Bot upvotes vs real upvotes: the 12.5-hour decay test
Bot upvotes purge fast, real upvotes hold weight. The 12.5-hour decay test, retention curves by voter tier, and the account-graph signal Reddit reads first.
The 12.5-hour decay test names the only window that decides whether a Reddit upvote campaign actually moved the post. The hot algorithm preserved in the open-source clux/decay reference uses log10(max(|score|, 1)) + (timestamp / 45000), where 45,000 seconds is 12.5 hours - the time constant by which a post needs 10× more votes to hold the same rank. Inside that window, every vote that survives a vote-integrity sweep and clears the Contributor Quality Score weighting applies at full leverage; outside it, the leverage has decayed by 90%+ and the post is functionally dead. Bot upvote networks fail the test on both axes: they purge inside the sweep window and weight near zero through CQS scaling. Signals runs an aged Reddit account marketplace plus an editorial network for AI brand mentions across Reddit, Quora, Product Hunt, and Threads, and the retention data we ship into client campaigns lines up with the public testing below. Operators choosing to buy Reddit upvotes on a budget rarely see the test fail in real time - they see a stalled post and blame fuzzing.
The decay constant - every 12.5h subtracts a full point on the log-scale score, defining the window weight matters in.
SourceWhat is the 12.5-hour decay test for paid Reddit upvotes?
The 12.5-hour decay test measures whether a paid vote actually applies leverage inside the window where Reddit's ranking algorithm cares about it. The hot-sort score formula divides the post's age in seconds by 45,000, so a post 12.5 hours old needs 10× the score of a fresh post to hold rank, and a 24-hour post needs 100×. The test is straightforward: count how many of your paid votes are still on the post (and weighted at near-full algorithm weight) at 1 hour, 12.5 hours, and 24 hours. A campaign that retains 90%+ at 12.5 hours with Moderate-or-higher voter CQS scored full leverage on every vote. A campaign that retains 50% at 12.5 hours and is running Lowest-CQS voters delivered roughly 5-10× less effective ranking weight than the order page promised. Most order pages do not show this number because most vendors fail it.
How does Reddit's detection stack distinguish bot votes from real votes?
Reddit reads three layers - voter account quality, IP and fingerprint graph, and vote timing - and the bot vs real distinction shows up in all three at once. Real aged voter accounts have months or years of comment history, posting cadence, subreddit diversity, and engagement variance; farmed bot accounts have shallow karma stacks built from copypasta on low-stakes subs and almost no comment depth. The Contributor Quality Score documentation confirms account behavior, security, and engagement feed the trust rating, with email-verified and consistently-upvoted accounts scoring higher. Derek Hsieh's Kafka Summit 2021 talk on Reddit's ksqlDB streaming detector adds the timing layer - bursts that miss the natural launch-curve envelope flag inside minutes. Bot networks score weakly across all three and produce a graph signature Reddit's vote-cheating policy was written to catch.
What does bot vs real retention look like at 1 hour?
At the 1-hour mark, bot upvotes lose 30-50% of vote count to first-window sweeps while real-account drips usually retain 95%+. The streaming detector Hsieh described moved purge timing from hourly Airflow batches to minute-scale ksqlDB jobs, which means a bot-network blast on a fresh post often gets reversed before the post leaves Rising. The REDAccs $480 service audit tested six providers with controlled orders and the bottom-tier services retained under 50% past four hours - a number that lines up with what operator threads on BlackHatWorld describe for bulk bot packages. Real-account drip campaigns from the audit's top-tier providers (UpvoteShop and REDAccs) showed 0.8% and 2.1% drop at 7 days, implying near-zero loss inside the first hour. The 1-hour gap is the largest single signal that separates the two.
What does bot vs real retention look like at 12.5 hours and 7 days?
By 12.5 hours, bot networks have typically lost 40-60% of votes; by day 7 the loss compounds to 26-31% on the bottom-tier services and 50%+ on the cheapest bulk pools. The audit's 7-day drop figures are the cleanest public data set: UpvoteShop 0.8%, REDAccs 2.1%, SocialPro 11.4%, SocialPlug 18.7%, RedditUpvote.Net 26.3%, UpvoteMax 31.2%. The retention gradient maps directly to voter quality - aged-account drips at the top, bulk-account blasts at the bottom. The compounding effect matters because Reddit's vote-integrity sweeps run on a cadence rather than once - a vote that survives the first sweep can still be removed by the second or third pass once the account graph updates with stronger signal.
| Window | Aged-account drip retention | Bulk bot-network retention | Effective weight gap |
|---|---|---|---|
| 1 hour | 95-99% | 50-70% | 4-8× |
| 12.5 hours | 90-97% | 40-60% | 5-10× |
| 24 hours | 88-95% | 35-55% | 6-12× |
| 7 days | 80-95% | 50-74% (bottom tier) | 3-15× depending on package |
The "effective weight gap" combines retention with CQS-scaled vote weight - a Lowest-CQS vote that survives a sweep still contributes ~0.1× algorithm weight, so the gap is the multiplier between effective ranking lift on quality vs cheap packages.
How does the account graph expose farmed vote sources?
Farmed vote sources expose themselves through three signals the account graph reads simultaneously: shallow comment-to-vote ratio, IP and fingerprint clustering, and creation-date bunching. Bot farms produce accounts that vote constantly and comment rarely - the inverse of organic behavior, where comments outnumber new posts but vote activity is balanced across many subs. Reddit's vote-cheating policy names "creating multiple accounts to vote" as the textbook violation, and the graph traces it through shared infrastructure - a 50-account farm running on a single residential proxy pool produces a star pattern around that IP block that the vote-manipulation review sweeps catch on quarterly cadences. Creation-date bunching is the third signal: 30 accounts created within a week, all warming up on r/freekarma4u or r/karmacourt, then activating on the same buyer post in week 2 - the cluster is geometric and obvious. Real aged accounts have organic creation history spread across months or years, varied participation, and IP movement that matches a real user's ISP and travel patterns.
What does cost-per-surviving-vote actually look like?
The cost-per-surviving-weighted-vote multiplier on a bot package is typically 4-10× the order-page price after retention and CQS scaling. A $0.05/vote bot package that retains 50% at 24 hours and weights at 0.1× CQS contributes effective leverage equivalent to ~$1.00/vote at full weight - and that's before counting the post that didn't move. A $0.20/vote aged-account drip that retains 92% at 24 hours and weights at 0.9× CQS contributes effective leverage equivalent to ~$0.24/vote at full weight, often cheaper per moved-rank-position than the cheap order. The full pricing breakdown by tier is in our 2026 Reddit upvote cost map, and the timing protocol that makes either tier survive the test is in why drip beats blast. Bot packages look cheap at the order line and almost always cost more per outcome.
When does a bot upvote actually move rank?
Almost never on a fresh promotional post, and only marginally on a niche-sub post that was already organically lifting. The narrow case where a Lowest-CQS bot vote contributes is on a sub small enough that the algorithm's velocity sensitivity overwhelms its CQS scaling - a 100-subscriber niche sub where any vote at all moves the post into Hot. For any sub with active organic engagement, the CQS scaling drops bot vote weight close enough to zero that the hot algorithm does not notice the contribution, even when the votes are not purged. The retention numbers above describe what the post displays; the rank-impact numbers are smaller still. Operators reading "I bought 500 votes and the count stuck but nothing happened" are typically reading the difference between display retention and effective algorithm weight - the votes survived the sweep but contributed near-zero leverage.
How to run the 12.5-hour decay test on your own campaign
Run the test in three measurements: capture vote count at 1 hour, 12.5 hours, and 24 hours; capture position on Hot and Rising at the same intervals; capture the comment-to-vote ratio across the same windows. Vote count alone gives misleading signal because Reddit deliberately fuzzes the displayed numbers - the Reddiquette page confirms the system, and the TechCrunch 2016 algorithm overhaul reporting is the canonical reference for the fuzz layer. Position changes across the windows tell you whether weight is being applied. A package that retains count but loses position by hour 6 is showing CQS scaling masked behind display retention - the votes are visible but contributing near-zero leverage. The clean pass is consistent or improving Hot position from minute 30 through hour 12.5, with comment ratio in the 1 to 1 band per Reddit's anti-spam weighting and minimal vote-count drops between sweeps.
Frequently asked questions
How fast does Reddit purge bot upvotes?
The fastest sweeps fire inside minutes via the streaming ksqlDB detector Derek Hsieh described in 2021, with follow-up sweeps running across hours and days. Bot-network blasts on fresh posts typically lose 30-50% of votes within the first hour and continue purging through the 24-hour mark as the account-graph review updates. The 7-day window is when the largest cumulative loss shows up - the REDAccs $480 audit found bottom-tier providers dropped 26-31% across that window.
Are real upvotes ever flagged as bots by accident?
Rarely, but the misclassification path exists. Reddit's vote-cheating policy names coordinated voting from related accounts as the violation, so a small group of friends voting on the same post from the same IP block can trip the same graph signal as a farm. Genuine misfires are reversible through Reddit's appeal process. The dominant failure mode in paid campaigns is not false-positive flagging - it is correct identification of farmed accounts whose graph signature was always going to fail.
Does retention rate alone tell me if my upvotes are bot or real?
No - retention is the cleanest single signal but not sufficient. A package can retain visible count and still contribute near-zero ranking weight if voter accounts are scoring at the bottom of the CQS distribution. The fuller test combines retention, position movement, and comment-to-vote ratio. Position is the only metric that confirms weight was actually applied; retention without position is votes that survived the sweep without earning rank.
What is the difference between a bot upvote and a low-CQS upvote?
A bot upvote comes from an automated script firing through stolen or scripted credentials; a low-CQS upvote comes from a real human-operated account that scored poorly on Reddit's trust signals. In practice the two converge - bot-farmed accounts almost always score Lowest-CQS because the warmup behavior is detectable - but the categories differ at the technical layer. Both produce similar campaign outcomes: poor retention through sweeps and near-zero algorithm weight. Vendors blurring the distinction usually do so to avoid the "bot" label while still selling Lowest-CQS inventory.
Can a campaign mix real and bot upvotes safely?
It can, but the bot portion contributes little and risks the rest. The streaming detector reads the full vote stream on a post, so a bot burst layered on top of a real drip can still trip the timing-entropy threshold and trigger sweeps that catch the real votes alongside the farmed ones. The clean operator practice is to run a homogeneous voter pool at Moderate CQS or higher with a drip-paced delivery curve. Mixing tiers to lower the order line typically loses more than it saves through cross-contamination during sweeps.
Why do my upvotes sometimes count for a few hours then drop?
That pattern is the second-pass sweep. The first pass fires within minutes via streaming detection and catches the loudest signals; the second pass runs hours later as the account-graph review updates with cross-account pattern matching. Votes that passed the first pass on individual-account weakness can still fail the second pass when the cross-account graph reveals the farm structure. The Sage Journals 2021 case study on Reddit's voting practices traces the multi-pass design back to the early-2010s redesign of the manipulation pipeline.
Will buying bot upvotes get my account banned?
The poster's account is at low risk because they typically do not vote on their own post; the voter accounts absorb enforcement under Reddit's Disrupting Communities policy. Reddit's H2 2024 Transparency Report shows content manipulation accounted for 0.7% of admin removals across 158.96M items. The poster-side enforcement risk is low but not zero, and repeated campaigns against the same account can compound. The honest assessment of poster risk by purchase profile is in our breakdown of whether buying Reddit upvotes gets you banned in 2026.
What does a healthy upvote retention curve look like graphed over 7 days?
A healthy curve is a quick early ramp during delivery, a steady plateau after the drip ends, and a slow downward drift of <5% across the next 7 days. A bot-network curve is a fast ramp followed by a sharp first-hour drop of 20-40%, a partial recovery as fuzzing oscillates, then continued losses as later sweeps fire. The shape of the curve - not the height - is what tells the story. Operators logging counts at minute 60, hour 12.5, hour 24, and day 7 will see the two patterns separate cleanly inside the first 24 hours.