Independent backbone · Growing daily · Exclusive tape archives

Usenet Retention: What the Numbers Don't Tell You

Most providers advertise retention days, but they're all pulling from the same pool. We run our own backbone, we recovered articles from 20-year-old tape archives, and our retention grows every day. That combination matters more than a number.

🔧 Independent Backbone
📼 Exclusive Tape Archive Recovery
📦 ~500TB Ingested Daily
⚡ Sub-3ms NVMe Latency
✅ 99%+ Completion Rate
Current Binary Retention
5,695+ days
Over 15.6 years of articles, growing daily
~500TB
New Articles
Ingested Yesterday
99%+
Vollständigkeitsrate
verified by TechRadar
<3ms
NVMe Spool
Article Latency
20+ yrs
Oldest Articles
via Tape Recovery

What Is Usenet Retention?

When someone posts an article to Usenet, it gets distributed across servers worldwide. Retention is how far back your provider keeps those articles available, measured in days. A provider with 1,000 days of retention has articles going back about 2.7 years. Anything older? Gone.

We currently sit at 5,695+ days, which works out to over 15.6 years of stored articles. That number goes up by one every day.

But retention days are only one dimension. The other question — and the one most people don't think to ask — is which articles does your provider actually have?

Why Most Retention Numbers Are Misleading

If you've shopped around for Usenet providers, you've probably noticed that a lot of them advertise similar retention numbers — 6,000+ days, 6,400+ days, etc. That's not a coincidence.

The majority of Usenet providers don't run their own infrastructure. They resell access to the same upstream backbone. Dozens of brands — big and small — all pull from the same article pool. When one of them claims 6,400 days of retention, they all have the same 6,400 days of the same articles.

That has a practical consequence: if an article is missing from one of those providers, it's missing from all of them. Switching between resellers doesn't give you access to different content — you're searching the same pool with a different logo on it.

NewsDemon runs its own independent backbone. We are not a reseller. We built and operate our own ingestion, storage, and serving infrastructure. Our article inventory is different from the shared pool that most of the market relies on, which means we carry articles that other providers simply don't have. Learn what "independent" really means in Usenet →

Tape Archive Recovery: Articles Nobody Else Has

A few years back, we got our hands on a large collection of old Usenet articles stored on magnetic tape. These are articles going back over 20 years — posts from the early 2000s that were never migrated to modern storage by other providers. For most of the Usenet world, these articles simply vanished.

We loaded them into our infrastructure and made them available to every NewsDemon member. No extra charge, no special plan tier required. If you're searching for something old and can't find it on your current provider, there's a real chance we have it sitting in our archive from that tape recovery.

No other Usenet provider has these articles. They're exclusive to NewsDemon.

Binary Retention vs. Text Retention

Not all retention numbers describe the same thing. Binary retention covers files and attachments — the large downloads most people are looking for. Text retention covers discussion posts. Some providers inflate their headline number by advertising text retention while keeping binary retention shorter.

TypeWhat It StoresNewsDemon
Binär-Vorhaltezeit Files, images, software, archives — the large attachments in binary newsgroups. This is what most people are actually looking for. 5,695+ days
Text-Vorhaltezeit Discussion posts and text-only messages. Smaller files, but valuable for research and community archives. 5,695+ days

When we say 5,695+ days, we mean the articles you're actually trying to download. Plus the exclusive tape archive content going back 20+ years on top of that.

How Our Retention Works

Our retention goes up by one day, every day. It has been doing that for years. Here's what's happening under the hood.

Daily Feed Stored

We pull in around 500 terabytes of new Usenet articles per day across all major newsgroup hierarchies. The exact number fluctuates — some days are heavier than others — but it's consistently in that range. New posts get indexed and available to members within seconds.

Intelligent Filtering

We run an AI-driven filtering system that analyzes incoming articles for junk, spam, sporge, and other noise. Combined with article deduplication, we keep what matters and throw out the garbage — so our archive is cleaner and more usable than a provider that just stores everything indiscriminately. Retention baseline: 5,695 days as of March 17, 2026, ticking up by one each day.

Three-Region Replication

Every article gets replicated across all three server regions — US East, US West, and EU Netherlands. Same retention depth on every farm, and the redundancy protects against data loss.

Speed: NVMe Spools and Why It Matters

Having deep retention doesn't help much if article retrieval is slow. We've invested heavily in making the actual download experience fast.

Article Latency (NVMe Spool)
< 3ms
Down from 15ms on spinning disks
Daily Feed Stored Rate
~500TB
Across all major hierarchies
Server Regions
3 Worldwide
US East · US West · EU Netherlands
Connections Per Account
50 SSL
256-bit encryption on every connection

When we deployed NVMe spool sets for our most frequently accessed articles, average retrieval latency dropped from 15ms to under 3ms. That's a 5x improvement, and you feel it — downloads start faster and searches return quicker.

Older articles deeper in the archive sit on high-density storage optimized for sequential reads. New stuff is fast, old stuff is still there and still accessible.

Completion Rates

Retention days are only part of the picture. Completion rate measures what percentage of an article's segments are actually available when you go to download it. A provider can advertise thousands of days of retention, but if articles are incomplete, downloads fail.

We maintain over 99% completion across the full retention window — a number verified by TechRadar and NGR Blog.

A big reason for that: we run our own independent backbone. We control ingestion, storage, and serving end-to-end. No gaps from upstream provider decisions, no articles quietly disappearing because a reseller's source decided to drop them.

Why This Matters for You

You Find Articles Others Don't Have

Because we run our own backbone and we recovered exclusive tape archives, our article inventory is different from the majority of providers. Members tell us regularly that they're finding articles on NewsDemon that they couldn't get anywhere else — stuff they'd given up on.

Independent Infrastructure, Not a Reseller

Most Usenet providers are reselling access to the same shared backbone. If an article is missing from one, it's missing from all of them. NewsDemon's pool is genuinely different. That independence is why the r/NewsDemon community consistently recommends us. (And yes, "independent" means something specific — not every provider using that word qualifies.)

Same Archive on Every Plan

We don't gate retention behind premium tiers. The $3/month metered plan gets the exact same 5,695+ day archive — including the tape recovery content — as the unlimited annual. No asterisks.

Frequently Asked Questions

What is Usenet retention?
Usenet retention is how far back in time a provider stores articles on its servers, measured in days. A provider with 5,000 days of retention has articles going back roughly 13.7 years. NewsDemon currently offers 5,695+ days of binary retention, growing by one day every day.
What's the difference between binary and text retention?
Binary retention covers files and attachments in binary newsgroups — the large downloads most users are looking for. Text retention covers discussion posts and text-only messages. Binary requires far more storage and is the number that matters most when comparing providers.
Why does NewsDemon have articles other providers don't?
Two reasons. First, we operate our own independent backbone, separate from the shared network that most other providers resell. Our article inventory is genuinely different. Second, we recovered a large collection of articles from magnetic tape archives going back over 20 years — content that was never migrated by other providers and effectively disappeared from Usenet.
Does higher retention always mean better service?
Not necessarily. Many providers share the same backbone and the same articles. Two providers with identical retention numbers on the same shared network have the exact same content. An independent provider with a different article pool, high completion, and exclusive archive content can deliver articles that shared-backbone providers simply don't carry.
What is completion rate?
Completion rate measures what percentage of an article's segments are available for download. NewsDemon maintains 99%+ completion across the full retention window, verified by independent review sites including TechRadar and NGR Blog.
Why do people recommend NewsDemon on Reddit?
NewsDemon operates its own independent backbone with a different article pool than the shared network most providers resell. Members report finding articles on NewsDemon that aren't available elsewhere. Combined with exclusive tape archive content, competitive pricing, and 99%+ completion, it's consistently recommended in the r/NewsDemon community.

Ready to See the Difference?

Every plan gets the full archive — including our exclusive tape recovery content. 50 SSL connections, free VPN, 30-day money-back guarantee.

Tarife ansehen