Usenet was designed with a model of sporadic connection. Every site had a copy of items in its subscribed groups, and would refresh them from its upstream servers periodically.
This is a good model if connectivity is scarce or expensive, but inefficient if it’s pervasive and cheap.
Starting to use Usenet again (since it hasn’t actually ever gone away) doesn’t really make sense. It is a product of a different set of preconditions.
> This is a good model if connectivity is scarce or expensive, but inefficient if it’s pervasive and cheap.
The web is absolutely packed with data duplication, every CDN ever keeps a copy of everything in serves in multiple locations. There's nothing wrong with duplicating data. So long as there's some agreement on the uniqueness of a message it doesn't really matter how many times it's duplicated in the network.
It's actually an advantage because it means multiple parties can maintain backups of of messages and reconstruct conversations after the fact. If some central source of threads goes down it can be rebuilt from partial copies. It's the reason DejaNews even had extensive Usenet archives for Google to purchase.
This is a good model if connectivity is scarce or expensive, but inefficient if it’s pervasive and cheap.
Starting to use Usenet again (since it hasn’t actually ever gone away) doesn’t really make sense. It is a product of a different set of preconditions.