I’ve been doing a bunch of reading about web feeds and syndication and RSS and atom and related stuff recently. And something struck me the other day.
This morning Rogers Cadenhead talks on his blog about the problem of scaling issues of web feeds: if you have a popular feed, you’ve got zillions of feed aggregators hitting your site a zillion times a day to see if your feed has changed. That’s not very efficient.
I read a similar complaint just yesterday in Wired (the actual magazine, not online) from Meg of Kinja, who suggested some sort of peer-to-peer solution to cut down on the web traffic from aggregators.
Last week I was reading Brent Simmons’ weblog (Brent writes the really great mac feed reader NetNewsWire, which I use and am using right this second to type this post) and he was talking about the use of guids to uniquely refer to weblog posts and how important it is that feed developers actually use guids.
Why am I noting these things? Issues of distributing news in either a one-to-many fashion or peer-to-peer, or of uniquely identifying a single post: this is stuff that the people who did Usenet figured out nearly fifteen years ago.
OK, everyone knows that Usenet is dead, that its a cesspool of spam and porn and pointless flameage, but the technology for distribution and keeping track of posts in Usenet works really well. Its not HTTP and its not XML and the technology certainly has its problems, but surely there are some lessons from how usenet posts are distributed and stored that could be valuable to the web feed world. Surely fifteen years is not so long that we don’t have to completely reinvent the same stuff all over again.
Just a thought.