Can a browser pass HTTP requests the same as BitTorrent?
June 30, 2005 3:58 PM   Subscribe

Is there any reason why standard HTTP requests can't work the same as the BitTorrent protocol? It seems like this would allow websites to avoid the "slashdot/metafilter" effect when traffic surges to an individual site and overwhelms the servers. Seems like this could happen on the client side, with enabled web browsers passing requested data the same as a torrent client. I don't know enough about the underlying technologies to determine if this is a stupid question or not, so please be gentle as you take me down a notch.
posted by hulette to Computers & Internet (13 answers total)
 
Perhaps because you and I don't want to have to archive the internet?

'cause that's how BT works: everyone shares whatever bits of the file they've managed to download and keep on their computer. That's a large part of why torrents expire: no one bothers keeping the downloads on their drive.

Further, many sites are dynamically generated. AskMe is: when you request it, the back end generates the web page on the fly just for you. With BT, you get the exact same file as everyone else.
posted by five fresh fish at 4:04 PM on June 30, 2005


The load for processing page views is usually not on bandwidth, but processing, so serving up a list of peers would be almost as stressful as sending the page itself.
posted by abcde at 4:06 PM on June 30, 2005


Most website which might suffer this type of traffic volume are customized for each user. For example the "(3 new)" links here at MeFi. Thus, you need to be talking to one database. Allowing multiple nodes to distribute the content would not allow this customization since Matt is not going to give your node access to his database.

Said another way - The HTML and web server aren't the bottleneck, the database is.
posted by y6y6y6 at 4:07 PM on June 30, 2005


If you ignore the dynamically-generated content issue, I believe Freenet does something very like what you describe, though for completely different reasons.
posted by squidlarkin at 4:12 PM on June 30, 2005


Hulette, you might also look at the Coral Project, which aims at what you're describing.
posted by Rothko at 4:17 PM on June 30, 2005


Response by poster: dang, I should have known it couldn't be that easy. thanks for the answers everybody.

and smart people... don't let the apparent impossibility of this concept stop you from trying to make it work anyway!
posted by hulette at 4:20 PM on June 30, 2005


Best answer: The dynamic nature of many web pages can be dealt with -- HTTP already has a cache control system that works.

There's no reason why a BitTorrent-like system can't work for normal HTTP access. Let's say I access a page from server X. The next user who wants that page could equally get it from server X, or from me. In that sense, I become server Y.

What you now have is a distributed cache. The problem here is that I might go offline, or my cached page copy might go away (since my machine can only store a limited amount of data), or my firewall settings might not permit incoming traffic, and so on. Also, I'm just one out of a huge number of possible caches; you'd need some kind of coordinator, or coordinators, who kept tabs on who is caching what, and to keep this manageable you'd probably want some kind of "supernode" system.

What makes more sense, then, is to set up dedicated distributed caches. Several systems, like the Coral cache or the commercial, closed Akamai system (used by Microsoft and others), exist, and work well.

Coral is a public system that's usable with any site -- just add .nyud.net:8090 to the end of the host name, as in http://ask.metafilter.com.nyud.net:8090/mefi/20609. There are plugins to do this more easily inside your browser.
posted by gentle at 4:21 PM on June 30, 2005


Response by poster: ah, that coral system looks pretty cool. i'll look into that.
posted by hulette at 4:24 PM on June 30, 2005


There's also Dijjer.
posted by gentle at 4:28 PM on June 30, 2005


It depends on what you mean by "BitTorrent-like system". BT is a very clever protocol, but it's definitely optimized for multi-megabyte files. The overhead of the protocol means it's not a good match for the usual web page which consists of several 10-100k files.

There's no reason larger files couldn't be served that way, and in fact some ISPs like he.net make it easy to create new torrents by providing a tracker service to their customers. The real limitation here is that not every browser has BT support. For some reason BT still has the reputation as "hard to use".

Coral is definitely a good alternative solution.
posted by Nelson at 5:10 PM on June 30, 2005


Overhead is the big one. BT makes sense when you're distributing 700 meg CD images, but when you're delivering a 5k HTML page, or a 30k image, you'll waste more internet bandwidth figuring out where you'll pull it from than just downloading the damn thing.
posted by devilsbrigade at 5:37 PM on June 30, 2005


Not to mention I don't want to share my client computer with you so you can browse faster. Or at all. I have a server for that. It runs really fast and has a lot of bandwidth. Heck, I even visit it myself every once in a while just to say hello.

Hey, unless you want to pay me for the use of my processor, memory, bandwidth, and drives on the laptop too!
posted by realcountrymusic at 9:08 PM on June 30, 2005


What you now have is a distributed cache.

If you're in a corporate environment, and you need something on a professional scale (i.e. you want to scale to millions of users hitting your site and you don't want to build out the infrastructure to handle it), you should check out Akamai.

They basically carve up your webpage and substitute anything static or near-static with references to their distributed servers, which logically serve up those elements from a topologically "nearer" server to the user than yours. For example: If you're based on California, and some guy in New York wants to look at your web page, he may be pulling the static imagery off of a local webserver in Manhattan and getting the dynamic text off of your website.
posted by thanotopsis at 9:29 PM on June 30, 2005


« Older Avoiding false positive drug results from...   |   80s band names involving automobiles? Newer »
This thread is closed to new comments.