Quantifying the Slashdot Effect
October 24, 2008 4:54 PM   Subscribe

I'm trying to figure out if a server I'm working on can withstand being linked to by very high traffic websites. Can anyone tell me the basic traffic characteristics of, say, a front page link from: Drudge Report, Yahoo! Buzz, Huffington Post, Slashdot, Reddit, or Digg? I'm using ab (ApacheBench) and I'd like to have some idea of how many requests, and at what level of concurrency, I should be trying to make this server capable of pushing out. Thanks!

I already know what to do (log and observe, cache cache cache, tune and tweak db and web server configuration, etc) and am experienced in that process. I just need some typical numbers to help me figure out when I'm done!
posted by evariste to Computers & Internet (5 answers total) 2 users marked this as a favorite
 
Response by poster: The particular server I'm working on right now seems to handle
ab -n 10000 -c 500
really well, which means ten thousand requests for a page, issued 500 at a time. Some of the page requests are slow, but 75% take under 2 seconds and 99% are under 17 seconds. The slowest one took 45 seconds, and one request couldn't be satisfied. Throughout this, the server's load average remained well below 1. A good start? Just average? I have no basis on which to judge this.

It would be cool if someone could tell me, e.g., "Slashdot's traffic comes in spurts of about 5000, maybe 20 concurrently, until you've seen about two hundred thousand pageloads" because that way I could write a script that simulated a Slashdotting (or whatever).
posted by evariste at 5:01 PM on October 24, 2008


Best answer: This seems to be a quantitative description of the Slashdot effect from ten years ago. So it's definitely out of date, but might still be useful.
posted by qxntpqbbbqxl at 5:31 PM on October 24, 2008


Response by poster: A great start, thank you!
posted by evariste at 5:33 PM on October 24, 2008


Best answer: Kottke: digg vs Slashdot
posted by mandal at 6:45 PM on October 24, 2008


Best answer: I work for a company that among other things hosts the top search result for the google that was linked to from a 'Google Doodle' (one-off logo) on several of the Google regional sites.

These are the stats for the 24 hour period

Requests per second: up to ~200
Data transfer: up to ~1.8MB/second (6GB/hour)
Requests being processed by apache: up to ~600 simultaneous
Total data transferred: >100GB

this represents about 200,000 visits. We couldn't really get any indication of what to expect so we had moved the site to a dedicated box ahead of time.
posted by tallus at 10:40 AM on October 25, 2008 [1 favorite]


« Older What to do with bad tasting honey   |   Help me help the cows... Newer »
This thread is closed to new comments.