How can I give my server some relief?
September 1, 2011 10:15 AM

Quick web question: Does splitting image heavy content into pages increase or decrease server load?

Howdy,

My new t-shirt website (which got listed to the front page here yesterday, and a few other big sites today) is getting slammed with traffic and is now responding too slowly for my comfort. I'm trying to think of ways to reduce the load on the server and I did the basic stuff: Combined all the CSS and javascript I could to reduce overhead, combined images into CSS sprites wherever possible. And it has improved things, but not as much as I'd like.

Currently my index displays all 21 of my shirt designs as images (and I'll be adding a few more shortly), so the next thing I was thinking of doing would be to split the index into two pages so instead of loading all 21 images at once I'd have it load say 15 per page (cutting down the number of images by 6 for now, and a few more once I add new designs).

Assuming say 30% of folks just hit the index and bounce, never to click elsewhere would this help reduce server load, or would the fact that a lot of folks would be clicking on page 2 and starting a whole additional page load more than make up for any gains that come from reducing the number of images loaded at once?

Thanks much.
posted by Jezztek to Computers & Internet (23 answers total)
To clarify, are you talking about trying to reduce bandwidth or shorten loading time? I don't imagine that congested bandwidth is slowing down your page loads.

I am not an expert on this however.
posted by humboldt32 at 10:35 AM on September 1, 2011


How and where are you hosting?
posted by humboldt32 at 10:36 AM on September 1, 2011


Splitting it up will be to your gain if people didn't plan to view all n objects anyhow. In other words, if they were going to lose interest by the 5th item anyhow it is in your advantage to not spend the bandwidth displaying all 21 up front.

That is a technical answer for how it might save you on bandwidth, but there are two counter points I would raise to this plan:

1) For those that do want to view all 21, you have more bandwidth being used because now you deliver 21 images plus the other page content surrounding it 21 times instead of once.

2) From a business standpoint, if the 20th item is the design that makes the sale, do you really want to bury it from view? I would think for this limited of an inventory you should show it all up front. Sales drop off might be bad to separate into pages.

I would use the yslow firefox extension and analyze what takes takes up speed and bandwidth on the site. Your js+css work is likely a good optimization, but it may not be the silver bullet. If the images comprise the majority of the bandwidth you might be able to compress them a little or host them elsewhere to distribute the bandwidth load, for example.
posted by dgran at 10:37 AM on September 1, 2011


Oh yeah, I guess I should have clarified. I'm guessing it isn't total bandwidth that is causing the problems (but I could be wrong), it seems to be the number of http requests being made. So it doesn't really seem to be the size of the page in terms of Kb (but I'm sure that doesn;t help) that seems to be causing the slow load times, more that the server has a delay from having too many requests to process at a time.

I'm hosting on a VPS though inmotionhosting.
posted by Jezztek at 10:44 AM on September 1, 2011


Have you correctly identified the bottleneck? Is the site database driven? Its its just high CPU load from the number of requests then splitting it in to 2 pages will increase requests by 3 (page, css and js) for every person who want to look at all the images and reduce by 6 for every person who only views the homepage. Whether its worth it obviously depends on your actual bounce rate rather than a guess.

Of course this may not be the problem at all - if your CPU load is fine, your apache connections limit may be too low and that is why your site is slow.
posted by missmagenta at 10:47 AM on September 1, 2011


If it isn't bandwidth, then your VPS provider may have process or memory limits for your hosting plan. If you are slammed right now I would contact the hosting provider and ask them if they can upgrade you for 30 days while you get through this bump in traffic. Memail me if they aren't responsive to this type of request and I can recommend a good hosting provider that I know is good at this sort of thing.
posted by dgran at 10:55 AM on September 1, 2011


Yeah, to sum up what has already been said, you don't know what the bottleneck is, and you need to find that out. Ask your server host what your CPU/Memory looks like. What does your backend look like? Is it a PHP app? Off-the-shelf CMS or e-store? Many of those have caching functions/plugins that you might be able to enable.
posted by Jairus at 10:57 AM on September 1, 2011


The site is database driven, but I cache everything except a few of the info pages.

I'm not sure how to troubleshoot any of that stuff directly, the cpanel is about as far as I can get with my limited technical knowledge. When I called my host they seemed convinced that the problem was number of requests per page (this was before I had combined everything I could), and given that my changes made a significant improvement I figured the more I could do along those lines the better.

It might simply be that I'd need to be on my own dedicated server to survive this much traffic, but by the time they would switch me over I'm guessing things would have long died down.

On preview: dgran - They said they could give me a 30 upgrade, but that it would take 2+ days to get me over to a new server, I'm guessing this crazy amount of traffic will be managable in 48 hours when I fall off the front page of boingboing and pharyngula.
posted by Jezztek at 11:00 AM on September 1, 2011


Sorry - I take that back, you'd be down 6 and up 6 (3 css images). You'd need a bounce rate of >50%.

Are you caching requests? You might want to create a .htaccess with something like the following:

ExpiresActive On
ExpiresDefault "access plus 1 years"
ExpiresByType image/gif "access plus 1 years"
ExpiresByType image/jpeg "access plus 1 years"
ExpiresByType image/png "access plus 1 years"
ExpiresByType text/css "access plus 1 years"
ExpiresByType text/html "access plus 1 seconds"
ExpiresByType text/javascript "access plus 1 years"
ExpiresByType application/x-unknown-content-type "access plus 1 years"
ExpiresByType application/x-javascript "access plus 1 years"

Obviously you'd have to adjust the values to suit your needs and it wont help if every visit is unique but it will reduce traffic from repeat visitors and if you split into multiple pages it will help a little
posted by missmagenta at 11:03 AM on September 1, 2011


That's a good call missmagenta, I had expiresbytype set for my pngs but not much else. I'll expand on that.
posted by Jezztek at 11:09 AM on September 1, 2011


If you're on a VPS will your host not give you SSH root access? One of the main advantages of VPS over regular shared is having control over your configuration and software.
posted by missmagenta at 11:14 AM on September 1, 2011


Yeah, I have SSH access. I just don't know how to use it =) I'm going through a basic tutorial right now, actually.
posted by Jezztek at 11:26 AM on September 1, 2011


enter the command "top" (without the quote marks). That will tell you what the server load is along with memory usage and some other stuff, it will also list the running processes using the most resources
posted by missmagenta at 11:29 AM on September 1, 2011


Yeah, it looks like CPU load is my problem, at my last check I was running a 2.27 (8 cpus).
posted by Jezztek at 11:31 AM on September 1, 2011


With 8 CPUs a server load of 2.27 should be fine (I certainly wouldn't expect it to be as bad as it is ). How was your memory usage?

You could also try increasing your ServerLimit and MaxClients in your apache config, it could be that you're just hitting a cap.
posted by missmagenta at 11:37 AM on September 1, 2011


and don't forget to stop and start apache after changing the config (apparently restarting doesn't do the trick with ServerLimit changes)
posted by missmagenta at 11:41 AM on September 1, 2011


My memory looked good (as far as I can tell) Mem: 3145728k total, 175620k used, 2970108k free.

I'll try setting my ServerLimit / MaxClients higher. What are good numbers for those?
posted by Jezztek at 11:45 AM on September 1, 2011


What are they at currently?
posted by missmagenta at 11:47 AM on September 1, 2011


Sorry about the delay, I was trying to access my httpd conf via SSH and it took me a while to realize that access was blocked by my host, and I needed to get them to copy over the info:

I believe this is what I am looking for right?


ServerLimit 100
MaxClients 100
StartServers 5
MinSpareServers 5
MaxSpareServers 10
MaxRequestsPerChild 1000


PS. Thanks so much for helping me with this!
posted by Jezztek at 12:11 PM on September 1, 2011


They're lower than the defaults which I think are 256 for ServerLimit and 150 for MaxClients. Since your memory and CPU loads are fine, I'd try doubling those. Can your host tell you how many httpd processes are running?
posted by missmagenta at 12:38 PM on September 1, 2011


It would also be good to run a second HTTP server dedicated to serving static content. When you serve everything from one server, those images (which I'm presuming are static) just clog up expensive worker threads of your main server. Separating static from dynamic content makes everything smoother. It also increases page load time even if your server is not overloaded, because browsers use per-domain heuristics to decide how many simultaneous connections to make, so if you have your site split between www.example.com and static.example.com it will use twice as many connections, which means none of the long-loading dynamic pages will block the quick static image requests.

You could also use a JS lazy-loading technique such that the images aren't actually requested until the user scrolls down the page. This gets you the best of both worlds in terms of fast page loading and server relief, but it also means you don't lose people who don't like pagination.
posted by Rhomboid at 1:22 PM on September 1, 2011


(And to be clear, this doesn't mean putting images on a separate subdomain/name-based vhost on the same Apache instance, it means a whole separate server on a different IP, preferably something tuned to static serving like lighttpd or nginx, i.e. no mod_php or any other scripting modules.)
posted by Rhomboid at 1:26 PM on September 1, 2011


Ooooh, I hadn't even thought of doing the lazy-loading stuff. I should really get on that, thanks much.
posted by Jezztek at 2:54 PM on September 1, 2011


« Older Divorcing a person with BPD   |   "Per Se is so 2009" Newer »
This thread is closed to new comments.