A free utility to measure web site load times?
October 16, 2006 1:20 PM Subscribe
I need a tool (web- or Mac-based) that will help me measure how long a web page will take to load. I'm familiar with tools that measure page weight based on the HTML in a document, but I need something that will measure load times including dynamic factors. Does such a beast exist?
I have a Wordpress-based site. Wordpress creates each page dynamically from various "modules" (such as a header, a footer, a sidebar, etc.) These modules also include connections to external content, such as ads, traffic meters, etc. Every little piece adds to the load time. My goal is to reduce the load time, and to that end, I want to measure the most egregious time sinks. (If my traffic meter adds two seconds to the load time, it's probably not worth it, for example.)
I have a Wordpress-based site. Wordpress creates each page dynamically from various "modules" (such as a header, a footer, a sidebar, etc.) These modules also include connections to external content, such as ads, traffic meters, etc. Every little piece adds to the load time. My goal is to reduce the load time, and to that end, I want to measure the most egregious time sinks. (If my traffic meter adds two seconds to the load time, it's probably not worth it, for example.)
Within Terminal.app:
You'll want to sample this measurement, say, 100 times, to estimate the mean download time. Replace the URL with a particular subsite.
posted by Blazecock Pileon at 2:01 PM on October 16, 2006 [1 favorite]
curl -s -w "%{time_total}\n" -o output.txt http://www.google.com/
You'll want to sample this measurement, say, 100 times, to estimate the mean download time. Replace the URL with a particular subsite.
posted by Blazecock Pileon at 2:01 PM on October 16, 2006 [1 favorite]
That only downloads one URL though (the main HTML), not any of the other JS or CSS or images or anything else that might be referenced by the main HTML file. So it's not nearly what jdroth is asking for.
posted by xil at 2:22 PM on October 16, 2006
posted by xil at 2:22 PM on October 16, 2006
This seems like something PHP should offer. With ColdFusion, I can do stuff like measure processing time and get pretty complete debugging mode reports of how long each db connection took and how long each and every template took to load.
I would google around for "php debugging info" and see what you find.
posted by mathowie at 2:28 PM on October 16, 2006
I would google around for "php debugging info" and see what you find.
posted by mathowie at 2:28 PM on October 16, 2006
The Safari debug menu has a "page load test" that might be useful. (Not at a Mac so can't try it myself.)
posted by smackfu at 2:28 PM on October 16, 2006
posted by smackfu at 2:28 PM on October 16, 2006
That only downloads one URL though (the main HTML), not any of the other JS or CSS or images or anything else
Replace the
You could use Perl or any other scripting library to parse out
I suppose you could have
posted by Blazecock Pileon at 2:29 PM on October 16, 2006
Replace the
URL
with the full path of URL/script.js
etc.You could use Perl or any other scripting library to parse out
img
and Javascript and other components of a page. But curl
is what you'd use to measure time to grab that specific static item.I suppose you could have
.js
etc. files generated dynamically but I don't see that happening too often.posted by Blazecock Pileon at 2:29 PM on October 16, 2006
This seems like something PHP should offer.
I use a variant of this PHP Timer class to measure MySQL request speeds. One thought might be to use the PEAR::HTTP_Request
You'd still need to specify the target, although it might be easier to programmatically scrape timing targets from a root URL, within a larger PHP script.
posted by Blazecock Pileon at 2:36 PM on October 16, 2006
I use a variant of this PHP Timer class to measure MySQL request speeds. One thought might be to use the PEAR::HTTP_Request
sendRequest
method within a Timer block. You'd still need to specify the target, although it might be easier to programmatically scrape timing targets from a root URL, within a larger PHP script.
posted by Blazecock Pileon at 2:36 PM on October 16, 2006
There are several factors here:
--time it takes to assemble and output the page (called "execution time") - usually you measure this internally
--time it takes to download the page once you've handed it to the user - usually you calculate the page weight and divide by the download speed of the user
--time it takes to execute dynamic things like javascript or Flash - depends on the user's PC, can't really be measured except in a very general sense
If you want to measure execution time in PHP, you can do something like this. I would expect that there's a pre-existing Wordpress plugin to measure the PHP execution time of pages, so you should probably just look for that first.
If you include content from another site, that content has to be generated, downloaded, and rendered. You can use curl or whatever to measure each external thing that you're including in your page.
posted by jellicle at 2:42 PM on October 16, 2006
--time it takes to assemble and output the page (called "execution time") - usually you measure this internally
--time it takes to download the page once you've handed it to the user - usually you calculate the page weight and divide by the download speed of the user
--time it takes to execute dynamic things like javascript or Flash - depends on the user's PC, can't really be measured except in a very general sense
If you want to measure execution time in PHP, you can do something like this. I would expect that there's a pre-existing Wordpress plugin to measure the PHP execution time of pages, so you should probably just look for that first.
If you include content from another site, that content has to be generated, downloaded, and rendered. You can use curl or whatever to measure each external thing that you're including in your page.
posted by jellicle at 2:42 PM on October 16, 2006
for a more powerful tool, apache benchmark, preinstalled on your mac
from terminal:
man ab
simple example:
ab -n 50 -t 50 -c 3 http://localhost/
for some reason it doesnt like not having a trailing slash there by the way.
posted by singingfish at 3:33 PM on October 16, 2006
from terminal:
man ab
simple example:
ab -n 50 -t 50 -c 3 http://localhost/
for some reason it doesnt like not having a trailing slash there by the way.
posted by singingfish at 3:33 PM on October 16, 2006
Best answer: This free Web Page Analyzer can "calculate page size, composition, and download time. The script calculates the size of individual elements and sums up each type of web page component." It lists the sizes of all of the individual objects on the page.
posted by kirkaracha at 4:25 PM on October 16, 2006
posted by kirkaracha at 4:25 PM on October 16, 2006
Look at some of the wordpress plugins. I saw one the other day that added a little snippet to the bottom of the page that told you how long it took for things to be generated. I can't remember which one it was, though, but I think it may have been one of the comment spam protectors like spam karma or perhaps an extended options management module, but anyways, the point is that wordpress has some stuff written for it to do what you want included as part of some other plugins.
posted by Mr. Gunn at 4:34 PM on October 16, 2006
posted by Mr. Gunn at 4:34 PM on October 16, 2006
Jakarta JMeter is one of the standard tools for this. It's somewhat complex, at least more complex then a simple PHP script as mentioned above, but the documentation is quite good and it's very full featured. You can easily setup multiple threads and simulate multiple users hitting your site at once. JMeter is also really useful for tuning your web server configuration, as you can see visually (on its graphs) how performance degrades as more users hit the site.
posted by zachlipton at 6:24 PM on October 16, 2006
posted by zachlipton at 6:24 PM on October 16, 2006
Response by poster: These are all great suggestions, and I will try as many as possible. Thanks. For now I'm marking the Web Page Analyzer as best answer because it gave me the info I needed in a quick and dirty fashion. Thanks, kirkaracha.
posted by jdroth at 6:36 PM on October 16, 2006
posted by jdroth at 6:36 PM on October 16, 2006
FasterFox is an extension that speeds up Firefox by optimizing a few settings, and in addition adds a little meter in the bottom right corner of your browser that tells you how long that page took to load.
Doesn't give you the breakdown of individual page elements however.
posted by kaefer at 10:07 PM on October 16, 2006
Doesn't give you the breakdown of individual page elements however.
posted by kaefer at 10:07 PM on October 16, 2006
This link made me think of this, and it has a pretty graph too.
posted by smackfu at 3:28 PM on October 19, 2006
posted by smackfu at 3:28 PM on October 19, 2006
This thread is closed to new comments.
posted by jdroth at 1:23 PM on October 16, 2006