ping equivalent for web servers that reports http latency?
May 3, 2009 9:57 PM
The equivalent of ping for http connections?
I want to the equivalent of ping for webservers. I want to type in a simple command that generates 1 or more 1-line reports on the transaction: no. of bytes, response code, and round-trip latency.
For example, I would like to ask the tool to 'ping' my server once a minute for five minutes:
webping http://www.example.com/foo -i 60 -c 5
and see something like
1200 bytes (200) from www.example.com: seq=1 time = 240ms
1200 bytes (200) from www.example.com: seq=2 time = 240ms
1200 bytes (200) from www.example.com: seq=3 time = 240ms
1200 bytes (200) from www.example.com: seq=4 time = 240ms
300 bytes (404) from www.example.com: seq=5 time = 1500ms
Is there such a thing for linux/os-x? If not, is there a way to get what I want out of wget (shell / python munging of wget's STDOUT perfectly acceptable).
I want to the equivalent of ping for webservers. I want to type in a simple command that generates 1 or more 1-line reports on the transaction: no. of bytes, response code, and round-trip latency.
For example, I would like to ask the tool to 'ping' my server once a minute for five minutes:
webping http://www.example.com/foo -i 60 -c 5
and see something like
1200 bytes (200) from www.example.com: seq=1 time = 240ms
1200 bytes (200) from www.example.com: seq=2 time = 240ms
1200 bytes (200) from www.example.com: seq=3 time = 240ms
1200 bytes (200) from www.example.com: seq=4 time = 240ms
300 bytes (404) from www.example.com: seq=5 time = 1500ms
Is there such a thing for linux/os-x? If not, is there a way to get what I want out of wget (shell / python munging of wget's STDOUT perfectly acceptable).
You could just put a 0 byte file somewhere and time how long wget (or whatever) takes to get it.
posted by aubilenon at 10:10 PM on May 3, 2009
posted by aubilenon at 10:10 PM on May 3, 2009
Shell script running telnet to the port and doing 200's, or use curl...
posted by iamabot at 10:14 PM on May 3, 2009
posted by iamabot at 10:14 PM on May 3, 2009
Similar to aubilenon's suggestion but posting an x sized file with the page returning a same sized file.
posted by wongcorgi at 10:16 PM on May 3, 2009
posted by wongcorgi at 10:16 PM on May 3, 2009
you could also use apache bench... for instance:
posted by mulligan at 10:26 PM on May 3, 2009
$ ab -c 1 -n 1 -i http://metafilter.com/ This is ApacheBench, Version 2.3 <> Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/ Licensed to The Apache Software Foundation, http://www.apache.org/ Benchmarking metafilter.com (be patient).....done Server Software: Apache Server Hostname: metafilter.com Server Port: 80 Document Path: / Document Length: 0 bytes Concurrency Level: 1 Time taken for tests: 0.126 seconds Complete requests: 1 Failed requests: 0 Write errors: 0 Non-2xx responses: 1 Total transferred: 189 bytes HTML transferred: 0 bytes Requests per second: 7.94 [#/sec] (mean) Time per request: 125.882 [ms] (mean) Time per request: 125.882 [ms] (mean, across all concurrent requests) Transfer rate: 1.47 [Kbytes/sec] received Connection Times (ms) min mean[+/-sd] median max Connect: 59 59 0.0 59 59 Processing: 66 66 0.0 66 66 Waiting: 66 66 0.0 66 66 Total: 126 126 0.0 126 126 >
posted by mulligan at 10:26 PM on May 3, 2009
The closest thing to ping would probably be an OPTIONS or TRACE request.
posted by hattifattener at 10:58 PM on May 3, 2009
posted by hattifattener at 10:58 PM on May 3, 2009
You can do that just by putting "time wget http://www.example.com/foo" at the command prompt, right? The output of that will include everything you asked for - python munging would get it into any format you like.
posted by mjg123 at 12:33 AM on May 4, 2009
posted by mjg123 at 12:33 AM on May 4, 2009
I thought about it some more. What are you trying to measure? What your example will give you is the total time for a request/response, which will include network latency (as given by normal "ping"), plus any delay caused by the webserver/server-side processing. This is fine for testing normal "use case" of your website, and HTTP Ping (suggested above by anarchivist) does seem to do what you want, apart from reporting the HTTP status code, which you could hack in easily.
For anything more serious, apachebench is excellent.
posted by mjg123 at 1:17 AM on May 4, 2009
For anything more serious, apachebench is excellent.
posted by mjg123 at 1:17 AM on May 4, 2009
What are you trying to measure?
I want to know when my server is taking too long to respond.
The main thing I want to measure is real-world response latency for a real URL (not just a test file, but in this case a web service that's generating content dynamically.
This measurement can include the latency of the TCP connection, or it can break it out separately like http_ping does. I'd like it to tell me when the complete transfer is done, and if it's not done, what the response code was (or whether the connection just timed out).
I want output that looks like ping - everything on one line.
Http_ping is a good find - I'll try it out.
Wrapping wget/curl in the appropriate scripting logicsounds good too. Are there any scripts to do this? I think wget needs external logic to: 1) hit a single URL repeatedly, 2) report the time between request and response, and 3) generate a single line of output.
posted by zippy at 7:19 AM on May 4, 2009
I want to know when my server is taking too long to respond.
The main thing I want to measure is real-world response latency for a real URL (not just a test file, but in this case a web service that's generating content dynamically.
This measurement can include the latency of the TCP connection, or it can break it out separately like http_ping does. I'd like it to tell me when the complete transfer is done, and if it's not done, what the response code was (or whether the connection just timed out).
I want output that looks like ping - everything on one line.
Http_ping is a good find - I'll try it out.
Wrapping wget/curl in the appropriate scripting logicsounds good too. Are there any scripts to do this? I think wget needs external logic to: 1) hit a single URL repeatedly, 2) report the time between request and response, and 3) generate a single line of output.
posted by zippy at 7:19 AM on May 4, 2009
How about a cronjob with wget? Add a line to your crontab file (mine's at /etc/crontab, not sure if everyone's is) something like this:
* * * * * username wget http://www.example.com/foo -a /etc/log_file
"-a /etc/log_file" appends the output from wget to /etc/log_file. The five stars at the beginning indicate to run it every minute. username is the user to run the cronjob as.
posted by jgunsch at 8:31 AM on May 4, 2009
* * * * * username wget http://www.example.com/foo -a /etc/log_file
"-a /etc/log_file" appends the output from wget to /etc/log_file. The five stars at the beginning indicate to run it every minute. username is the user to run the cronjob as.
posted by jgunsch at 8:31 AM on May 4, 2009
Want to throw in a plug for wbox, which will do exactly what you want.
posted by jquinby at 8:41 AM on May 4, 2009
jquinby@ubuntu:~/src/wbox$ ./wbox www.metafilter.com WBOX www.metafilter.com (174.132.172.58) port 80 0. 200 OK 60098 bytes 603 ms 1. 200 OK (60099) bytes 348 ms 2. 200 OK (60098) bytes 399 ms 3. 200 OK (60099) bytes 352 ms 4. 200 OK 60099 bytes 592 ms --- 5 replies received, time min/avg/max = 348/458.80/603 ---From the same folks who brought you hping, which is also quite useful.
posted by jquinby at 8:41 AM on May 4, 2009
wbox is what I want - thanks all for the good suggestions (http ping also appears to fill the bill). time+wget looks like the simplest existing tool + scripting glue solution.
posted by zippy at 3:31 PM on May 4, 2009
posted by zippy at 3:31 PM on May 4, 2009
« Older Blocking distracting programs to get work done | Is there a song ID web site out there? Newer »
This thread is closed to new comments.
posted by anarchivist at 10:05 PM on May 3, 2009