How do I write a program to analyze my data?
August 12, 2010 4:23 PM Subscribe
I need help writing a simple program that will compare numbers in a specific way.
The internet connexion where I live is very unreliable, going out for a few minutes probably fifty times a day. This seems to be due to noise on the phone line— the service is a DSL— and so is out of my control. In an attempt to be able to describe patterns in the drops, I wrote a very simple BASH script that pings google.com a few times a minute and then appends the UNIX time to either online.log if the ping was a success or offline.log if the ping was a failure.
Obviously, the data are not useful in this format. I can't quite come up with a good way to view or categorize the data. I have a vague inkling that I want a list of all outages with the times when they occurred, but I can't imagine what that program would look like. Presumably, I would read the files in and any set of numbers from offline.log that falls between two from online.log would be read, and then the first and last numbers of that set would be converted back to a human-readable time format.
I ask, then, if anyone can tell me how to write such a program. I've never been much of a programmer but I have all the normal UNIX tools at my disposal here— I tried to write it in perl but I can't quite make it work. The system is actually Mac OS X 10.6 with MacPorts.
The internet connexion where I live is very unreliable, going out for a few minutes probably fifty times a day. This seems to be due to noise on the phone line— the service is a DSL— and so is out of my control. In an attempt to be able to describe patterns in the drops, I wrote a very simple BASH script that pings google.com a few times a minute and then appends the UNIX time to either online.log if the ping was a success or offline.log if the ping was a failure.
Obviously, the data are not useful in this format. I can't quite come up with a good way to view or categorize the data. I have a vague inkling that I want a list of all outages with the times when they occurred, but I can't imagine what that program would look like. Presumably, I would read the files in and any set of numbers from offline.log that falls between two from online.log would be read, and then the first and last numbers of that set would be converted back to a human-readable time format.
I ask, then, if anyone can tell me how to write such a program. I've never been much of a programmer but I have all the normal UNIX tools at my disposal here— I tried to write it in perl but I can't quite make it work. The system is actually Mac OS X 10.6 with MacPorts.
I would do this in a spreadsheet. You probably want a 2 juxtaposed time based histograms. Or something.
That said, ping is by design unreliable, so keep that in mind.
I would ping something closer than google, something at your ISP, at the other end of your DSL line. traceroute will give you some good hosts to ping.
There are also existing tools that will do exactly what you want (Nagios is the first thing that comes to mind, and while overkill, will do).
posted by antiquark at 4:45 PM on August 12, 2010
That said, ping is by design unreliable, so keep that in mind.
I would ping something closer than google, something at your ISP, at the other end of your DSL line. traceroute will give you some good hosts to ping.
There are also existing tools that will do exactly what you want (Nagios is the first thing that comes to mind, and while overkill, will do).
posted by antiquark at 4:45 PM on August 12, 2010
Best answer: Seems to me that what you want to do is record the transitions, not the persistent state.
So make a program/script that does something like this (keeping in mind that I don't know bash but messaed around with csh (mumble)20 years ago(mumble).
As for analysis - each line tells you when something changed. The deltas between dates tell you how long a transition was.
posted by plinth at 4:51 PM on August 12, 2010
So make a program/script that does something like this (keeping in mind that I don't know bash but messaed around with csh (mumble)20 years ago(mumble).
echo "nothing" > /tmp/online while true do if pingFails && -e /tmp/online then echo "offline " `date` >> ~/transitions rm /tmp/online else if !pingFails && !-e /tmp/online then echo "online" `date` >> ~/transitions echo "nothing" > /tmp/online fi endThe take away is that you should keep state as to whether you were online last time - you can do that through the existence of a file. Then when your state changes, you write the state change and the date to the end of your log file. This file will tell you when you go up and down. Note that there are probably a bajillion ways to do this, but this is just one and it's simple.
As for analysis - each line tells you when something changed. The deltas between dates tell you how long a transition was.
posted by plinth at 4:51 PM on August 12, 2010
I can't see the advantage of having two files. What about instead writing one file with the time and the ping. If it's offline, you could write a -1 (or whatever) for the ping. Then you could plot ping as a function of time. Maybe you could average the ping values for each 1 minute / 2 minutes / 5 minutes since you'd have so much data. If there's a cause and effect between increasing ping and disconnection, I think it would be a great way to demonstrate it.
posted by cali59 at 4:56 PM on August 12, 2010
posted by cali59 at 4:56 PM on August 12, 2010
Won't Excel convert unix time stamps? You could pretty easily whip up a couple graphs of success/fail by day/hour
posted by ish__ at 5:00 PM on August 12, 2010
posted by ish__ at 5:00 PM on August 12, 2010
Ok; on what-should-have-been-preview I see its more "how to write" than "what to do"
So - write a program that reads through each file and makes array histograms for day/hour.
posted by ish__ at 5:02 PM on August 12, 2010
So - write a program that reads through each file and makes array histograms for day/hour.
posted by ish__ at 5:02 PM on August 12, 2010
I would have logged to a single file then graphed with x-axis = time and y-axis is on-line / off-line (or, if you want something more discrete, the ping times from 1 to infinity).
posted by jeffamaphone at 5:13 PM on August 12, 2010
posted by jeffamaphone at 5:13 PM on August 12, 2010
Best answer: If you install splunk, you can just index the ping log file (make sure you include time stamps) and it'll make a fancy graph for you in real time. If you forward the appropriate ports to your pc on your router, you can just send the link to your ISP to look at it.
Also, I would suggest downloading a program called mtr as well, so you can see where the outage is happening.
posted by empath at 5:28 PM on August 12, 2010
Also, I would suggest downloading a program called mtr as well, so you can see where the outage is happening.
posted by empath at 5:28 PM on August 12, 2010
Best answer: Also, I would ping 8.8.8.8 which is google's DNS server and is super reliable.
posted by empath at 5:29 PM on August 12, 2010 [1 favorite]
posted by empath at 5:29 PM on August 12, 2010 [1 favorite]
sed -i 's/$/,off/' offline.log
sed -i 's/$/,on/' online.log
sort offline.log online.log > combined.csv
Now you have the data you've already collected in a file, in comma-separated-value format.
I'm not sure what you mean by Unix time -- seconds since 1/1/1970, or Thu Aug 12 22:32:46 EDT 2010 ? If the former, you're cool. If the latter, you may need to munge it to make the spreadsheet able to do something useful with it.
Put it into openoffice's spreadsheet, and make a chart out of it.
posted by novalis_dt at 7:38 PM on August 12, 2010
sed -i 's/$/,on/' online.log
sort offline.log online.log > combined.csv
Now you have the data you've already collected in a file, in comma-separated-value format.
I'm not sure what you mean by Unix time -- seconds since 1/1/1970, or Thu Aug 12 22:32:46 EDT 2010 ? If the former, you're cool. If the latter, you may need to munge it to make the spreadsheet able to do something useful with it.
Put it into openoffice's spreadsheet, and make a chart out of it.
posted by novalis_dt at 7:38 PM on August 12, 2010
This thread is closed to new comments.
posted by pharm at 4:40 PM on August 12, 2010