Offline Browser programs
March 17, 2006 8:30 AM Subscribe
Is there a program that will download websites that include dynamic table functionality? For example, if a website has statistical information on all 50 states (e.g., most government websites) and, by inputting the names of, say 3 states (plus vehicle data) you get a screen that compares car insurance rates in those states for the car in question. My goal is to be able to download such a site and browse it offline in instances where I can't connect to the web.
No. What you're asking is kind of like retrieving your email without being connected to the internet. It doesn't work because the email resides on some mail server somewhere and you need to be connected to it to get your newest email.
In the same way, all the work that's involved in getting your insurance quotes resides on some server somewhere. Think about what's involved: you need to find out what kind of car it is, how many accidents its been in, how many tickets its been issued, possibly even your credit score. These all need to be considered before an insurance company can give you any kind of quote, and for various reasons, this data is only available at several online datacenters (i'm thinking: dmv, equifax, geico, etc.).
Of course when you get a quote online, you can simply save that single page which you can then access offline at a later time, but you already know that.
That being said, there are some things like conversion calculators which are implemented fully in html and javascript. That means all the logic to take input and give output are coded into that single page; there is no dependency on any external data residing on a server. These type of applications should work fine by just saving the page.
posted by pooya at 9:39 AM on March 17, 2006
In the same way, all the work that's involved in getting your insurance quotes resides on some server somewhere. Think about what's involved: you need to find out what kind of car it is, how many accidents its been in, how many tickets its been issued, possibly even your credit score. These all need to be considered before an insurance company can give you any kind of quote, and for various reasons, this data is only available at several online datacenters (i'm thinking: dmv, equifax, geico, etc.).
Of course when you get a quote online, you can simply save that single page which you can then access offline at a later time, but you already know that.
That being said, there are some things like conversion calculators which are implemented fully in html and javascript. That means all the logic to take input and give output are coded into that single page; there is no dependency on any external data residing on a server. These type of applications should work fine by just saving the page.
posted by pooya at 9:39 AM on March 17, 2006
If you wanted to actually download all the possible answers, you'd be downloading 117600 possible comparisons!
Theoretically you could have an app that just downloads the data for the 50 states and does the comparison offline, but there's no general purpose app that does that for all different kinds of websites.
posted by aubilenon at 10:45 AM on March 17, 2006
Theoretically you could have an app that just downloads the data for the 50 states and does the comparison offline, but there's no general purpose app that does that for all different kinds of websites.
posted by aubilenon at 10:45 AM on March 17, 2006
You can copy and paste a table from a web browser into a spreadsheet program like Excel, and then use the spreadsheet (or a database program like Access) to run calculations or queries on the data.
Microsoft Excel also has a feature called Get External Data that can help automate the process of copying data from a table on a web page.
posted by mbrubeck at 12:21 PM on March 17, 2006
Microsoft Excel also has a feature called Get External Data that can help automate the process of copying data from a table on a web page.
posted by mbrubeck at 12:21 PM on March 17, 2006
(This solution requires that the site has some way to display all of the data in a single table.)
posted by mbrubeck at 12:22 PM on March 17, 2006
posted by mbrubeck at 12:22 PM on March 17, 2006
I'm not sure if I'm understanding this, but if you have PHP or PERL(?) installed somewhere and you know a little of either you could have CURL download these pages by cycling through an array of your parameters (States, etc) and storing the results for you. This is assuming you don't need special cookies, or login access, because then things can get pretty tricky. Google CURL. Or look at the PHP docs: http://us3.php.net/manual/en/ref.curl.php
If you don't know any of the above, you could always ask a kind soul to help you out. But depending on the site, this could be impossible or really hard.
posted by miniape at 1:08 PM on March 17, 2006
If you don't know any of the above, you could always ask a kind soul to help you out. But depending on the site, this could be impossible or really hard.
posted by miniape at 1:08 PM on March 17, 2006
This thread is closed to new comments.
posted by mendel at 9:20 AM on March 17, 2006