Recommended software to download website?
February 27, 2012 11:48 AM Subscribe
I want to download an entire website to my computer for off-line viewing. I'm using OS X 10.6.8 with Firefox, but I'm fine with Chrome or Safari. I've studied the offerings on Google and CNet like SiteSucker, WebDevil, WebCopier, Getleft, etc. I'd prefer a free download since I plan to use it only once. A browser feature or add on would be great. Before I decide I'd like to know of other's experience with this task in hopes of avoiding pitfalls. Thanks.
Use `wget` which might already be on your machine. Like This.
imagineerit has a good warning, this won't work too well if the site is using Javascript to fetch and display page content, but otherwise it will work fine for other cases. You might want to try and browse the site a bit with Javascript turned off to see if it's usable that way. Further Googling 'website mirror wget' will turn up other pages with more options that you can pass to wget to configure the depth and breadth of what it tries to mirror.
posted by zengargoyle at 12:18 PM on February 27, 2012
imagineerit has a good warning, this won't work too well if the site is using Javascript to fetch and display page content, but otherwise it will work fine for other cases. You might want to try and browse the site a bit with Javascript turned off to see if it's usable that way. Further Googling 'website mirror wget' will turn up other pages with more options that you can pass to wget to configure the depth and breadth of what it tries to mirror.
posted by zengargoyle at 12:18 PM on February 27, 2012
SiteSucker is free—there is a $5 "recommended donation" that might have stymied you while you were researching, but it's not required and the application is not crippled in any way. I've used it a few times and the only issue I've run into is with sites that require credentials; you can just log in using Safari in that case and it will use your keychain info automatically.
posted by bcwinters at 12:33 PM on February 27, 2012 [1 favorite]
posted by bcwinters at 12:33 PM on February 27, 2012 [1 favorite]
I checked my mac in the next room and wget doesn't seem to be on there (in Snow Leopard, anyway). It'd be easy enough to install via macports. If you want to try a browser plugin, this one claims to do site mirroring.
I'm a CLI person, and wget would be my first choice for this sort of thing (above warnings about dynamic content notwithstanding).
posted by jquinby at 12:43 PM on February 27, 2012
I'm a CLI person, and wget would be my first choice for this sort of thing (above warnings about dynamic content notwithstanding).
posted by jquinby at 12:43 PM on February 27, 2012
cURL is a similar command line utility to wget that comes native on macs and should do what you need.
posted by no regrets, coyote at 1:03 PM on February 27, 2012
posted by no regrets, coyote at 1:03 PM on February 27, 2012
I've used httrack in the past on a PC. Looks like there's an OS X download too.
posted by backwards guitar at 1:34 PM on February 27, 2012 [1 favorite]
posted by backwards guitar at 1:34 PM on February 27, 2012 [1 favorite]
I'd second the recommendations of wget or curl. Both can do what you want quite easily and at least one of them comes with Mac OS X so there's no need to trust random freeware or fork over for a commercial program.
If you need help on use of curl or wget, just run 'man curl' or 'man wget' respectively to read the (quite comprehensive) manual pages.
posted by -1 at 1:42 PM on February 27, 2012
If you need help on use of curl or wget, just run 'man curl' or 'man wget' respectively to read the (quite comprehensive) manual pages.
posted by -1 at 1:42 PM on February 27, 2012
This thread is closed to new comments.
Unless it contains nothing but static images and text, it's not going to be possible to grab and preserve a whole website and still have it be functional.
posted by imagineerit at 11:58 AM on February 27, 2012