Mac or Firefox solution to download local copies of web sites?
August 23, 2007 12:12 PM   Subscribe

I'm looking for Mac OSX software or a Firefox plug-in that will allow me to create local copies of web sites to be viewed later.

I will be without web access for about a week and I want to do some work on few php and css projects. To do so, I will need to have tutorials handy. I'd like something for the Mac that will allow me to download a page including all graphics and download all the pages it links to (1 to 2 links deep).
posted by dripdripdrop to Computers & Internet (8 answers total)
 
You can get wget for OSX, with a GUI if you prefer.
posted by jjb at 12:17 PM on August 23, 2007


You might also take a look at SiteSucker. With the "Localize" HTML processing option turned on, it does exactly what you want.
posted by RichardP at 12:20 PM on August 23, 2007


I use ScrapBook. It's a great FF plugin.
posted by Cat Pie Hurts at 12:37 PM on August 23, 2007


Hmmm... Sitesucker and Scrapbook both look like they'll do the job. I'll download them and give it a try.
posted by dripdripdrop at 1:16 PM on August 23, 2007


Zotero is a pretty handy Firefox based collection management program and might do what you want.
posted by singingfish at 2:19 PM on August 23, 2007


Yojimbo is a great, great application for this. Use the "web archive" when you are importing a URL.


posted by raheel at 3:25 PM on August 23, 2007


I've always been partial to Web Devil. You can specify types of media (if, say, you just wanted all the images and none of the html, etc), how deep down the hierarchy you want it to go, etc.

I've used it to archive sites that I worry will go down someday, as well as to make an end run around crappy UIs in picture/video galleries.
posted by churl at 3:48 PM on August 23, 2007


I also swear by Scrapbook
posted by katala at 10:20 PM on August 23, 2007


« Older The Invisible Hand just picked my pocket   |   What exactly is Immature Myelination? Newer »
This thread is closed to new comments.