Can I download a website and all its contents for offline surfing?
November 15, 2008 7:22 AM Subscribe
Can I download an entire website, including all hosted files, and burn it all onto a DVD-ROM for offline use? I'd like to be able to "surf" the site as if I'm using the real online version. I'm on a Mac.
I tried Wget but couldn't figure out how to use it.
I tried Wget but couldn't figure out how to use it.
you were right to try wget.
heres an example, mirroring the site example.com, placing all files in a directory named example.com, with a two-second wait between requests, and converting all hyperlinks to be local:
wget --mirror -k -w 2 -p example.com --convert-links example.com
this is a one-off - wget is very very robust. spend some time with the documentation: man wget
to see everything it can do.
posted by namewithoutwords at 7:53 AM on November 15, 2008
heres an example, mirroring the site example.com, placing all files in a directory named example.com, with a two-second wait between requests, and converting all hyperlinks to be local:
wget --mirror -k -w 2 -p example.com --convert-links example.com
this is a one-off - wget is very very robust. spend some time with the documentation: man wget
to see everything it can do.
posted by namewithoutwords at 7:53 AM on November 15, 2008
Though, yes, wget is pretty awesome. There are actually several free GUI front-ends for it that may help you. Here's one.
posted by mkultra at 7:59 AM on November 15, 2008
posted by mkultra at 7:59 AM on November 15, 2008
Response by poster: Wget is waay too complicated for me. I have no idea how to use it. I know next to nothing about computers. I can log onto the Internet, type documents, and watch movies. That's about it. I need something really basic and self explanatory.
posted by HotPatatta at 8:07 AM on November 15, 2008
posted by HotPatatta at 8:07 AM on November 15, 2008
If anyone can chime in with a Windows solution, especially one that will work with Blackboard classes, I'd appreciate it.
posted by lizzicide at 8:16 AM on November 15, 2008
posted by lizzicide at 8:16 AM on November 15, 2008
I've used HTTrack before; works well, it's pretty simple for basic tasks, and it has ports for Windows, Mac and a bunch of Linux distros.
posted by Dipsomaniac at 8:38 AM on November 15, 2008
posted by Dipsomaniac at 8:38 AM on November 15, 2008
I second HTTrack . Used it in the past. Simple to use from memory.
posted by therubettes at 9:02 AM on November 15, 2008
posted by therubettes at 9:02 AM on November 15, 2008
WinWget?
http://lifehacker.com/5086682/winwget-makes-automated-downloads-a-breeze
posted by JpMaxMan at 9:29 AM on November 15, 2008
http://lifehacker.com/5086682/winwget-makes-automated-downloads-a-breeze
posted by JpMaxMan at 9:29 AM on November 15, 2008
Sorry - linked:
WinWget LifeHacker Article
WinWget Direct
posted by JpMaxMan at 9:31 AM on November 15, 2008
WinWget LifeHacker Article
WinWget Direct
posted by JpMaxMan at 9:31 AM on November 15, 2008
Also for the OP in the lifeHacker article they link to their guide to mastering WGET from the command line. Might check it out.
posted by JpMaxMan at 9:33 AM on November 15, 2008
posted by JpMaxMan at 9:33 AM on November 15, 2008
Please be considerate of the host if you do this. Someone used to download my entire site every day, 3,000 pages, so he could stay up to date. It ran up my bandwidth bills and hosed my traffic stats.
posted by futility closet at 9:42 AM on November 15, 2008 [1 favorite]
posted by futility closet at 9:42 AM on November 15, 2008 [1 favorite]
Speaking as a web site owner, whenever I see anyone doing this to my site, I ban them in my firewall so that they can never, ever visit again.
posted by Class Goat at 9:59 AM on November 15, 2008 [1 favorite]
posted by Class Goat at 9:59 AM on November 15, 2008 [1 favorite]
DeepVacuum is a very easy to use GUI front-end for wget for OS X.
posted by jjg at 10:35 AM on November 15, 2008 [2 favorites]
posted by jjg at 10:35 AM on November 15, 2008 [2 favorites]
As Futility and CG said, be sure to set a nice polite delay between pages when using this so you don't hammer a website unkindly.
posted by rokusan at 11:51 AM on November 15, 2008
posted by rokusan at 11:51 AM on November 15, 2008
There's a Mac frontend for wget called Get that you could try.
Alternatively, you could install the Scrapbook plugin for Firefox and use that to save pages selectively to view offline. It organizes saved pages much the same way your Bookmarks file does, and will save links, too. It's not a good tool to scrape an entire website, but it's a better tool than wget to, say, save all the tabs you have open to read offline later.
posted by gum at 1:49 PM on November 15, 2008
Alternatively, you could install the Scrapbook plugin for Firefox and use that to save pages selectively to view offline. It organizes saved pages much the same way your Bookmarks file does, and will save links, too. It's not a good tool to scrape an entire website, but it's a better tool than wget to, say, save all the tabs you have open to read offline later.
posted by gum at 1:49 PM on November 15, 2008
This thread is closed to new comments.
posted by leigh1 at 7:34 AM on November 15, 2008