Saving webpages
September 7, 2006 6:57 AM   Subscribe

How can I best save a web page and the data for its images and links?

I'm viewing a web page, and I want to save it to disk for my own reference. In addition to setting the depth of the links to follow, what other ways can I expand and limit the bundle of information saved? Can I automatically reduce the size of images? Can I set different priorities for information local to that website, and for external links?

(I'm using XP at the moment, but I'm sure others may be interested in alternative platforms.)
posted by StickyCarpet to Computers & Internet (11 answers total) 3 users marked this as a favorite
 
Best answer: HTTrack
posted by prostyle at 7:04 AM on September 7, 2006 [1 favorite]


Be careful with HTTrack, though. It's very easy with that program to a) get more of a Web site than you want, and b) slow down the site for everybody else while you're downloading. Make sure you set the maximum link depth to something reasonable (Like the first 2 layers of links from the home page), and/or set a maximum size (Like a total of 5mb of files). Also, set the number of connections to 1 or 2 -- it will take longer for you to get the files, but you won't be screwing everybody else who's trying to view that page at the same time.

I'm not totally sure that you can automatically resize images with httrack, but it's very full-featured, so I wouldn't be surprised.
posted by Hildago at 7:26 AM on September 7, 2006


Best answer: The easiest way is with page in focus, Ctrl-A, Ctrl-C, open and focus a blank MS Word document, Ctrl-V.
posted by Mr. Gunn at 7:35 AM on September 7, 2006


Best answer: I'm a fan of wget because of its flexability. You'll need to learn some of its switches, however.
posted by owenkun at 7:51 AM on September 7, 2006 [1 favorite]


Best answer: If you want to avoid dealing with command line switches (wget, etc) then Teleport Pro is very sweet. It's totally customisable - depth, identity, size, age, filetype, thread control, and so on.
posted by meehawl at 8:02 AM on September 7, 2006


Best answer: If you use Firefox, Scrapbook is an awesome extension for saving a static copy of webpages. It has pretty much replaced Bookmarks for me, as it'll save the page as it is at that moment in time, which is very useful for dynamic sites.
posted by mysterpigg at 8:30 AM on September 7, 2006


Best answer: I've never used it but WinWGet looks to be pretty good.

It is basically a nice Windows user interface to the recommended "wget" application.
posted by mr_silver at 9:42 AM on September 7, 2006


I'll put in another vote for Scrapbook. It gets regularly updated and every time it gets easier and more powerful.
posted by Ookseer at 11:05 AM on September 7, 2006


If you only need that page + plus all the images on it I like the MAF extension for FireFox. It saves a single page that can be read with both FireFox and IE.
posted by Mitheral at 12:13 PM on September 7, 2006


Another late vote for Scrapbook. Ive uses HTTrack, but Scrapbook is much easier, integrated into Firefox, and mre user-friendly. Assuming you use Firefox, of course....
posted by Boobus Tuber at 4:04 PM on September 7, 2006


Best answer: I'm an HTTrack user but just learned of BlackWidow yesterday and planning to try it out. It's shareware.
posted by IndigoRain at 4:57 PM on September 7, 2006


« Older I'm an noob at engagement   |   Quit copying me!!! Newer »
This thread is closed to new comments.