How to download website directory?
July 21, 2008 6:01 PM   Subscribe

I'm trying to download from links en masse, any software solutions?

I'm trying to download files and links from a website "directory". It sorta looks like this. I guess it's directory made with html links.

Wish I could just drag and drop the folders.

I found a website copying program but it doesn't handle spaces in the address very well. Any ideas besides having to manually click each link by hand?
posted by abdulf to Computers & Internet (11 answers total) 7 users marked this as a favorite
 
wget will do exactly what you're looking for. Curl will also do it, but I learned wget first and so recommend it first. It has a method to create a local copy of any site to a specific depth. It will also rewrite all the links and references so that you can locally browse the resulting folder offline.
posted by odinsdream at 6:08 PM on July 21, 2008 [1 favorite]


downThemAll. URLToys.
posted by a robot made out of meat at 6:08 PM on July 21, 2008


Downthemall rocks.
posted by CunningLinguist at 6:11 PM on July 21, 2008


I prefer Flashget, it integrates very well with the Firefox browser and has a great depth of tools for a variety of uses.

Hope that helps
posted by door2summer at 6:12 PM on July 21, 2008 [1 favorite]


Nthing Downthemall and wget (or winwget).
posted by box at 6:12 PM on July 21, 2008


I came here to recommend DownThemAll. It seems I'm not the first.
posted by fogster at 6:37 PM on July 21, 2008


Seconding Flashget, though to integrate it into Firefox properly, as door2summer has said, you'll want the Firefox extension Flashgot.
posted by Effigy2000 at 7:08 PM on July 21, 2008


lftp (like wget, another *nix tool available for Windows) can do smart stuff with HTTP mirroring and recursion, wherein you download the contents of subdirectories and their subdirectories (which I don't know that downthemall supports).

wget is by far the most powerful of the two, but I always seem to have to re-read the wget man page if I've not used it for a mirror in a month or so to get all the options perfect. If all I want is to mirror a visible-in-links directory structure (such as Apache index pages), lftp is definitely the easy tool:

mkdir MyMirror
lftp http://example.com/path/to/file/listing/
mirror
posted by tgmayfield at 7:13 PM on July 21, 2008


Linky is another Firefox option.
posted by idb at 6:14 AM on July 22, 2008


If FF is not your primary browser, SiteSucker is a donation-ware Mac OS X app.
posted by Fin Azvandi at 9:03 AM on July 22, 2008


Blue Crab and Speed Download are also very good.
posted by LuckySeven~ at 9:21 AM on July 22, 2008


« Older What fields and jobs/careers, ...   |  Recommendations on great "... Newer »
This thread is closed to new comments.