Panther Replaced Wget with cURL -- So How Do I Backup a Website?
June 14, 2004 12:42 PM   Subscribe

OS X: cURL help...

In Panther, as I understand it, wget was replaced by cURL. Can someone tell me how to make a backup of a web site with curl?

(I can only figure out how to get one file at a time, and I don't have FTP access, and yes, it's my site, but it's a typepad site and they don't have a resource for exporting or backing up photo galleries, of which I have many.)
posted by o2b to Computers & Internet (6 answers total)
 
I think that wget is still present in Panther -- I'm running 10.3.3 and use DeepVacuum, which is described by the developer as "a front end to wget command line utility."
posted by Dean King at 1:18 PM on June 14, 2004


I had to use fink to have wget on my fresh 10.3 install...
posted by esch at 1:29 PM on June 14, 2004


curl will get more than one file at a time, but it requires that you know the names of the files you're downloading; it won't follow links like wget will. The easiest way to get wget on Panther is probably to download DeepVacuum, pointed to by Dean King, then open the package and put the wget binary in /usr/local/bin. Or just use DeepVacuum, it looks like a nice front end.
posted by kindall at 1:54 PM on June 14, 2004


Response by poster: I got DeepVacuum. It's working like a charm.

Thanks.
posted by o2b at 2:20 PM on June 14, 2004


Why don't you just get, compile and install wget?
posted by cheaily at 4:50 PM on June 14, 2004


Crap, I didn't read kindall's comment. Never mind.
posted by cheaily at 4:51 PM on June 14, 2004


« Older Advice on cat seizures.   |   Canvas 9 Question Newer »
This thread is closed to new comments.