Panther Replaced Wget with cURL -- So How Do I Backup a Website?
June 14, 2004 12:42 PM Subscribe
OS X: cURL help...
In Panther, as I understand it, wget was replaced by cURL. Can someone tell me how to make a backup of a web site with curl?
(I can only figure out how to get one file at a time, and I don't have FTP access, and yes, it's my site, but it's a typepad site and they don't have a resource for exporting or backing up photo galleries, of which I have many.)
In Panther, as I understand it, wget was replaced by cURL. Can someone tell me how to make a backup of a web site with curl?
(I can only figure out how to get one file at a time, and I don't have FTP access, and yes, it's my site, but it's a typepad site and they don't have a resource for exporting or backing up photo galleries, of which I have many.)
I had to use fink to have wget on my fresh 10.3 install...
posted by esch at 1:29 PM on June 14, 2004
posted by esch at 1:29 PM on June 14, 2004
curl will get more than one file at a time, but it requires that you know the names of the files you're downloading; it won't follow links like wget will. The easiest way to get wget on Panther is probably to download DeepVacuum, pointed to by Dean King, then open the package and put the wget binary in /usr/local/bin. Or just use DeepVacuum, it looks like a nice front end.
posted by kindall at 1:54 PM on June 14, 2004
posted by kindall at 1:54 PM on June 14, 2004
Response by poster: I got DeepVacuum. It's working like a charm.
Thanks.
posted by o2b at 2:20 PM on June 14, 2004
Thanks.
posted by o2b at 2:20 PM on June 14, 2004
This thread is closed to new comments.
posted by Dean King at 1:18 PM on June 14, 2004