setup for creating own optimized online backup?
January 26, 2009 10:41 AM   Subscribe

I am making an online backup of all my important data (about 20-100 gigs). I am not looking for a service, i want to create my own since i have 300gigs online waiting to be used. In creating my own service I need 1)a good ftp program and 2)some a program that would work such that if i drag in a folder with the same name, it would overwrite the folder, but only that which is different so it would not take so long after the initial download.

I dont want to completely synchronize my disk to the online disk because i only want certain files online. Im sure there is lots of possibilities for this setup (an opensource/free setup would be best), ive just never dabbeled in this stuff and wouldnt mind some advice! thanks!
posted by figTree to Computers & Internet (12 answers total) 3 users marked this as a favorite
Are you committed to using FTP? I've heard good things about Amanda, an open source network backup system.
posted by aubilenon at 10:54 AM on January 26, 2009

I don't think you want FTP. Rsync?
posted by rokusan at 10:59 AM on January 26, 2009

just to be clear, i am renting online space from dot5hosting. Here is a list of what their servers have. So I basically need to use that technology to upload my stuff and have it done intelligently by not overwriting the same files over and over. I know computers, but i know very little about this realm - i thought that in this case ftp is what is required?
posted by figTree at 11:11 AM on January 26, 2009

You want Rsync, or one of it's many wrappers.

9 times out of 10, when you want a backup solution, the answer is rsync.
posted by chrisamiller at 11:12 AM on January 26, 2009

I don't think you want FTP. I think you want rsync.
posted by majick at 11:13 AM on January 26, 2009

Transmit (an FTP client of the mac) lets you keep two things in sync.

rsync is probably what you actually want to use however.
posted by chunking express at 11:15 AM on January 26, 2009

Oh, Transmit is here. The synchronization feature does work well.
posted by chunking express at 11:16 AM on January 26, 2009

What you really want is duplicity. It will find the differences in your files and directories using the algorithm from the oft-mentioned rsync, package them up into archives, encrypt them (if you want) and upload them via FTP. You can also keep your history of changes for a few days (or a lot of days) in case you want to go back to a previous backup.
posted by pocams at 11:18 AM on January 26, 2009 [1 favorite]

I use rdiff-backup. It does everything you want, should work or be installable on the host, and keeps a history of changes with minimal overhead.
posted by roue at 12:13 PM on January 26, 2009

just to be clear, i am renting online space from dot5hosting.

My experience with cheap hosting solutions is that they throttle bandwidth pretty aggressively. You might find that moving 100gigs will take a week or two with a lot of dropped connections and errors. Youre better off with something like rsync, which only copies the delta of the file after the first initial big upload.

This is why people go with services like Mozy. Theyre designed to do this. A web hosting company isnt going to care much if you mega-transfers keep stalling. They care about hosting websites.
posted by damn dirty ape at 12:23 PM on January 26, 2009

Are people divining that you're a Mac or Unix user? For PC's, SyncBackSE has a FTP option and can "RSync" files intelligently.
posted by psyche7 at 1:12 PM on January 26, 2009

From Dot5Hosting's user agreement:

Hosting space is intended for normal use only, and is limited to Web files, e-mail and content of the hosted Web sites, not for storage of media or other data. Hosting space may not be used as offsite storage for electronic files or for third party electronic mail or FTP hosts.
posted by sageleaf at 1:13 PM on January 26, 2009

« Older Bootcamp flummoxes wireless connection.   |   Why am I incredibly passionate about learning... Newer »
This thread is closed to new comments.