Online backups
February 27, 2005 9:27 AM   Subscribe

I'm looking for an online backup solution for my personal computers.

What I'm thinking of doing is having one big Linux box with lots of hard drives to store everything, and then do incremental backups to a server somewhere. I don't need a dumbed down service, any online hosting system that provides a lot of storage would do, any recommendations?
posted by delmoi to Computers & Internet (14 answers total)
 
Are you trying to find a hosting system and/or a way to backup? I can help with the latter.
posted by easyasy3k at 9:41 AM on February 27, 2005


I have a Mac and use .Mac and it's Idisk. It works beautifully, with 250 MB for $100/year; you can buy more. You divvy this up between your Idisk and online email storage. You can use the free Backup program it provides to automatically back up files on your computer, as well as your bookmarks, addresses, calendars, etc. And it synchronizes those plus your mail to the idisk and to other computers / handhelds. It has a "Public" folder where you can store stuff you want to share with others, with or without password protection. All-in-all it's a good solution if you have a Mac.
posted by rabbus at 9:48 AM on February 27, 2005


Recent threads have mentioned a number of hosting companies, but it sounds like what you really want is a colocated server that you can fill with drives and dump data to from many locations. This way, you aren't paying outrageous amounts for your disk space, and since you won't be using a lot of bandwidth (relatively) you won't have to pay too much for connectivity. However, you will still be looking at about $100/mo for reasonable service. Here's another option: "colocate" at work. Keep a second PC at work, and use ssh to connect it to your home computer (assuming you can get permission) and use that for off-site backup. This is approximately what I do, but it's easy since I handle most IT for our company.
posted by wzcx at 9:59 AM on February 27, 2005


You could just do an rsync/ssh combo every night to the hosting provider of your choosing.

But, if this is a long term plan and you're paying monthly, it might end up being cheaper to just get a used DLT8000 or something off of eBay.
posted by cmonkey at 10:07 AM on February 27, 2005


Another solution is to just run the Linux box yourself and hook it up to your LAN and your (assumed) broadband connection. Run SSH on that with a set of locked down firewall rules (netfilter is your friend!) and use rsync over SSH to backup/restore. I do this and it works great for me. The only downside is that 1) it's still only a broadband connection and 2) it's not offsite.

Email me if you want more specific information on this setup.
posted by thebabelfish at 10:18 AM on February 27, 2005


I'm very fond of GmailFS, which lets you mount your Gmail account as a filesystem in Linux.
posted by ori at 10:29 AM on February 27, 2005


thanks for the suggestions about SSH rsync. I'll probably do something like that. And don't be shy about posting complete solutions into the thread. I'll probably email you, but the whole point (I think) of Ask me is so that other people can get advice also :)

I don't run a mac, and .mac is way to small for what I'm looking for I've got a couple hundred gigs of data floating around on various machines that I'd like to 1) consolidate and 2) backup regularly somewhere. Setting up a box at work isn't really viable.

Does anyone know of any spesific places I could send a box? $100 a month is resonable.
posted by delmoi at 10:35 AM on February 27, 2005


Does anyone know of any spesific places I could send a box? $100 a month is resonable.

Rather than send the machine somewhere, find a colocation facility within driving distance (a google search on "colocation [NEAREST MAJOR CITY]" should yield results). This way if you decide you need additional space, or the motherboard craps out or whatever, you can go and fix it yourself as opposed to relying on whichever NOC monkey the facility has availible. Also, you don't lose your machine if the facility goes bankrupt.

$100 a month should net you rent for the rackspace, 1 to 5 IPs and probably about 100GB of transfer monthly.

As for rsync:

* First set up SSH host keys properly, so you can automate the process
* Then just set a cronjob that runs on your local machine:

rsync -avze ssh --delete /home/delmoi colocated.machine.com:/opt/backup

Now your directories will be sync'd up. The rsync manpage has more details and options like exclusion of files.
posted by cmonkey at 11:02 AM on February 27, 2005


Oh, and before you put the remote backup machine anywhere, run rsync with it on your local LAN so you're not doing a full dump remotely.
posted by cmonkey at 11:05 AM on February 27, 2005


Textdrive provide webDav access to offline space and are quite cheap. The great thing about the WebDav is Operating systems can see webDav folders as just other folders.

Textdrive
also provide Subversion repositories which could be used as a form of incremental backup. (Actually, people recommend that you don't use version control systems to manage backups, but there was an article which I can't find right now which talked about successfully using version control for backup)

For work, I use connected which provides nothing but online backup (windows). This is a supremely hassle free form of backup, and works well. 250MB costs $80.00 a year and 2 Gig of backup costs $15 a month.
(As noted, your big server would have to be a windows server)
posted by seanyboy at 11:14 AM on February 27, 2005


Yeah, I didn't post a complete solution 'cause I was too lazy at the time. Anyway, cmonkey kinda summed it up.

As for the firewall, look at the online doc for netfilter for how to setup the iptables rules you'll want. Basically, since this will be a backup machine only, you'll want to REJECT (nice to client) or DROP (makes client timeout) all incoming packets except those from the local interface (lo), established connections, and for ports of services you want to access (SSH = port 22 by default). Make sure you get the firewall rules worked out BEFORE you put the machine anywhere not easily reachable so you don't lock yourself out by mistake.
posted by thebabelfish at 11:31 AM on February 27, 2005


rsync to an offsite disk is nice. But what kind of bandwidth does it need, in practice? I've only got 256kbits/sec upstream.
posted by Nelson at 1:30 PM on February 27, 2005


That just means it will take longer to put the same data through. I think rsync has a nice compression option also - which deflates the data and then blows it back up to normal after it's transferred, which should help.

here's from the manual:

-z, --compress
With this option, rsync compresses any data from the files that
it sends to the destination machine. This option is useful on
slow links. The compression method used is the same method that
gzip uses.

Note this this option typically achieves better compression
ratios that can be achieved by using a compressing remote shell,
or a compressing transport, as it takes advantage of the
implicit information sent for matching data blocks.
posted by kjell at 2:04 PM on February 27, 2005


My version of rsync also has --bwlimit, to prevent you from blowing out your ADSL upstream. So we can compress and control, but is there enough bandwidth for me to actually sync my stuff? Guess it can't hurt to try.
posted by Nelson at 2:21 PM on February 27, 2005


« Older Where can I watch the Oscars online?   |   Pictograms for web application? Newer »
This thread is closed to new comments.