Maybe it's time to start backing up...
October 11, 2007 8:20 PM   Subscribe

Help me figure out the world of computer backups.

We're slowly moving our backups from "oh, I guess it's time to drag the my documents folder" to the external hard drive to something more sophisticated.

Ideally we'd like to make images of the people's laptops, and store them somewhere. I know that we can do this with various software, but to me this process becomes tedious. When you use something like Acronis or any other image backup software, you have to install it on each machine, and then physically tell it to back up.

Ideally, we'd like it to be done over the network. The workflow I envision is a small client size program that runs in the background of the target computer. The program checks in with a server daily, and every 2 weeks or so makes an incremental image of everything that's changed.

We're all connected through gigabit ethernet, and we're willing to commit the resources (i.e, $$) to making it work, but I haven't been able to find a program that does all this.

I've looked into Norton Ghost, but their website is a mess and they've segmented their product so many different ways that I think they're marketing department needs an overhaul.

Is Norton Ghost what I need? Is there a simpler, better, workflow? We want something that will minimize human involvement. Shared Network drives don't work for us because we want to backup whole systems, and not just user-chosen files.

We're willing to spend money (to a point...), and I know that hard drive backups aren't real backups, but once we have all the computer images onto the server, we'll back them up to tape.
posted by unexpected to Computers & Internet (23 answers total) 8 users marked this as a favorite
 
Retrospect?
posted by trevyn at 8:45 PM on October 11, 2007


Response by poster: Interesting. Do you know what version of Retrospect would provide this functionality? Their product is also market segmented to pieces.
posted by unexpected at 8:53 PM on October 11, 2007


I use and love SyncBack SE.

Its backups are file-level incremental... so unchanged files don't get sent across the network. It also gives you options for pure back up or synchronization (either 2-way or 1-way).

Once you get it set up, you never have to think about it again.
posted by toomuchpete at 9:42 PM on October 11, 2007


Retrospect 7.5 Multi-Server does this for me; I think it would would do exactly what you're looking for too. It does fully automated HD and tape snapshots, and cross-platform support is good too (we have a mixed Windows, Mac and Linux environment).

It's a bit of a pain that the server-side component is Windows Server only, but the first time you do a full hard drive restore when a hard drive dies, and it results in something you can boot into that looks exactly the way it was yesterday before the HD death, you will be amazed.

You'll need a big, big, big RAID-5 array for this.
posted by eschatfische at 9:45 PM on October 11, 2007


Response by poster: eschatfische...interesting.

We're estimating that we're going to have about 15 terabytes of data to backup. Does Retrospect have an option to talk to other "host machines" so that we can hold this much data?

I'm also looking at something called NovaNET by Novostor...anyone used that? Is Retrospect pretty much the gold standard?
posted by unexpected at 9:49 PM on October 11, 2007


Two more things to add:

SyncBack SE will also let you do your back-ups to an FTP server and purchasing one license allows you to use the program on up to five computers, which is really nice.
posted by toomuchpete at 9:50 PM on October 11, 2007


Look in to running rsync via cygwin. It works very well for me and my development laptop, but it might be a little more difficult in your situation.
posted by yellowbkpk at 10:00 PM on October 11, 2007


Oh, and rsync is open source and free. Once you set it up on one machine, it should be very easy to set up on other machines.
posted by yellowbkpk at 10:01 PM on October 11, 2007


Response by poster: toomuchpete, does SyncBack do images or does it only back up specific files?
posted by unexpected at 10:07 PM on October 11, 2007


It depends on your specific requirements, but most companies don't take images of everything, they standardize on a few particular configurations, and keep a couple of standard images. That way you only need to backup user data, and there's tons of options for that. A backup agent running on the clients pushes changed data to the backup server, which also controls the storing of data onto your array/filer/SAN/whatever. That's the way at least Legato, Tivoli Storage Manager, and Veritas Netbackup do it, and probably many others.

Regarding the "hard drive backups aren't real backups" part, read this.
posted by dhoe at 10:27 PM on October 11, 2007


Response by poster: See, the reason we want to take images is that users keep data in random places. It's obvious that you back up the My Documents and Desktop, but how do backup programs account for the user that stores files in C:\Program Files\Skype\Critical Files that omg I can't delete.
posted by unexpected at 10:33 PM on October 11, 2007


Microsoft's SyncToy is free (if you have windows), can do incremental backups, and can be scheduled. No drive imaging, though.
posted by alexei at 10:44 PM on October 11, 2007


I have successfully restored Windows boxes from bare metal using a whole-of-system backup made using the much-derided Windows Backup that ships with Windows. It does actually work. In fact it works better to network shares or backup disk drives than it does to tape. You can easily set it to do scheduled backups, including incremental ones if you want. The only downside is that all the backups go into a single .bkp file that only Windows Backup understands, and the user interface is a bit crap. But restoring individual files out of a .bkp is at least as easy as restoring them from an image. Windows Backup is a perfectly good first move away from dragging My Documents.
posted by flabdablet at 11:16 PM on October 11, 2007


Dealing with users who insist on keeping important stuff in places other than subfolders of My Documents is best done by simply creating company IT policy that says exactly which folders will get backed up and how often, and then inflicting the pain of random data loss on people who still refuse to move their stuff into those places. There is really very little point spending (easily) ten times as much as you need to on backup storage, only to fill 90% of it with identical copies of Windows and all your installed software.

At my school site, users have My Documents redirected to folders on the network server. They also have roaming profiles, so stuff that goes in Application Data also ends up on the server. The server is regularly backed up in two ways. One scheduled task runs Robocopy to mirror everything except in-use system files to an external drive every night (that drive is formatted NTFS with compression, has Volume Shadow Copy turned on, and is shared read-only on the network, allowing users to retrieve any of their last two months of nightly backups). A second scheduled task uses Windows Backup to do a full backup of the whole server every Sunday, onto a second external drive that I swap out and take home with me when I come in on the Monday.

Workstations don't get regular backups. For each make and model of workstation, I have a disk image (made with my own tools running on the Trinity Rescue Kit) that can plaster an image of the basic Windows install onto a machine in about five minutes; there's a domain-wide startup script that does unattended installs for any missing software (this takes almost an hour to run if it's all missing). So when a workstation dies, it costs me about ten minutes work to fix.

I'm pretty happy with this setup.
posted by flabdablet at 11:38 PM on October 11, 2007 [1 favorite]


Response by poster: flabdablet,

That's a good setup, and I can see how it works for you at a school- where you don't really care about their data, their data is their business, if they lose it's their problem.

However, if a user loses data on their laptop, it quickly becomes my problem, with my ass on the line. I wish I could inflict random user data loss to users, but I value my job too highly to do that ;-).

Right now I'm leaning towards pitching Retrospect, but can anyone tell me how it handles a lot of storage? Can you have multiple slaves off of one master server to store to, or do you need separate licenses for each one.
posted by unexpected at 6:12 AM on October 12, 2007


We're estimating that we're going to have about 15 terabytes of data to backup. Does Retrospect have an option to talk to other "host machines" so that we can hold this much data?

Yep. Our main storage array is an external iSCSI array on a private network; you can also mount NASes via SMB/Windows File Sharing.
posted by eschatfische at 7:00 AM on October 12, 2007


I want to second dhoe here. Maintaining images is a huge undertaking and the laws of diminishing returns kick in here. Instead of backing up a gig or so per user youre going to backup 10-20 gigs. On top of it, if your users have media files on there like music youre going to have massive backups.

Instead you should target a folder (My Documents or whatever) or map a server share for them to use. You'll have managable backups then. It'll be easier for you to hand those backups for off-site holding too.
posted by damn dirty ape at 8:01 AM on October 12, 2007


You are planing on keeping some backups off-site arent you? A lot of good backups do when your office catches on fire.
posted by damn dirty ape at 8:30 AM on October 12, 2007


Response by poster: I've already expounded on why we don't want to target just a specific folder. This is our current practice and it works terribly for us. YMMV.

Ideally, we'd like to keep off-site backups....but baby steps. I've noticed that you can spend $200 on backup or $200,000. You just got to pick something that fits your needs :-)
posted by unexpected at 9:18 AM on October 12, 2007


For your software, try Second Copy.
Runs in the background as often as you schedule it, looks for any new/changed files, copies them to some other place. There's probably something newer/sexier/cheaper, but this works.

What you use as your "other place" is up to you, and depends on how much and how often you want to back up. Try something from the Snap! line of network-attached storage devices.

I'd reconsider your ideas about having to be able to instantly restore any of the machines from an image. Disk images are machine specific in Windows - if the machine itself is lost/stolen/broken, you can't just slap that image on some other computer (Macs can do this, though). Also, you're never going to be able to protect yourself from other people's "mistakes".

The reason there are servers and network-attached storage devices and policies like flabdablet's is that IT professionals know that there is no way to manage every user on every workstation all the time. What you have to manage is the data they work with. Redirecting all users to save to a single server/storage device both in office policy and software settings means that you are very, but only, responsible for maintaining, backing up, and providing access to that shared device. If an individual computer goes down, it can be replaced easily, since all it needs is access to the shared resources. If you think you can live a life that involves caring for a dozen or more individual local C: drives, and anything that a user might do to them, you're going to be miserable and full of hate until you burn out and quit.
posted by bartleby at 5:27 PM on October 12, 2007


However, if a user loses data on their laptop, it quickly becomes my problem, with my ass on the line. I wish I could inflict random user data loss to users, but I value my job too highly to do that ;-)

That's why the first step I recommended was creation of company IT policy. This needs to be signed off on from above, and it needs to articulate clearly who is responsible for the integrity of data in which locations.

It is absolutely unfeasible for any one person to be responsible for preventing any data loss on a fleet of laptops. Those laptop users need to be responsible for data they leave lying around in places other than those that company IT policy says that you, as sysadmin, are responsible for.

As a good sysadmin, it would of course be sensible of you to come up with standard ways of installing software on those laptops such that by default, what you install puts its stuff in places that you can and do back up. But you can't control everything your laptop users do, and if somebody installs something on their own laptop and/or keeps important data in a location that's not within your safety zone, then company policy needs to say that loss of that information puts their arse on the line, not yours.

If your company insists on you taking full responsibility for potential data loss that really isn't within your control, then the company has given you a psychologically impossible task, and that's an OH&S issue.

If you want to pursue the imaging road regardless of all the above, then what I would recommend is supplying each laptop user with an external USB2 hard drive containing a minimal Linux distro on the boot partition. Booting off this drive would automatically run a script that makes a bit-identical copy of all the laptop's hard drive partitions onto similarly-sized partitions on the external drive, then shuts the computer down. For a typical laptop hard disk, this should take about half an hour to run. Train your users how to boot their machines off these things and get them doing that routinely - maybe before leaving the office at night, or just before going to lunch, or whenever.

Keep a little fleet of these things offsite and regularly swap them over.

A bare-metal restore onto another identical laptop then becomes a very quick and easy thing to do, as does restoring individual files (just plug the external drive in and access the mirrored partitions as ordinary Windows drives). I think you'd find it very very unlikely that a given laptop and its most recent backup drive would both fail at once, and almost inconceivable that it and all its backup drives fail at once. Drives cost less than tape cartridges per gigabyte these days, and are also more reliable, so you can buy more of them and get more redundancy for a given budget.

If that sounds appealing, you shouldn't have much trouble finding a friendly local Linux geek to set you up with a suitable distro and the required scripts.
posted by flabdablet at 6:23 PM on October 12, 2007


I was fired from a job because an attorney's laptop went belly-up. Even though the machine was causing him problems, and I'd pestered him for weeks to let me have the machine for two days to back it up, wipe the machine, and rebuild it, he couldn't find the time. Then, when the machine died and I spent a week rebuilding it for him, he pressured management into firing me.

All that prelude is leading up to the fact that, as a law firm, we had made the policy to keep documents on the server (easily check out to travelling laptop users) and make the users responsible for all other laptop data (including Outlook archive and PST files). We gave them the tools and the training to keep their data safe, but it was up to them to utilize it. Doing anything else was officially considered an unreasonable burden on the miniscule IT department. It was a policy that worked well for most of the nine years I was there and is still in place today, over a year after my firing.
posted by lhauser at 12:46 PM on October 13, 2007


Oy, oy, oy. Sounds more to me like you were fired from a job because you were working for wilfully ignorant pricks.

If you find yourself working for people like that again, be aware that the mere existence of policy won't necessarily protect you. Policy is good underpants, but you will need to put them on by yourself.

When the attorney with the flaky laptop refused to let you fix it, did you immediately go to management and make sure that this fact was noted in writing?

Also, did you give this attorney the option of borrowing the machine for one hour just to make a disk image?
posted by flabdablet at 7:09 PM on October 13, 2007


« Older WAP/Mobile Road Conditions Site for MO/KC Metro...   |   I'd really rather just move to France, but c'est... Newer »
This thread is closed to new comments.