Comprehensive hard drive imaging
April 6, 2009 1:52 PM   Subscribe

Is there a fairly comprehensive method I can use to image a hard-drive quickly, in case I end up needing that backup image later?

Essentially it breaks down like this: I'd like to open a small repair shop, and I'm confident that on a regular customer's computer the chances of me accidentally borking something important whilst attempting a fix, are low. All the same, I've been using some drawn out and inefficient backup procedures to ensure that if everything goes to hell, I can get everything back to roughly the same place it was when they brought in their computer.

My question is, is there a "best" way to do this? I want to be able to take a customer's machine (PC or Mac) and quickly image the whole drive, so even if I just spend an hour deleting random files for fun, I can go back and restore the state of the drive as it was when they came in. It can be a hardware solution, a software program, anything.

Thanks in advance for any and all suggestions.
posted by geodave to Computers & Internet (13 answers total) 8 users marked this as a favorite
 
For Mac, you want Super Duper! or Carbon Copy Cloner. Though if you don't know that, query whether you are qualified to open a small repair shop.
posted by raf at 1:56 PM on April 6, 2009


Best answer: Use the old unix util, dd, it copies at the block level, perfect for imaging any os.

Just boot the cd, identify the drive you want to backup eg /dev/sda and then "dd if=/dev/sda of=/backups/sda_machinex.dd.img

Reverse it to restore, boot knoppix "dd if=/backups/sda_machinex.dd.img of=/dev/sda"
posted by zentrification at 2:02 PM on April 6, 2009 [2 favorites]


Response by poster: I had been using Super Duper for mac drives, but I was hoping for a single workstation that would accomplish no other function than imaging customer drives, so every computer that comes through gets processed before work begins. My hope was to get a single setup that will work for any operating system, and dd is perfect.

Thanks z.
posted by geodave at 2:11 PM on April 6, 2009


The fastest and most reliable way to do this is to setup a centralized netboot server that would allow you to load a custom repair os (for mac or pc), image off the contents of the drive without having to boot from the OS disk at all, over the network to a centralized server, preferably one with a good storage array and it's own tape backup solution.

Repair shops I worked with in the past used to offer that as a flat rate Backup charge to the customer ($90), on top of their hourly charge. In some cases if it was just a hardware repair, and the user had their own data backed up, they would just wave it, but more often than not that would come back to bite them. Usually an OEM warrantied drive repair doesn't cover restoring the old data, so if you were to send a macbook back to Apple decides to replace the drive, you don't get the old one back, and they wont backup your data.

DeployStudio is a solid framework developed for the Mac side of things, there are lightweight USB/CDROM boot discs floating around on the internet that work for PCs and include a network stack so you can connect to a fileserver to backup the drive, but because there is such a huge variety of hardware there may not be as simple a solution as there is for the Mac.

And I may suggest you may want to spend some time working in an existing repair shop first, to get an idea of how they track work orders, computer status, customer records, billing, etc. It looks easy when its just a small handful of machines you are doing in the part time, but when you start looking at the numbers you have to bring in on a monthly basis to pay rent, utilities, power, and to pay for some support equipment, etc. you will have a volume of work that would necessitate a central database to manage the work order inventory. Nothing will kill a repair shops rep faster than a machine walking off or a customer losing their data.
posted by mrzarquon at 2:11 PM on April 6, 2009 [1 favorite]


Using dd you can make it go faster by increasing the blocksize parameter. So for example:

"dd if=/dev/sda of=/backups/sda_machinex.dd.img bs=256K"

should be much faster. The downside is that errors in one block can mess up the whole 256K block, so you need to keep an eye out for error messages to be sure it copied correctly. In fact, you need to do that anyway.

There is also dcfldd which will take one or more hashes at the same time the copy is made. You can then also hash the original drive to make sure the copy was bit-for-bit correct.
posted by procrastination at 2:25 PM on April 6, 2009


FWIW, DeployStudio can also be used to create your images for the PC (Windows). Options are not as robust as for the Mac side, but it works. OTOH, it was developed in French originally so the documentation can be lacking, though it's been getting better. And free.
posted by jmd82 at 2:31 PM on April 6, 2009


Ah, I didn't quite understand that you wanted one platform-independent solution, that makes more sense.
posted by raf at 2:48 PM on April 6, 2009


Quickly? Not DD. Acronis and ghost are much faster, but cost money. Not to mention dd will sit there and copy empty parts of the disk. 500gig disk with 100 gigs of data == a 500gig backup. With ghost, acronis, or DXML it just copes the 100gigs. Acronic and DXML will use the VSS service so you dont need to boot from a boot disc. Just do it while XP or better is running.

Want free? Try DriveImageXML.
posted by damn dirty ape at 3:03 PM on April 6, 2009


I've been using Macrium Reflect on Vista, and it works great. You can run it without rebooting, and you can explore the contents of the image file to get at specific files. Actually, I can only attest to the fact that it backs up well -- I've never needed it to restore.
posted by Simon Barclay at 4:33 PM on April 6, 2009


Best answer: Instead of using dd, consider GNU ddrescue. Does essentially the same job as dd, but faster by default since there's no need to increase the default buffer size, gives you a progress report as it goes, and handles bad sectors intelligently: skips them on first pass, then comes back to them and rereads the hell out of them trying for a good copy. Also generates a log file showing you exactly where any bad sectors are. Also, being a Linux tool, runs on anything. Included in the Trinity Rescue Kit live CD. Installable into Debian or Ubuntu with sudo apt-get install gddrescue.

Like dd, it makes absolutely no assumptions about the data it's copying; specifically, it doesn't care what (if any) filesystem(s) are in use. That does make it slower than filesystem-aware tools like Acronis and ntfsclone and DriveImageXML and whatnot, but also means you can rely on it to restore a customer's drive to the exact block-for-block state of breakage it was in when you first started fiddling with it, or even restore that exact state of breakage onto an entirely new drive if replacing the original is required.
posted by flabdablet's sock puppet at 4:45 PM on April 6, 2009


1- I like Ghost, and have used it for many, many years for this kind of thing. Couldn't be simpler.
2- You might want to check into the legality/ethics of storing images of customers' PCs. Even if it's a fresh install with no data present, it might not be kosher to have all that stuff laying around.
3- For that scenario, you might want to look into a backup solution that eliminates duplicate files in the images that you store. I know Windows Home Server has that capability, as well as the capibility to do incremental backups and to reimage a PC on demand. So the technology exists, but I doubt that's the right product for your scenario.
posted by gjc at 7:42 PM on April 6, 2009


Meh meh compression, gunzip the output of your dd command, saves about the same space. dd if=/dev/sda | gunzip sda.img.dd.gz and to uncompress gunzip -c sda.img.dd.gz | dd of=/dev/sda
posted by zentrification at 4:15 PM on April 7, 2009


I think the people recommending filesystem-aware backup solutions are missing an important point. Customer machines arrive at repair shops in assorted states of bustedness. Sometimes they have corrupted filesystems and/or busted partition tables, and will require the use of data recovery tools to fix. Sometimes data recovery tools can break things even worse than they were broken originally; sometimes you need to use several, and each of them will want to see the disk in its original broken state, rather than as "fixed" by a previous tool.

Having a complete block-for-block backup allows you to go back to square 1 and start again. A backup made using a filesystem-aware copier like ntfsclone or True Image will only get you copies of the blocks that the filesystem marked as Used; a file-based copier like Ghost will only get you copies of intact files. Neither of these is much use for rolling back a partially successful data recovery session.

Compression is a bit of a mixed blessing. Even with a fairly snappy CPU, making a compressed image is a lot slower than making an uncompressed one, and the resulting image is harder to loop-mount. In fact, for maximum flexibility it's probably best to make uncompressed images directly to another whole disk rather than to a file in a backup filesystem (compressed or not). 1.5TB drives cost $130 at newegg. Owning as many of those as there are customer jobs in the pipeline is not going to break the repair shop bank.

Anybody who does need a filesystem-aware whole-disk-to-compressed-files imaging solution is welcome to play with the save-image and load-image scripts I wrote for my own use. I use them with the Trinity Rescue Kit and Ubuntu, and they suit me very well.
posted by flabdablet's sock puppet at 5:03 AM on April 8, 2009 [1 favorite]


« Older How do I do taxes when I haven't worked all year?   |   Mitigating gadget theft at work. Newer »
This thread is closed to new comments.