Could turning on disk compression actually speed up disk access?
November 25, 2006 1:27 AM Subscribe
Could turning on hard drive compression actually speed up access on my laptop?
I have an ultraportable laptop. It's got a decent processor and enough ram, but it has one of those tiny 80GB Toshiba iPod hard drives in it and the speed, both access and transfer is awful. (Plenty good for an iPod, horrible for real computing.)
I run a few applications that access the hard drive quite a bit, either loading or writing a decent amount of data. When I look at the task manager, my HD light is on solid while my CPU usage is around 15%. I'd, of course, love to speed up access.
So my thinking goes like this: If I click the "Compress drive to save disk space" it will put a little more data in every sector, so each seek and read should be a little more efficient. Since my CPU is near idle anyway it could use those extra cycles to process the compression.
So the question: Is this assumption true?
Are there other drawbacks to compressed NTFS volumes? (Like being more likely to go corrupt, harder to recover corrupted files, fragmentation, etc?)
(And I'll throw in an extra related Q: Is there a way to tell Windows XP Pro to try to fill up more real ram before it start paging?)
I have an ultraportable laptop. It's got a decent processor and enough ram, but it has one of those tiny 80GB Toshiba iPod hard drives in it and the speed, both access and transfer is awful. (Plenty good for an iPod, horrible for real computing.)
I run a few applications that access the hard drive quite a bit, either loading or writing a decent amount of data. When I look at the task manager, my HD light is on solid while my CPU usage is around 15%. I'd, of course, love to speed up access.
So my thinking goes like this: If I click the "Compress drive to save disk space" it will put a little more data in every sector, so each seek and read should be a little more efficient. Since my CPU is near idle anyway it could use those extra cycles to process the compression.
So the question: Is this assumption true?
Are there other drawbacks to compressed NTFS volumes? (Like being more likely to go corrupt, harder to recover corrupted files, fragmentation, etc?)
(And I'll throw in an extra related Q: Is there a way to tell Windows XP Pro to try to fill up more real ram before it start paging?)
The decompression/compression will just take up more time, even if your cpu is not straining.
Upgrade the memory and/or the harddrive.
posted by stavrogin at 2:53 AM on November 25, 2006
Upgrade the memory and/or the harddrive.
posted by stavrogin at 2:53 AM on November 25, 2006
Probably not, the amount of compression you will get for most things will be trivial. Applications themselves compress poorly, as do most things that come in large files, audio, video, pictures (well, these things do compress well, but that's usually built into the file type, e.g. an mp3 is already compressed).
It appears that 1.8 inch hard drives are only offered at 4200 RPM, so I doubt you'll be able to pick up a faster drive. Your best bet is to buy more memory.
Windows does use every bit of memory. This article describes it fairly well. The more memory you have, the more windows can keep off the drive, and the faster things will load.
posted by borkencode at 4:41 AM on November 25, 2006
It appears that 1.8 inch hard drives are only offered at 4200 RPM, so I doubt you'll be able to pick up a faster drive. Your best bet is to buy more memory.
Windows does use every bit of memory. This article describes it fairly well. The more memory you have, the more windows can keep off the drive, and the faster things will load.
posted by borkencode at 4:41 AM on November 25, 2006
I agree that it depends on what you have to compress. The hard drive being permanently on is something I used to suffer until I upgraded the RAM.
If your files can be heavily compressed you may notice a difference. I used to have a party trick (for computer geek parties) of keeping a 400MB text file on a floppy disk. It was basically a file full of spaces, but when it decompressed it gave the impression of very high data transfer rates. (It also worked well as a hard driver filler when combined with a batch file)
posted by tomble at 5:05 AM on November 25, 2006
If your files can be heavily compressed you may notice a difference. I used to have a party trick (for computer geek parties) of keeping a 400MB text file on a floppy disk. It was basically a file full of spaces, but when it decompressed it gave the impression of very high data transfer rates. (It also worked well as a hard driver filler when combined with a batch file)
posted by tomble at 5:05 AM on November 25, 2006
Best answer: I say it really depends. Seek times tend to dominate everything, especially at 4200 RPM, so if there is heavy fragmentation it won't really matter whether the files are compressed or not. A good defragmentation strategy is definitely called for, like the latest Diskeeper that runs constantly in the background but using only idle cycles.
And as already mentioned, most media files are not compressible. However, I strongly disagree with "[a]pplications themselves compress poorly", as you can easily get 25% - 75% compression on .exe and .dll files. There is a reason that things like UPX exist and are very effective.
I used to have a party trick (for computer geek parties) of keeping a 400MB text file on a floppy disk. It was basically a file full of spaces, but when it decompressed it gave the impression of very high data transfer rates.
Bah, that's nothing. Here is a zip file that is 42374 bytes in size but results in a total extracted filesize of 4,503,599,626,321,920 bytes. This is accomplished by nesting 16 zip files, each containing 16 zip files, etc. to a nesting depth of 5, with each leaf node containing a single 4.3GB file. Tip: Try sending this in email if you want to see if your email scanner is brain dead or not. Some email scanners that look for malware in attachments will try to recursively extract the contents of zip files to scan them, and if implemented naively, will attempt to extract all 4 petabytes. (But don't actually try this unless you admin the mail server and can deal with it blowing up.)
posted by Rhomboid at 8:36 AM on November 25, 2006 [3 favorites]
And as already mentioned, most media files are not compressible. However, I strongly disagree with "[a]pplications themselves compress poorly", as you can easily get 25% - 75% compression on .exe and .dll files. There is a reason that things like UPX exist and are very effective.
I used to have a party trick (for computer geek parties) of keeping a 400MB text file on a floppy disk. It was basically a file full of spaces, but when it decompressed it gave the impression of very high data transfer rates.
Bah, that's nothing. Here is a zip file that is 42374 bytes in size but results in a total extracted filesize of 4,503,599,626,321,920 bytes. This is accomplished by nesting 16 zip files, each containing 16 zip files, etc. to a nesting depth of 5, with each leaf node containing a single 4.3GB file. Tip: Try sending this in email if you want to see if your email scanner is brain dead or not. Some email scanners that look for malware in attachments will try to recursively extract the contents of zip files to scan them, and if implemented naively, will attempt to extract all 4 petabytes. (But don't actually try this unless you admin the mail server and can deal with it blowing up.)
posted by Rhomboid at 8:36 AM on November 25, 2006 [3 favorites]
No, it generally doesn't help much.
Seek time is 90% of I/O time. CPU is hundreds or thousands of times faster by comparison, so compression generally doesn't hurt much, either. On files where you compress enough to shrink N blocks to fewer-than-N blocks, then you lower the probability of "fragmentation" and get rid of some seek-time.
A good filesystem is the best way you can help, or more RAM if your OS allows for read caching of all files.
Linux + ext2/3 are very good about both. DOS/Windows + FAT is just about the worst. Windows NTFS is better.
posted by cmiller at 11:27 AM on November 25, 2006
Seek time is 90% of I/O time. CPU is hundreds or thousands of times faster by comparison, so compression generally doesn't hurt much, either. On files where you compress enough to shrink N blocks to fewer-than-N blocks, then you lower the probability of "fragmentation" and get rid of some seek-time.
A good filesystem is the best way you can help, or more RAM if your OS allows for read caching of all files.
Linux + ext2/3 are very good about both. DOS/Windows + FAT is just about the worst. Windows NTFS is better.
posted by cmiller at 11:27 AM on November 25, 2006
Response by poster: Just to follow up:
I should have mentioned I have a gig and a half of ram in my machine which is the maximum it can take, so buying more ram is out of the question.
And yeah, I can't change the drive because it's the best performing one in that form factor.
I'm also not using compressed media files on the drive for the most part.
I didn't think I needed to mention that changing my OS not an option. It has become very tiresome bit of propaganda.
Anyway...
Since no one said that compression was at all dangerous I tried it. Turns out it had virtually no effect on speed (looks like gains and losses canceled out). Though I now have around 20% more space on the drive, so I'll keep it.
I usually keep it pretty defragged, but running Diskeeper instead of Windows defrag actually sped access up around 15% Not spectacular but noticeable. So for now I'll run with that.
Still worth it to have a 2 pound laptop that has 6 hours of battery life.
posted by Ookseer at 9:40 AM on November 26, 2006
I should have mentioned I have a gig and a half of ram in my machine which is the maximum it can take, so buying more ram is out of the question.
And yeah, I can't change the drive because it's the best performing one in that form factor.
I'm also not using compressed media files on the drive for the most part.
I didn't think I needed to mention that changing my OS not an option. It has become very tiresome bit of propaganda.
Anyway...
Since no one said that compression was at all dangerous I tried it. Turns out it had virtually no effect on speed (looks like gains and losses canceled out). Though I now have around 20% more space on the drive, so I'll keep it.
I usually keep it pretty defragged, but running Diskeeper instead of Windows defrag actually sped access up around 15% Not spectacular but noticeable. So for now I'll run with that.
Still worth it to have a 2 pound laptop that has 6 hours of battery life.
posted by Ookseer at 9:40 AM on November 26, 2006
Another option is to get a high capacity usb flash drive. I do a lot of heavy audio on my studio pc and when I have too many parts running my 7200 rpm drives are not enough. I then bounce a few of the heavier tracks (many files, lots of edits, etc) onto a flash drive. Really helps get over a temporary performance issue, but capacity makes it only practical for a few parts, and its basically temporary storage. I would kill (but not pay) for a 16-32 gb internal flash drive.
posted by cmicali at 12:38 PM on November 26, 2006
posted by cmicali at 12:38 PM on November 26, 2006
I would kill (but not pay) for a 16-32 gb internal flash drive.
I don't see why you would think it would be that expensive. You can get a 8GB USB flash drive for $135, and CF cards are currently around $65 for 4GB. With a Linux or BSD system you could probably create a nice JBOD volume to span two or more of these into one volume, so you're looking at about $300 max for 16GB of solid state storage.
I wouldn't assume it would be tremendously faster though. Certainly flash wins on seek time, but current hard drives have higher sustained throughput. Therefore if your workload is highly sequential or streaming in nature it would not necessarily benefit.
posted by Rhomboid at 4:42 AM on November 27, 2006
I don't see why you would think it would be that expensive. You can get a 8GB USB flash drive for $135, and CF cards are currently around $65 for 4GB. With a Linux or BSD system you could probably create a nice JBOD volume to span two or more of these into one volume, so you're looking at about $300 max for 16GB of solid state storage.
I wouldn't assume it would be tremendously faster though. Certainly flash wins on seek time, but current hard drives have higher sustained throughput. Therefore if your workload is highly sequential or streaming in nature it would not necessarily benefit.
posted by Rhomboid at 4:42 AM on November 27, 2006
Oops, I linked to SD when I typed CF. But you get the idea.
posted by Rhomboid at 4:44 AM on November 27, 2006
posted by Rhomboid at 4:44 AM on November 27, 2006
This thread is closed to new comments.
You do get noticeable advantages if an app is written to read app-specific compressed files on disk that are layed out well. This is a key technique in making console games work right.
posted by Nelson at 1:40 AM on November 25, 2006