WTF is filling up my hard drive?
November 23, 2019 12:24 PM Subscribe
My 2018 macbook pro's disk is apparently full. But I can only find files that comprise a fraction of the total when I look at everything on the hard drive. Where else can I look? Any thoughts on what could be driving it? What am I missing? (screenshots)
Note: I would STRONGLY prefer not to install any third-party apps to clean this up, unless it's literally my only choice.
Note: I would STRONGLY prefer not to install any third-party apps to clean this up, unless it's literally my only choice.
This happened to me, and I found it was Time Machine.
posted by gregoreo at 12:29 PM on November 23, 2019 [1 favorite]
posted by gregoreo at 12:29 PM on November 23, 2019 [1 favorite]
Your resistance to third party utilities is futile. Apple doesn't have a utility that makes pinpointing space hogs easy (and to be fair, Microsoft doesn't either). Disk Inventory X is a good, free option if you're running a version of OS X that pre-dates Catalina (I just tried it on Catalina, and it doesn't work). Otherwise, DaisyDisk (has a free trial, can be obtained as part of a subscription to SetApp, or purchased for $10 USD) is excellent for identifying space hogs for removal.
You can also use the free utility, Monolingual, to remove several gigabytes of data related to languages that you will never use. I don't know if this works on OS X Catalina, though.
posted by BrandonW at 12:35 PM on November 23, 2019 [2 favorites]
You can also use the free utility, Monolingual, to remove several gigabytes of data related to languages that you will never use. I don't know if this works on OS X Catalina, though.
posted by BrandonW at 12:35 PM on November 23, 2019 [2 favorites]
Do you use Time Machine with an external hard drive, and have you plugged that drive in and done a backup recently? Time Machine will basically keep deleted files "alive" until connect your backup drive and run a backup. That can chew up a lot of space.
Otherwise, seconding Disk Inventory X to track it down (and I hope they update it for Catalina soon).
posted by parm at 12:46 PM on November 23, 2019
Otherwise, seconding Disk Inventory X to track it down (and I hope they update it for Catalina soon).
posted by parm at 12:46 PM on November 23, 2019
The (free, reliable) third party software Grand Perspective is great for visualizing what's taking up disk space.
posted by tapir-whorf at 12:46 PM on November 23, 2019 [4 favorites]
posted by tapir-whorf at 12:46 PM on November 23, 2019 [4 favorites]
What does the legend you've not included in your screenshot identify the large gray portion as?
posted by humboldt32 at 12:52 PM on November 23, 2019
posted by humboldt32 at 12:52 PM on November 23, 2019
Photos/iPhoto "Trash" files, iTunes backups, and Lightroom caches would not be part of the 413 GB "Other" section shown in the OP's screenshot.
I have seen this exact symptom when the document versioning system flips out. There was a period of time where some apps (notably Sketch) were autosaving after almost every keystroke, keeping endless previous versions of files, and failing to purge old versions properly. The invisible folder that stores autosaves/versioning (./DocumentRevisions-V100) is one of the items that is part of the "Other" category, as are Local Snapshots (the Time Machine feature gregoreo mentioned).
It isn't really possible to fix this without either installing 3rd party software (like Sketch Cache Cleaner or a similar utility for other apps) or diving into Terminal (as in this StackExchange explanation, which I wouldn't recommend to someone who isn't 100% comfortable on the command line).
posted by bcwinters at 12:53 PM on November 23, 2019 [1 favorite]
I have seen this exact symptom when the document versioning system flips out. There was a period of time where some apps (notably Sketch) were autosaving after almost every keystroke, keeping endless previous versions of files, and failing to purge old versions properly. The invisible folder that stores autosaves/versioning (./DocumentRevisions-V100) is one of the items that is part of the "Other" category, as are Local Snapshots (the Time Machine feature gregoreo mentioned).
It isn't really possible to fix this without either installing 3rd party software (like Sketch Cache Cleaner or a similar utility for other apps) or diving into Terminal (as in this StackExchange explanation, which I wouldn't recommend to someone who isn't 100% comfortable on the command line).
posted by bcwinters at 12:53 PM on November 23, 2019 [1 favorite]
Response by poster: Sorry, should have added a few more details:
-I don't use time machine & never have on this computer
-This is a work-owned computer, so I'm well aware that there may be additional things installed by my company that I can't see - though I do have full admin capabilities on the laptop, so is that even possible that they'd be completely obscured from view?
-I'm resistant to third-party apps not because I think they're not useful, but more because I'd have to run through a whole permissioning process to see if I could install them, and it's a massive pain. My company's tech support could theoretically help, but there's a whole call center process involved and it's often much faster & easier to DIY if at all possible. We do not have any sort of onsite IT person, unfortunately.
posted by mosst at 12:58 PM on November 23, 2019
-I don't use time machine & never have on this computer
-This is a work-owned computer, so I'm well aware that there may be additional things installed by my company that I can't see - though I do have full admin capabilities on the laptop, so is that even possible that they'd be completely obscured from view?
-I'm resistant to third-party apps not because I think they're not useful, but more because I'd have to run through a whole permissioning process to see if I could install them, and it's a massive pain. My company's tech support could theoretically help, but there's a whole call center process involved and it's often much faster & easier to DIY if at all possible. We do not have any sort of onsite IT person, unfortunately.
posted by mosst at 12:58 PM on November 23, 2019
Response by poster: And the large grey portion is, helpfully, "other".
posted by mosst at 1:00 PM on November 23, 2019
posted by mosst at 1:00 PM on November 23, 2019
Best answer: Try opening a Terminal window and pasting in the following command line:
After entering the password, du will rattle your hard disk while appearing to do nothing for some time before the next command prompt appears. Once that's happened, find the system /tmp folder in the Finder and open report.txt with Textedit. You should see a list of every directory (folder) on your whole system, each one prefixed by a number showing the disk usage of all the files it contains in KiB (1024-byte) units, with the biggest ones sorted to the top.
Next, back in the Terminal window, type
Never did trust those fancy newfangled GUI tools for this class of inquiry.
posted by flabdablet at 1:02 PM on November 23, 2019 [34 favorites]
sudo du -k / | sort -rn >/tmp/report.txtSudo will first ask for your admin password. Type it in and press Enter; it won't echo anything at all for each character you type.
After entering the password, du will rattle your hard disk while appearing to do nothing for some time before the next command prompt appears. Once that's happened, find the system /tmp folder in the Finder and open report.txt with Textedit. You should see a list of every directory (folder) on your whole system, each one prefixed by a number showing the disk usage of all the files it contains in KiB (1024-byte) units, with the biggest ones sorted to the top.
Next, back in the Terminal window, type
df
and press Enter. This should show you actual amounts of space used on each in-use filesystem, and you should be able to match those up with the amounts that du
showed you for some of the larger folders.Never did trust those fancy newfangled GUI tools for this class of inquiry.
posted by flabdablet at 1:02 PM on November 23, 2019 [34 favorites]
Response by poster: Success thanks to flabdablet! Looks like I have 300+ gb of core dumps, for some reason (I /have/ crashed my computer a not-insignificant number of times recently). My understanding is that they are safe to delete, so I went ahead and did so, fingers crossed I didn't screw anything up further!
posted by mosst at 1:17 PM on November 23, 2019 [8 favorites]
posted by mosst at 1:17 PM on November 23, 2019 [8 favorites]
Best answer: If you have core files (and if Mac is sufficiently unix/linux-like, which I think it is) you can probably use the "file" command on the core files to tell which program made them, since if they accumulated before, they probably will again. It would be a big clue as to why this is happening, if that's something you care about (you might not).
Example from linux, but should work similarly:
posted by smcameron at 6:16 AM on November 24, 2019 [1 favorite]
Example from linux, but should work similarly:
$ file core core: ELF 64-bit LSB core file x86-64, version 1 (SYSV), SVR4-style, from './testcore'In the above example the program that made the core is "testcore"
posted by smcameron at 6:16 AM on November 24, 2019 [1 favorite]
Best answer: On macOS, core files are stored in /cores. If you aren't debugging programs using lldb, you can safely delete any files you find in that directory.
See StackExchange and Stack Overflow for details of the ulimit command, which is what enables and disables coredumps. VirtualBox has a comment in their documentation which may prove helpful as well.
posted by blob at 8:01 AM on November 24, 2019 [2 favorites]
See StackExchange and Stack Overflow for details of the ulimit command, which is what enables and disables coredumps. VirtualBox has a comment in their documentation which may prove helpful as well.
posted by blob at 8:01 AM on November 24, 2019 [2 favorites]
A neighbor had a similar issue and after getting in touch with a high level Apple tech found out QuickTime was recording in the background. In case you need another lead.
The solution was:
posted by terrapin at 12:16 PM on November 24, 2019
The solution was:
sudo du -shx /Users/yourusernamehere/* | grep $'G\t' | sort -rn | head -n 10
posted by terrapin at 12:16 PM on November 24, 2019
Best answer: That's essentially the same solution as I already recommended, only fancier and faster and therefore more likely to miss stuff. It would not have found core dumps in /cores, for example.
When you're doing this kind of work, simple and slow and thorough is good; fancy and fast and assumption-making, not so much.
For those still following along at home, the similarities and differences between
Both forms rely on the du command to generate the disk usage information we're interested in. However, the first form runs it against the root of the entire Unix file hierarchy, named /, while the second runs it against a list consisting of all files and folders immediately contained within the user's own home folder.
Both forms use sudo to make du run with administrative privileges rather than those of the user account whose Terminal it's launched from. The first form does this so that du will be able to see inside folders owned by the system or other user accounts, in order not to miss anything. Not sure why the second form would need to do that, since everything inside /Users/yourusername/ would normally be completely readable to the yourusername account even without sudo's help.
du produces one line of text output for each file or folder it reports on. Each output line consists of a number representing the amount of disk space actually used by the file, then a tab character, then the pathname of the file, then a newline. For folders, the amount of disk space used includes that used by all files and folders inside the folder as well as by the folder itself.
The k option passed to du in the first form makes all the disk usage numbers appear as a pure number based on a consistent size unit, the kibibyte (1024 bytes). The second form uses the alternative h (for "human-readable") option, which makes the numbers appear with a B, K, M, G, T or P suffix for bytes, kibibytes, mebibytes, gibibytes, tebibytes and pebibytes respectively. Note that the typical software-enginerd insistence on misusing SI unit symbols to denote IEC binary multiples means that the numbers are not particularly human-meaningful, especially for the larger units.
The other du options that the second form includes and the first doesn't are x, which makes du ignore filesystems on secondary devices like USB sticks that might have been mounted on folders inside the ones it's looking at, and s, which makes it not report explicitly on the sizes of folders inside those it's being specifically asked about, instead producing only a summary usage number for each such folder. The overall effect is that the second form produces one line of du output for each file and folder contained immediately inside the user's home folder, and doesn't descend into subfolders to produce detailed usage output for their subfolders.
Next, it filters that output through grep, using a filter that matches an uppercase G followed by a tab character. The filter will pass only those lines that contain a usage number with a G (gibibyte) suffix.
Both forms pass their disk usage output through sort, using options that make it sort that output into descending order by the first number found in each line, which will be the usage numbers that begin them.
Finally, the second form further filters the sorted output through head, which cuts off output delivered back into the Terminal window after the first 10 lines.
The first form, by contrast, produces a massive flood of unfiltered though sorted disk usage information that would be overwhelming and useless if simply allowed to scroll furiously by in the Terminal window. To prevent it from doing that, the > redirection operator diverts all of that output into /tmp/report.txt. From there it can be examined at leisure, to any desired degree of depth, via Textedit or less or any other convenient text viewing tool.
So the second form makes several assumptions, all of which I think are questionable given what we're actually trying to find out with these commands. The biggest one is, of course, that the space-consuming files we're trying to find will be contained in the user's own home folder; in fact there's no way to know that in advance. But I'm not super-fond of the idea that quickly generated, heavily filtered information packaged nicely for easy digestion is always more desirable than way-too-much information I might have to wait a minute or two for, either. If I wanted to be left completely in the dark for fear of information overload, I'd have reached for a GUI tool in the first place.
posted by flabdablet at 8:41 PM on November 24, 2019 [3 favorites]
When you're doing this kind of work, simple and slow and thorough is good; fancy and fast and assumption-making, not so much.
For those still following along at home, the similarities and differences between
sudo du -k / | sort -rn >/tmp/report.txtand
sudo du -shx /Users/yourusernamehere/* | grep $'G\t' | sort -rn | head -n 10are as follows:
Both forms rely on the du command to generate the disk usage information we're interested in. However, the first form runs it against the root of the entire Unix file hierarchy, named /, while the second runs it against a list consisting of all files and folders immediately contained within the user's own home folder.
Both forms use sudo to make du run with administrative privileges rather than those of the user account whose Terminal it's launched from. The first form does this so that du will be able to see inside folders owned by the system or other user accounts, in order not to miss anything. Not sure why the second form would need to do that, since everything inside /Users/yourusername/ would normally be completely readable to the yourusername account even without sudo's help.
du produces one line of text output for each file or folder it reports on. Each output line consists of a number representing the amount of disk space actually used by the file, then a tab character, then the pathname of the file, then a newline. For folders, the amount of disk space used includes that used by all files and folders inside the folder as well as by the folder itself.
The k option passed to du in the first form makes all the disk usage numbers appear as a pure number based on a consistent size unit, the kibibyte (1024 bytes). The second form uses the alternative h (for "human-readable") option, which makes the numbers appear with a B, K, M, G, T or P suffix for bytes, kibibytes, mebibytes, gibibytes, tebibytes and pebibytes respectively. Note that the typical software-enginerd insistence on misusing SI unit symbols to denote IEC binary multiples means that the numbers are not particularly human-meaningful, especially for the larger units.
The other du options that the second form includes and the first doesn't are x, which makes du ignore filesystems on secondary devices like USB sticks that might have been mounted on folders inside the ones it's looking at, and s, which makes it not report explicitly on the sizes of folders inside those it's being specifically asked about, instead producing only a summary usage number for each such folder. The overall effect is that the second form produces one line of du output for each file and folder contained immediately inside the user's home folder, and doesn't descend into subfolders to produce detailed usage output for their subfolders.
Next, it filters that output through grep, using a filter that matches an uppercase G followed by a tab character. The filter will pass only those lines that contain a usage number with a G (gibibyte) suffix.
Both forms pass their disk usage output through sort, using options that make it sort that output into descending order by the first number found in each line, which will be the usage numbers that begin them.
Finally, the second form further filters the sorted output through head, which cuts off output delivered back into the Terminal window after the first 10 lines.
The first form, by contrast, produces a massive flood of unfiltered though sorted disk usage information that would be overwhelming and useless if simply allowed to scroll furiously by in the Terminal window. To prevent it from doing that, the > redirection operator diverts all of that output into /tmp/report.txt. From there it can be examined at leisure, to any desired degree of depth, via Textedit or less or any other convenient text viewing tool.
So the second form makes several assumptions, all of which I think are questionable given what we're actually trying to find out with these commands. The biggest one is, of course, that the space-consuming files we're trying to find will be contained in the user's own home folder; in fact there's no way to know that in advance. But I'm not super-fond of the idea that quickly generated, heavily filtered information packaged nicely for easy digestion is always more desirable than way-too-much information I might have to wait a minute or two for, either. If I wanted to be left completely in the dark for fear of information overload, I'd have reached for a GUI tool in the first place.
posted by flabdablet at 8:41 PM on November 24, 2019 [3 favorites]
« Older "I just can't see it." Obliviousness and division... | I will crush you like an insect Newer »
This thread is closed to new comments.
posted by sindark at 12:28 PM on November 23, 2019