windows file manager for folders with hundreds of millions of files
January 11, 2007 6:41 AM   Subscribe

Can anyone recommend a file manager for windows that can handle directories with hundreds of millions of folders/files?
posted by yeoz to Computers & Internet (6 answers total) 1 user marked this as a favorite
 
I would expect that EVERYTHING you could do with that would be slow. To get an idea of the maximum speed it could possibly be to load the directory contents, from a cmd prompt (Windows -> Run -> cmd.exe) type "dir /b full_path_to_directory > NUL"

I think that scripts and commandline tools are going to be the way to go here.
posted by aubilenon at 9:06 AM on January 11, 2007


Another thing to consider is that the filesystem itself probably performs poorly when you have millions of files in a directory.
posted by chrchr at 9:07 AM on January 11, 2007


Best answer: Scripts are the way to go. Most interactive tools (including dir) will attempt to sort the directory before listing it, and that kinda screws you right there. The "/b" in aubilenon's post prevents the sorting so it's a good starting benchmark. GUI interactive tools might be even worse as lots of them like to animate actions which can make any action too slow to be practical.
posted by chairface at 10:54 AM on January 11, 2007


I doubt you'll find anything that can deal with it efficiently. Like aubilenon said, if you want any utility out of it (i.e. sorting), the sorting is generally done on the application side, which means sorting hundreds of millions of records AND keeping them in memory so it knows what to display next when you scroll. So any solution will be (a) an enormous memory hog and (b) slow to start up.

Please don't tell my why you want to do this, my head hurts already.
posted by chundo at 12:34 PM on January 11, 2007


as a side note i had a folder with 1200 images in it everytime an app wrote to that folder my computer hung for a good couple fo seconds.

so i would hate to have to work in a folder that big!!

maybe booting the drive in a linux machine could be a solution?
posted by moochoo at 1:13 PM on January 11, 2007


I had some folders with tens of thousands of image files (nowhere near hundreds of millions, but they did slow Windows Explorer noticeably).

I was frustrated enough with the slowness that I wrote a little command-line program to split the files into subdirectories of files each (where is a command-line option).

It doesn't pay any attention to folders, so it may not do what you need it to do, but I'd be happy to send it to you (with the usual disclaimers -- it was written in a half-hour and I can't guarantee it won't cause problems).

If you're interested, email scott(AT){DOT}com and I can send it to you.

It does read a list of all files in the target directory first, so there may be memory issues if you truly have hundreds of millions of files...

posted by scottnic at 3:40 PM on January 11, 2007


« Older fresh air from the flue   |   Why has my garlic gone blue? Newer »
This thread is closed to new comments.