in search of utility, perl, or other script to organize over 2,000 files quickly
December 2, 2005 3:51 PM   Subscribe

I have 2,028 zip archives and a problem that should be trivial but isn't for me. If I unzip them all with a batch utility and don't clear the path saved in the archive I end up with 2,028 directories each containing a single file. If I do clear the saved path I end up with a mess. I need either a windows utility or a perl script to do the following:

take the mess and sort it into logical directories. Call the top level directory Books. Now, in the first scenario I would end up with
Books/
./melville, herman - bartleby the scrivener (lit)/Melville, Herman - Bartleby the Scrivenr.lit
./melville, herman - moby-dick or the white whale (illust) (lit)/Melville, Herman - Moby-Dick or The White Whale (Illust).lit
etc. etc. etc.

Or, if I were to do it the second way I'd just end up with 2,028 files in a single directory. What I want is a way to loop through all those files, create a director called melville, herman and move every file that has the words melville, herman (ignoring capitalziation) at the beginning of the file name into the new directory. I'm using windows but I have a linux partition. I try to avoid mounting NTFS drives in linux but I'll do it if someone has a perl script that'll get the job done.
Thanks
posted by Grod to Computers & Internet (6 answers total)
 
Response by poster: actually, I can't figure out how to make the mess so the scenario I first outlined is the one I have to work with. It's very annoying.
posted by Grod at 3:56 PM on December 2, 2005


Best answer: vBookSorter is an eBook sorter/librarian. Might be just what you want.
posted by evariste at 4:00 PM on December 2, 2005


If it was on Linux, you could use perl. Open each file, do a regular expression on text for the title information you wanted, and then use the "system" call to do 'mv' or 'mkdir' whatever kind of stuff you are doing ... er... but maybe vBookSorter might be an easier, less geeky option.
posted by Dag Maggot at 6:32 PM on December 2, 2005


CrayDrygu, that would just end up with all the files in a single directory, which he said in the post is not what he wanted. He could have accomplished that just by telling his unzip program not to create directories.

Dag Maggot, you don't need linux to use perl. You can run perl on Windows just as you would with linux. There are two ports of it, ActiveState and Cygwin. I prefer the latter since it gives you a full *nix environment, whereas AS perl tries to fit perl into the win32 environment.

You could try something like the following (untested):

find /inputpath -print0 | perl -e 'use File::Basename; $/ = "\000"; while(<>) { my ($d, $fn) = fileparse($_); if ($fn =~ m/^(\w)\s*,\s*(\w)/) { rename( "$d$fn", "/outputpath/$1, $2/$fn"); } }'

Note that /inputpath and /outputpath are the source and dest dirs.
posted by Rhomboid at 11:41 PM on December 2, 2005


And if you want better perl suggestions, flesh out your question to a complete specification and post on perlmonks.org. You should get a fair number of replies.
posted by Rhomboid at 11:45 PM on December 2, 2005


You could always use WinRAR. One of the context menu options allows you to extract multiple archive files into to a single folder. You would select all the files you want to extract, right click on one of the files in the selected list and select "Extract to <specified folder>".
posted by purephase at 7:00 AM on December 3, 2005


« Older Hard snowboard boots?   |   Where can I learn more about the dangers of... Newer »
This thread is closed to new comments.