XCOPY Command without Directory replicationNovember 29, 2005 4:04 PM   Subscribe

DOS-FU: Help me figure out the DOS command for moving some files.

I have a drive with multiple nested directories. Inside each directory are some text files "*.TXT", I want to move all of the files into one directory with one command. The file names are unique. I could write code to do this, but I know there's got to be a simple command for it. What is the command for doing this in MS-DOS?

"xcopy /e *.txt C:\foo" is close, but I don't want to replicate the directory structure, I just want all the files to be in C:\foo
posted by blue_beetle to Computers & Internet (20 answers total) 1 user marked this as a favorite

I don't think this is feasible in one command in DOS.

The easiest way to do it that comes to mind is:

copy *.txt c:\foo
copy *\*.txt c:\foo
copy *\*\*.txt c:\foo

etc., for as many level of nesting as there are. And as I don't have a DOS prompt at hand I can't guarantee that this would work.
posted by IshmaelGraves at 4:09 PM on November 29, 2005

Not entirely sure, but how about:

FOR /R C:\foo %f IN (*.txt) DO copy %f C:\bar
posted by sbutler at 4:14 PM on November 29, 2005

(that will only work under 2k/XP -- maybe NT -- btw)
posted by sbutler at 4:14 PM on November 29, 2005

You need the resource kit utilies. Forfiles is the easiest command.

FORFILES -P c:\targetdir -s -M *.TXT -c"cmd /c copy @file c:\destdir.

If you have a list of directories, you wrap the whole thing in a for loop.

Robocopy, alas, will move the directory information as well.
posted by eriko at 4:16 PM on November 29, 2005

For moving files there's always this...

move c:\windows\temp\*.* c:\temp - This would move the files of c:\windows\temp to the temp directory in root (...). [LINK]
posted by Zack_Replica at 4:17 PM on November 29, 2005

eriko and sbutler have it. An alternative method is to put everything into a zip file (directories and all), then extract it without specifying that the directory structure should be used (e.g., uncheck 'use folder names' in WinZip).

So, you could put the whole structure in a zip file, sort by type, pull out everything that wasn't a txt file, then do the extract-without-structure thing I mentioned. That'll work regardless of windows version and you don't have to install the resource kit.
posted by jedicus at 4:19 PM on November 29, 2005

Ah, sbutler remembered the /R variation of the FOR command. Better answer. (Though you might need %%F, not %F, if you are scripting it.) And, yes, it works under NT4, but in all cases, you need to run cmd.exe, not command.com
posted by eriko at 4:20 PM on November 29, 2005

I'm doing this from memory, so please bear with me. If my DOS-Fu is still up to par (no guarantees), xcopy c:\*.txt c:\foo /s /d /i should do it.

Be sure to post success/failure here.
posted by Grensgeval at 4:29 PM on November 29, 2005

but it looks like eriko and sbutler may have beat me.
posted by Grensgeval at 4:41 PM on November 29, 2005

The best thing to use for any heavy DOS lifting is:

XXCopy

Gathering files into one directory using XXCOPY
posted by meehawl at 6:36 PM on November 29, 2005

oh, btw, XXCopy free for non-commercial use, 32-bit, but 16-bit also available.

I use it for incremental network backups and cloning. It's awesome. The available switches are extensive.
posted by meehawl at 6:39 PM on November 29, 2005

Another option is to use 4NT or Take Command from J P Software. These products include a "global" command which will run the desired operator on all subdirectories within the specified directory. They also include additional switches on the operators themselves, including "copy" and "move".

These are excellent products but horrendously overpriced. J P has not learned the wisdom of the $19 registration fee. Use the shareware version for 30 days. posted by yclipse at 7:51 PM on November 29, 2005 If you have Winzip or some equivalent, just zip up the directory and all subdirectories (the default). When complete, unzip into another directory with "Use Folder Names" unselected. posted by rfs at 9:07 PM on November 29, 2005 In unix / linux / etc. it's easy: find /path -name "*.txt" -exec cp {} /newpath \;  Where /path is the path to the root directory of your text files, and /newpath is where you want them all to go. I believe you can download Cygwin for free here which (IIRC) contains a port of the GNU find command for Windows. If this is a one-time thing, you could also boot a Linux CD, mount your fs, and do it like that. Tho that might be a bit of an overkill. posted by ori at 11:03 PM on November 29, 2005 I definitely agree that you should install Cygwin and get a *real* shell and supporting tools, none of this CMD.EXE nastiness. I disagree that you should use "-exec cp {} /newpath" as above. That's going to be hideously inefficient if there are a lot of files, because it has to invoke cp once for each file, rather than just giving cp the list of files to copy. Under Cygwin especially this will be glacially slow because of fork() emulation. A first approximation of the solution to this would be: cp$(find /path -name \*.txt) /newpath

However, this will break if there are files with spaces in their name, or if there are too many files to fit on a single command line. You can then call in 'xargs' to solve both of these problems:

find /path -name \*.txt -print0 | xargs -0 sh -c 'mv "$@" /newpath' sh This solves all three problems: It will batch as many filenames into a single invocation of the command as possible, but it will invoke it multiple times if there are so many as to run into the maximum command line length. And it will work with filenames that contain space (or tabs, newlines, etc) in their names. posted by Rhomboid at 12:59 AM on November 30, 2005 (Hey Rhomboid, I'm new (one month) to Linux so I'm glad to be outdone in shell wizardy. I don't really understand how your second command works, tho. What does$@ expand to? How will xargs know to start a new invocation of mv? If you don't mind explaining, and if you don't want to derail the thread, you can email me in the address listed in my profile.)
posted by ori at 1:24 AM on November 30, 2005

sbutler had it- the for with the /R switch will recursively operate on the whole directory tree. The only changes are the /Y to ensure that if there IS a duplicate file name, it overwrites the file without pausing for human intervention. Also, using " " characters around wild cards can handle the problem of directories or files with spaces in their name.
for /R "C:\BARBAZ\" %f in ("*.txt") DO @copy "%f" C:\Foo\ /Y
Installing additional tool sets like Cygwin are nice if you're already more familiar with them, but the built in DOS/vbscript hodgepodge is pretty flexible for solving most basic automation needs, and certainly what the poster was looking for.
posted by hincandenza at 3:47 AM on November 30, 2005

$@ is a shell variable which essentially means "all the command line arguments of the current shell." You can find a more detailed explanation of it in the bash manpage, section "Special Parameters".$0 is the name of the current script or program, $1 is the first argument (and$2 is the second argument, etc), and $* and$@ are all the arguments (expanded as a single word or multiple words, respectively.)

What xargs ends up executing is something like: sh -c 'mv "$@" /newpath' sh file1 file2 file3 ... If you read the manpage for bash under the -c option, it explains that any options specified after the -c argument get put into the positional parameters, starting at$0. So this is why I added the "sh" at the end, just as a placeholder to put into $0 so that the first filename goes into$1, the second into $2, etc. This means that when bash goes to execute the command given by -c,$@ holds the list of files that xargs supplied.

Note that this -c business is only necessary because we need to /append/ a parameter to the end of the cp command - we want "cp file1 file2 file3 file3 /destdir". Normally when you use xargs you don't care about this, such as when deleting files: find /path -name \*.ext -print0 | xargs -0 rm for example to delete all files with extention ".ext" under /path. The stuff above about calling sh -c is only necessary to add the /destdir on the end.

The reason that xargs runs mv more than once if necessary is that it knows the limit of the longest command line possible for the system it is running on. It gets this value either by quering the system at runtime (by calling sysconf(_SC_ARG_MAX)) or by using a compiled-in default. It also has some heuristics to compensate for the size of the environment, since the limit is often on the combined argument+environment size. You can add the -S option to see status messages about how it is determining these sizes. You can also manually specify these limits on the maximum total argument length (-s) or total number of arguments (-n). Since the whole point of xargs is to overcome these limits, it will loop, executing batches of the given command as many times as necessary to process all the arguments given as input.
posted by Rhomboid at 4:19 AM on November 30, 2005

I like Unix as much as the next person, but compare and contrast:

Unix
find /path -name \*.txt -print0 | xargs -0 sh -c 'mv "\$@" /newpath' sh

XXCopy
XXCOPY C:\*.doc D:\mydocs\ /SR

Find all .doc files, flattens their subdirs, and concatenate the path info into the filename and add filename to the right. Solves the problem of similar filenames rather simply.

Or, if you wanted simply version-stamped filenames:
XXCOPY C:\*.doc D:\mydocs\ /SG

Find all .doc files, flattens their subdirs, and for identical files, stamp them in descending order of age, e.g.: