Creating custom files
November 20, 2008 11:20 AM Subscribe
How do I create a "custom file" in Windows XP?
There's this biiig file I want to download. I have a slow and unreliable net connection. I have two sources: a torrent offering low speed, and links to a one-click hoster for a split multi-part version of the same file(.001, .002..etc).
Problem: I don't have a premium membership at the hoster so if a download gets interrupted I have to redownload that file from the start, costing me time and marginal bandwidth cost, and no guarantee of success on the retry.
Potential solution: I attempt to download as much as I can from the HTTP hoster, saving completed & even truncated downloads. I then create a "custom file" of bytecount equal to my target file. I "paste in" the data from my attempted downloads, after calculating the offsets. I then load the torrent, point it to the custom file, and it downloads the remaining pieces which fail the hash check. I have successfully done this last part earlier but with a single non-split file.
How do I go about doing this?
P.S. if there's an easier solution, I'd like to know of it, but I'm interested in knowing if the above can be done.
There's this biiig file I want to download. I have a slow and unreliable net connection. I have two sources: a torrent offering low speed, and links to a one-click hoster for a split multi-part version of the same file(.001, .002..etc).
Problem: I don't have a premium membership at the hoster so if a download gets interrupted I have to redownload that file from the start, costing me time and marginal bandwidth cost, and no guarantee of success on the retry.
Potential solution: I attempt to download as much as I can from the HTTP hoster, saving completed & even truncated downloads. I then create a "custom file" of bytecount equal to my target file. I "paste in" the data from my attempted downloads, after calculating the offsets. I then load the torrent, point it to the custom file, and it downloads the remaining pieces which fail the hash check. I have successfully done this last part earlier but with a single non-split file.
How do I go about doing this?
P.S. if there's an easier solution, I'd like to know of it, but I'm interested in knowing if the above can be done.
wget resumes incomplete downloads
this also
posted by East Manitoba Regional Junior Kabaddi Champion '94 at 11:43 AM on November 20, 2008
this also
posted by East Manitoba Regional Junior Kabaddi Champion '94 at 11:43 AM on November 20, 2008
Is firefox's download manager not working? You should be able to resume downloads. Maybe you should try down-them-all. Or is it that the website that hosts the files isn't going to cooperate with such a plan? Your scheme seems rather awkward and prone to failure. If, as dunkadunc says, the files you're downloading have been compressed, there will be no way to reproduce the original simply by concatenating them.
posted by I_pity_the_fool at 11:52 AM on November 20, 2008
posted by I_pity_the_fool at 11:52 AM on November 20, 2008
Response by poster: To clarify the matter: I have two sources for the same file. One source just happens to offers it in parts (not compressed further or repackaged)
I want to download file.iso, which is, say, 8249247543 bytes (random no.). The torrent is for file.iso. The HTTP links are for file.iso.001, file.iso.002, file.iso.003....etc
Suppose, I manage to successfully download 001, get half of 002, 2/3rds of 003, all of 004 and so on. I then join 001, 002, leave requisite number of bytes for remainder of 002, join 003...etc. Then I load this unified file in the torrent and continue thereon.
With a bit of trial and error, I should be able to slipstream the parts properly, but I need to know if the facility's available.
Doing the whole thing via torrent is always on, but doing 35% via torrent is way better than doing 100% via torrent.
On preview: no, the hoster doesn't allow resume. If it did, there would be no need for this scheme.
posted by Gyan at 11:58 AM on November 20, 2008
I want to download file.iso, which is, say, 8249247543 bytes (random no.). The torrent is for file.iso. The HTTP links are for file.iso.001, file.iso.002, file.iso.003....etc
Suppose, I manage to successfully download 001, get half of 002, 2/3rds of 003, all of 004 and so on. I then join 001, 002, leave requisite number of bytes for remainder of 002, join 003...etc. Then I load this unified file in the torrent and continue thereon.
With a bit of trial and error, I should be able to slipstream the parts properly, but I need to know if the facility's available.
Doing the whole thing via torrent is always on, but doing 35% via torrent is way better than doing 100% via torrent.
On preview: no, the hoster doesn't allow resume. If it did, there would be no need for this scheme.
posted by Gyan at 11:58 AM on November 20, 2008
This isn't how files work. The functionality you're looking for (which I highly doubt exists) would need to be in the BitTorrent client itself - it's the BitTorrent client that keeps track of which parts of the file have been downloaded.
posted by 0xFCAF at 12:05 PM on November 20, 2008
posted by 0xFCAF at 12:05 PM on November 20, 2008
mge has pointed out to me that this functionality does, in fact, exist - you can tell your client to re-check a file and it will compare the hashes of the blocks with the known-good version. We're talking offline on coming up with a way of merging the .00X files. Are they all of uniform size?
posted by 0xFCAF at 12:15 PM on November 20, 2008
posted by 0xFCAF at 12:15 PM on November 20, 2008
Response by poster: All except the last piece are identically sized.
posted by Gyan at 12:21 PM on November 20, 2008
posted by Gyan at 12:21 PM on November 20, 2008
Response by poster: OK, if I can create a file of arbitrary size, I think I have this thing licked.
posted by Gyan at 12:23 PM on November 20, 2008
posted by Gyan at 12:23 PM on November 20, 2008
Okay, here's the procedure as best as I can figure:
0. Start your BitTorrent client, make it create the placeholder file (I'm assuming you've already done this)
1. Figure out how much space (in bytes) is between the end of each extant .00X files and the next .00X file
2. Make files of exactly that size. I don't know of a program that you'd have by default that would do this, but any simple hex editor should be able to create a file of arbitrary size pretty easily.
3. From a command line:
copy /b file.001+padding_001+file.002+padding_002+file.004+padding_003... output.iso
4. Replace the file created in step 0 with the one from step 3. Note that you'll lose whatever data you downloaded from BitTorrent - if you want to avoid this, you'll need to create the padding files using the corresponding data from the file in step 0 (presumably with the hex editor)
5. Have your BitTorrent client 'check' the file - it should detect that the sections from the .00X files match the actual thing and mark them as downloaded
posted by 0xFCAF at 12:30 PM on November 20, 2008
0. Start your BitTorrent client, make it create the placeholder file (I'm assuming you've already done this)
1. Figure out how much space (in bytes) is between the end of each extant .00X files and the next .00X file
2. Make files of exactly that size. I don't know of a program that you'd have by default that would do this, but any simple hex editor should be able to create a file of arbitrary size pretty easily.
3. From a command line:
copy /b file.001+padding_001+file.002+padding_002+file.004+padding_003... output.iso
4. Replace the file created in step 0 with the one from step 3. Note that you'll lose whatever data you downloaded from BitTorrent - if you want to avoid this, you'll need to create the padding files using the corresponding data from the file in step 0 (presumably with the hex editor)
5. Have your BitTorrent client 'check' the file - it should detect that the sections from the .00X files match the actual thing and mark them as downloaded
posted by 0xFCAF at 12:30 PM on November 20, 2008
Response by poster: OK, it seems fsutil bundled with XP alongwith a file joiner like FFSJ should do the trick.
On preview: Thanks, 0xFCAF. I should note that your steps 0 & 4 are not required.
posted by Gyan at 12:33 PM on November 20, 2008
On preview: Thanks, 0xFCAF. I should note that your steps 0 & 4 are not required.
posted by Gyan at 12:33 PM on November 20, 2008
We're talking offline on coming up with a way of merging the .00X files. Are they all of uniform size?
My guess is that this will not work at all. As dunkadunc said, the .00X files are probably part of a multi-part rar archive. The BitTorrent client is expecting chunks of the unrared iso.
posted by burnmp3s at 12:36 PM on November 20, 2008
My guess is that this will not work at all. As dunkadunc said, the .00X files are probably part of a multi-part rar archive. The BitTorrent client is expecting chunks of the unrared iso.
posted by burnmp3s at 12:36 PM on November 20, 2008
Install cygwin so you can use linux commands.
cat file01 file02 file03 > file
Cat is short for concatenate. It reads each file and outputs them in the order you specified. The > sends the output to a file rather than printing it to the screen.
Note that this probably isn't the best way if you're a windows user.
posted by valadil at 1:08 PM on November 20, 2008
cat file01 file02 file03 > file
Cat is short for concatenate. It reads each file and outputs them in the order you specified. The > sends the output to a file rather than printing it to the screen.
Note that this probably isn't the best way if you're a windows user.
posted by valadil at 1:08 PM on November 20, 2008
I've done something like this, roughly following 0xFCAF's method (but I think I remember only needing to make a single empty padding file, and just specififying it multiple times in the "copy /b" command). It worked perfectly, with BitTorrent filling in the empty parts.
posted by zsazsa at 1:09 PM on November 20, 2008
posted by zsazsa at 1:09 PM on November 20, 2008
Oh, wait. I just realized something. You don't even need to make an empty padding file. If the files are the same size, just copy one of the other parts multiple times in place of the missing segments. Bittorrent will simply recognize the chunks for that file as not matching the proper checksum and will replace them. Say if you have file.000, file.002, file.003, and file.006, use file.000 to take the place of the missing .001, 004, and 005 chunks. So, for the "copy /b"method, use this command:
copy /b file.000+file.000+file.002+file.003+file.000+file.000+file.006 file.iso
posted by zsazsa at 1:14 PM on November 20, 2008
copy /b file.000+file.000+file.002+file.003+file.000+file.000+file.006 file.iso
posted by zsazsa at 1:14 PM on November 20, 2008
Also, if you want to do a quick check of whether or not 0xFCAF's method will work, download part of one of the .00X files, rename it to the .iso name, and (possibly) pad it out to the correct length. Then drop it into your BitTorrent download folder and force a recheck. If it shows up as 0%, then it's not going to work, if it shows up as >0%, then it's going to work.
posted by burnmp3s at 1:16 PM on November 20, 2008
posted by burnmp3s at 1:16 PM on November 20, 2008
My guess is that this will not work at all. As dunkadunc said, the .00X files are probably part of a multi-part rar archive. The BitTorrent client is expecting chunks of the unrared iso.
Second this. The 00x (most likely rar) files are not just slices of an ISO. Its part of an archive which splits files its own way and uses its own compression on top of it. ISO also uses its own format and its own compression. Combining these files with some kind of binary join will probably result in failure and waste of time.
P.S. if there's an easier solution, I'd like to know of it, but I'm interested in knowing if the above can be done.
Unrelable net connection? Take a usb drive to the library or nearest free wifi place.
posted by damn dirty ape at 2:19 PM on November 20, 2008
Second this. The 00x (most likely rar) files are not just slices of an ISO. Its part of an archive which splits files its own way and uses its own compression on top of it. ISO also uses its own format and its own compression. Combining these files with some kind of binary join will probably result in failure and waste of time.
P.S. if there's an easier solution, I'd like to know of it, but I'm interested in knowing if the above can be done.
Unrelable net connection? Take a usb drive to the library or nearest free wifi place.
posted by damn dirty ape at 2:19 PM on November 20, 2008
Install cygwin so you can use proper file-handling commands and a proper shell. Then, assuming the intended size of your input files is 10485760 (10 megabytes) and that you have nine of them:
posted by flabdablet at 3:00 PM on November 20, 2008
target=/path/to/output/file partsize=10485760 for ((i=1; i<=9; i++)) do source="$(printf /path/to/input/files.%03d $i)" dd if="$source" of="$target" ibs=1M obs=$partsize seek=$((i-1)) done
posted by flabdablet at 3:00 PM on November 20, 2008
They told me the <pre> tag was fixed, dammit. And I believed them! Fool me twice, you can't get fooled again...
Be that as it may: as long as the size you put in partsize= is correct, the commands above will build the file you've asked for, even if some or all of your input files are too short. The output file will be fragmented, though, because it will be created as a sparse file with no disk sectors allocated in the gaps between the end of one too-short input file and the start of the next one.
Simplest way to avoid that is to create a zero-filled, full-size output file to begin with, which you should be able to do with
where you substitute the desired size of your output file in megabytes for N.
posted by flabdablet at 3:16 PM on November 20, 2008
Be that as it may: as long as the size you put in partsize= is correct, the commands above will build the file you've asked for, even if some or all of your input files are too short. The output file will be fragmented, though, because it will be created as a sparse file with no disk sectors allocated in the gaps between the end of one too-short input file and the start of the next one.
Simplest way to avoid that is to create a zero-filled, full-size output file to begin with, which you should be able to do with
dd if=/dev/zero of=/path/to/output/file bs=1M count=N
where you substitute the desired size of your output file in megabytes for N.
posted by flabdablet at 3:16 PM on November 20, 2008
Oh yeah - if you do use the pre-filled initial output file, you should add
to the dd command inside the for loop, so that it won't truncate the output file as it's writing the input file pieces into it.
posted by flabdablet at 3:25 PM on November 20, 2008
conv=notrunc
to the dd command inside the for loop, so that it won't truncate the output file as it's writing the input file pieces into it.
posted by flabdablet at 3:25 PM on November 20, 2008
Why not just download from the one-click hoster at your local library or coffee shop? (These places probably block BitTorrent). Seems like it would be a lot less painful.
posted by cnc at 8:08 PM on November 20, 2008
posted by cnc at 8:08 PM on November 20, 2008
Response by poster: a)I can't believe this. As I've stated throughout, except for the splitting, both files are identical. There's no further packaging or compression.
b)I'm in India. The coffeeshop has a worse connection than me :), and there are no public libraries here, offering internet access.
posted by Gyan at 9:50 PM on November 20, 2008
b)I'm in India. The coffeeshop has a worse connection than me :), and there are no public libraries here, offering internet access.
posted by Gyan at 9:50 PM on November 20, 2008
a) then the dd commands I listed will do exactly what you want
b) even in India.
posted by flabdablet at 11:22 PM on November 20, 2008
b) even in India.
posted by flabdablet at 11:22 PM on November 20, 2008
Arranging the multi-part portions in place in a copy of the full file:
1. Create blank copies of each incomplete part, the size of the complete part.
2. Use dd with notrunc to put the downloaded part into the start of the appropriate copy. If you have the complete part, copy it into place.
3. "Cat" all the pieces together.
You can then copy this file into place & let BitTorrent complete it.
If you've already gotten some of the file with BitTorrent, then you might try to merge the BT file with the other. I've done this sort of thing with a script involving xdd, but I don't have that script at hand & I'm not sure if Cygwin offers that binary.
Merging the two might also be doable given the right client, or, hm, running two clients, one with each version of the file & letting them share completed parts?
(As an aside, no, ISO is not a compressed format, & depending on the contents, compression might not help much.)
posted by Pronoiac at 10:51 PM on November 21, 2008
1. Create blank copies of each incomplete part, the size of the complete part.
2. Use dd with notrunc to put the downloaded part into the start of the appropriate copy. If you have the complete part, copy it into place.
3. "Cat" all the pieces together.
You can then copy this file into place & let BitTorrent complete it.
If you've already gotten some of the file with BitTorrent, then you might try to merge the BT file with the other. I've done this sort of thing with a script involving xdd, but I don't have that script at hand & I'm not sure if Cygwin offers that binary.
Merging the two might also be doable given the right client, or, hm, running two clients, one with each version of the file & letting them share completed parts?
(As an aside, no, ISO is not a compressed format, & depending on the contents, compression might not help much.)
posted by Pronoiac at 10:51 PM on November 21, 2008
« Older New earbuds near Times Square please! | Belize is cloudy... Where else should I go in... Newer »
This thread is closed to new comments.
I don't see any way you could do this and have it work out: The torrent might not even be a split RAR, and there really isn't any way that you could properly splice everything together, byte for byte.
Re-download the whole thing using Bittorrent. At least you know that way that you won't lose what you've already downloaded.
posted by dunkadunc at 11:31 AM on November 20, 2008