Browser download without a Content-Length header?
July 20, 2006 9:47 AM   Subscribe

How do I get the browser to prompt a user to save a huge file when I don't know it's size in advance?

I'm generating a large file on the fly and streaming it to the user's browser. When the download starts I want them to be prompted to save the file. Usually you can just set the Content-Length header to make this happen but because I'm building the file as it's transmitted, I don't know the length ahead of time.

When Content-Length isn't set, the behavior of both IE and Firefox is to download the entire file, then prompt the user to save it. This means that the browser sits there a long time spinning it's little spinner and probably making the user think nothing is happening.

Currently I'm setting Content-Type and Content-Disposition. If I do Content-Length="2147483647" (eg. 232, the save prompt comes up properly but says it expects 2GB of data and will take 100s of hours to complete.

I know I've seen Firefox send a file to the download manager without knowing it's size. How do you do it?
posted by joegester to Technology (9 answers total)
 
You could try (in php):

header("Content-Transfer-Encoding: binary");
header("Content-Type: " . $mimetype);
header("Content-Disposition: attachment; filename=\"" . $name .
"\";" );
posted by null terminated at 10:42 AM on July 20, 2006


I missed you saying you were already using content-type and content-disposition. Why can't you determine the size of the generated file?
posted by null terminated at 10:46 AM on July 20, 2006


Best answer: You should be able to use a variation of that code.
The code forcing the client to open a save as dialog is this:
header("Content-type: application/octet-stream\n");
So normally, even if you don't know the size of the file before hand, the dialog box should open and close as soon as the download is finished.
posted by McSly at 11:01 AM on July 20, 2006


Response by poster: null terminated: The file is basically just the output of a database query in CSV format. Potentially, it could be several hundred megabytes. So it's too big to entirely accumulate in memory. Dumping it into a temp file takes awhile and creates more disk traffic than would be ideal. The best solution is to just stream the results right out to the user as they come in from the database.

McSly: "Content-Type: octect-stream" causes the server to generate the entire output, then send it. It seems like that's the right way to do things though. I guess I just need to to fuss with the server to make it not do that.
posted by joegester at 11:21 AM on July 20, 2006


Response by poster: A quick server patch to stop it from computing the file size and everything is hunky-dory.
posted by joegester at 11:39 AM on July 20, 2006


Is it possible to provide Content-Length with a reasonable estimate? If you could ballpark the total file size, that would be better than just providing MAXINT and no more complex.
posted by cortex at 11:39 AM on July 20, 2006


Oh. Well, never mind.
posted by cortex at 11:40 AM on July 20, 2006


Response by poster: Cortex: I considered that actually. There's no easy way to do it because I don't even know how many records there will be. It could be 25 or it could be 250000. Plus, if the estimate is too low, the browser will truncate the results.
posted by joegester at 12:32 PM on July 20, 2006


Sorta O/T:

I'd recommend attempting to zip/gzip the file to make the download a lot quicker. If you're just sending text output, you can get a killer compression rate and your users will thank you.
posted by hatsix at 4:11 PM on July 20, 2006


« Older Boffin hats on chaps, what the hell is this about?   |   Which light best portrays an accurate depiction of... Newer »
This thread is closed to new comments.