PHP Uploads Result in Corrupt Files
March 5, 2008 1:35 PM   Subscribe

PHP gurus unite! Tell me why my PHP uploads always result in corrupt files and what I can do to fix it!

I've installed PHPWEBFTP on my website. I followed all of the installation instructions and can connect to my FTP server through the script. I can even upload and download files. The only problem is that any file that is either uploaded or downloaded via this script becomes corrupt. What gives?

This script is hosted on a dedicated server owned by The Planet .

You can try it yourself at https://store.gkmachine.com/secureftp2/

user: meta@gkmachine.com
pw: meta

For kicks, I uploaded the exact same script to another webserver (owned by Dotster) and it worked flawlessly. Uploads/downloads were not corrupted if I accessed the script on this webserver.

This leads me to believe that it's a setting in either php.conf or httpd.conf on The Planet server - endless searches have left me puzzled and frustrated.

Any help is much appreciated. Thanks!
posted by charlesroper to Technology (28 answers total)
 
Can you check the output of phpinfo and see what the differences are between the two servers? If you haven't used it before, there's a phpinfo page on PHP.net with the syntax for making a simple wee text file that'll do it.
posted by bcwinters at 1:40 PM on March 5, 2008


Response by poster: That was my first thought - i actually printed off both phpinfo pages and then changed "The Planet" php.ini to match the one on the Dotster server. So both php.ini should be the same now. After a restart of apache, uploads/downloads are still corrupt.
posted by charlesroper at 1:44 PM on March 5, 2008


Are both phpinfo pages the same after you changed php.ini? It's possible for a server php.ini that you don't control to override yours, so you might want to check again.

Also, use a text file to upload. How exactly is it corrupted?
posted by rhizome at 1:56 PM on March 5, 2008


Are you sure it's not a problem of size. If you are trying to upload a big file you need to configure php to accept it. If not the file will be truncated. See some information here.
posted by McSly at 1:59 PM on March 5, 2008


I should precise that the default max size is just 2 MB so anything bigger will truncated.
posted by McSly at 2:00 PM on March 5, 2008


Response by poster: Thanks for all of your responses.

Both phpinfo pages are the same after I changed php.ini

I should have mentioned this earlier - the ONLY files that don't get corrupted are plain text files. If you try the script, you'll notice that there is any option to switch between binary/ASCII. I don't mess with that at all since, on the Dotster server, my uploads work just fine without clicking that.

I uploaded a gif file and compared the original to the "post transfer" one, and they look about the same in a text editor - the characters just seemed to be moved around (i.e., added/removed spaces, line breaks, etc.). It's as if it doesn't get pieced together right. You're free to try it out with a small image of your own and compare the files - maybe you can see something I don't.

Concerning size - I'm only uploading a 15 KB file so that should be within all of the limits.

I look forward to any other suggestions.
posted by charlesroper at 2:10 PM on March 5, 2008


Response by poster: For reference - here are the two phpinfo files i'm comparing

PHPINFO #1: (the script does not work properly on this website) - https://store.gkmachine.com/secureftp2/phpinfo.php

PHPINFO #2: (the script works flawlessly on this site) -
http://vi-pak.com/phpinfo.php

What do you think?
posted by charlesroper at 2:46 PM on March 5, 2008


Sounds like a mimetype issue... (?)
Looks like vi-pak is running mime_mod_magic and mod_mime, gk is only running mod_mime.
posted by misterbrandt at 2:49 PM on March 5, 2008


Response by poster: @misterbrandt: how would I fix that?
posted by charlesroper at 2:57 PM on March 5, 2008


Response by poster: I'm in the process of installing mime_mod_magic. I'll let you know how it turns out - stay tuned!
posted by charlesroper at 3:03 PM on March 5, 2008


On preview: yeah, do that.
posted by misterbrandt at 3:07 PM on March 5, 2008


Response by poster: Unfortunately, that didn't solve the problem - actually, now our main website is malfunctioning:
http://gkmachine.com

Any help?
posted by charlesroper at 3:29 PM on March 5, 2008


Response by poster: Apparently installing mime_mod_magic interfered with our php includes. Fixed that without having to uninstall mime_mod_magic.

Back to the upload issue: any other ideas?
posted by charlesroper at 3:49 PM on March 5, 2008


not helpfule in this situation but net2ftp works great for me and it is hph based also
posted by DJWeezy at 4:50 PM on March 5, 2008


Have you compared Apache configs on the two machines? I'd consider looking at the values for AddDefaultCharset and DefaultType, for starters.
posted by so at 6:58 PM on March 5, 2008


Response by poster: I looked at both of those Apache values and they're the same. I've tried net2ftp on my server, and it works great - i just don't like the interface. I'm afraid that it may be too confusing for some of my clients - as compared to PHPWEBFTP.

I've been working on this for 12 hours straight and still haven't found a solution. Time to give up I guess.

May just have to settle for net2ftp. Ack.
posted by charlesroper at 9:10 PM on March 5, 2008


I did a binary comparison of a file up/downloaded. The problem is that it's adding an extra character (0+000D -> chr(13) -> the "carriage return") to chunks. In all cases, the lines terminated in a 0+000C ("line feed"), but the corrupted versions have the spurious carriage return as well. This is a typical problem with Windows->Unix conversions. Windows uses CR+LF for end-of-line, while Unix only uses LF (read more about it here).

I believe the file is getting corrupted on upload, but as to the reason why... I don't know. It could be an Apache configuration issue, or even something as remote as switching the default content type from UTF-8 to ISO-8859-1 (in the index.php file). This is just a shot in the dark, though.
posted by Civil_Disobedient at 10:41 PM on March 5, 2008 [1 favorite]


Response by poster: @Civil_Disobedient: Thanks for your input. I'll look through the index.php, but I'm really not sure what to look for. If you have the time, you're welcome to download the script yourself at PHPWEBFTP and look through the index.php file to see what you think should be changed.

I just don't know if it's the index.php file because BOTH servers I've tried it on are linux servers - so you'd think if it were in the index file, the error would happen on both.

I appreciate the time you spent on this already. Thanks!
posted by charlesroper at 8:47 AM on March 6, 2008


I wonder if forcing binary mode in the fopen's in include/ftp.class.php might be all that's required. These occur at lines 130, 133, 249, and 252. Just change 130 from this:
$fp = fopen($destination . $file, "a+");
to this:
$fp = fopen($destination . $file, "a+b");
and the same for the others. Read more here.
posted by so at 9:10 AM on March 6, 2008


Response by poster: @so: Good thought. I just made your recommended changes but still having the same problems.

Like Civil_Disobedient was saying, there's a good chance it's something with the windows->unix conversion. I just don't know what to change.

For what it's worth, this problem happens on both the upload AND the download.

I uploaded a known working file through an FTP client (cuteFTP) and then downloaded it through this script, and it was still corrupt.

I then uploaded a known working file through the script and downloaded through FTP client, and the file was corrupted again.

So it appears that the problem occurs on ANY sort of transfer through this script on this server.
posted by charlesroper at 9:44 AM on March 6, 2008


Yes, I went through the manual upload/scripted download and found corruption, too.

It looks like the script authors might have a little conceptual error in index.php. Try changin line 545 from this:

$data = readfile($zipfile);

to this:

$data = file($zipfile);

Or else comment-out the following loop around 'echo'. 'readfile' jams the file into the output buffer already, so the 'echo' is sending extraneous data. The method as written will be a serious memory pig for large files, and will probably fail.
posted by so at 10:24 AM on March 6, 2008


Response by poster: @so: Made your change to line 545 but still having corruption problems with this script.
posted by charlesroper at 10:36 AM on March 6, 2008


OK, I'm done. Sorry for the runaround. That whole 'readfile' section seems wrong, though. But then I can't explain why it would ever work, either. Sorry again, and good luck!
posted by so at 11:00 AM on March 6, 2008


Response by poster: No apologies needed - I appreciate EVERYONE'S advice on this matter. Thanks for your help!
posted by charlesroper at 11:46 AM on March 6, 2008


So it appears that the problem occurs on ANY sort of transfer through this script on this server.

Hmm. That's very strange. One would think that the developers have used the system long enough that they would have noticed corrupted binary file transfers. If you're running Apache, I would start looking into .htaccess parameters (or lack thereof) that might be the source of trouble. Something so obvious would have been discovered a long time back if it were a simple PHP error, so my bet is a website configuration problem.
posted by Civil_Disobedient at 5:54 PM on March 6, 2008


Hey, sorry to beat a dead horse, but I'm pretty sure we brushed up against the culprit earlier. In index.php, there's this construct starting at line 545:

$data = readfile($zipfile);
$i=0;
while ($data[$i] != "")
{
echo $data[$i];
$i++;
}

The problem is that readfile places the contents of $zipfile into the output buffer and returns the number of bytes read (http://us.php.net/manual/en/function.readfile.php). This means several things:
  1. the file is already in the output buffer, so the 'echo' isn't required (and, in fact, doesn't even run because of #2)
  2. $data doesn't contain the contents of the file, but the size of the file as an int; thus, the while loop never even runs since $data[0] is null
Also, it's not entirely clear that 'readfile' is binary-safe in all 4.x versions of PHP. So, maybe the entire block above could be simply rewritten like this:

$data = file_get_contents($zipfile);
echo $data;

and be done with it. Please note that this is still kinda quick-and-dirty, and might get ugly for large files.
posted by so at 8:09 PM on March 6, 2008


Response by poster: @so: thanks for the idea. I tried it, but still having the same problems. I released all of my frustration by just d/l net2ftp and customizing it to look and feel as close to phpwebftp as possible.

It has its quirks too, but at least it does the one thing it was designed to do: share files.

So, for now, I guess I'll just have to settle for that. Thanks for your help - I really appreciate it!
posted by charlesroper at 12:46 AM on March 7, 2008


I just want to say that I have this very same problem. PhpWebFTP transferred files result corrupted, while Net2FTP ones do not.
posted by lion at 10:51 AM on December 2, 2008


« Older How do I transport an animal out of BC and into AK...   |   Advice on Hungarian down comforter Newer »
This thread is closed to new comments.