I need an alternative to FTP, SCP, SFTP
August 18, 2006 11:25 AM   Subscribe

I have, for various reasons, a need an open-source and user-friendly technology to replace FTP & SFTP. With today's technology, am I expecting the impossible?

I am running a home server and hosting people for free, but file transfers are giving me headaches, and all the googling I could do so far has not helped (more than two years trying to diagnose what's up with connections dropping, users being locked out, daemons not restarting, etc, with various software, linux distributions, hardware, etc).

It could be a *really* good and secure ajax web interface for users to manage files, or it could be a different protocol, or anything that comes to your mind that would be suitable for all the operations needed for a typical webmaster.

What is important:
* no "ghost" connections blocking the user from accessing again, dropped connections without explanation (I think these are my #1 problems)
* lock the user into his directory (my big gripe. SFTP is just a real pain in the arse to use chroot jails, whereas FTP mostly "just works" in that aspect)
* decent speed (I have 3mbits/800kbits of bandwidth, I'd like to maximize that)
* works on Linux
* not be a totally insecure piece of orangutan feces / a cheap PHP upload script

What would be nice (secondary objectives):
* ease of use on the client side. Must work for Linux and Windows
* works on Debian GNU/Linux or Ubuntu
* allows a user database; I mean not having to create a real "system user" for each user that needs an account.
* encryption
* could handle a frontal slashdotting (just kidding ;)

Hmm... That's about it, I tried to not make that question too long. If you need more info, just ask ;).
posted by a007r to Computers & Internet (22 answers total)
 
It seems strange that you're having such problems with your FTP servers. What server programs are you using? These are fairly old and reliable technologies by now.
posted by odinsdream at 11:49 AM on August 18, 2006


VSFTPD is probably the best regular FTP server.

It sounds to me like your problems may be due to the "home server" part as much as anything. Home routers are generally crappy. Cable internet services tend to put arbitrary restrictions such as bandwidth-throttling on their home users, especially users which are apparently running servers.
posted by jellicle at 11:54 AM on August 18, 2006


What about file transfer over ssh? WinSCP3 is a decent windows client for it. You'd have to create a user for each account though, but it covers encryption.
posted by Vantech at 12:05 PM on August 18, 2006


I assume the problems come from either my modem or my router. Actually, I suspect the router... but no matter what, home routers WILL do that. I tried changing the router and modem a few times, no good.

I know about WinSCP, but that's using SFTP and SCP... VSFTPD is of course a FTP daemon. I actually used ProFTPD in the beginning, but kept using PureFTPD which seemed better to me since a while.

Basically, I'm tired of messing around SFTP/FTP protocols and software like that. I'm pretty sure my problems come from the fact that I am a home user, so if there is any way to work around that (with a different way to do transfers), I'd like to give it a try.

The server is currently running Gentoo Linux, but I plan on putting debian testing on it soon. I assume these are "old and reliable" technologies, but they may only work right in datacenter conditions or something (a server connected directly to the net with a good connection?), if my understanding is correct.
posted by a007r at 12:28 PM on August 18, 2006


Oh I forgot to add this info: my ISP does not *seem* to do restrictions, they are pretty home server friendly on first look. No port restrictions, no speed throttling. The speed issue I was talking about happened only using SFTP (ssh). This is a home ADSL line.
posted by a007r at 12:33 PM on August 18, 2006


How about WEB Dav? Windows users can then create a mapped drive (see this) and it looks just like another drive on their computer.
posted by gus at 12:34 PM on August 18, 2006


I'm not sure how user friendly or admin friendly it would be, but some kind of XML-RPC interface would probably do it. You'd have to write both client and server, but if you're willing to a bit of head scratching and Googling, it's quite feasible.
posted by tommorris at 12:48 PM on August 18, 2006


See gus' comment: WebDAV.

There's no need to reinvent the wheel.
posted by waldo at 1:57 PM on August 18, 2006


You could use rsync.
posted by devilsbrigade at 2:17 PM on August 18, 2006


Yeah, FTP is mature and old, but sucks. The connect back port or figuring out passive mode is just yuck...

You know what works really really well? HTTP..

WebDAV uses good ole HTTP for file sharing. Or, you could look into using a version tool like svn over http. You would get file sharing and version/backup protection all at the same time.

Score!
posted by PissOnYourParade at 2:42 PM on August 18, 2006


webdav
posted by blag at 5:24 PM on August 18, 2006


Another vote for webdav. Be advised the one OS that doesn't have a good native client is Windows.
posted by yerfatma at 5:42 PM on August 18, 2006


I assume the problems come from either my modem or my router.

Then why dump sftp?

no "ghost" connections blocking the user from accessing again

Where did you get this idea from? What the heck is a ghost connection?

decent speed (I have 3mbits/800kbits of bandwidth, I'd like to maximize that)

Both of the protocols you mention (and their most commonly used implementations) can far outrun three megabit. The protocol is not causing you speed problems.

ease of use on the client side. Must work for Linux and Windows

Pretty much every service you can run on a Linux box will have a Linux client, and your users who use Linux as a workstation will be able to use them with little guidance. Perhaps you just search for Windows clients for Linux-based servers.

encryption

I think that ssh-based solutions are your quickest way to get here. I've forced asked Windows customers to use Cygwin to run rsync-over-ssh ksh scripts I've written them to upload their web sites. If you run the rsync server itself, and use this approach, it's very simple to chroot people. Rsync is really simple to use.

allows a user database; I mean not having to create a real "system user" for each user that needs an account.

What is really the harm here if they don't have a shell and their account is locked? It just lets them own files. useradd and userdel are easy to work with.

The server is currently running Gentoo Linux, but I plan on putting debian testing on it soon. I assume these are "old and reliable" technologies, but they may only work right in datacenter conditions or something (a server connected directly to the net with a good connection?), if my understanding is correct.

Your understanding is wrong. Having a datacenter does not affect the efficacy of a Linux box. Gentoo, Debian, Red Hat, LFS, it doesn't matter. Any of them can effectively do file-swapping over the Internet. Run ssh on one of them, and you've run ssh on all of them.
posted by popechunk at 8:37 PM on August 18, 2006


I know about WinSCP, but that's using SFTP and SCP...

...and that's a problem, why?

I don't see that you've really nailed down what you're trying to fix, and it seems to me like you'll be kicking dead whales down the beach until you do.

But I suspect abandoning SFTP won't be that solution.
posted by baylink at 8:42 PM on August 18, 2006


Where did you get this idea from? What the heck is a ghost connection?
Hm. Hard to explain, I think in French it is called "connexions rémanentes", but basically, my problem is very similar to this and this (just googled it).

Connections will drop without explanation (no problems at all with HTTP and stuff like that), and the best guess I can have is that somewhere between my server and the client there is a connection that died in a weird way and left a "ghost". That ghost prevents the client from continuing or reconnecting because the server (or some other interface?) thinks the client is already connected. That's just a guess.

Both of the protocols you mention (and their most commonly used implementations) can far outrun three megabit. The protocol is not causing you speed problems.
Then why could I upload at full speed (70-80) with FTP and maybe 15-30 with SFTP? I first assumed it was encryption slowing down, I tried tweaking pretty much everything to no avail. I gave up after more unsuccessful reasearches.

Having a datacenter does not affect the efficacy of a Linux box. Gentoo, Debian, Red Hat, LFS, it doesn't matter.
Hmm. What I meant is that SFTP/FTP were maybe made for better, more reliable connections that you find in datacenters, not on a home ADSL line with consumer hardware, no?

That basically explains why I am looking for alternatives: I would like to try pretty much everything I can, maybe I could find something that does not behave so weird or can cope with my network...?
posted by a007r at 9:03 PM on August 18, 2006


Most of the protocols you could find work over TCP/IP. Network flakiness is handled (quite well, where possible) at the TCP/IP layers, rather than at the application layer, in most cases.

I've never heard of ghost connections. The examples you googled just seem like random problems with ssh, because you've not explained how they relate to your issue.

I am coming to believe that you have some other problem going on (your link is saturated by some other traffic? Your cables are bad? Your router is bad? Your ISP service is bad? The PC you are running your site on has a bad NIC? Your DNS is screwed up? ) that is giving you fits.

No application or (layer 7) protocol that I know of will overcome crappy network reliability the way you want it to. I recommend that you devise a way to test how reliable your network connection is ( some kind of ping script? smokeping?).
posted by popechunk at 9:25 PM on August 18, 2006


Yeah, it sounds like some kind of problem with your firewall or router. Is there NAT going on? You could try running a simple web server, script it to send and receive some large files, and see if it works reliably. If not, nothing else along the lines of what you're thinking has much chance of working.

But to answer the actual question, NFS would be one alternate way to do this. It uses UDP by default, which means it has at least some chance of working better, and of course some chance of not working at all with whatever your network problem is. I've no idea if there's a windows client, but I guess there must be, since NFS has been around since forever and it's still fairly popular in some places.
posted by sfenders at 9:30 PM on August 18, 2006


sfenders, I'll bet you've hit it.

a007r: you're bringing these connections back into your server through a router, aren't you?

You're "reverse-NAT"ting them, which makes you dependent on the size of a particular mapping table in the router, and how often its entries are cleared out (as we discussed in a thread a couple days ago I'm too lazy to go find :-).

If the server your people are talking to *does not have it's own* personal public address, but is on a private RFC1918 LAN address behind a router, then the problem is very likely that your router sucks.

You might look into replacing it with something commercial-grade, like a SnapGear/CyberGuard.
posted by baylink at 11:17 PM on August 18, 2006


yerfatma: "Another vote for webdav. Be advised the one OS that doesn't have a good native client is Windows."

Hogwash. "Add a network place", and point it to a http location. Works like a charm. Now the Mac's Finder on the other hand... oy gewalt. Why must they try to write resource forks everywhere? And why must the machine freeze up when they aren't allowed to?

Apache's webdav implementation is a little tough to work with because setting up permissions is a pain, but try davenport on top of a samba server. Have samba authenticate using pam. The default "home" share should keep users in their home dirs. Davenport then just acts as an http to smb gateway. Firewall your box so outsiders can only attach via webdav. You should use ssl, but getting certificates working is a challenge.
posted by team lowkey at 2:15 AM on August 19, 2006


If it's gotten better, I'm happy to hear it. I had a tough time when I first got a textdrive account ('bout a year ago) and the posts in the forum suggested this was SOP on Windows. I had no problems on Mac.
posted by yerfatma at 6:21 PM on August 19, 2006


Same for me, yerfatma. I had a hell of a time getting webdav to work on textdrive natively on both mac and windows, and I sure wasn't going to fork out for a commercial client for either system. I haven't tried it again in quite some time, so perhaps it's gotten better.
posted by odinsdream at 4:20 AM on August 20, 2006


I never tried textdrive, so I can't speak to that, but I've been using the windows native client and davenport for a couple years without a hitch. And the Mac client works fine, too, as long as you have write access to all the directories. But if you don't... yikes.
posted by team lowkey at 3:43 PM on August 21, 2006


« Older atheist military burials   |   Recommendations for Children's Stories to... Newer »
This thread is closed to new comments.