Which OS should I use on my fileserver?
December 7, 2006 2:44 PM
I'm in the process of setting up a file server that will interface with 12 windows XP workstations. I want it to be directly mountable on those workstations, and (the catch) I also want it to work with Active Directory. I'd like to install Linux on the server, but should I use Windows instead?
I've always used Linux on servers in the past, and I want to do the same here, but it sounds like getting Linux to play nice with Active Directory may be not so easy. I could get a copy of Windows 2003 Server but I'm more familiar with administering Linux than Windows, and I feel like Linux is more reliable. I'm setting this up in a university without a lot of IT support so I'm sort of on my own.
I already have Active Directory set up for the workstations and my users have accounts set up in Active Directory that those workstations can authenticate against. I'd like to have the server do the same. Can I do this in Linux or should I bite the bullet and get a copy of Windows Server?
I've always used Linux on servers in the past, and I want to do the same here, but it sounds like getting Linux to play nice with Active Directory may be not so easy. I could get a copy of Windows 2003 Server but I'm more familiar with administering Linux than Windows, and I feel like Linux is more reliable. I'm setting this up in a university without a lot of IT support so I'm sort of on my own.
I already have Active Directory set up for the workstations and my users have accounts set up in Active Directory that those workstations can authenticate against. I'd like to have the server do the same. Can I do this in Linux or should I bite the bullet and get a copy of Windows Server?
Linux:
+ Google!
http://ubuntuforums.org/archive/index.php/t-91510.html
posted by flabdablet at 4:09 PM on December 7, 2006
+ Google!
http://ubuntuforums.org/archive/index.php/t-91510.html
posted by flabdablet at 4:09 PM on December 7, 2006
The Linux kernel has been much poorer quality since they went to 2.6, and the chance of encountering a bug (particularly on more exotic hardware, like your fileserver) is pretty high. From surveys they took, the kernel devs determined that about 20% of their userbase is affected by one or more kernel bugs and -- get this -- they're just fine with that. Development speed and fun of programming have become much more important to them than actually making the kernel work every single time without fail. They are adding features at an incredible rate, and they're never letting the code shake out. And they only support a given kernel for sixty days before abandoning it.
The distros can insulate you from a lot of the pain, but it's tremendously harder to retrofit quality; it's much easier and more effective to build it in from the start.
In the world of adults, where things need to work, I think you'd be better off with FreeBSD, Solaris, or Windows. Given your Active Directory needs, Windows would probably be the easiest approach, even though it costs so much. While your time in learning the different system isn't free either, raising the total cost of implementation, it may reduce cost of ownership substantially. When you graduate or move on, the system will be fairly easy to maintain, and won't take much training.
That said, Samba DOES have the ability to join an Active Directory, and will use it for authentication. I haven't done it myself, but I think it's not even that difficult to configure. I don't believe Samba pays any attention to most of AD's features (server configurations and the like), but it will at least allow authentication.
If you really want to use Linux, it should work all right, but you'll probably have better stability with Windows 2003 than with just about any 2.6 kernel.
posted by Malor at 4:10 PM on December 7, 2006
The distros can insulate you from a lot of the pain, but it's tremendously harder to retrofit quality; it's much easier and more effective to build it in from the start.
In the world of adults, where things need to work, I think you'd be better off with FreeBSD, Solaris, or Windows. Given your Active Directory needs, Windows would probably be the easiest approach, even though it costs so much. While your time in learning the different system isn't free either, raising the total cost of implementation, it may reduce cost of ownership substantially. When you graduate or move on, the system will be fairly easy to maintain, and won't take much training.
That said, Samba DOES have the ability to join an Active Directory, and will use it for authentication. I haven't done it myself, but I think it's not even that difficult to configure. I don't believe Samba pays any attention to most of AD's features (server configurations and the like), but it will at least allow authentication.
If you really want to use Linux, it should work all right, but you'll probably have better stability with Windows 2003 than with just about any 2.6 kernel.
posted by Malor at 4:10 PM on December 7, 2006
Of course,
Windows:
+ NTFS permissions model.
NTFS permissions beat POSIX permissions hands down for flexibility. But if you already know your way around POSIX-style access control, this shouldn't be a biggie.
posted by flabdablet at 4:13 PM on December 7, 2006
Windows:
+ NTFS permissions model.
NTFS permissions beat POSIX permissions hands down for flexibility. But if you already know your way around POSIX-style access control, this shouldn't be a biggie.
posted by flabdablet at 4:13 PM on December 7, 2006
I would personally go with Windows in your situation. Like CrayDrygo said, it's almost worth it just for compatibility. Anytime you have different operating systems talking to each other there will be issues to work through, but using Active Directory will multiply those issues. I'm not saying it can't be done (obviously others do it) but you might spend a large amount of your time troubleshooting and still end up with an unstable environment. I also can't say enough good things about Shadow Copy. I used to cringe every time I got a restore request because our tape backups were often unreliable. Since upgrading to 2003, I don't even remember the last time I had to go to tape for a restore.
And CrayDrygo's other point is the reason my company has rarely used open source software in production; who's going to support it when you leave? We've used things like G4U with great success (and a lot of money saved) but it doesn't really require any knowledge of Unix. I consider it part of my job to set up an environment that is stable and can be easily managed if I left. However, if you have a dozen other people at your company that know Linux it's not a problem.
I'm sure some people will say to go with Linux because (just like in threads that mention Apple) people tend to blindly hate Microsoft, but it really depends on your needs for this particular project. If this was an internal project for just my department, I would seriously consider using an open source OS just for fun and for the challenge. I would never use unsupported software in production unless it was clearly the best option. Microsoft support is far from free, but it's available 24/7 if needed. If everything went to hell and my boss is breathing down my neck, I wouldn't want to post my problem in a forum and hope someone can help me.
posted by bda1972 at 4:47 PM on December 7, 2006
And CrayDrygo's other point is the reason my company has rarely used open source software in production; who's going to support it when you leave? We've used things like G4U with great success (and a lot of money saved) but it doesn't really require any knowledge of Unix. I consider it part of my job to set up an environment that is stable and can be easily managed if I left. However, if you have a dozen other people at your company that know Linux it's not a problem.
I'm sure some people will say to go with Linux because (just like in threads that mention Apple) people tend to blindly hate Microsoft, but it really depends on your needs for this particular project. If this was an internal project for just my department, I would seriously consider using an open source OS just for fun and for the challenge. I would never use unsupported software in production unless it was clearly the best option. Microsoft support is far from free, but it's available 24/7 if needed. If everything went to hell and my boss is breathing down my neck, I wouldn't want to post my problem in a forum and hope someone can help me.
posted by bda1972 at 4:47 PM on December 7, 2006
I've been tinkering with what you want for almost a year now, and all the time I wasted was made worthwhile when I discovered ClarkConnect which is based on a stripped down version of RedHat Enterprise 4.
The best thing about it (besides being free) was that after a very simple install process ( a handful of screens) you only need to access it through a web interface. I was up and running within an hour, and on the first try, which was never the case with Ubuntu (which I still use on my notebook). I'm using ClarkConnect for NAS purposes but it's supposed to play fair with Active Directory and has a great support base.
posted by furtive at 6:18 PM on December 7, 2006
The best thing about it (besides being free) was that after a very simple install process ( a handful of screens) you only need to access it through a web interface. I was up and running within an hour, and on the first try, which was never the case with Ubuntu (which I still use on my notebook). I'm using ClarkConnect for NAS purposes but it's supposed to play fair with Active Directory and has a great support base.
posted by furtive at 6:18 PM on December 7, 2006
Guys, this is for a file server. Configuring Samba to authenticate against AD is child's play, and is quite robust. Issues like Kerberos and kernel bugs shouldn't be a concern.
posted by kableh at 6:51 AM on December 8, 2006
posted by kableh at 6:51 AM on December 8, 2006
I dunno much regarding Samba stability and linux bugs (do you have a source for that stuff Malor?) , but I had to respond to this:
And CrayDrygo's other point is the reason my company has rarely used open source software in production; who's going to support it when you leave?
Uh, any other linux sysadmin? Even if there isn't one with the company, if the company is okay with using linux as a file server they should be okay with hiring someone qualified to support it. Its the same deal if an all win32 shop has to support a few macs for designers - they either cross train someone or hire someone to do it. Lots of companies have mixed environments and just hire the right sys admins to deal with it.
And please don't play the vendor support card. When you are having critical issues, you will probably be *better* off going to open source forums/discussion/issue trackers versus trying to get help from a vendor like MS. Having "vendor support" is a nice form of false security for IT mgr'ers, but I've never seen it pay off at any of the places I've worked for, versus just getting people who know open source and leveraging the community.
posted by rsanheim at 7:25 AM on December 8, 2006
And CrayDrygo's other point is the reason my company has rarely used open source software in production; who's going to support it when you leave?
Uh, any other linux sysadmin? Even if there isn't one with the company, if the company is okay with using linux as a file server they should be okay with hiring someone qualified to support it. Its the same deal if an all win32 shop has to support a few macs for designers - they either cross train someone or hire someone to do it. Lots of companies have mixed environments and just hire the right sys admins to deal with it.
And please don't play the vendor support card. When you are having critical issues, you will probably be *better* off going to open source forums/discussion/issue trackers versus trying to get help from a vendor like MS. Having "vendor support" is a nice form of false security for IT mgr'ers, but I've never seen it pay off at any of the places I've worked for, versus just getting people who know open source and leveraging the community.
posted by rsanheim at 7:25 AM on December 8, 2006
Thanks for all the advice - it's been very helpful. I'm more or less unconcerned with the issues of who will support all this after I leave - I was hired to run the instrumentation and the fact that I know some linux administration is just a fringe benefit. If I didn't know how to do it we'd do without the file server or run some totally kludgey system I guess.
I think I will spend a couple days trying to get Linux to work since it is free and I already have it installed on that machine. If that doesn't work I will bite the bullet and shell out for Windows Server.
posted by pombe at 9:48 AM on December 8, 2006
I think I will spend a couple days trying to get Linux to work since it is free and I already have it installed on that machine. If that doesn't work I will bite the bullet and shell out for Windows Server.
posted by pombe at 9:48 AM on December 8, 2006
rsanheim: read lwn.net, best Linux source going. The 20% bug rate quote was from the weekly edition two or three weeks ago. Rik Van Riel has also said (also sourced from lwn) that if only 1 stable kernel in 3 is actually stable, he's just fine with that. That's nearly a direct quote.
I've observed plenty of bugs, personally. I've now spent several hundred dollars buying hardware to replace things that were actually working fine -- it was just the Linux kernel sucking. 2.6.15 was particularly awful. It broke traceroute, for God's sake. I haven't been personally bitten by anything since about 2.6.17, but plenty of others are having trouble. It's not unusual to see 30+ kernel patches, fixing bugs and security holes, in each two-month cycle.
Quality is not high on their priority list.
(while MeFi was down, I had time to look up the specific quote in question, from Linux Weekly News (lwn.net). It is:
Quote of the week
Per the comments, LKML's response was "that's just fine".
posted by Malor at 2:38 PM on December 8, 2006
I've observed plenty of bugs, personally. I've now spent several hundred dollars buying hardware to replace things that were actually working fine -- it was just the Linux kernel sucking. 2.6.15 was particularly awful. It broke traceroute, for God's sake. I haven't been personally bitten by anything since about 2.6.17, but plenty of others are having trouble. It's not unusual to see 30+ kernel patches, fixing bugs and security holes, in each two-month cycle.
Quality is not high on their priority list.
(while MeFi was down, I had time to look up the specific quote in question, from Linux Weekly News (lwn.net). It is:
Quote of the week
70% hit a bug-- Andrew Morton
1/7th think it's deteriorating
1/4th think lkml response is inadequate
3/5ths think bugzilla response is inadequate
2/5ths think we have features-vs-stability wrong
2/3rds hit a bug. Of those, 1/3rd remain unfixed
1/5th of users are presently impacted by a kernel bug
Happy with that?
Per the comments, LKML's response was "that's just fine".
posted by Malor at 2:38 PM on December 8, 2006
Thanks for the further info, malor. I guess I had a false impression of the kernel's quality.
Is there anything like a automated build, overall test suite, etc...for the kernel and low level systems? I think I read somewhere some kernel developer saying "our users are our test system", which struck me as pretty stupid. Obviously testing and automation won't catch the unlimited number of bugs that could come up due to strange hardware conflicts in the wild, but it would at least catch a general baseline of things.
posted by rsanheim at 11:08 PM on December 10, 2006
Is there anything like a automated build, overall test suite, etc...for the kernel and low level systems? I think I read somewhere some kernel developer saying "our users are our test system", which struck me as pretty stupid. Obviously testing and automation won't catch the unlimited number of bugs that could come up due to strange hardware conflicts in the wild, but it would at least catch a general baseline of things.
posted by rsanheim at 11:08 PM on December 10, 2006
Kernels are *all about* strange hardware conflicts in the wild. That's what kernels *do*.
I run a headless Ubuntu Server 6.06 box at home (kernel 2.6.15-27-server, and FWIW, traceroute works just fine). It runs Samba to share multimedia files to the rest of my machines, squid to speed up my browsing and Windows updates, and sshd to let me set up tunnels from assorted places. It runs 24*7 between power failures; longest uptime so far has been about two months. Overall, it's a very easy system to maintain.
My laptop runs Ubuntu 6.10 (kernel 2.6.17-10-generic). It's rock solid, too.
I also deal with four Windows Server 2003 boxes at a couple of schools. On one of these, I've had to put in a scheduled job to restart it every Sunday night. Without that, it will run for about ten days and then go almost totally unresponsive. In fact it becomes so unresponsive that I can't even run Task Manager to see if anybody's eaten all the RAM. All I can do is restart it, and even that takes about ten minutes. I have no idea why, and no way of finding out. It's frustrating as hell.
Maybe using a 2.6 Linux kernel is asking for trouble; personally I've had none. Personally, I am not convinced that the Windows Server 2003 kernel is the last word in stability either. The plural of anecdote is not data, though, so YMM of course V.
posted by flabdablet at 4:43 AM on December 11, 2006
I run a headless Ubuntu Server 6.06 box at home (kernel 2.6.15-27-server, and FWIW, traceroute works just fine). It runs Samba to share multimedia files to the rest of my machines, squid to speed up my browsing and Windows updates, and sshd to let me set up tunnels from assorted places. It runs 24*7 between power failures; longest uptime so far has been about two months. Overall, it's a very easy system to maintain.
My laptop runs Ubuntu 6.10 (kernel 2.6.17-10-generic). It's rock solid, too.
I also deal with four Windows Server 2003 boxes at a couple of schools. On one of these, I've had to put in a scheduled job to restart it every Sunday night. Without that, it will run for about ten days and then go almost totally unresponsive. In fact it becomes so unresponsive that I can't even run Task Manager to see if anybody's eaten all the RAM. All I can do is restart it, and even that takes about ten minutes. I have no idea why, and no way of finding out. It's frustrating as hell.
Maybe using a 2.6 Linux kernel is asking for trouble; personally I've had none. Personally, I am not convinced that the Windows Server 2003 kernel is the last word in stability either. The plural of anecdote is not data, though, so YMM of course V.
posted by flabdablet at 4:43 AM on December 11, 2006
I just wanted to followup to my own question to let any future readers know what I did. I'm running Debian, which is still on the 2.4 kernel series. It's been very stable for me so that's not been a problem.
I basically followed the instructions posted above for Ubuntu to get Samba + AD authentication working. It took me about a day and a little bit of tweaking and it basically worked. I've further tweaked the configuration after reading the Samba Howto and it now seems to be very stable and working well. The only thing that confused me and didn't seem to be well documented was setting up the firewall. AD requires a lot of ports to be open to work, and also requires that your nameservers be able to access your machine. I had to spend a while looking at what packets were being dropped to get the firewall setup.
While this may not be the way to go for someone in a corporate environment where things have to work, for an academic environment where the emphasis is on cheap and good enough, Linux and AD can definitely be made to interoperate. If anyone reads this thread trying to do this, feel free to email me for info.
posted by pombe at 6:00 PM on April 5, 2007
I basically followed the instructions posted above for Ubuntu to get Samba + AD authentication working. It took me about a day and a little bit of tweaking and it basically worked. I've further tweaked the configuration after reading the Samba Howto and it now seems to be very stable and working well. The only thing that confused me and didn't seem to be well documented was setting up the firewall. AD requires a lot of ports to be open to work, and also requires that your nameservers be able to access your machine. I had to spend a while looking at what packets were being dropped to get the firewall setup.
While this may not be the way to go for someone in a corporate environment where things have to work, for an academic environment where the emphasis is on cheap and good enough, Linux and AD can definitely be made to interoperate. If anyone reads this thread trying to do this, feel free to email me for info.
posted by pombe at 6:00 PM on April 5, 2007
This thread is closed to new comments.
A few things I should have added - I already have a 3TB file server with hardware raid. I'm just trying to decide what OS to put on it.
The server set up is a little unusual in that it won't be used for long term storage. I run a core facility, and the server will just be used for short term data storage during data acquistion and analysis, and the users will be responsible for long term data storage. This will be enforced by having a cron job or similar delete any files over a month old. So the backup issues are moot - the main point of the server is to make file transfer between workstations transparent and to make it easy for users to move files over the network to their lab computers.
The 12 workstations are for data acquisition and analysis - we will have more like 100 users on those workstations.
posted by pombe at 3:25 PM on December 7, 2006