How to set up a server room with a 10 TB rack?
April 24, 2008 7:40 AM   Subscribe

Going to be moving into a new office space, where will will get to redesign some of the space for our needs. We will need to set of a 10TB server rack-- need to figure out how much space should we set aside for a server room. If you were starting to set up one from scratch, what would you start considering?

What sort of things do we need to worry about-- room temparture, or airflow or both? Cooling racks? How big would the room need to be?
posted by ShawnStruck to Computers & Internet (12 answers total) 1 user marked this as a favorite
 
If you aren't familiar with this sort of thing, it's probably something that you want to hire a consultant to do. To make sure that everything is designed and setup correctly. Just saying.
posted by pete0r at 7:55 AM on April 24, 2008


agreed - getting a consultant or contractor would be what you want. temperature and airflow are important. so is security of it, and not having the floor carpeted would be good too.
posted by joshgray at 7:59 AM on April 24, 2008


We're planning on doing so; some general guidelines or best practices would be appreciated, too.
posted by ShawnStruck at 7:59 AM on April 24, 2008


You aren't really providing enough information here. You can fit 10TB of mirrored storage in 8 rack units. That's a small fraction of a single rack. But presumably you need some actual servers as well, and some networking gear....

You need to worry about room temperature, and airflow. You will certainly need air conditioning. You should probably buy racks with integrated fans. Don't forget that you need to work out power distribution too - perhaps with a UPS? A raised floor makes cabling a lot easier. The room need to be big enough to allow at least several feet of clearance at the front and back of the racks. There is a lot to think about, and it's fairly easy to get some of it wrong - particular with regards to cooling.

In other words, pete0r is right, you should probably get a consultant in.
posted by standbythree at 8:06 AM on April 24, 2008


One thing to be aware of is, many office buildings shut off their central AC on weekends to conserve energy. If yours does this, you'll definitely need to arrange for separate, constant AC for your server room. I had to do this for my company when we moved, and it was an unexpected $10k increase in my budget.
posted by autojack at 8:51 AM on April 24, 2008


You want redundant disparate power from seperate power conditioners. There should be sufficient nearline battery backup onsite for each power feed to allow generators to spin up. Onsite fuel for generators should be sufficient to operate for 12 hour or however long the SLA is with the local power company.

You should have redundant cooling in place for the facility that is also on the generator system.

You should have air filtration in place in order to keep humidity and dust levels minimized.

Before you get in to the above, what is your uptime goal, what is your projected growth, what are the access needs, what are the performance needs. You may be better served by leasing a rack from a service provider. Building a fault tolerant infrastructure is extremely expensive.
posted by iamabot at 8:56 AM on April 24, 2008


Well, you need power and cooling to the room, and that may require non-trivial electrical, HVAC, or plumbing work before you can even start installing the rack(s).

Exactly how much power you'll need for each rack depends on what you're going to be installing -- networking gear doesn't draw anything close to what densly-packed blade servers do -- so you may need to work with a consultant several times over the course of the project.

But basically here's my feeling of the process you need to go through:

-Assuming you already have a firm idea of the business needs (what applications you want to run, now and in the future, client loads, storage requirements), make a list of the hardware you need, and an additional list of hardware you might want to add in the near future and far future, insofar as you can predict.

-Based on that, define requirements for rack space, power, cooling, and connectivity infrastructure. Add a lot of spare capacity to make up for things you didn't think of and to add flexibility down the road.

-Install electrics and HVAC as required, then raised flooring and/or cable-management. This is kind of the low-level infrastructure part. Door locks and physical security go here, too, before you start bringing the expensive stuff in.

-Then you actually install the racks and start wiring them up, and last you start installing, wiring, and configuring the hardware.

This is based on my experience of infrastructure installs (well, it's based on my experience of how things should have gone, but rarely seem to). I think you're going to want the consultant in there a few times; in the beginning to help you define requirements, and at the end to do the install itself. You'll probably need to manage some of the low-level work (cooling and power) yourself, working with your physical plant or landlord's maintenance people.
posted by Kadin2048 at 8:59 AM on April 24, 2008


You'll need dedicated power - I know that doesn't have anything to do with the space, but it was a big chunk of change when we had to design the server room. Plus special outlets for APC UPS units.

Also, figure out if you're going to be sitting in the server room, and plan accordingly. At one place, we have to sit in there (brrr) because there simply aren't any spare cubicles. They left us about two feet of space to cram a stool in.
posted by Liosliath at 8:59 AM on April 24, 2008


Seconding everything. Also be sure to find out if the floor can support the combined weight of the servers, HVAC, UPS, etc. and what kind of ducting you'll need for chiller exhaust. Some older construction has surprisingly small limits on very heavy things in small areas.

Sort of a side answer but how sure are you that this data needs to be on-site? Is leasing colo space and a dedicated line an option? What about something like Sun's Blackbox, i.e. a datacenter in a shipping container that you could drop in the parking lot or on the roof?

Proper server space is expensive to construct and especially so when you're talking small areas where economies of scale work against you.
posted by Skorgu at 9:43 AM on April 24, 2008


How I've done this in the past:

1) Used visio to lay out my proposed gear into racks to figure out how many racks I'd need. 10TB isn't much these days- does it really require a dedicated room? I also worked on making sure there was access all the way around the racks- some folks forget you need just as much room in the back as you do in the front.

2) worked with an electrician who specialized in this; we couldn't afford redundant power, but we did go with an isolation transformer into the UPS system. Make sure you can take the UPS offline for when you need to swap out batteries- they die every so often..

3) worked with an a/c specialist- redundant, dedicated cooling is absolutely critical. If you don't have cooling, you don't have a server room.

4) worked with the security guy to get a sep. keycode and lock.

5) worked with the local verizon rep to make sure the building's risers would allow me to get the various network drops we needed.

6) bought a crapload of network cables ( I make my own, usually, but with a full server room load it gets time prohibitive).

Basically, you're going to coordinate the activities of a whole bunch of expensive union folks- plumbers, electricians, telecom, carpenters, etc. This is a very expensive and involved undertaking. If at all possible, rent a secure colo somewhere else, as your gear will probably fit in a half cabinet. Half cabinets are going for around $500 / month these days...
posted by jenkinsEar at 9:56 AM on April 24, 2008 [1 favorite]


You may want to ask yourself whether you really need a racked server room, or more specifically whether everything needs to go in there.

Depending on your needs and the configuration, 10TB RAIDed and mirrored is doable with an ordinary PC (non-racked, no special cooling needs).

For what it's worth, I built a RAIDed 2TB fileserver 5 years ago for $5k, and it sat in an ordinary office. Because it wasn't dependent on special cooling, it had one less point of failure than our racked machine room. The one or two times our machine room lost AC, this server kept running.

Uptime was fine for non-critical use. Cheap big disks then were 250GB/ea. Now they're more like 1TB ea, so I suspect ~10TB is doable in a tower case with enough bays.
posted by zippy at 10:01 AM on April 24, 2008


On review - I believe 10TB mirrored would require two tower PCs to hold the 20+ drives.
posted by zippy at 10:03 AM on April 24, 2008


« Older can i get private health insurance in canada?   |   Bittman Recipe Reminder Newer »
This thread is closed to new comments.