With some understanding of the basics, though, setting up your own server room for your small business network need not be an arcane process. Here are some tips for getting started.
Rack-mount equipment makes sense
It’s not uncommon for small businesses to begin operation by stacking server hardware and network appliances on a desk or shelf. Though such a deployment is inexpensive, the pile of equipment invariably expands into an unmanageable mess with the growth of the company. Exposed equipment is also completely open to physical tampering and is a ticking time bomb for accidents such as coffee spills, dust or even workers tripping over wires.
However, rack-mount equipment is designed specifically to properly house this type of hardware. While these tend to be pricier than their non-rack-mount equivalents, it’s arguable that being easier to manage far exceeds the cost premium. In addition, shelves and drawers designed for mounting onto the server rack are widely available; these let racks work with non-rack mount appliances as necessary.
There’s a server rack for all seasons
Before getting the server rack, it’s good to first understand its basic characteristics. Server racks are measured in terms of rack units, usually written as “RU” or simply “U.” One rack unit equals 1.75 inches (44.45mm) in high, with compliant equipment measured in multiples of “U.” Network switches are generally 1U to 2U, servers can range from 1U to 4U and blade servers can be anywhere from 5U to 10U or more.
Isolate servers to reduce noise
Organizations without the luxury of a dedicated room for server equipment will want to consider noise management. Whenever possible, a small, partitioned room is worth the expense. Aside from substantially dampening or even eliminating productivity-sapping equipment noise, having a room for your server gear also offers the ability to secure IT equipment against casual theft or tampering.
get an AC unit (or two)
If all you plan to deploy is a couple of network switches and a five-bay network attached storage (NAS) system, then you probably don’t need to worry about cooling. Pack in several more servers, a mid-sized uninterruptible power supply and a larger NAS, though, and the heat starts building up quickly. Needless to say, high temperatures can dramatically shorten equipment life and often culminate in inexplicable crashes or outages.
It’s possible to compare the thermal load of the rack with the thermal output of your server equipment, but a common-sense approach of measuring the temperature inside the rack is often sufficient. Keeping your equipment cool isn’t isolated to the heat dissipation capabilities of your server rack; it’s also directly affected by the ambient temperature outside the rack. That’s why installing air-conditioning units in the server room is recommended.
Managing wires isn’t glamorous, but it’s necessary
Setting up a server rack is more than just twisting a few screws to secure the equipment into place. Proper cable management can’t be overstated, as just about every piece of equipment in the rack is linked with Ethernet cables. Intra-cabinet wiring aside, it makes sense to terminate cable runs for Ethernet LAN points for desktop computers, IP cameras and other network appliances at the rack.
The best way to properly manage all these cables is to use an RJ45 patch panel to terminate Ethernet cable runs. The typical patch panel installs in 1U of space and offers up to 24 ports. Using a patch panel does require some hands-on work — stripping a cable, punching it into the patch panel and using a wire tester tool to verify the connectivity. (If hiring a professional is in the budget, he or she can probably get everything installed in less than a day.)
Label everything — and keep it simple
Finally, don’t skimp on labeling and documenting your setup, even for relatively simple deployments. What may be obvious to the employer setting it up could be missed by a new IT staffer or a vendor contracted to work on certain aspects of the system. Time savings aside, proper labeling reduces the likelihood of catastrophic mistakes such as a mission-critical system getting unplugged or restarted without adequate warning.