Data Center airflow management is vital for the successful operation of any network, yet keeping the center at the correct ambient temperature is something which is all too often overlooked. NetworkTigers takes a look at the issues…
Data is, and always has been, a valuable asset. Storing and processing this data is critical. And data storage facilities – data centers – are now a fundamental and vital part of our world.
Your data center can be a very expensive and complicated behemoth that can require intensive management and monitoring. Clearly, their operation must be closely monitored. But the big question is – how does one minimize the costs and the time taken to maintain the data center?
Having invested a great deal of time, effort and resources into creating the perfect data center, you might think maintenance is then a high priority, right? But there’s a spanner in the works of every data center – and it’s often overlooked.
We’re talking about heat. NetworkTigers has seen way too many decent data centers – where it seems everything has been thought of, and the latest, greatest kit installed – ruined by poor air flow and cooling.
Cooling IT equipment accounts for approximately 40% of total energy consumption in data centers.
That might seem alarming, in terms of the company energy bill – and some businesses are wasting a significant amount of cash due to insufficient airflow management, driving up power consumption costs. According to Peter Curtis, author of “Maintaining Mission Critical Systems in a 24/7 Environment”, efficient airflow management could cut business energy consumption by 30%.
A data center needs to be kept at a constant temperature to operate most effectively. Overheating can lead to equipment failure and downtime, both costly for a business.
Inadequate airflow management can cause air to bypass and recirculate, stopping it reaching its intended location – this is inefficient and costly.
So what can be done? Here’s Network Tiger’s suggestions on how to ensure your data center airflow management system is optimized – it could even save you a few bucks.
Although a legacy layout may be the best option for small businesses to avoid unnecessary complexities (if it ain’t broke, don’t fix it), you should evaluate whether older style interfaces are better in terms of how efficiently they support airflow.
For example, some businesses are still using plexiglass to cover their server racks, severely restricting airflow into the racks. This can easily be replaced with perforated panel doors to immediately increase airflow.
Server rack arrangement is also crucial. Most large businesses adopt the ‘hot aisle, cold aisle’ arrangement – which does what it says on the tin – with server aisles arranged in this order. This prevents hot air seeping from one unit to another. Instead, it goes to the Computer Room Air Conditioning Units (CRACs) to be recycled into cool air. Some businesses are yet to adopt this layout.
Raised flooring in data centers is, of course, an efficient way for CRACs to cool server racks – thanks to perforated tiles distributing the cool air. A raised floor layout also offers adaptability as perforated tiles can be moved around to suit the airflow needs of the data center.
One simple thing that can be done to improve airflow is to check for air leakages. Leakages can come from unsealed raised floor openings, particularly under electrical gear and cable cutouts, and cause cold air to seep into undesired areas.
Leakages can also allow warm, dusty, and humid (undesirable) air from other places to enter the data room, reducing airflow quality. To prevent this, air restrictors can be used to close openings. Also, data facility managers should ensure gaps within and between IT racks are sealed.
Height of raised floor
Efficient airflow underneath raised floors is essential as it affects the airflow of the whole data room. Some data centers are not sufficiently optimizing their airflow due to the height of the raised floor.
The most efficient height should be ascertained for two reasons: if the floor is too low, airflow is restricted.
If the floor is too high, then there is more distance for the air to travel to get where it’s needed. There is also an increased chance of impedance, causing the air to relocate to unintended areas.
Basic physics informs us that obstructions under the raised floor will affect air velocity and momentum. To optimize this space so air is efficiently supplied to the data room, obstructions should be limited both under the raised floor and in the data center itself.
A common obstruction in data centers is poor cable management.
Data center managers should check cables are arranged tightly together and organized along the direction of airflow. Modern data centers keep cabling away from under the raised floor, allowing better airflow.
If the volume of the cold air intake that the servers require is larger than that available from the cold aisles, hot air will likely recirculate into the cold aisles to fill this insufficiency.
This is usually caused when CRACs are placed improperly, so not enough cold air is supplied to the correct aisles. Data center managers should assess where CRACs are placed to avoid this.
Without proper management, airflow can bypass its intended destination – increasing the possibility of hot air sneaking into the cold aisles.
When hot air is mixed with cold, it increases its density – disabling the hot air’s ability to rise to the ceiling return plenums, which create a path for hot air to travel to the CRACs to be cooled. Instead, the mixed air becomes stagnant – these areas are very difficult to cool.
There is a range of cheap, effective equipment that can be used to increase containment of hot and cold air. Blanking panels, for example, can be added to cover empty rack slots, preventing hot air from seeping into the cold aisles. Plastic curtains (similar to those in commercial refrigeration) can also be used to seal off cold aisles.
Our final advice? Always analyze airflow efficiency before investing in costly air-conditioning units. And remember, true airflow optimization requires expert assistance.
What will future data centers look like?
Future data centers may look very different as companies innovate alternative solutions to suit their cooling needs. A few years ago, Microsoft submerged 864 of its servers on the ocean floor close to the northern coast of Scotland – sounds bizarre right?
There was actually a good reason in experimenting with this, as the cold northern ocean provides a constant and natural cooling source.
Microsoft’s underwater data center has now resurfaced – and it seems the idea has potential: without human interruption, server failure rate was one-eighth of those on land.
Maybe fish are better data center technicians than humans?