Home

PR Services

Other services

What makes us different

Clients

What our clients say

Feature articles

Blog

About us

Contact

 

 

Specialists in PR and marketing support for building services, building management and sustainability

 

Keeping an open mind 

Traditional energy–hungry data centre cooling systems can be made more efficient by combining modern cooling technologies with an open mind. Dean Ward of Walter Meier (Climate UK) explains 

When designing cooling for data centres it’s clear the priority is to maintain optimum conditions for the servers housed there. Historically in such applications the mission-critical nature of server rooms has meant that energy efficiency has taken a back seat. 

This situation is changing, however – not in terms of mission criticality but in the increasingly important role that reducing carbon emissions now plays in the corporate agenda of the organisations that operate data centres. This means that building services engineers need to try to maintain system resilience while introducing higher efficiencies. At the same time, as server processing power increases so too does the cooling load, making the whole situation even more challenging.

In meeting these challenges, I would suggest that designers need to take advantage of all the tools that are available to them and, without a doubt, one of these is free cooling. To that end there are now computer room air conditioning (CRAC) units and chillers that have been optimised to maximise free cooling – even at ambient temperatures as high as 15°C. In this respect, they are designed to gradually reduce levels of mechanical cooling as ambient temperature falls, so that 100% free cooling can often be achieved at ambient temperatures of 0°C.

Clearly this offers significant energy-saving opportunities. Indeed, field trials with plant optimised for free cooling, operating for 24 hours a day – as would be typical for a data centre – can achieve around 35% energy savings compared to traditional designs.

Fine tuning 

In parallel, there are also things that can be done with the design to maximise the opportunities to deploy free cooling. For example, much of the received wisdom on the design of data centre cooling systems is based on requirements of older style computer servers, which were very vulnerable to rising temperatures. 

Modern servers, however, are more tolerant and capable of operating in temperature of up to 26°C without any impairment to their function. Consequently, it often isn’t necessary to design to 22°C at 50% relative humidity –though clearly this is a decision that needs to be made on the basis of a good understanding of the servers being used in each project. However, with heat loads of perhaps 1500W/m2 the temperature will rise so quickly in the event of major cooling failure that a few degrees difference in set point will make little difference to the end result. 

Where it is possible to operate with higher temperatures there is an opportunity to use higher chilled water temperatures; perhaps 10°C/16°C flow/return, rather than the traditional 6°C flow/12°C return temperatures. Here, it will often be necessary to increase air volumes as well, resulting in an increase in fan power but the overall energy savings will still be significant. 

The high resilience of data centres also means that most have more CRAC units than are required at any one time, so that they operate at part-capacity for much of the time. Here, variable speed fans will also help to reduce energy consumption – and larger fans (where space allows) will add to these savings. 

Needless to say, any such measures will need to maintain the required level of resilience for the facility (2N or 2N+1) to ensure any plant failure does not have catastrophic results. 

Maximising savings 

As with so many critical engineering projects the overall energy savings that can be achieved will be limited by the need to address the end user’s key business priorities. Nevertheless, by looking at every aspect of the design there are often opportunities to make small differences here and there that add to an overall difference that’s well worth having. 

In addition to the examples above, the humidification system is certainly worth a closer look. Traditionally, CRAC units have incorporated electrode boiler humidifiers but clearly these are not the most efficient option. As an alternative, cold water spray humidification will not only be more efficient, it will also provide some free, adiabatic cooling. 

It’s also worth bearing in mind that data centres have strict maintenance regimes so the considerations that often mitigate against cooling towers and water cooled chillers in standard commercial buildings don’t apply here. And if water cooled chillers are used it’s possible to achieve a much higher Seasonal Energy Efficiency Ratio (SEER) compared to an air cooled chiller. Plus, any free cooling will be based on the ambient wet bulb temperature – which will be lower than the dry bulb temperature. 

In fact, there are many ways that energy savings can be achieved in data centres without compromising on the resilience of the systems. The key is to look at each element of the system in detail and to keep an open mind.

www.waltermeier.com