Page 35 - Climate Control News December 2018
P. 35

Data Centres
LEFT: Artists impression of Singapore data centre
It is expected to achieve a Power Usage Ef- fectiveness (PUE) of 1.19, making it one of the most efficient in the region.
The StatePoint system uses a liquid-to-air exchanger in which water evaporates through a membrane separation layer to cool the data centre. Nortek Air Solutions said the liquid-to- air membrane exchanger prevents cross con- tamination between the water and air streams.
With the addition of a pre-cooling coil, the system can maintain required cooling water and temperatures in humid climates using minimal supplemental mechanical cooling.
“When deployed,the new cooling will allow us to build highly water- and energy-efficient Facebook data centres in places where direct cooling is not feasible,” Veerendra Mulay, Face- book’s research and development mechanical engineer, wrote in a blog.
“Based on our testing in several different lo-
“THE HEART OF THE SYSTEM IS A LIQUID-TO-AIR ENERGY EXCHANGER.”
– R&D ENGINEER, VEERANDRA MULAY
cations, we anticipate the (new) system can re- duce water usage by more than 20 per cent for data centres in hot and humid climates and by almost 90 per cent in cooler climates in com- parison with previous indirect cooling sys- tems.”
The heart of the SPLC system is a liquid-to- air energy exchanger, where water is cooled as it evaporates through a membrane separation layer, he wrote in the blog post.
The cold water then cools the air inside the data centre and keeps servers at optimal tem- peratures. “When outside air temperatures are low, the SPLC’s most energy- and water- efficient mode uses that air to produce cold water. When outside air temperatures rise, the SPLC system will operate in an adiabatic mode, in which the system engages the heat exchanger to cool the warm outside air before it goes into the recovery coil to produce cold water,” Mulay wrote.
“In hot and humid weather, the SPLC will op- erate in super-evaporative mode, where outside air is cooled by a pre-cooling coil and then used to produce cold water.” ✺
Using higher temperatures to improve efficiency
WATER CHILLERS ACCOUNT for between 60 and 85% of overall cooling-system energy con- sumption in the data centre.
Consequently, data centres are designed, where possible, to keep usage of chillers to a min- imum and to maximise the amount of available "free cooling", in which less power-hungry sys- tems such as air coolers and cooling towers can keep the temperature of the IT space at a satis- factory level.
One approach to reducing water chiller energy consumption is to design the cooling system so that a higher outlet water temperature (CHW) from the chillers can be tolerated while main- taining a sufficient cooling effort. In this way, chillers consume less energy by not having to work as hard, and the number of free cooling hours can be increased.
As with any complex system, attention needs to be paid to all parts of the infrastructure, as changes in one area can have direct implications for another. A new whitepaper from Schneider Electric, a specialist in energy management and automation, examines the effect on overall cool- ing system efficiency by operating at higher chilled water temperatures.
The whitepaper is entitled “How higher chilled water temperature can improve data centre cooling system efficiency”.
It outlines the various strategies and tech- niques that can be deployed to permit satisfac- tory cooling at higher temperatures, whilst dis- cussing the trade-offs that must be considered at each stage, comparing the overall effect of such strategies on two data centres operating in vast- ly different climates.
Among the trade-offs discussed were the need to install more air-handling units inside the IT
space to offset the higher water-coolant temper- atures, in addition to the need for redesigned equipment such as coils, to provide adequate cooling efforts when CHW (chilled water tem- perature) exceeds 20C.
The paper also advises the addition of adiaba- tic, or evaporative, cooling to further improve heat rejection efficiency. Each approach requires an additional capital investment, but results in lower long-term operating expenses due to the improved energy efficiency.
The two case studies cover real-world exam- ples in differing climates; the first is in a temper- ate region (Frankfurt, Germany) and the second in a tropical monsoon climate (Miami, Florida).
In each case, data was collected to assess the energy savings that were accrued by deploying higher CHW temperatures at various incre- ments, whilst comparing the effect of deploying additional adiabatic cooling.
The study found that an increased capital ex- penditure of 13% in both cases resulted in energy savings of between 41% and 64%, with improve- ments in TCO between 12% and 16% over a three year period.
Another inherent benefit of reducing the amount of energy expended on cooling is the improvement in a data centres PUE (Power Us- age Effectiveness) rating. As this is calculated by dividing the total amount of power con- sumed by a data centre by the power consumed by its IT equipment alone, any reduction in en- ergy expended on cooling will naturally reduce the PUE figure.
The Schneider Electric study found that PUE for the two data centres examined was re- duced by 14% in the case of Miami and 16% in the case of Frankfurt. ✺
The paper also advises the addition of adiabatic, or evaporative, cooling to further improve heat rejection efficiency.
CLIMATE CONTROL NEWS
DECEMBER – JANUARY 2019
35


































































































   33   34   35   36   37