Many people in various climates have discussed the use of grey-water recycling and rainwater capture to minimise the millions of litres of groundwater required to cool large data centres. However, the simple answer in many climates, according to Sandia National Laboratories researcher David J. Martinez, is to use liquid refrigerant.
Based on this principle, Martinez (engineering project lead for Sandia’s infrastructure computing services) is helping design and monitor a cooling system expected to save four to five million gallons (15 to 19 million litres) annually in New Mexico if installed next year at Sandia’s computing centre, and hundreds of millions of gallons nationally if the method is widely adopted. It is currently being tested at the National Renewable Energy Laboratory in Colorado, which expects to save a million gallons (almost four million litres) annually.
The system, built by Johnson Controls and called the Thermosyphon Cooler Hybrid System, cools like a refrigerator without the expense and energy needs of a compressor.
Currently, many data centres use water to remove waste heat from servers. The warmed water is piped to cooling towers, where a separate stream of water is turned to mist and evaporates into the atmosphere. Like sweat evaporating from the body, the process removes heat from the piped water, which returns to chill the installation. However, large-scale replenishment is needed to continue the process. Thus, an increasing amount of water will be needed worldwide to evaporate heat from the growing number of data centres, which themselves are increasing in size as more users put information into the cloud.
“My job is to eventually put cooling towers out of business,” said Martinez.
“So now most data centres use water to cool themselves, but I’m always looking at the future and I see refrigerant cooling coming in for half the data centres in the US, north and west of Texas, where the climate will make it work.”
The prototype method uses a liquid refrigerant instead of water to carry away heat. The system works like this: Water heated by the computing centre is pumped within a closed system into proximity with another system containing refrigerant. The refrigerant absorbs heat from the water so that the water, now cooled, can circulate to cool again. Meanwhile, the heated refrigerant vaporises and rises in its closed system to exchange heat with the atmosphere. As heat is removed from the refrigerant, it condenses and sinks to absorb more heat, and the cycle repeats.
“There’s no water loss like there is in a cooling tower that relies on evaporation,” said Martinez.
“We also don’t have to add chemicals such as biocides, another expense. This system does not utilise a compressor, which would incur more costs. The system utilises phase-changing refrigerant and only requires outside air that’s cool enough to absorb the heat.
“If you don’t have to cool a data centre to 45° Fahrenheit (7° Celsius) but instead only to 65 to 80° (18 to 27° Celsius), then a warmer outside temperature – just a little cooler than the necessary temperature in the data centre – could do the job.”
For indirect air cooling in a facility, better design brings the correct amount of cooling to the right location, allowing operating temperatures to be raised and allowing the refrigerant cycle to be used more during the year.
“At Sandia, we used to have to run at 45° Fahrenheit (7° Celsius). New we’re at 65 to 78 (18 to 25° Celsius). We arranged for air to flow more smoothly instead of ignoring whorls as it cycled in open spaces,” said Martinez.
“We did that by working with supercomputer architects and manufacturers of cooling units so they designed more efficient air-flow arrangements. Also, we installed fans sensitive to room temperature, so they slow down as the room cools from decreased computer usage and go faster as computer demand increases. This results in a more efficient and economical way to circulate air in a data centre.”
Big jobs that don’t have to be completed immediately can be scheduled at night when temperatures are cooler.
“Improving efficiencies inside a system raises efficiencies in the overall system,” said Martinez.
“That saves still more water by allowing more use of the water-saving refrigerant system.”