banner

News

Aug 14, 2023

How Can Data Centers Reduce Water Usage and Improve Efficiency?

Richard Pallardy | Jul 27, 2023

When most people think of computing, water may be far from their minds. This resource, however, is crucial to the cooling systems of most large data centers -- upon which the world’s digital infrastructure increasingly relies.

Through a variety of mechanisms, water absorbs the heat energy emitted by servers, keeping their temperature stable and allowing them to function without interruption.

Related: A New Front in the Water Wars: Your Internet Use

However, our reliance on water to cool the contraptions that propel the digital era is creating new problems -- especially in arid regions acutely afflicted by water scarcity.

Data centers use enormous amounts of water to keep servers cool, employing a number of different methods.

Related: Are Underwater Data Centers Truly Practical?

“The high water usage in data centers is primarily due to the cooling systems that prevent heat buildup from the data servers,” says Ian Clatworthy, director of data platform product marketing at Hitachi Vantara. “Cooling systems like chillers, cooling towers, and air conditioners rely on water to maintain the optimal temperature of the equipment, resulting in considerable water consumption.”

Exactly how much they use remains obscure, largely due to lack of transparency by major providers. So, too, it is difficult to ascertain exact measurements because water is used both to cool the machines and to generate the electricity that powers them.

Electricity generation may use as much as four times the amount of water used in direct cooling. Data centers account for nearly 2% of total electricity use in the United States. While around 40% of data is stored on small, onsite servers, increasingly, data is outsourced to massive offsite operations.

Research by the Uptime Institute suggests up to 6.75 million gallons of water a year by a single large data center. A 2021 paper estimates an average of around 150,000 gallons. Up to 57% may be sourced from potable water supplies -- unusual for industrial use.

“Unfortunately, this practice can put a significant strain on local water supplies, particularly in drought-prone areas,” says data scientist and writer Kat Campbell. “Furthermore, the water used in cooling systems often ends up as wastewater, which can be an additional environmental concern.”

Much of the water usage problem is attributable to inefficient cooling systems -- cooling towers in particular. Cooling towers spray hot air generated by servers with water or pass the air over wet media. Heat energy is removed from the air through contact with water and the cooler air is then recirculated. Substantial amounts of water are lost through evaporation. Many of these systems also use coolant devices – chillers -- to lower the temperature of the air as well. Because the water can breed pathogens, it must be treated with toxic chemicals, which may be released along with water vapor.

Adiabatic cooling employs similar principles but only kicks in when ambient temperatures require it. While these systems may require less water, they also require more energy to run.

“The water consumed is dependent on many variables, including the IT supply air setpoint, the ambient temperature and humidity ranges for the geographical location, and the type of cooling system deployed,” claims Stuart Lawrence, vice president of product innovation and sustainability for Stream Data Centers. “Not all systems that use water to cool use a lot -- some use higher proportions of electrical energy combined with more equipment to reject the heat using air only and no water. Some only use evaporative cooling in peak summer months and, depending on the location, that might only require a small amount of water for a couple hours a day.”

Still, up to a fifth of data centers in the United States draw from watersheds that are already “moderately or highly” water-stressed, according to 2021 research from Virginia Tech and Lawrence Berkeley National Lab.

The placement of data centers in these regions has put them in conflict with local water use, especially because many of them draw their water supply from treated water that is ready for drinking and residential irrigation and cleaning. In some areas, permits for new data centers have been denied due to the strain they would likely put on the water supply. And in others, residents and environmental groups have protested.

Erecting data centers in cooler climates may reduce the need for water use. By using external air that is already cooled by the natural environment, chillers may be unnecessary or used only during warmer parts of the year. Google and Microsoft have experimented with this approach, known as “free cooling.”

Google maintains that it is pursuing a hyperlocal approach to water use. “At each data center campus, our cooling decisions look at the local environment -- balancing the availability of carbon-free energy and responsibly-sourced water -- to minimize the net climate impact both today and in the future,” the company claims.

Still, they argue that water-cooled data centers are more efficient than air-cooled centers. While true in some situations, it is not true in others. Proponents of air cooling maintain that, overall, this method is more efficient and requires fewer resources to operate.

Continue reading this article on InformationWeek.

More information about text formats

Continue reading this article on InformationWeek.
SHARE