AI Data Centers Face Growing Energy and Water Challenges

Why do AI data centers use so many resources?
Engadget

Key Points

  • AI data centers rely heavily on power‑hungry GPUs, driving a sharp rise in electricity use.
  • Heat from GPUs forces intensive cooling, dramatically increasing water consumption.
  • Traditional cooling methods consume billions of liters of water annually.
  • Closed‑loop liquid cooling and micro‑fluidic chip cooling offer significant heat‑reduction gains.
  • Free‑cooling and geothermal solutions are being piloted to cut water and energy use.
  • Transparency about emissions and resource use is urged by industry experts.
  • Smaller, tuned AI models and better hardware design can lower overall demand.

The surge in AI workloads is driving rapid expansion of data centers that rely heavily on GPUs, which consume far more power and generate far more heat than traditional CPUs. This growth is straining electricity supplies and dramatically increasing water use for cooling. Industry leaders are exploring liquid‑cooling, micro‑fluidic chips, free‑cooling, and geothermal options, while experts call for greater transparency, smarter hardware design, and more efficient model use to curb the environmental impact.

Rising Resource Demands

AI’s rapid adoption has spurred the construction of new data centers packed with graphics processing units (GPUs). Unlike CPUs, the GPUs “activate all of the processing units at once,” leading to higher power consumption and heat output. Reported electricity use rose from roughly 60 TWh in early years to 176 TWh by 2023, pushing data center energy use from about 1.9 percent to nearly 4.4 percent of total U.S. consumption.

Heat generated by GPUs must be managed to maintain performance and longevity. Standard guidelines advise keeping server rooms between 18 °C and 27 °C (64.4 °F to 80.6 °F). Meeting these limits requires intensive cooling, which in turn drives water consumption.

Cooling and Water Use

Traditional cooling methods—hot and cold aisle containment, direct and indirect evaporative cooling—rely on large volumes of water. U.S. data centers consumed 21.2 billion liters of water in 2014, climbing to 66 billion liters by 2018. AI‑focused facilities alone used about 55.4 billion liters in 2023, with projections suggesting a potential rise to 124 billion liters by 2028.

Much of this water, however, is lost to evaporation or treated with chemicals, limiting its reuse. Indirect water use—primarily from power plants that supply electricity—accounts for roughly three‑quarters of a data center’s total water footprint.

Emerging Sustainable Solutions

Companies are testing a range of cooling innovations. Closed‑loop liquid cooling, already common in high‑end PCs, is being adopted in AI data centers to recycle coolant without loss. Microsoft’s micro‑fluidic cooling system demonstrated up to three times better heat removal than traditional cold plates, cutting the temperature rise of GPU silicon by 65 percent in lab tests.

Free‑cooling leverages ambient air or seawater, sometimes combined with rainwater harvesting. Start Campus in Portugal plans to circulate seawater through its cooling loop, pledging no meaningful water loss. Immersion cooling—submerging hardware in non‑conductive liquids—remains niche but shows promise for specific applications.

Geothermal options are gaining traction. Iron Mountain Data Centers uses a 35‑acre underground reservoir for year‑round cooling, while Meta has partnered with Sage Geosystems to source up to 150 MW of geothermal power for future facilities.

Industry Calls for Transparency and Efficiency

Experts stress that greater transparency about emissions and resource use is essential. Vijay Gadepally of MIT’s Lincoln Laboratory urges AI firms to disclose footprints and to prioritize smarter chip designs and better utilization of existing capacity. He notes that many AI models are “vastly overpowered,” suggesting that smaller, tuned models can achieve comparable performance while reducing computational waste.

By focusing on efficient hardware, smarter workload allocation, and sustainable cooling, the industry can mitigate the strain on power grids and water supplies, avoiding “rolling blackouts, less potable water and rising utility costs” for surrounding communities.

#AI data centers#GPU energy consumption#water usage#liquid cooling#micro‑fluidic cooling#free cooling#geothermal power#Microsoft#Google#NVIDIA#Meta#Vijay Gadepally#Phil Burr#Omer Wilson
Generated with  News Factory -  Source: Engadget

Also available in:

AI Data Centers Face Growing Energy and Water Challenges | AI News