OpenAI CEO Sam Altman Dismisses Claims About ChatGPT’s Water and Energy Use

Key Points
- Sam Altman called claims that a ChatGPT query uses 17 gallons of water "totally fake".
- He also said the idea that a query consumes 1.5 iPhone‑battery charges is inaccurate.
- Altman acknowledged past water use from evaporative cooling in data centers but noted modern facilities no longer rely on it.
- He emphasized that the real concern is AI’s total energy consumption, not per‑query usage.
- Altman urged a swift shift to nuclear, wind, and solar power to meet AI’s growing energy needs.
- There is currently no legal mandate for tech firms to disclose precise energy or water usage figures.
- Independent scientists are attempting to study AI’s environmental impact without official data.
- Data‑center operations have been linked to rising electricity prices.
- Altman argued that comparing AI training energy to human learning costs is unfair.
- He suggested AI may already match human energy efficiency for inference when measured against the lifetime energy humans expend to learn.
OpenAI chief executive Sam Altman responded to criticism of artificial‑intelligence energy and water consumption, labeling recent claims as unfounded. He said reports that a single ChatGPT query consumes 17 gallons of water or 1.5 iPhone‑battery charges are “totally fake” and “unfair.” While acknowledging that data‑center cooling once relied on evaporative methods, Altman emphasized the broader need for clean power, noting that the industry should shift quickly toward nuclear, wind, and solar sources. He also highlighted the lack of legal disclosure requirements and the difficulty of measuring AI’s environmental footprint.
OpenAI’s Leader Addresses Environmental Criticism
Sam Altman, chief executive of OpenAI, recently spoke about growing public concern over the environmental impact of artificial‑intelligence systems. In a public interview, Altman directly challenged widely circulated figures that suggest a single ChatGPT query uses a large amount of water or electricity.
Altman described the claim that a query consumes "17 gallons of water" as "completely untrue" and "totally insane," stressing that there is no direct connection to reality. He also refuted the notion that a single request draws the equivalent of "1.5 iPhone battery charges," stating there is “no way it’s anything close to that much.”
While dismissing these specific metrics, Altman acknowledged that AI’s overall energy consumption is a legitimate concern. He noted that data‑center cooling historically relied on evaporative methods, which required significant water use, but modern facilities have largely moved away from that approach. He argued that the real issue lies in the total energy demand of AI technologies worldwide, not the per‑query usage.
Altman called for a rapid transition to low‑carbon power sources, saying the world must “move towards nuclear or wind and solar very quickly.” He highlighted that there is currently no legal requirement for technology companies to disclose the exact amounts of energy and water they consume, leaving independent scientists to estimate the impact on their own.
He also referenced broader industry trends, noting that data‑center operations have been linked to rising electricity prices. In his view, discussions that compare AI training energy to the cost of a human inference are “unfair,” particularly when they focus on the energy required to train a model versus the energy a human uses to answer a question.
Altman drew a parallel between AI and human learning, pointing out that humans spend “like 20 years of life and all of the food you eat during that time” to become knowledgeable. He suggested that when measured against this benchmark, AI may already have achieved comparable energy efficiency for inference tasks.
Overall, Altman’s remarks aim to shift the conversation from sensational per‑query statistics to a more nuanced understanding of AI’s total energy footprint and the importance of clean energy adoption across the sector.