New AI Data Centers Could Emit More CO₂ Than Morocco, Report Finds

Key Points
- Wired reports 11 U.S. AI data centers could emit 129 million tons of CO₂ annually.
- Emission potential exceeds Morocco's total greenhouse‑gas output in 2024.
- Facilities are linked to OpenAI, Meta, Microsoft and xAI.
- Companies plan dedicated natural‑gas power plants to bypass grid limits.
- Microsoft's project alone may emit over 11.5 million tons per year.
- xAI's turbines in Memphis and Southaven could each add 6.4 million tons.
- Models assume full‑capacity operation; real output may be lower but still high.
- Local communities could face air‑quality and health impacts.
- The report highlights a growing clash between AI growth and climate goals.
A Wired investigation reveals that 11 gas‑powered AI data centers under construction or announced in the United States could release up to 129 million tons of carbon dioxide annually—more than Morocco’s total emissions in 2024. The facilities, linked to OpenAI, Meta, Microsoft and xAI, plan to run dedicated natural‑gas power plants to sidestep grid constraints, raising fresh concerns about the climate impact of the AI boom.
Wired’s latest investigation uncovers a stark climate dilemma hidden behind the rapid expansion of artificial‑intelligence computing. Eleven new data‑center campuses across the United States, slated for companies such as OpenAI, Meta, Microsoft and xAI, are designed to run on dedicated natural‑gas power plants. If built as modeled, the sites could emit roughly 129 million metric tons of CO₂ each year—more than the entire nation of Morocco produced in 2024.
The report identifies projects already announced or under construction in Texas, New Mexico, Ohio, Wisconsin and several other states. Microsoft, for example, is eyeing a natural‑gas facility that could alone generate over 11.5 million tons of greenhouse gases annually, surpassing Jamaica’s total emissions. xAI’s planned turbines in Memphis, Tennessee, and Southaven, Mississippi, each project an additional 6.4 million tons of CO₂ equivalents per year.
Companies are pursuing these on‑site power solutions to avoid bottlenecks in the existing electrical grid, which struggles to meet the massive energy demand of AI workloads. By building their own gas‑fuelled stations, they aim to secure a reliable, high‑capacity supply for training large language models and other compute‑intensive tasks.
Wired’s analysis bases its emissions estimates on models that assume the power plants run at full capacity continuously. Even if real‑world operation trims that figure by a third, the output would still represent a considerable carbon burden. The potential environmental impact extends beyond global metrics; local communities near the proposed plants may face air‑quality degradation and other health risks associated with sustained natural‑gas combustion.
Industry observers note that while AI can drive efficiencies in other sectors, its own energy footprint is growing faster than many sustainability initiatives can offset. The findings arrive amid broader scrutiny of tech firms’ climate commitments and could prompt regulators, investors and consumers to question the viability of a data‑center model that leans heavily on fossil fuels.
For now, the path to greener AI remains uncertain. Some experts argue that the sector could pivot to renewable energy sources or improve cooling efficiency, but the current trajectory points to a substantial increase in emissions unless policy or market forces intervene.