AI’s Unquenchable Thirst for Water Could Rival That of Some Nations | Unpublished
Hello!
Source Feed: Walrus
Author: Christopher Pollon
Publication Date: October 1, 2025 - 06:29

AI’s Unquenchable Thirst for Water Could Rival That of Some Nations

October 1, 2025

A bout halfway through Microsoft vice chair and president Brad Smith’s keynote address to Web Summit Vancouver in May, as he expounded on how AI might benefit humanity, things took an abrupt turn. Responsible AI, he said, “means, for us, building data centres . . . that are powered by carbon free energy, like the renewal of Three Mile Island . . .”

The four iconic cylindrical towers of Pennsylvania’s Three Mile Island nuclear plant, the site of America’s worst nuclear accident, appeared on a giant screen above the standing-room-only crowd. Most of the tech enthusiasts in the audience looked too young to remember the partial meltdown and ensuing panic in 1979—which was a major contributor to the decline in nuclear construction in the 1980s and ’90s.

What had been up to that point a slick presentation about Microsoft’s vision for an AI-enabled future, a technical upheaval Smith likened to the harnessing of electricity, unexpectedly turned into confronting AI’s voracious appetite for that very power.

Smith had little choice but to acknowledge it. The computational power required by large-scale AI models depends on staggering amounts of energy. This year alone, his company will invest $80 billion (US) to roll out AI and data centre infrastructure across North America and forty countries around the world. Microsoft is not alone: Apple will invest $500 billion (US) in AI and other infrastructure over the next four years—almost double of what it cost to mount the entire Apollo space program; and Alphabet, Google’s parent company, has announced it will spend $75 billion (US) on AI infrastructure this year.

The electricity to make this go could account for up to 12 percent of all American energy consumption by 2028. Microsoft is investing in mothballed nuclear disaster sites, while Meta is helping another nuclear power plant remain operational until 2047. Other tech giants are investing in new modular reactors because a build-out of their AI investments will demand a reordering of energy systems in North America and beyond.

Given the scope of the investments being made now, what will the environmental impacts of AI be moving forward? It’s a question that’s hard to answer because the biggest tech companies are not sharing enough data for us to collectively plan ahead. But one thing is clear: when it comes to clean energy and fresh water, it will demand hard choices between the needs of humanity and those of machines.

E very time you hit “Enter” on a search with ChatGPT, the most popular AI search service on earth, with around 700 million weekly users, your request for words or images is routed to one of the world’s roughly 7,000 data centres (up from 3,600 in 2015, according to Bloomberg), including about 3,000 in the US alone. Once there, pre-trained AI models interpret and, through a process called inference, create a response.

An AI data centre is home to all the computer hardware required to train AI models and produce the search results—including servers employing graphics processing units, which require enormous 24/7/365 flows of electricity to operate and, by extension, fresh water to cool.

It has been widely reported that a single request to ChatGPT requires nearly ten times more energy than a Google search. The only problem with this statistic, says Sasha Luccioni, AI and climate lead at Hugging Face, the world’s biggest platform for open-source AI that advocates for greater democratization of the technology, is that it is not based on reality. It’s a poorly executed guesstimate, she says, based on an off-the-cuff remark by Alphabet chairman John Hennessy during a 2023 interview with Reuters. The electricity usage referenced for the single Google search comparison, she says, was taken from a 2009 blog post by the company.

“So we don’t have any numbers, and people pick what they can and then use it out of context,” says Luccioni, who co-authored a recent study looking at what she calls “urban legends” around AI resource use. Regarding the estimates about water usage, “none of it is really anchored in actual information,” she says.

Once your AI query reaches the data centre, what happens there becomes a closely held proprietary secret of the owners of the so-called “closed models” most of us use: Open AI’s ChatGPT, Google’s Gemini, Anthropic’s Claude, and more. “These companies face few incentives to release this information, and so far, they have not,” reads a May 2025 MIT Technology Review investigation that attempted to quantify the energy use of routine AI requests but instead found this information lives in a black box.

“While many model developers release estimates of the power consumption and carbon emissions from the final training runs for their latest models, there is comparatively little transparency into the impact of model development, hardware manufacturing, and total water usage throughout,” reads another study co-led by the Allen Institute for AI.

Even the number of data centres set up for AI inferencing is unknown, reported MIT Technology Review, because information on such facilities is now guarded tightly by companies.

What is clear, however, is that as AI becomes more integrated into all aspects of our work, health, and social systems, its energy and water footprint will only grow. Without transparency about the energy and water footprint of AI, says Luccioni, it’s impossible to manage and plan for impacts and prioritize the use of critical resources, like the capacity of our electrical grids, drinking water supply amid drought, and collective responses to climate change.

Luccioni describes a December 2024 conference hosted by the International Energy Agency, which brought big tech companies into the same room as energy grid operators from across the world. To do their jobs, the latter, Luccioni says, need to know what electrical demand will be. “Data centres will use energy 24/7,” she says, but electricity providers “don’t get any information, so they don’t know how many data centres are going up.” It’s kind of just forced upon them. They can’t plan ahead. “And so they were panicking a little, being like, ‘We need numbers, we need at least some projections, we need to know where we’re going. We need to know how we have to build out our capacity, because otherwise we can’t respond to demand.’”

The reason why all this information is so critical was underscored early this summer, when a heat wave struck the entire eastern North American continent, driving humidex temperatures into the mid-forties. Electricity providers struggled to keep up with demand from air conditioning, which, during bouts of increasingly common extreme summer heat, keeps many people alive. As utilities warned of impending rolling blackouts, tech companies were racing to plan a mass expansion of energy-sucking data centres.

A ccording to Alex Hanna, co-author of The AI Con: How to Fight Big Tech’s Hype and Create the Future We Want, and a former member of Google’s Ethical AI team whose work included flagging environmental impacts, the rise of AI is driving the creation of a massive, resource-hungry, and perhaps the largest computing infrastructure to date. “There needs to be a scaling up of data centres that support the training of large language models and large language model inference,” she says. “This infrastructure is computationally intensive, and [thus] more energy intensive.”

The source of all that electricity depends on where you live or, more exactly, where the data centre that handles your interaction operates. Coal currently accounts for about 30 percent—the largest source—of electricity for data centres globally, according to a 2025 report by the IEA. Renewables, including wind, solar, and hydro, account for about 27 percent, followed by natural gas and nuclear. (Natural gas accounts for over 40 percent of all US data-centre electricity.)

“The United States and China are by far the largest datacenter markets today,” reads the IEA report. “In both countries, most of the electricity consumed by datacenters is produced from fossil fuels, which also meet most of the increase to 2030.”

Concerns about power insecurity for data centres coming online has spawned the resurgence in nuclear power in the US—Microsoft has agreed to purchase the next twenty years of Three Mile Island nuclear power once the site becomes operational in 2028, while demand is delaying the retirement of multiple emissions-spewing coal-powered plants.

In addition to Microsoft’s ambitions, companies like Meta, Amazon, and Google have joined a pledge to triple the world’s nuclear capacity by 2050. But new nuclear capacity will take years to come online, costing untold billions of dollars and producing streams of toxic waste that the world has not yet figured out what to do with.

In the shorter term, because data centres must have 24/7 access to electricity, and brownouts from overtaxed utilities and grids are not an option, data centres typically build their own backup power as a fail-safe measure—often employing natural gas or diesel generators.

Elon Musk’s xAI shined a spotlight on the Wild West of backup data-centre power systems about a year ago, when it established dozens of portable methane gas generators at a big data centre in Memphis. Up to thirty-five generators were on site without a permit—until members of a poor downwind Black community rose up in response to the emissions.

In June, an intent-to-sue notice was sent to xAI from the National Association for the Advancement of Colored People, or NAACP, a civil rights group, for failing to get necessary permits or pollution controls for the turbines. However, the closure of the Environmental Protection Agency’s scientific research arm that analyzes environmental hazards could make it difficult for the case to proceed. By August, Time reported Musk was planning an even bigger data centre just a few miles away—with sixty-six more natural gas turbines already delivered to the site.

A I is also a glutton for groundwater and water from rivers and lakes. And while at least some energy and carbon emissions information is released by tech companies regarding the training of their AI models, even if it isn’t entirely reliable, far less is known about water withdrawals and consumption for data centres.

That’s why Shaolei Ren, associate professor of electrical and computer engineering at the University of California, Riverside, co-authored a study that, for the first time, estimated the water footprint from running artificial intelligence queries. The study showed that, in 2023, using GPT-3 to produce ten to fifty medium-size responses, of roughly 150 to 300 words each, consumed half a litre of water.

If it sounds unspectacular, consider this: Ren cites their research that estimates that by 2027, water withdrawal alone from global AI demand could be six times the total annual water withdrawal of Denmark, or half of all of the UK’s.

Google’s 2025 Environmental Report shows the company’s overall water consumption increased 28 percent just from 2023 to 2024, including in Henderson, Nevada, near water-parched Las Vegas, where more than 200 million gallons of water were consumed last year. (US data centres are expected to follow similar general water consumption trajectories, consuming two to four times more water in 2028 than in 2023.)

Water evaporation is how most AI data centres consume fresh water, both directly and indirectly. Cooling towers are one of the ways thermoelectric power plants are cooled. They convert water into vapour that gets released into the atmosphere. Computer hardware at data centres also generates huge amounts of heat—cooling systems are commonly connected to cooling towers that consume water by converting it into steam.

“The [data centre] cooling tower is an open loop,” explained Ren in a UC Riverside press release, “and that’s where the water will evaporate and remove the heat from the data center into the environment.”

AI’s water footprint includes the ultra-pure water required for AI chip and server manufacturing, which often gets polluted in the process, with limited water recycling after the fact depending on the manufacturing plant.

This “secret water footprint,” as the study calls it, is playing out amid “severe freshwater scarcity crisis, worsened extended droughts, and quickly aging public water infrastructure.” If AI’s “escalating” water footprint is not addressed, the authors warn of impending “social conflicts as freshwater resources suitable for human use are extremely limited and unevenly distributed.”

To make AI less “thirsty,” a phrase they use in the study title, Ren and his collaborators recommend that water consumption tracking and reporting has to happen across the AI value chain—in everything from chip manufacturing to on-site data-centre cooling and for so-called scope-two emissions that include all the water used to generate the electricity, with fossil fuel combustion being the most common process in countries including the US and China.

On the latter, Meta has started to include scope-two water consumption in its energy reports. “Regulations are useful but probably won’t be realistic in the shorter term,” Ren says of the US. “The industry leaders such as Meta and Apple (which conducted a water footprint analysis in March 2025) can set examples for water footprint reporting.”

A s AI expands, Big Tech has become more secretive in recent years—about power usage, water draw, even the data fuelling their models. That, says Luccioni, is a problem we can’t afford to ignore. “I feel like it’s been a race to the bottom since ChatGPT came out,” she says. “Big Tech companies have released less and less information about the models, because it’s now considered proprietary.”

Canada has attempted, and failed up to this point, to create federal legislation governing its AI industry, which includes about 240 data centres—with Alberta as a current hotspot. The Artificial Intelligence and Data Act was tabled in Parliament in June 2022. It included a regulatory framework requiring developers and other players to comply with rules for risk assessment and mitigation, record keeping and disclosure of key system information, and fines and criminal charges for noncompliance. The legislation did not become law as Parliament was prorogued early this year.

South of the border, as this story was being written, a proposed US federal law to prevent state governments from enacting any legislation restricting AI companies for ten years was narrowly avoided.

The one bright side to the current US federal aversion to regulating AI, Luccioni hopes, is that it’s a wake-up call to the rest of the world, particularly Europe. The European Union’s AI Act—made into law in August 2024—includes new energy consumption and transparency regulations that will apply to companies developing or using AI technologies. Under the act, for example, a company developing a general-purpose AI model must create and maintain a breakdown of the energy consumption of the model.

The same approach is anathema to the current US federal leadership—in a country that has near complete dominance on the sector and is home to all of the top ten AI companies. “It’s pretty bleak geopolitically, but maybe it’s going to be a kind of kick in the butt that we need in order to actually make regulation in the EU, Canada, and Asia,” says Luccioni.

The post AI’s Unquenchable Thirst for Water Could Rival That of Some Nations first appeared on The Walrus.


Unpublished Newswire

 
Last week, Premier Doug Ford announced that the era of speed cameras in Ontario would be coming to an end. Read More
October 1, 2025 - 09:53 | Robert Hiltz | Ottawa Citizen
Recently, I spent hours scouring my small house for a particular shirt. After combing through every closet, dresser and storage bin, I finally found it wedged behind some large sweaters in a drawer.  Surely, I thought, there has to be a better way. I immediately wondered whether someone had invented a closet organization app. A […]
October 1, 2025 - 09:09 | Paul Welch | Ottawa Citizen
U.S. statistical agencies that provide key economic data to the U.S. Federal Reserve are among the agencies impacted by a shutdown, and an interest rate meeting is looming.
October 1, 2025 - 08:42 | Sean Boynton | Global News - Canada