Data centers don't harm water access at all, anywhere
A lot of journalism on AI's water impacts is misleading
There has been a lot of concern about the environmental impacts of data centers on local communities. In the last year, you may have seen headlines like these:
Surely, this means data centers are a significant water issue for local communities?
To see if this is real, I’ll focus on a specific claim: data centers’ water demands raise the household costs of water where they are built. I won’t focus on commercial or industrial water prices1. I also won’t focus on electric costs, which sometimes do seem to rise, or pollution, which in specific cases seems pretty bad.
After investigating this a lot, I’ve come to the conclusion that data centers have not raised household water bills at all, anywhere. They are consuming a ridiculously tiny amount of the nation’s water. The evidence is surprisingly clear and simple, the whole story of how America deals with water is ridiculously interesting, and each headline above is wrong for simple, obvious reasons that I’ll explain. First I’ll go over the situation with AI and water, and then go into why these headlines are either drastically misleading or (in the case of the New York Times) a pretty straightforward lie. Stories like this are everywhere: a headline with a framing that makes it seem like AI is causing a water shortage, but when you look closer, the shortage always turns out to be caused by something else.
No city or town governments have reported that any residential increases in water prices have been caused by data centers. Obviously government reports may not tell the whole story. But no matter where I look for plausible places data centers would have raised residential water prices, I cannot find anything. Nowhere in the articles I screenshotted above is any evidence provided at all, either.
Contents
Assumptions
First, I’m going to assume the main reason people are worried about AI’s water use is its consumptive use of freshwater. “Consumptive use” = the total water it withdraws form a local ecosystem that it causes to leave that ecosystem, via evaporation. Water simply moving through the data center and being reused doesn’t count as “consumed” until it’s evaporated and creates demand to withdraw more water from a local system.
Some assumptions about numbers I’m making here:
The total consumptive use of freshwater in America every day is ~132 billion gallons.
The total consumptive use of freshwater from all America from data centers, with both the onsite and offsite (nearby power plant) use of water was 200-250 million gallons per day in 2023.2 This is 0.15-0.19% of America’s daily water consumption. We don’t have more updated numbers.
The total consumptive freshwater use for the average American lifestyle is ~420 gallons per day.3
I’ve had to really dig for a lot of the water statistics I’ve found here. There’s a good chance some of them are wrong. If you think they are, and have good sources with updated numbers, please let me know. I’ve tried my best to piece the picture together, but it remains pretty unclear.
Likely places data centers would have increased costs
Here are a few places I looked:
The county with the most data centers in the country
is Loudoun County, Virginia. It’s the home of “data center alley”: the largest concentration of data centers in the world. This link4 has the household water rates of every county in Virginia as of 2024. Loudoun’s water costs are pretty low by Virginia’s standards. It’s currently expecting a slight rate increase, but will still keep it low by state standards.
The city lists the causes of the rate increase:
A key element of providing clean drinking water and protecting the environment is ensuring that Loudoun Water’s infrastructure is maintained and in good working condition. This requires a significant capital investment every year. The proposed rate increases are primarily driven by:
Significant capital program (system replacement and expansion)
Actual cost escalation (ex., power and chemicals) is considerably higher than prior forecasting
Recent rate increases have been well below the cost increases mentioned above (rate increase for 2021, 2022 and 2023 were 3% annually)
Funding the repair and replacement reserve
Increases in purchased water, purchased sewer services and other operating costs.
It seems like a large part of the cost increase is coming from an infrastructure upgrade, but the county does also seem to be using more water. Is this because of the data centers, and is this causing part of the rise in household water costs?
Water for commercial vs. residential use is priced differently to keep markets separate, so while data centers could have indirectly contributed to the household cost, it’s likely that if the system wasn’t able to pump enough water, and household rates went up, water being more scarce for household use would have mainly been caused by other household water demand rising. From 2020-2025, Loudoun County’s population increased by 8%.
Loudoun County has a much more detailed report on rate increases. It lists the following reasons the rates have gone up:
Rising wholesale water purchase costs from nearby Fairfax County(+42% since 2021). Those increases are due to Fairfax Water’s own operating and capital expenses (PFAS treatment upgrades, chemical costs, inflation).
Rising wastewater treatment charges from DC Water (+36% since 2021). Driven by DC Water’s systemwide costs, not Loudoun’s data centers.
Spikes in power costs (+50%) and chemical costs (+40%). These are costs of running Loudoun’s plants. They rise with inflation, not with who’s buying the water.
Ongoing repair and replacement of aging infrastructure. This is the closest link to growth. Loudoun expands its system to meet all new demand: homes, businesses, and data centers. But the study is clear: growth-related expansion is funded through availability charges on new connections, not by existing household customers (“growth pays for growth”).
Capital improvements.
None of those listed drivers are attributed to data centers specifically. Data centers do appear in the report (as big reclaimed water users and as non-residential customers) but their costs are largely handled in separate rate categories and availability charges.
So this first one doesn’t really look like data centers made water costs go up. It makes sense. “Data center alley” has existed for a while, so the county’s already thought a lot about how to govern them.
The place where data centers use the highest percentage of local water
is Dalles City, Oregon, where 29% of all city water goes to a local Google data center. Water costs stayed flat until 2025, then rose a bit to fund a general upgrade to the water system. Some assumed this upgrade was purely to support the data center, but the city government responded to the question of whether the upgrade was motived by the data center with this:
No. The replacement of the water treatment plant and water transmission lines, the two largest and most expensive projects, are needed because the existing facilities are at the end of their useful life. It is now more expensive to rehabilitate and expand these facilities than it is to replace them. The water treatment plant is 75 years old and has experienced significant deterioration of its concrete structures. The water transmission pipelines, which are about 80 years old and made of unlined steel, are deteriorated and leaking. These projects would be needed even if the City did not have Google as a water customer.
The county used tax revenue from Google to fund the majority of the upgrade. They estimated that without the revenue from the data center, the cost of the upgrade would raise local water bills by 23%. Instead, households only paid 7.3% more for the upgrade. So in the place with the highest percentage of water going to data centers, the data center slightly lowered water bills.
The county with the most water stress where data centers have been built
is Maricopa County in Arizona (home to Phoenix). The county is in extremely high baseline water stress (one of the highest in America), and has major data center builds (Google in Mesa; Microsoft in Goodyear).
Circle of Blue, a nonprofit research organization that seems generally trusted, estimates that data cneters in Maricopa County will use 905 million gallons of water in 2025.
For context, Maricopa County golf courses use 29 billion gallons of water each year.
In total, the county uses 2.13 billion gallons of water every day, or 777 billion gallons every year. Data centers make up 0.12% of the county’s water use. Golf courses make up 3.8%.
News organizations listed causes include inflation, infrastructure improvements, and the continuing drought as the main reason for rising water costs there. Technically, if the drought is responsible for higher water costs, then the data centers are responsible for some proportion of that, since they’re competing for the scarce resource of water. But because commercial and residential rates are different, data centers should only be raising the water costs of other businesses in the area.
Commercial buildings pay significantly higher rates for water in Maricopa County (and basically everywhere). Politicians in general know that voters get really really mad upset when costs rise, and so they want the markets for commercial vs. household water to be separate.
Let’s forget the commercial/residential split and assume all water use in the county is equally responsible for the cost increases, and there are no other causes of the cost increases at all (ignoring inflation and infrastructure projects).
Each year, it looks like water bills will become more expensive by an additional ~$180 over the full year. If data centers are as responsible as their proportion of the grid, they be responsible for 20 cents of that increase over the year, or about 1.6 cents per month. This is the absolute worst increase that’s possible, assuming no rising water costs are due to infrastructure maintenance or upgrades, and the household/commercial split doesn’t exist.
Taxes on data centers raised $863 million each year for the state and local government in Arizona, so for every $0.20 cost all data centers impose on the county residents each year, they give back $115 to the state and local government in tax revenue. This seems like on net it’s definitely not sapping resources from the local community.
This is the most extreme example I can come up with, and it still seems to round to zero. I will concede that 20 cents per year for one of the most lucrative sources of tax revenue is technically a “nonzero” increase in household water costs. The numbers here are uncertain enough that this seems like it could be a place where data centers are impacting local water access, but with what we have right now seems like it rounds to zero.
The place with the highest concentration of very new data centers
… is Loudoun County again. There’s been a common framing that data centers are being built in poorer communities that don’t have the resources to resist them. This seems to be true in some places (xAI’s Colossus especially seems to be causing air pollution near poor communities in Memphis, but also isn’t affecting local water costs), but Loudoun County is the richest county per capita in the United States. I think that if the residents had significant objections to data centers being built nearby, they have the resources to coordinate stop them. The county government has an interesting FAQ on how they manage data centers.
So why aren’t household water costs rising anywhere?
Data centers are not using that much water
Data centers in America had a consumptive water use of 200-250 million gallons of water per day in 2023 (including offsite water used in normal power plants generate the electricity). If we just look at the consumptive water use in the data centers themselves, it was more like 48 million gallons of water per year.
It’s surprisingly hard to find good estimates of the US’s total consumptive water use. This paper finds it’s around 132 billion gallons per day. This implies all data centers consumed about 0.15-0.19% of the total freshwater we consume in 2023. 1/500th of our freshwater supply. Data centers themselves used just 0.04%. Obviously not nothing, but I think putting this number in context makes it seem way less extreme. I repeat this point a lot, but Americans spend half their waking lives online. Everything we do online interacts with and uses energy and water in data centers. It’s kind of a miracle that something we spend 50% of our time using only consumes 0.19% of our water.
In 2023 all data centers in America collectively consumed as much water as the lifestyles of the residents of Paterson, New Jersey. AI uses ~15% of energy in data centers globally. It seems unlikely AI in America specifically is even half of the water footprint of Paterson. If we just include the water used in data centers themselves, this drops to the water footprint of the population of Albany, New York.
Water economics
America is good at water economics. Our water management has a lot of ways to keep rates low for consumers. This (and AI’s very low total use of water) is the main reason it hasn’t affected water prices at all.
Household and commercial water prices are different everywhere to keep the markets separate so commercial buildings aren’t competing with homeowners. Data centers only compete with other businesses for water, like any other industry.
In low water scarcity areas, water isn’t zero sum. More people buying water doesn’t lead to higher prices, it gives the utility more money to spend on drawing more water and improve infrastructure. It’s the same reason grocery prices don’t go up when more people move to a town. More people shop at the grocery store, which allows the grocery store to invest more in getting food, and they make a profit they can use to upgrade other services, so on net more people buying from a store often makes food prices fall, not rise. Studies have found that utilities pumping more water, on average, causes prices to fall, not rise.
The only times water costs rise in low water stress areas when a large new consumer arrives is when the consumer demands so much water that utilities are forced to do major rapid upgrades to their systems, and the consumer doesn’t pay for those upgrades. In every example I can find of data centers requiring water system upgrades, the companies that own the data centers are the main source of revenue for the system upgrade.
The main water issue in American small towns isn’t the supply of water, it’s aging water infrastructure that doesn’t serve a large or rich enough tax base to get the money to upgrade. Old infrastructure makes water more expensive. It can also be dangerous (lead pipes etc.). Small town water costs are often higher, not lower, than cities, due to economies of scale. This wouldn’t happen if water costs simply rose when more water is used. Data centers moving into small towns often provide utilities with enough revenue to upgrade their old systems and make water more, not less, accessible for everyone else.
In high water scarcity areas, city and state leaders have already thought a lot about water management. They can regulate data centers the same ways they regulate any other industries. Here water is more zero sum, but data centers just end up raising the cost of water for other private businesses, not for homes. Data centers are subject to the economics of water in high scarcity areas, and often rely more on air cooling rather than water cooling because the ratio of electric costs to water costs is lower.
This seems fine if we think of data centers as any other industry. Lots of industries in America use water. AI is using a tiny fraction compared to most, and generating way, way more revenue per gallon of water consumed than most. Where water is scarce, AI data centers should be able to bid against other commercial and industrial businesses for it.
In general, if there’s a public resource like water, it’s considered the job of the utility and government to set rates to reflect its scarcity. Blaming a private business like a data center for using too much water seems kind of like blaming private customers for buying too much food from a grocery store. It’s the grocery store’s responsibility to set prices to reflect the relative scarcity of and demand for different products. If people are buying too much food from the store and there’s not enough money to restock, that’s the fault of the store, not the individuals. Private businesses shouldn’t be expected to monitor the exact state of local water to decide how much is ethical to buy from the utility. It’s the utility’s job to limit demand by setting prices higher if they actually think the company is going to harm the local water system. If utilities set prices high enough, data centers adjust by switching to different types of cooling systems that use less or no water. The market is (like in most places) the way that data centers receive information about the relative scarcity of water. In most conversations about AI and water, the responsibility for water management is oddly shifted to the private company in a way we don’t do for any other industry.
Politicians especially have strong motivations to keep household water prices low. Voters get mad when utility costs rise.
There are many cases of data centers being built, providing lots of tax revenue for the town and water utility, and the locals benefiting from improved water systems. Critics often read this as “buying off” local communities, but there are many instances where these water upgrades just would not have happened otherwise. It’s hard not to see it as a net improvement for the community. If you believe it’s possible for large companies using water to just make reasonable deals with local governments to mutually benefit, these all look like positive-sum trades for everyone involved.
Here are specific examples:
The Dalles, Oregon - Fees paid by Google fund essential upgrades to water system.
Council Bluffs, Iowa - Google pays for expanded water treatment plant.
Quincy, Washington - Quincy and Microsoft built the Quincy Water Reuse Utility (QWRU) to recycle cooling water, reducing reliance on local potable groundwater; Microsoft contributed major funding (about $31 million) and guaranteed project financing via loans/bonds repaid through rates. These improvements increase regional water resilience beyond the data center itself.
Goodyear, Arizona - In siting its data centers, Microsoft agreed to invest roughly $40–42 million to expand the city’s wastewater capacity—utility infrastructure the city highlights as part of the development agreement and that increases system capacity for the community.
Umatilla/Hermiston, Oregon - Working with local leaders, AWS helped stand up pipelines and practices to reuse data-center cooling water for agriculture, returning up to ~96% of cooling water to local farmers at no charge. That 96% is from AWS itself, not sure if it’s correct.
I could go on like this for a while. Maybe you think every one of these is some trick by big tech to buy off communities, but all I’m seeing here is an improvement in local water systems without any examples of equivalent harm elsewhere
In general, the US has a lot of freshwater.
America has among the cheapest water costs of any nation in the world.
Because we’re also the richest nation in the world, our water costs are incredibly low as a percentage of per capita income.
AI as a normal industry
If a steel plant or college or amusement park were built in a small town, it would be normal for it to use a lot of the town’s water. We wouldn’t be shocked if the main industry in the town were using a sizable amount of the water there. If it provided a lot of tax revenue for the town and was not otherwise harming it, I think we would see this as a positive, and wouldn’t talk in an alarmed way about the specific percentages of the town’s water the industry was using. We should think of AI like we do any other industry. A data center drops into a town, draws water in the same way a factory or college would, the local system adapts to it quickly, and residents benefit from the tax revenue. The industry could provide enough water payments or tax revenue to improve the town’s water supplies in the first place.
In Texas, data centers paid an estimated $3.21 billion in taxes to state and local governments in 2024. There is no exact figure on total data center water use in Texas in 2024, but there are forecasts that they will use ~50 billion gallons of water in 2025. So excluding the price data centers pay for the water and energy they use, data centers are paying about $0.064 in tax revenue per gallon of water they use. The cost of water in Texas varies between $0.005 and $0.015 per gallon. Even if data centers caused water prices to quadruple, they would still ultimately be paying more back to local communities than the communities lost in additional water prices. This pattern holds everywhere data centers are built. They add huge amounts of tax revenue to local and state economies that seem to more than balance out any negative water externalities.
Many people have a strong aversion to use physical resource on digital systems, and think of any water spent on digital stuff as a waste. I think if we just looked passed the question of whether something is physical or digital and otherwise treated it as a normal industry, AI actually uses the least amount of water per dollar of revenue it generates.
Here are some other industries that use a lot of water, and their total revenue vs. how much water they use. I’ll measure this in dollars earned in revenue per thousands of gallons of water consumed. Here “onsite” means the water used in the data center itself, and “offsite” means the water used to generate the electricity at nearby power plants.5
Agriculture - $19.35/1000 gal
Electric power – $312.35/1000 gal
Data centers (onsite + offsite water) - $1,579/1000 gal
Bottled water - $1,703/1000 gal
Data centers (onsite water only) - $20,722/1000 gal
Why is journalism on AI water impacts so consistently bad?
Many negative news stories of AI’s water use are wildly misleading in very simple ways
I was motivated to write this after noticing that many long ominous articles on AI impacts on water never actually gave any evidence that local household water costs had risen anywhere. They were making a few other misleading choices as well.
Every popular article about how AI’s water use is bad for the environment in the last year has had a wildly misleading framing.
The Economic Times: Texans are showering less because of AI
Take this one from the Economic Times, it circulated a lot:
The article clarifies that this is 463 million gallons of water spread over 2 years, or 640,000 gallons of water per day. Texas consumes 13 billion gallons of waters per day. So all data centers added 0.005% to Texas’s water demands.
0.005% of Texas’s population is 1,600. Imagine a headline that said “1,600 people moved to Texas. Now, residents are being asked to take shorter showers.”
Many iterations of the same article appeared:
San Antonio data centers guzzled 463 million gallons of water as area faced drought
Data Centers in Texas Used 463 Million Gallons of Water, Residents Told to Take Shorter Showers
One article corrected for the much larger uptick of data centers in 2025:
50 billion gallons per year is a lot more! That’s more like 1.1% of Texas’s water use. Nowhere in this article does it share that proportion. It seems pretty normal for a state as large as Texas to have a 1% fluctuation in its water demand.
The New York Times: Data centers are guzzling up water and preventing home building
From the New York Times:
The subtitle says: “In the race to develop artificial intelligence, tech giants are building data centers that guzzle up water. That has led to problems for people who live nearby.”
Reading it, you would have to assume that the main data center in the story is guzzling up the local water in the way other data centers use water.
In the article, residents describe how their wells dried up because residue from the construction of the data center added sediment to the local water system. The data center had not been turned on yet. Water was not being used to cool the chips. This was a construction problem that could have happened with any large building. It had nothing to do with the data center draining the water to cool its chips. The data center was not even built to draw groundwater at all, it relies on the local municipal water system.
The residents were clearly wronged by Meta here and deserve compensation. But this is not an example of a data center’s water demand harming a local population. While the article itself is relatively clear on this, the subtitle says otherwise!
The rest of the article is also full of statistics that seem somewhat misleading when you look at them closely.
Water troubles similar to Newton County’s are also playing out in other data center hot spots, including Texas, Arizona, Louisiana and the United Arab Emirates. Around Phoenix, some homebuilders have paused construction because of droughts exacerbated by data centers.
The term “exacerbated” is doing a lot of work here. If there is a drought happening, and a data center is using literally any water, then in some very technical sense that data center is “exacerbating” the drought. But in no single one of these cases did data centers seem to actually raise the local cost of water at all. We already saw in Phoenix that data centers were only using 0.12% of the county water. It would be odd if that was what caused home builders to pause.
The article goes on with some ominous predictions about Georgia’s water use around the data center, but so far residents have not seen their water bills rise at all. We’re good at water economics! You wouldn’t know that at all from reading this article.
I think the main story being an issue with construction, but the title associating it with some issue specific to data centers, seems pretty similar to a news story reporting on loud sounds from construction of a building that happens to be a bank, and the title saying “Many banks are known for their incredible noise pollution. Some residents found out the hard way.” This would leave you with an incorrect understanding of banks.
Contra the subtitle, data centers “guzzling up water” in the sense of “using the water for cooling” has not led to any problems, anywhere, for the people who live nearby. The subtitle is a lie.
CNET’s long very vague report on AI and water
This same story was later referenced by a long article on AI water use at CNET, here with a wildly misleading framing:
The developer, 1778 Rich Pike, is hoping to build a 34-building data center campus on 1,000 acres that spans Clifton and Covington townships, according to Ejk and local reports. That 1,000 acres includes two watersheds, the Lehigh River and the Roaring Brook, Ejk says, adding that the developer's attorney has said each building would have its own well to supply the water neededEverybody in Clifton is on a well, so the concern was the drain of their water aquifers, because if there's that kind of demand for 34 more wells, you're going to drain everybody's wells," Ejk says. "And then what do they do?"
Ejk, a retired school principal and former Clifton Township supervisor, says her top concerns regarding the data center campus include environmental factors, impacts on water quality or water depletion in the area, and negative effects on the residents who live there.
Her fears are in line with what others who live near data centers have reported experiencing. According to a New York Times article in July, after construction kicked off on a Meta data center in Social Circle, Georgia, neighbors said wells began to dry up, disrupting their water source.
There’s no mention anywhere in the article that the data center in Georgia was not using the well water for normal operations.
Bloomberg: AI is draining water from areas that need it most
Here’s a popular Bloomberg story from May. It shows this graphic:
Red dots indicate data centers built in areas with higher or extremely high water stress. My first thought as someone who lives in Washington DC was “Sorry, what?”
Northern Virginia is a high water stress area?
I cannot find any information online about Northern Virginia being a high water stress area. It seems to be considered low to medium. Correct me if I’m wrong. Best I could do was this quote from the Financial Times:
Virginia has suffered several record breaking dry-spells in recent years, as well as a “high impact” drought in 2023, according to the US National Integrated Drought Information System. Much of the state, including the northern area where the four counties are located, is suffering from abnormally dry conditions, according to the US Drought Monitor. But following recent rain, the Virginia Department of Environmental Quality on Friday lifted drought advisories across much of the state, though drought warnings and watches are still in effect for some regions.
Back to the map. There were some numbers shared in a related article by one of the same authors. But readers were left without a sense of proportion of what percentage of our water all these data centers are using.
AI’s total consumptive water use is equal to the water consumption of the lifestyles of everyone in Paterson, New Jersey. This graphic is effectively spreading the water costs of the population of Paterson across the whole country, and drawing a lot of scary red dots.
Even the title chart can send the wrong message.
I think for a lot of people, stories about AI are their first time hearing about data centers. But the vast majority of data centers exist to support the internet in general, not AI.
Simply showing the number of data centers doesn’t show the impact of AI specifically, or how much power data centers are drawing. Power roughly correlates with water, because the more energy is used in data center computers, the more they need to be cooled, and the more water is needed to do that. Here’s a graph showing the power demand of all data centers, and how much of that demand AI makes up.
Obviously there’s been a big uptick on power draw since 2019, but AI is still a small fraction of total data center power draw. I think Goldman Sachs underestimated AI’s power draw here, experts think it’s more like ~15% of total power used in data centers, but it’s important to understand that the vast majority of that original scary red data center graph isn’t AI specifically.
AI is going to be large part of the very large data center buildout that’s currently underway, but it’s important to understand that up until this point most of those data centers on the graph were just the buildout of the internet.
One more note, circling back again to Maricopa County.
The county is a gigantic city built in the middle of a desert. For as long as it’s existed, it’s been under high water stress. Everyone living there is aware of this. The entire region is (I say this approvingly) a monument to man’s arrogance.
The only reason anyone can live in Phoenix in the first place is that we have done lots of ridiculous massive projects to move huge amounts of water to the area from elsewhere.
This is an area where environmentalism and equity come apart. I’d like residents of Phoenix to have access to reliable water supplies, but I don’t think this the most environmentalist move. I think the most environmentalist move would probably be to encourage people to leave the Phoenix area in the first place and live somewhere that doesn’t need to spend over two times as much energy as the country on average pumping water. I have to bite the bullet here and say that between environmentalism and equity, I’d rather choose equity and not raise people’s water prices much, even though they’ve chosen to live in the middle of a desert.
It seems inconsistent to think that it’s wrong for environmentalist reasons to build data centers near Phoenix that increase the city’s water use by 0.1%, but it’s not wrong for Phoenix to exist in the first place. If it’s bad for the environment to build data centers in the area at all, Phoenix’s low water bills themselves seem definitionally bad for the environment too. I think you can be on team “Keep Phoenix’s water bills low, and build data centers there” or team “Neither the data centers nor Phoenix should be built there, we need to raise residents’ water bills to reflect this fact” but those are the only options. I’m on team build the data centers and help out the residents of Phoenix.
4 common misleading ways of reporting AI water usage statistics
Comparing AI to households without clarifying how small a part of our individual water footprint our households are
Many articles choose to report AI’s water use this way:
“AI is now using as much as (large number) of homes.”
Take this quote from Newsweek:
In 2025, data centers across the state are projected to use 49 billion gallons of water, enough to supply millions of households, primarily for cooling massive banks of servers that power generative AI and cloud computing.
That sounds bad! The water to supply millions of homes sounds like a significant chunk of the total water used in America.
The vast majority (~93%) of our individual total consumption of freshwater resources does not happen in our homes, it happens in the production of the food we eat. Experts seem to disagree on exactly what percentage of our freshwater consumption happens in our homes, but it’s pretty small. Most estimates seem to land around 1%. So if you just look at the tiny tiny part of our water footprint that we use in our homes, data centers use a lot of those tiny amounts. But if you look at the average American’s total consumptive water footprint of ~1600 L/day, 49 billion gallons per year is about 300,000 people’s worth of water. That’s about 1% of the population of Texas. The entire data center industry (both for AI and the internet) using as much water as 1% of its population just doesn’t seem as shocking.
Referencing the “hidden, true water costs” that AI companies are not telling you, without sharing what those very easily accessible costs are
A move that I complained about in my last post is that a lot of articles will imply that AI companies are hiding the “true, real” water costs of data centers by only reporting the “onsite” water use (the water used by the data center) and not the “offsite” water use (the water used in nearby power plants to generate the electricity). Reporting both onsite and offsite water costs has become standard in reporting AI’s total water impact.
Many authors leave their readers hanging about what these “true costs” are. They’ll report a minuscule amount of water used in a data center, and it’s obvious to the reader that it’s too small to care about, but then the author will add “but the true cost is much higher” and leaves the reader hanging, to infer that the true cost might matter.
We actually have a pretty simple way of estimating what the additional water cost of offsite generation is. Data centers on average use 0.48 L of water to cool their systems for every kWh of energy they use, and the power plants that provide data centers energy average 4.52 L/kWh. So to get a rough estimate:
If you know the onsite water used in the data center, multiply it by 10.4 to get the onsite + offsite water.
If you know the onsite energy used, multiply it by 5.00 L/kWh to get the onsite + offsite water used.
Obviously scaling up a number by a factor of 10 is a lot, but it often still isn’t very much in absolute terms. Going from 5 drops for a prompt to 50 drops of water is a lot relatively, but in absolute terms it’s a change from 0.00004% of your daily water footprint to 0.0004%. Journalists should make these magnitudes clear instead of leaving their readers hanging.
This talking point can be doubly deceptive if you only look at the proportion
Let’s say there are 2 data centers in a town (I’ll call them Poseidon and Enki) drawing from the same power source. The local town’s electricity costs 4 L of water per kWh to generate.
The Poseidon data center is pretty wasteful with its cooling water. It spends 2 L of water on cooling for every kWh it uses on computing, way above the national average of 0.48L/kWh. So if you add the onsite and offsite water usage, Poseidon uses 6 L of water per kWh.
The Enki data center finds a trick to be way more efficient with its cooling water. It drops its water use down to 0.1L/kWh. Well below the national average. So if you add its onsite and offsite water usage, it uses 4.1 L per kWh without using any more energy.
Obviously, the Enki data center is much better for the local water supply.
Both data centers are asked by the town to release a report on how much water they’re using. They both choose to only report on the water they’re actually using in the data center itself.
Suddenly, a local newspaper shares an expose: both data centers are secretly using more water than they reported, but Enki’s secret, real water use is 41x its reported water costs.
While Poseidon’s is only 3x its reported water costs:
Here, Enki looks much more dishonest than Poseidon. If readers only saw this proportion, they would probably be left thinking that Enki is much worse for the local water supply. But this is wrong! Enki’s much better. The reason the proportions are so different is that Enki’s managed to make its use of water so efficient compared to the nearby power plant.
I think something like this often happens with data center water reporting.
When I wrote about a case of Google’s “secret, real water cost” actually not being very much water, a lot of people messaged me to say Google still looks really dishonest here, because the secret cost is 10x its stated water costs once you add the offsite costs. A way of reframing this is to say that Google’s made its AI models so energy efficient that they’re now only using 1/10th as much water in their data centers per kWh as the water required to generate that energy. This seems good! We should frame this as Google solidly optimizing its water use.
Take this quote from a recent article titled “Tech companies rarely reveal exactly how much water their data centers use, research shows”:
Sustainability reports offer a valuable glimpse into data center water use. But because the reports are voluntary, different companies report different statistics in ways that make them hard to combine or compare. Importantly, these disclosures do not consistently include the indirect water consumption from their electricity use, which the Lawrence Berkeley Lab estimated was 12 times greater than the direct use for cooling in 2023. Our estimates highlighting specific water consumption reports are all related to cooling.
The article should have mentioned that this means data centers have made their water use so efficient that basically the only water they’re using at all is in the nearby power plant, not in the data centers themselves. But framing it in the original way way make it look like the AI labs are hiding a massive secret cost from local communities, which I guess is a more exciting story.
Vague gestures at data centers “straining local water systems” or “exacerbating drought” without clarifying what the actual harms are
If you use literally any water in any area with a drought, you’re in some sense “straining the local water system” and “exacerbating the drought.” Both of these tell us basically nothing meaningful about how bad a data center is for a local water system. If an article doesn’t come with any clarification at all about what the actual expected harms are, I would be extremely wary of this language. In basically every example I can find where it’s used, the data centers are adding minuscule amounts of water demand to the point that they’re probably not changing the behavior of any individuals or businesses in the area.
Simply listing very large numbers without any comparisons to similar industries and processes
This is the great singular sin of bad climate communication. The second you see it, you should assume it’s misleading. Simply reporting “millions of gallons of water” without context gives you no information. The power our digital clocks draw use millions of gallons of water, but digital clocks aren’t causing a water crisis. Whenever you see an article cite a huge amount of water with no comparisons at all to anything to give you a proper sense of proportion, ask a chatbot to contextualize the number for you.
The future
Obviously this could all change in the future. Data centers are growing rapidly. I don’t know how to forecast how much data centers will impact water use over the next few years. Experts have tried, but it’s all pretty murky. For now, it doesn’t seem like data centers are affecting households’ access to water.
Further reading
The Lawrence Berkeley National Laboratory’s report on data center energy and water use in 2024 is the most comprehensive document we have on AI and water right now.
Brian Potter’s recent piece on water and update on data center water use.
Hannah Ritchie has some recent great stuff on data centers and chatbots
Matt Yglesias’s piece on data centers and water
Friend of the blog SE Gyges has a great breakdown of the single most popular statistic about ChatGPT that’s also a lie: it uses a bottle of water per email.
I think here we should treat AI like any other private industry. Private companies who want to use water need to buy it from the utility, and they might need to compete with each other if water is scarce by paying a little more for it. If data centers are competing with households for water, that seems bad, but if data centers are competing with golf courses for water use, I’d really rather the water go to the data center. Most places have separate water rates for residential vs. commercial/industrial water use, and they do this to keep the water markets somewhat separate so competition between firms doesn’t end up affecting household water. Data centers should be allowed to compete with other private businesses.
This is very hard to correctly estimate. I agree with Brain Potter’s reasoning here.
From this paper. It’s very very hard to find good stats on this and they seem to disagree with each other. I’m using this as the middle of the range I found. Most of this water use is used in agriculture to grow our food.
Page 7-9
Will type up how I got these soon.
I wish this were surprising to me, but AI news reporting has come up with so many outright bizarre "as much power as the Eiffel Tower in 17 years" and "all the factories in Nepal in a month" type analogies that I've become extremely cynical on this topic. Or training electricity expressed as some apparently staggering number of households... that turns out to be equal to less than 1% of annual global population growth, which is furthermore cumulative.
Working journalists by and large don't like generative AI, don't like big tech, and are overly trusting of sources critical of both. At the same time, they are highly inclined to be convinced by narratives, particularly those that cast them in the role of the hero. This doesn't require any malice, just a desire to tell the story that they feel must be told:
"As the dark side of AI was gradually exposed, and the stochastic parrots fell far short of their promise, the bubble burst and people gradually swore off AI. After calls from artists, writers, medical professionals and environmentalists that Something Must Be Done, [some thing] was done, and then everything went back to how it was and always will be, and the news cycle moved on."
Some of the more "big picture" articles in recent weeks seem to express bafflement that this just isn't playing out and people are still using ChatGPT. Go figure.
flagging this in case other folks are interested: the great lakes alliance published a report that includes data center use of lake water https://greatlakes.org/wp-content/uploads/2025/08/AGL_WaterUse_Report_Aug2025_Final.pdf