Using ChatGPT is not bad for the environment - a cheat sheet
The numbers clearly show that discouraging people from using chatbots is a pointless distraction for the climate movement
This post will be a cheat sheet for answering every environmental objection to using ChatGPT. I’ve broken it up so you can skip around and only read sections relevant or interesting to you. If you think I’m getting anything wrong I’d be excited to update this with the most accurate numbers. Please let me know in the comments or at AndyMasley@gmail.com.
Intro
The question this post is trying to answer is “Should I boycott ChatGPT or limit how much I use it for the sake of the climate?” and the answer is a resounding and conclusive “No.” The numbers are completely clear and final, and anyone telling you otherwise is deeply mistaken. All counter-points fail for simple obvious reasons.
You can use ChatGPT as much as you like without worrying that you’re doing any harm to the planet. Worrying about your personal use of ChatGPT is wasted time that you could spend on the serious problems of climate change instead.
This post is not about the broader climate impacts of AI beyond chatbots1, or about whether AI is bad for other reasons (copyright, hallucinations, job loss, risks from advanced AI, etc.). I’m not especially “pro” or “anti” AI. I’m writing this because people who care about the climate and want to help the environment are getting distracted by a non-issue.
I’m not an authority on AI and energy use. I cite all my sources and claims and defer to what seems like the expert consensus where it exists. I’m a fan of linking citations with hypertext instead of footnotes, so my sources are all in the writing itself instead of at the bottom. I have a physics degree and taught physics for seven years, so I do know a lot about where and how energy is used.
We can divide concerns about ChatGPT’s environmental impact into two categories:
Personal use: How much ChatGPT increases your personal environmental footprint.
Global use: How much ChatGPT is harming the planet as a whole.
I’ll write a bunch of responses to the most common objections in each category.
Throughout this post I’ll assume the average ChatGPT query uses 0.3 Wh of energy, about the same as a Google search used in 2009. Here’s a summary of why 0.3 Wh is the most reasonable guess right now. It seems like image generators use ~1.22 Wh per prompt (with large error bars), so everything I say here also applies to AI images.
I’m collecting all my responses to critiques of this post here, and corrections here. If you think I’m leaving out any important climate costs of a prompt in this post, I’ve tried to cover what those costs could be here, and show that they don’t add much.
Contents
This post in a nutshell
Using AI emits the same tiny amounts of CO2 as every other normal thing we do online, and way less than most offline things we do. Even when you include “hidden costs” like training, the emissions from making hardware, energy used in cooling, and AI chips idling between prompts, the carbon cost of an average chatbot prompt adds up to less than 1/100,000th of the average American’s daily emissions. Water is similar. Everything we do uses a lot of water. Most electricity is generated using water, and most of the way AI “uses” water is actually just in generating the electricity it uses. The average American uses ~50,000 times as much water every day as the full cost of an AI prompt. The actual amount of water used per prompt in data centers themselves is vanishingly small. AI prompts have similarly small energy and water costs to things like internet searches and online videos and music streaming.
Some people think tiny parts of our emissions “add up” when a lot of people do them. They add up in an absolute sense, but they don’t add up to be a larger relative part of our overall emissions. If AI chatbots are just a 100,000th of your personal emissions, they are likely to be around a 100,000th of global emissions as well. We should mostly focus on systematic change over personal lifestyle changes, but if we do want to do personal lifestyle changes, we should prioritize cutting things that are actually significant parts of our personal emissions. That’s the only way we could reduce significant amounts of global emissions too.
The reason AI is rapidly using more energy is that AI is suddenly being used by more people, not that AI stands out as using a lot of energy per person using it. It’s like if the internet had been invented a second time and people were rapidly coming online.
The reason AI data centers use a lot of energy is that they are built to collect huge amounts of individually tiny computer tasks in a single physical place. This makes them more energy-efficient than other ways of doing the same things with computers. If we’re going to do things with computers, we should prefer that data centers manage a lot of it. Every time you interact with the internet, you’re using a data center in the same way you use any other computer. Globally, the average person uses the internet for 7 hours a day, but data centers only use 0.23% of the world’s energy. It’s a miracle of optimization that something we spend half our waking lives on can use less than a 200th of our energy. Computers in general have been ridiculously optimized to use as little energy as possible, so we should assume that the things we do on them will not be significant parts of our carbon footprints.
Data centers do put more strain on local grids than some other types of buildings, for the same reason a stadium puts more strain on a grid than a coffee shop: the stadium is serving way more people at once. Data centers are building-sized computers that tens of thousands of people are using at any one time. The reason they stand out is that they gather a large amount of aggregate energy demand into a tiny place, not that they’re using a lot of energy per user. While data centers are not environmentally perfect, many stories about their impacts on local grids are wildly overblown. In the equation (Total Energy) = (Energy per Prompt) x (Number of Prompts), energy per prompt is low, but the number of prompts in a data center is extremely high, so the total energy they use is high. This means that your personal use of AI is adding extremely tiny amounts of energy demand, and of all the things you can cut to reduce your emissions, it’s one of the very least promising.
Deciding that you’re going to stop using AI for the sake of the climate is like going around your home and randomly unscrewing a single LED bulb, or pausing your microwave a few seconds early to save the planet. It’s so small that it’s a meaningless distraction.
The vast majority of AI’s effects on the environment will come from how it’s used, not on what happens in data centers. Amazon and Google Maps both have big impacts on the climate. Amazon might help or hurt a lot, and Google Maps optimizes a lot of car trips, but also might encourage more driving. But no one in debates about Amazon or Google’s climate impact says “The most important issue is the energy costs of running this website in data centers.” That would be crazy, because the websites are tools that cause people’s behavior to change, which leads to much larger changes in the physical world. If you’re concerned about AI’s impacts on the climate, the main question should be how using AI can help or hurt the climate, not the (tiny) costs of running AI in the first place.
Personal use
A ChatGPT prompt uses too much energy/water
Energy
A ChatGPT prompt uses 0.3 Watt-hours (Wh). This is enough energy to:
Leave a single incandescent light bulb on for 18 seconds.
Leave a wireless router on for 3 minutes.
Play a gaming console for 6 seconds.
Run a vacuum cleaner for 1 second.
Run a microwave for 1 second.
Run a toaster for 0.8 seconds.
Brew coffee for 10 seconds.
You can look up how much 0.3 Wh costs in your area. In DC where I live it’s $0.000051 (a two-hundredth of a cent). Think about how much your energy bill would have to increase before you noticed “Oh I’m using more energy. I should really try to reduce it for the sake of the climate.” What multiple of $0.000051 would that happen at? That can tell you roughly how many ChatGPT searches it’s okay for you to do.
If you were running ChatGPT’s servers in your home, to raise your energy bill by 1 dollar, you would need to send 19,600 prompts. One prompt every single second for 5 hours.
Emissions
Of course, in conversations about the climate, we don’t actually care directly about the energy. We care about the emissions. We need to be careful to include every possible way prompting ChatGPT can cause emissions. These are:
✅ The energy cost of an AI chip generating an answer to your prompt
✅ The energy of an idling AI chip between prompts
✅ The energy cost of cooling the AI chips
✅ Other energy overhead in the data center
✅ The fact that data centers use energy that’s 48% more carbon intensive than average
✅ The emissions from training the model in the first place (dividing emissions from training by the number of prompts the model receives)
✅ The embodied carbon of the AI chip (the emissions that were caused by making the hardware)
✅ The energy used to transmit the information from the data center to your device.
When you add all these, the full CO2 emissions caused by a ChatGPT prompt comes out to 0.28 g CO2. For context, this is the same amount of CO2 emitted by:
Driving a sedan at a consistent speed for 4 feet
Using a laptop for 1 minute. If you’re reading this on a laptop and spend 20 minutes reading the full post, you will have used as much energy as 20 ChatGPT prompts. ChatGPT could write this blog post using less energy than you use to read it!
Because this is so low, encouraging people to stop using ChatGPT is basically never going to have any impact on their individual emissions. If individual emissions are what you’re worried about, ChatGPT is hopeless as a way of lowering them. It’s like seeing people who are spending too much money, and saying they should buy one fewer gum ball per month:

By being vegan, I have as much climate impact as not prompting ChatGPT 1,000,000 times each year (the water impact is even bigger). I don’t think I’m going to come close to prompting ChatGPT 1,000,000 times in my life, so each year I effectively stop more than a person’s entire lifetime of ChatGPT prompts with a single lifestyle change. If I choose not to take a flight to Europe, I save 10 million ChatGPT prompts. this is like stopping more than 100 people from searching ChatGPT for their entire lives. Preventing ChatGPT prompts is a hopelessly useless lever for the climate movement to try to pull. We have so many tools at our disposal to make the climate better. Why make everyone feel guilt over something that won’t have any impact?
The average American emits about 100,000 times as much CO2 each day. If each of these dots is one ChatGPT prompt, all the dots together are how much you emit in one day.
I still find, even after showing this, that some people think using literally any additional energy is bad, because “every bit matters.” The thing is that our energy use changes a lot day to day, just like the money we spend changes day to day. If I started spending an additional penny per month, I wouldn’t notice, because there would be other days where I’d randomly spend way more or fewer pennies on other things. If I looked at a graph of my spending, the penny would be drowned out in the random noise of my other decisions. ChatGPT prompts are like this. They use so little energy that they get drowned out in the random ways we change our energy use day to day. If you looked at a graph of my energy footprint before and after using ChatGPT, you wouldn’t notice any change at all.
We have limited hours in the day, and different choices for how we spend our time. If prompting ChatGPT 100 times and reading its responses takes up hours of my time that I could have spent playing a video game or watching Netflix or driving my car, then using it actually prevents me from emitting way more, because those other things use way more energy per hour. Printing a physical book uses 5,000 Wh, so even just sitting down and reading a book you bought for 6 hours (using 833 Wh per hour) is going to use more energy per minute than ChatGPT, unless you prompt ChatGPT 1000 times per hour, or once every 3 seconds for a full hour. Switching to using ChatGPT from another activity is almost always going to decrease the total energy I use every day. This isn’t an argument that you should only use ChatGPT! It’s often worth it to spend more energy. But people sitting and using ChatGPT are often using way less energy per minute than almost anyone else in the world.
Water
I think a lot of people don’t realize how much water we each use every day.
Almost all electricity generation involves heating water to create steam to spin a turbine. The American energy grid uses 58,000,000,000,000 gallons of water every year. That’s enough to cover the entire surface area of Pennsylvania in Olympic swimming pools.
In America it takes about 2 gallons of water to produce 1 kWh of electricity. The average American uses around 30 kWh per day, so they use 60 gallons of water per day just to generate their electricity. They also use around 100 gallons of water directly each day, so 160 gallons in total.2
When I hear people say “50 ChatGPT prompts use a whole bottle of water!” I think they’re internally comparing this to the few times a year they buy a bottle of water. That makes ChatGPT’s water use seem like a lot. They’re not comparing it to the 1200 bottles of water they use every single day in their ordinary lives.
A ChatGPT prompt running on GPT-3 used between 10-25 mL of water if you include the water cost of training, the water cost of generating the electricity used, and the water used by the data center to cool the equipment. There’s a common misconception that ChatGPT uses a whole bottle of water per email. This point comes from a single quote in the Washington Post from one scientist with no proof of how they derived their information. Every investigation into this makes it look like it’s effectively impossible for a few ChatGPT prompts to use a whole bottle of water.
Water used in data centers is usually a direct function of how much energy a process uses. Because it seems like the energy per prompt decreased from GPT-3 to GPT-4 and 4o, I’ll use the 10-25 mL as a reasonable upper bound for how much water prompts are currently using. New data from Google implies that each prompt might only use ~2 mL of water.
This means that every single day, the average American uses enough water for 24,000-61,000 ChatGPT prompts.
Each dot in this image represents one ChatGPT prompt’s worth of water. All the dots together represent how much water you use in one day in your everyday life:
If you want to prompt ChatGPT 40 times, you can just stop your shower 1 second early. If you normally take a 5 minute shower, set a timer for 299 seconds instead, and you’ll have saved enough water to justify 40 ChatGPT prompts.
If you want to reduce your water footprint, avoiding ChatGPT will never make a dent.
Everything else we do online uses water in data centers too. ChatGPT seems to have been singled out because it uses a little more water per use than Google, but it doesn’t look bad relative to other normal online things we do.

ChatGPT is bad relative to other things we do (it’s ten times as bad as a Google search)
If you multiply an extremely small value by 10, it can still be so small that it shouldn’t factor into your decisions.
If you were being billed $0.0005 per month for energy for an activity, and then suddenly it began to cost $0.005 per month, how much would that change your plans?
A digital clock uses one million times more power (1W) than an analog watch (1µW). “Using a digital clock instead of a watch is one million times as harmful to the climate” is correct, but misleading. The energy digital clocks use rounds to zero compared to travel, food, and heat and air conditioning. Climate guilt about digital clocks would be misplaced.
The relationship between Google and ChatGPT is similar to watches and clocks. One uses more energy than the other, but both round to zero.
When was the last time you heard a climate scientist say we should avoid using Google for the environment? This would sound strange. It would sound strange if I said “Ugh, my friend did over 100 Google searches today. She clearly doesn’t care about the climate.” Google doesn’t add to our energy budget at all. Assuming a Google search uses 0.03 Wh, it would take 300,000 Google searches to increase your monthly energy use by 1%. It would be a sad meaningless distraction for people who care about the climate to freak out about how often they use Google search. Imagine what your reaction would be to someone telling you they did ten Google searches. You should have the same reaction to someone telling you they prompted ChatGPT.
What matters for your individual carbon budget is total emissions. Increasing the emissions of a specific activity by 10 times is only bad if that meaningfully contributes to your total emissions. If the original value is extremely small, this doesn’t matter.
It’s as if you were trying to save money and had a few options for where to cut:
You buy a gum ball once a month for $0.01. Suddenly their price jumps to $0.10 per gum ball.
You have a fancy meal out for $50 once a week to keep up with a friend. The restaurant host likes you because you come so often, so she lowers the price to $40.
It’s very unlikely that spending an additional $0.10 per month is ever going to matter for your budget. Spending any mental energy on the gum ball is going to be a waste of time for your budget, even though its cost was multiplied by 10. The meal out is making a sizable dent in your budget. Even though it decreased in cost, cutting that meal and finding something different to do with your friend is important if you’re trying to save money. What matters is the total money spent and the value you got for it, not how much individual activities increased or decreased relative to some other arbitrary point.
Google and ChatGPT are like the gum ball. If a friend were worried about their finances, but spent any time talking about foregoing a gum ball each month, you would correctly say they had been distracted by a cost that rounds to zero. You should say the same to friends worried about ChatGPT. They should be able to enjoy something that’s very close to free. What matters for the climate is the total energy we use, just like what matters for our budget is how much we spend in total. The climate doesn’t react to hyper specific categories of activities, like search or AI prompts.
If you’re an average American, each ChatGPT prompt increases your daily energy use (not including the energy you use in your car) by 0.001%. It takes about 1,000 ChatGPT prompts to increase your daily energy use by 1%. If you did 1,000 ChatGPT prompts in 1 day and feel bad about the increased energy, you could remove an equal amount of energy from your daily use by:
Running a clothes drier for 6 fewer minutes.
Running an air conditioner for 18 fewer minutes.
ChatGPT uses enough energy that you should be very careful with how you use it. Don’t use it as a search engine or a calculator or just to goof around
There are costs that are just so incredibly small that it does not matter if you take a few more. Imagine that someone told you that they had perfectly timed their microwave down to the second for each meal. They knew that their vegan chicken nuggies needed exactly 3 minutes and 42 seconds. Any longer and they would be wasting energy. That would be admirable, but a very tiny thing that would take a lot of extra effort relative to how much it helps the climate (basically not at all). If someone else just set the microwave to 4 minutes, this would use so little extra energy that it wouldn’t be bad for the climate at all.
This would also be 20 ChatGPT search’s worth of extra energy.
I sometimes hear people say that you should “think carefully about your ChatGPT use” and “be environmentally aware while using it” because of its environmental impact, and not just use it for simple things you could use other services for, like a simple search or a calculator or just making jokes. This sounds a lot like scolding the person setting their microwave to 4 minutes instead of 3:42. It’s misunderstanding just how little energy is involved.
I regularly use the Google search bar as a calculator. I’m too lazy to click on the calculator app on my computer. The search bar is right there. This adds a tiny tiny bit of energy cost, but it’s not enough that I should ever worry.
Suppose you gave yourself an energy budget for goofy ChatGPT prompts. Every year, you’re allowed to use it for 1,000 goofy things (a calculator, making funny text, a simple search you could have used Google for). At the end, all those prompts together would have used the same amount of energy as running a single clothes dryer a single time for three minutes. This would increase your energy budget by 0.003%. This is not enough to worry about. If you feel like it, please goof around on ChatGPT.
There are other hidden costs that you’re not including here
There are a ton of different ways we could add to the cost of a ChatGPT prompt by considering things like how much it “normalizes” using AI, or how much it encourages the data center buildout overall. Because this is such a long deep dive, I’ve written it as a separate post that you can read here.
Global use
Data centers use so much energy that in some place coal plants are re-opening to support them! This is a sign that AI’s using a lot of energy per prompt
This point requires a lot of details on what data centers are, how they work, and how they manage AI prompts. It will be the longest section here.
The reasons why data centers are using a lot of energy in the places they are built are that:
AI as a whole is being used a lot.
Data centers concentrate huge amounts of individually very tiny computer tasks in one place. Concentrating these tasks makes data centers way more energy-efficient. For AI data centers, in the equation
total energy = (energy per prompt) x (number of prompts)
the energy per prompt is very small, but the number of prompts is extremely large. Data centers manage AI responses from people all over the world, concentrated in individual buildings.
Data centers put very concentrated demand on local grids without increasing global emissions much, but the reason it’s concentrated is to make the data center maximally energy-efficient. This very concentrated demand sometimes incentivizes more carbon-intensive energy sources.
We’ll explore each in a lot of detail in this section. If you want, you can skip to the summary here.
The reason AI’s energy costs are rising so much is that it’s being used so much, not that each prompt uses a lot of energy
AI's energy use has exploded because AI usage has exploded.
ChatGPT grew faster than any previous major app.
It’s now the 5th most popular website with 5.7 billion monthly visits, beating out Wikipedia, Reddit, and Amazon. It’s processing 2.5 billion daily prompts. 51% of professional developers surveyed by Stack Overflow use AI daily, and only 15% report that they don’t use it or plan to. 20-30% of code at Microsoft is now AI-written. AI is also embedded into many ways we use the internet, the clearest example being Google search’s AI summary.
Generative AI may be the fastest-adopted technology ever.
Critics point out that this doesn't account for the intensity of use (which is lower than this graph would suggest) but AI is still ranks among the most rapidly adopted technologies.
In the equation Total energy = (Number of prompts) x (energy per prompt), the energy per prompt is low, but the number of prompts is so high that the total energy is high too. Next, we’ll see why data centers put so much strain on local grids, even though globally they use very little energy relative to how much we interact with them.
What is a data center?
A data center is a facility that stores a large number of computers, and infrastructure for power, cooling, and networking them. It’s effectively a building-sized computer. Data centers run most things you do online. Websites, cloud storage, email, and video streaming are all hosted and stored in data centers. You are interacting with data centers most of the time you’re online.
AI models are trained and run in data centers. When people talk about the environmental effects of AI, they are often talking about AI activity in data centers.
Data centers use lots of energy for massive amounts of computing. Similar to fans in personal computers, data centers require cooling systems to deal with the heat generated when they run. The most energy-efficient way to cool them is often running cool water near the servers to absorb the heat. The warmed water is then either evaporated, or cooled down and circulated again.
Learn more about data centers here.
The central environmental paradox of data centers
The environmental paradox of data centers is that they put so much strain on the surrounding energy grid because, not despite the fact that they have been made so uniquely energy-efficient. This is due to the economies of scale that come with running so many servers in one building.
Data centers are very efficient
Power Usage Effectiveness (PUE) measures data center efficiency: total energy input divided by energy delivered to servers.
PUE = Energy data center takes inEnergy data center delivers to servers
A perfectly efficient data center would have a PUE of 1. The energy coming in would perfectly equal the energy used by the chips. No data center can actually achieve a PUE of 1, due to lighting, air conditioning, and heat loss.
PUE does not measure water use, or the carbon intensity of the electricity. Data centers use a separate WUE measure for water.
In practice, hyperscale data centers often achieve ~1.1. The industry average is ~1.56.
For context, power lines typically lose ~5% of energy during transmission, meaning they have an energy input-to-output ratio of 1.05. Google has achieved PUE below 1.05 at some of their data centers. This means Google has optimized their data center operations so efficiently that they waste less energy on non-computing infrastructure (cooling, lighting, etc.) than the average power line loses just moving electricity from point A to point B. In other words, their data centers are more energy-efficient at delivering power specifically to computing chips than the electrical grid is at simply transmitting power.
Many hyperscaler data centers are achieving very low PUEs because they are optimized at every level of design and operation. Because the companies using them are also running them, they have a financial incentive to optimize the data center’s energy use.
Larger data centers benefit from economies of scale: shorter transmission distances, reduced losses, and optimization opportunities unavailable to smaller facilities. There are other benefits to clustering AI chips together as well, but the energy benefits alone are significant. Larger data centers operate more efficiently.
Data center power demand can get very large. For context, 30 MW of power (on the far right of the graph) is enough to power a town of about 60,000 people. xAI’s Colossus is the largest AI supercomputer, with a maximum power capacity of 280 MW, the same as a city with half a million people.
Because larger data centers are more energy-efficient, if you want your AI prompt to use as little energy as possible, you should (all else equal) prefer it to be processed in a very large data center.
In general, data centers are the most energy-efficient way to do the large-scale computing that services like the internet and AI require. Companies already have financial incentive to make them efficient. If we hold the amount of computations equal, Data centers are the most energy-efficient method for large-scale computing.
Data center energy and water demands are high relative to most commercial buildings
Data centers have some of the most concentrated energy demands of any type of building. They also require constant reliable power. Their computers run 24/7, and because a single large data center is handling computing for so many people, even a brief power outage can be uniquely bad for the business. Data centers aspire to “five 9’s of reliability”: the center should function normally for 99.999% of the time. As a result, they strain local grid capacity.
This strain can lead to greater reliance on fossil fuels. This occurs through the concept of marginal emissions. When electricity demand rises, utilities must activate additional power sources. The "marginal" source (the next plant brought online) is often a fossil fuel "peaker plant" (a power plant called upon during times of high electricity demand to supply power to the grid, often using natural gas turbines that can start up quickly). These plants are designed to ramp up quickly but are often less efficient and more polluting per unit energy than the average grid mix.
A data center's constant, high demand reduces the grid's flexibility and increases the baseline load, making it more likely that these high-polluting marginal sources are used. Furthermore, the rapid growth of data center demand is outpacing the deployment of new renewable energy and transmission infrastructure. This has led utilities in several states, including Georgia, Kansas, and Virginia, to delay the retirement of older fossil fuel plants or propose new natural gas plants to ensure grid reliability.
So when a large new constant load like a data center is added, it can increase the amount of fossil generation needed at the margin, making the power they consume more carbon-intensive than the grid average. Data centers also maintain backup generators (often diesel) in case of outages.
The paradox
To recap:
Data centers are so uniquely energy-efficient, and become more efficient the more computing they manage (due to economies of scale), that companies are incentivized to do gigantic amounts of computing in them, and keep them running 24/7. All else equal, if you want an AI tool to use the least energy possible, you should prefer it to be run in the biggest data center possible.
Because they concentrate so much computing inside, and need constant reliable power, even though each computation is energy-efficient, overall data centers have the highest concentrations of energy demand of basically any commercial buildings.
Local energy and water grids were not designed with these hyper-concentrated, very constant demands in mind, so data centers can sometimes strain the capacity of local grids. This sometimes increases fossil fuel dependance, because they can provide more consistent backup power compared to renewables.
This demand puts a strain on the surrounding grid and sometimes favors the use of fossil fuels, even though (and actually because) what’s happening in the data center itself is so uniquely energy-efficient.
The central paradox: data centers strain local grids precisely because they've been made so efficient.
Failing to understand this can lead to bad ideas for how to solve the growing problem of AI emissions. While it may be possible to make small improvements in data center efficiency, it’s unlikely that this would reduce emissions nearly as much as making the grid around data centers greener or enforcing stricter environmental regulations around how the data center generates back-up energy.
What should we expect to see based on the paradox of data centers?
Locally, we should expect data centers to have much higher rates of energy and water use than basically any other commercial or industrial buildings, and to put some strain on local grids as a result.
Globally, data centers should be a small part of energy and water use (and emissions), because they’re uniquely efficient with the resources they use.
The data confirms this. Individual data centers can grow so large that they take on the energy demands of whole cities, but globally all data centers on Earth (supporting the entire internet in addition to AI) use 1.5% of global electricity, and only 0.23% of global energy. Data centers are managing and hosting the global internet. The average person on Earth spends 6 hours and 40 minutes every day on the internet, so they’re spending 40% of their waking lives interacting with data centers. Data center efficiency means that something we all spend almost half our lives interacting with only uses 0.23% of our global energy.
The national microwave example
One of the most important facts about climate change is that where emissions happen is counter-intuitive, often hidden from us. Using a computer for hours doesn’t add nearly as much to your daily carbon footprint as eating a burger, but the computer feels more energy intensive than your lunch.
The climate does not react to which industry emits or which specific buildings emit. It only reacts to the total CO2 in the air.
The national microwave
If data centers didn’t exist, we would need to rely on our personal computers to run everything we do online. When you logged into YouTube, you’d run YouTube software on your own computer like it was a video game, and save all videos you upload on your own computer. Every time someone else wanted to watch your videos, they would need to connect to your computer like other players on a video game can connect to a game you’re running. The magic of data centers is that by piling huge amounts of computing in one place, you can make all that computing more energy-efficient than it would be otherwise, and more accessible for everyone involved.
Other things don’t work this way. If I want to microwave something, I have to own my own microwave. I can’t send things off to some centralized microwave somewhere else.
What if I could?
Let’s say there were a single national microwave. When you needed food heated up, you would teleport the food to the national microwave, in the same way you effectively teleport information from your computer to data centers.
This microwave would need to be massive to fit all microwaved meals Americans are making at any given time.
I estimate that all microwaves in America use ~25 GWh of electricity every day. This means that the big national microwave would be using as much electricity every day as Seattle. An entire new city’s worth of electricity demand added to the grid, just for heating food!
What about the water? Well the average water consumption per kWh in American power plants is 4.35 L, so this microwave would be consuming 110,000,000 liters of water every single day. 45 Olympic pools of water every day.
I think if the big national microwave existed, it would be drawing a lot of scorn. “This seems so wasteful. What’s wrong with just using an oven?” People would see it as sapping a whole city’s worth of energy. There would be articles about how it’s sapping the local grid and using as much energy as 800,000 households.
But this microwave does exist. It’s just hidden, broken up into hundreds of millions of little pieces of itself scattered across America. The impacts are just as real as if it existed in one place. The climate does not care about where emissions happen. It just responds to total emissions. All microwaves in America are emitting just as much as the big national microwave would. But they don’t exist in one place as a single evil building people can get mad at. The big national microwave is invisible.
Suppose someone says “Tech bros just reinvented the oven, only this time it’s destroying the planet.” Would we want people to stop using the big microwave and switch to their ovens?
Well, ovens use way more energy to cook the same food. Getting people to stop using the big microwave and switch to ovens would actually cause way more emissions in total, probably around 10 times as much. Those emissions would just be dispersed across the country, so people would feel better about them, because they couldn’t see them. People really like when they can’t actually see the effects they’re having on the climate, but those effects are still real. Shutting down the big national microwave would be a huge environmental mistake, even though it looks much more evil than everyone’s individual ovens.
These are three mistakes I think people would make if we had the big national microwave:
They would only see it as a single, inert building using a ton of energy, and wouldn’t compare its benefits per unit emissions to other buildings. They wouldn’t consider how many more people were interacting with it, or divide the energy cost by the number of people using it. It would make sense that a building managing all microwave needs for all America would be using more energy than a nearby toy store, because it would be serving a lot more people. The toy store is actually way more inefficient with its energy, but comparing it to the toy store would be ridiculous. If it sounds ridiculous to say “All American microwaves are using more energy than this one toy store, people should all throw away their microwaves!” then it should sound equally ridiculous to condemn the big national microwave for the same reason.
They would want to shut it down and have switch to things that seem more “normal” like using an oven, even though those normal things are mostly way worse for the climate. The normal things don’t have a big evil building of their own to represent how much energy they’re really using, so unlike the microwave, their emissions are invisible, even though they’re much greater in total.
They would only see the building as using a lot of energy relative to everyday people (800,000 households!) and wouldn’t consider it in the context of climate overall and how big of a problem it is compared to most other things we do. They would prioritize optimizing the microwave over much more easy climate wins, like shutting down coal plants or electrifying cars.
People also make these three mistakes when thinking about data centers. When you see climate arguments against data centers, check and see if the same arguments would apply to the national microwave, and wouldn’t apply to regular microwaves. That tells you that the other person is just mad that emissions are visible, not that they’re a lot.
Data centers as the national microwave
Data centers are weird buildings. They concentrate more power demand in a smaller space than basically any other type of building. They also, at any one moment, have hundreds of thousands of people from around the world interacting with them.
From the outside, data centers look boring and inert.
But in some sense they’re the most active physical objects we’ve ever built. They’re building-sized computers constantly running gigantic amounts of equations. Microsoft’s 2023 GPT-4 training cluster reportedly had ~25,000 A100 GPUs in a single build. This amount of chips could perform around 1018 (million trillion) addition or multiplication calculations per second. All humans who have ever lived, doing one calculation per second for 24 hours per day, could together do that many calculations in about 100 days.
Each data center is more of a massive very general tool than a normal building.
Like the national microwave, hundreds of thousands of people interact with a data center at once. People effectively teleport computer tasks away for some far off building to deal with it much more efficiently and reliably. When you read stories about data centers using huge amounts of power, you should think “This is because tens of thousands of people are interacting with the building at once and using it as a tool, like a gigantic national calculator.” You might think the way the data center’s being used is bad or wasteful, but you should still hold in your head that this is just a very very concentrated result of 10s of thousands of small individual decisions, like the microwave is.
Data centers are uniquely energy-efficient. Basically all the energy they take in is delivered directly to incredibly optimized computer tasks. They’re the most energy-efficient way to do a lot of computing. Since computing can give us valuable information about the world, it seems good to be able to build these huge clusters where computing can happen more efficiently than anywhere else.
All global data centers were responsible for 0.48% of emissions in 2024. On this chart of where global emissions were happening (in 2016, more recent data’s hard to find) all data centers would be about the size of the “rail” line.
Obviously this is not nothing, but take a minute to look through everything else in the infographic. If you hadn’t heard of data centers before, and saw them listed as a tiny stripe (they would be a small sliver of the “commercial” stripe), would something emitting 0.48% of global emissions stand out to you as something that required special attention? How many conversations in the last year have you seen about the emissions of cement, or landfills, or paper & pulp production?
I think a part of the reason you haven’t seen more discussions of the latter is that they don’t have big evil buildings of their own. There’s no big centralized paper production mill that’s using huge amounts of energy and water people can get mad about, even though paper production overall is emitting significantly more than data centers right now. People are less upset about those emissions because they’re invisible.
Identifying climate bad guys by just looking at what seems like it’s a bad guy is a terrible way to think about the problem. It makes it easy to become tricked by industries hiding or normalizing their emissions.
Of course, unlike other industries, data center energy use is growing.
The IEA forecasts that data center emissions will peak in 2030.
This is what their emissions look like now, and will look like in 2030, compared to other industries and emissions overall. I’ve split them up into the IEA’s implied forecasts for AI specifically and non-AI applications:
Again, not nothing, but the 2030 numbers are the very most emissions data centers are forecast to ever cause in one year.
Unlike most other things on this list, data centers can help us in basically every other area of climate. Having very general tools to do immense numbers of calculations and optimization tasks is useful for tackling other problems. Of all the areas I’d want to cut for the climate, having giant warehouse-sized calculators for these tasks seems like one of the last places to cut. Even though data centers are newer, they’ll probably be much more useful for solving the climate crisis than most other industries.
But these other industries don’t have big evil buildings of their own. They have much more effectively hidden their emissions from everyday people, or normalized them so people treat them as permanent unchangeable background parts of the world. If we want to think seriously about climate, we need to be willing to slice through those illusions and think of the whole world as more malleable. When we do that, data centers stop standing out. They take their place among other very energy-optimized industries that serve a lot of people and emit relatively tiny amounts per user, and the main climate villains (things like cars, livestock, and bad energy management in buildings) become more apparent.
More than anything else, it’s the energy grid itself that needs to change, rather than the relatively small individual ways we use energy. 1.5% of electricity in the world is being used on computing in data centers. The problem isn’t that this number should move from 1.5 to 0% (computing is incredibly valuable!), it’s that 100% of electricity should be green in the first place.
None of this is to say that data center emissions don’t matter at all, only that they’re taking up a suspiciously large part of the climate conversation relative to how much they actually emit, or how beneficial it would be to stop building them.
If cars or livestock or energy waste had their own big evil buildings, it would become much more obvious how big of a climate disaster each is. It’s our job as people thinking about climate to make the invisible visible.
It is famously difficult to predict how AI progress will play out, but most expert predictions of AI’s net effects on the climate lean in the direction that it will on net prevent more emissions than data centers cause. The International Energy Agency says:
The adoption of existing AI applications in end-use sectors could lead to 1400 Mt of CO2 emissions reductions in 2035 in the Widespread Adoption Case. This does not include any breakthrough discoveries that may emerge thanks to AI in the next decade. These potential emissions reductions, if realized, would be three times larger than the total data centre emissions in the Lift-off Case, and four times larger than those in the Base Case.
So in a world where we adopt AI in more parts of the economy faster, it looks like on net it will prevent more total emissions. If we achieve the high adoption situation the IEA describes, it will on net reduce global emissions by 1100 Mt of CO2 each year. That’s the current total yearly emissions of all global shipping and aviation. Seems good!
Summary
So what does this mean for your individual AI prompts?
Stories of AI data centers using a lot of local resources are due to them concentrating global demand for AI in very concentrated buildings, not each individual prompt using a lot of energy. In the equation Total energy = (Energy per prompt) x (Number of prompts), it’s the number of prompts that’s large, not the energy per prompt.
Usually, the larger the data center, the more energy-efficient it is. Computing in data centers is much more energy-efficient than on your computer.
The fact that AI prompts are run in a big data center makes them no worse for the climate than if you ran them on your personal computer, because the climate does not respond to where emissions happen, only total emissions. Data centers draw from energy sources that are more carbon intensive than normal, but the emissions from your individual prompts still round to zero compared to basically every other way you spend your time. Reducing your individual AI prompts will have no impact in your emissions.
Many people have trouble visualizing the aggregate results of doing everyday activities. If there were a single national microwave, it would use as much energy every day as Seattle. Data centers make this aggregate energy use visible, but the aggregate energy of most other ways we spend our time are invisible. Data centers only look like they’re using way more energy because we can’t directly see all the energy of other things we do gathered together into specific buildings.
If you are worried about the local environmental impacts of data centers, the way to solve them will be regulation from local governments, not individual consumer boycotts. Data centers concentrate so many prompts in one place that even large amounts of people boycotting AI wouldn’t change how much an individual data center emits.
Everything you do online involves interacting with data centers. We shouldn’t be surprised that buildings we spend half our time interacting with are using a noticeable amount of energy. It’s still incredibly small compared to most other things we do.
Globally, forecasts for how much energy data centers will use into 2030 and beyond seem pretty underwhelming, especially when you consider that everyone on Earth is spending half their time interacting with them.
Data centers are not inert buildings that just burn through energy and resources for no reason. They’re building-sized computers that have been uniquely optimized to do complex computing tasks. Of all the ways that we can spend energy as a society, they seem more useful than many other options.
Data centers harm local water access
This one’s actually just false. I have a much longer deep dive on this here. There is nowhere in America where data center operational use of water has increased household water prices at all. Why is this?
Data centers are not using that much water
Data centers in America had a consumptive water use of 200-250 million gallons of water per day in 2023 (including offsite water used in normal power plants generate the electricity). If we just look at the consumptive water use in the data centers themselves, it was more like 48 million gallons of water per year.
It’s surprisingly hard to find good estimates of the US’s total consumptive water use. This paper finds it’s around 132 billion gallons per day. This implies all data centers consumed about 0.15-0.19% of the total freshwater we consume in 2023. 1/500th of our freshwater supply. Data centers themselves used just 0.04%. Obviously not nothing, but I think putting this number in context makes it seem way less extreme. I repeat this point a lot, but Americans spend half their waking lives online. Everything we do online interacts with and uses energy and water in data centers. It’s kind of a miracle that something we spend 50% of our time using only consumes 0.19% of our water.
In 2023 all data centers in America collectively consumed as much water as the lifestyles of the residents of Paterson, New Jersey. AI uses ~15% of energy in data centers globally. It seems unlikely AI in America specifically is even half of the water footprint of Paterson. If we just include the water used in data centers themselves, this drops to the water footprint of the population of Albany, New York.
Water economics
America is good at water economics. Our water management has a lot of ways to keep rates low for consumers. This (and AI’s very low total use of water) is the main reason it hasn’t affected water prices at all.
Household and commercial water prices are different everywhere to keep the markets separate so commercial buildings aren’t competing with homeowners. Data centers only compete with other businesses for water, like any other industry.
In low water scarcity areas, water isn’t zero sum. More people buying water doesn’t lead to higher prices, it gives the utility more money to spend on drawing more water and improve infrastructure. It’s the same reason grocery prices don’t go up when more people move to a town. More people shop at the grocery store, which allows the grocery store to invest more in getting food, and they make a profit they can use to upgrade other services, so on net more people buying from a store often makes food prices fall, not rise. Studies have found that utilities pumping more water, on average, causes prices to fall, not rise.
The only times water costs rise in low water stress areas when a large new consumer arrives is when the consumer demands so much water that utilities are forced to do major rapid upgrades to their systems, and the consumer doesn’t pay for those upgrades. In every example I can find of data centers requiring water system upgrades, the companies that own the data centers are the main source of revenue used.
The main water issue in American small towns isn’t the supply of water, it’s aging water infrastructure that doesn’t serve a large or rich enough tax base to get the money to upgrade. Old infrastructure makes water more expensive. It can also be dangerous (lead pipes etc.). Small town water costs are often higher, not lower, than cities, due to economies of scale. This wouldn’t happen if water costs simply rose when more water is used. Data centers moving into small towns often provide utilities with enough revenue to upgrade their old systems and make water more, not less, accessible for everyone else.
In high water scarcity areas, city and state leaders have already thought a lot about water management. They can regulate data centers the same ways they regulate any other industries. Here water is more zero sum, but data centers just end up raising the cost of water for other private businesses, not for homes. Data centers are subject to the economics of water in high scarcity areas, and often rely more on air cooling rather than water cooling because the ratio of electric costs to water costs is lower.
This seems fine if we think of data centers as any other industry. Lots of industries in America use water. AI is using a tiny fraction compared to most, and generating way, way more revenue per gallon of water consumed than most. Where water is scarce, AI data centers should be able to bid against other commercial and industrial businesses for it.
In general, if there’s a public resource like water, it’s considered the job of the utility and government to set rates to reflect its scarcity. Blaming a private business like a data center for using too much water seems kind of like blaming private customers for buying too much food from a grocery store. It’s the grocery store’s responsibility to set prices to reflect the relative scarcity of and demand for different products. If people are buying too much food from the store and there’s not enough money to restock, that’s the fault of the store, not the individuals. Private businesses shouldn’t be expected to monitor the exact state of local water to decide how much is ethical to buy from the utility. It’s the utility’s job to limit demand by setting prices higher if they actually think the company is going to harm the local water system. If utilities set prices high enough, data centers adjust by switching to different types of cooling systems that use less or no water. The market is (like in most places) the way that data centers receive information about the relative scarcity of water. In most conversations about AI and water, the responsibility for water management is oddly shifted to the private company in a way we don’t do for any other industry.
Politicians especially have strong motivations to keep household water prices low. Voters get mad when utility costs rise.
There are many cases of data centers being built, providing lots of tax revenue for the town and water utility, and the locals benefiting from improved water systems. Critics often read this as “buying off” local communities, but there are many instances where these water upgrades just would not have happened otherwise. It’s hard not to see it as a net improvement for the community. If you believe it’s possible for large companies using water to just make reasonable deals with local governments to mutually benefit, these all look like positive-sum trades for everyone involved.
Here are specific examples:
The Dalles, Oregon - Fees paid by Google fund essential upgrades to water system.
Council Bluffs, Iowa - Google pays for expanded water treatment plant.
Quincy, Washington - Quincy and Microsoft built the Quincy Water Reuse Utility (QWRU) to recycle cooling water, reducing reliance on local potable groundwater; Microsoft contributed major funding (about $31 million) and guaranteed project financing via loans/bonds repaid through rates. These improvements increase regional water resilience beyond the data center itself.
Goodyear, Arizona - In siting its data centers, Microsoft agreed to invest roughly $40–42 million to expand the city’s wastewater capacity—utility infrastructure the city highlights as part of the development agreement and that increases system capacity for the community.
Umatilla/Hermiston, Oregon - Working with local leaders, AWS helped stand up pipelines and practices to reuse data-center cooling water for agriculture, returning up to ~96% of cooling water to local farmers at no charge. That 96% is from AWS itself, not sure if it’s correct.
I could go on like this for a while. Maybe you think every one of these is some trick by big tech to buy off communities, but all I’m seeing here is an improvement in local water systems without any examples of equivalent harm elsewhere
In general, the US has a lot of freshwater.
America has among the cheapest water costs of any nation in the world.
Because we’re also the richest nation in the world, our water costs are incredibly low as a percentage of per capita income.
AI as a normal industry
If a steel plant or college or amusement park were built in a small town, it would be normal for it to use a lot of the town’s water. We wouldn’t be shocked if the main industry in the town were using a sizable amount of the water there. If it provided a lot of tax revenue for the town and was not otherwise harming it, I think we would see this as a positive, and wouldn’t talk in an alarmed way about the specific percentages of the town’s water the industry was using. We should think of AI like we do any other industry. A data center drops into a town, draws water in the same way a factory or college would, the local system adapts to it quickly, and residents benefit from the tax revenue. The industry could provide enough water payments or tax revenue to improve the town’s water supplies in the first place.
In Texas, data centers paid an estimated $3.21 billion in taxes to state and local governments in 2024. There is no exact figure on total data center water use in Texas in 2024, but there are forecasts that they will use ~50 billion gallons of water in 2025. So excluding the price data centers pay for the water and energy they use, data centers are paying about $0.064 in tax revenue per gallon of water they use. The cost of water in Texas varies between $0.005 and $0.015 per gallon. Even if data centers caused water prices to quadruple, they would still ultimately be paying more back to local communities than the communities lost in additional water prices. This pattern holds everywhere data centers are built. They add huge amounts of tax revenue to local and state economies that seem to more than balance out any negative water externalities.
ChatGPT may not raise your own carbon footprint much, but it will be very bad for the environment if everyone starts using it
Things that are tiny parts of our personal emissions are tiny parts of global emissions
I’m going to make an obvious point that I worry is glanced over in these conversations.
Suppose this is a single person’s emissions. The red represents the emissions of a tiny activity they do.
Some might say that this small red square will add up if a lot of people do it. What would several people doing this look like?
We can combine them together into a single clump of total emissions.
Gathering the red dots together:
You may notice that now the ratio of the collective red activity to all the other collective emissions is the same for the group as the individual person.
This obviously makes simple mathematical sense, but I worry that this basic intuition is lost on a lot of people when they say things like “tiny things add up.” Tiny things add up absolutely, but they don’t add up relatively. They often remain the same tiny percentage of total emissions as they are for individual emissions. An activity that’s 1/10,000th of your personal emissions is likely to also be roughtly 1/10,000th of global emissions. It’s only as promising a way to reduce global CO2 emissions as it is for reducing your personal CO2. If you’re trying to rally everyone to stop an activity that wouldn’t really raise your personal emissions at all, this is a sign that you’re wasting your and others’ time attention and effort on a drastically ineffective thing to do in the name of stopping climate change.
AI fits this pattern perfectly. All data centers emitted 180 Mt CO2 in 2024. AI likely used about 15% of data center electricity, so it emitted around 30 Mt. Most of the emissions from AI come from electricity used in data centers, not from physical construction of AI hardware, so its total emissions weren’t much more than this.
Globally, the world emitted 41,600 Mt CO2 in 2024. AI was responsible for 0.07% of emissions last year. This includes every single instance of AI. Not just chatbots, but literally every instance of deep learning (there are a lot of these). Prompting chatbots 100 times during the day would add about 0.007% to your daily emissions. The reason global AI is 10x higher as a percentage is a combination of the fact that you interact with many other AI applications throughout the day without realizing it, Americans emit way more than average, and the average global citizen isn’t interacting with AI (or other things) as much. You’ll notice that when you see AI compared to total emissions, its proportion of total emissions looks very similar whether you’re looking at an individual person using chatbots a lot or the world as a whole.
This shouldn’t surprise us, because tiny things on an individual level don’t add up to huge things on a global level, they usually add together to cause the same proportion of the globe’s emissions as our own personal emissions. In this sense, tiny things don’t add up. They’re as promising a solution for climate at the global level as they are for your personal carbon footprint. Tiny things have equally tiny potential for actually moving the needle on how much we emit.
Making the list
What if you could make a list of demands for every behavior change you’d want everyone to make? The list would be ordered by which actions would help the climate the most if everyone did them. You’d want everyone to know about the first few and act on them. Maybe you could get the first 5 across to a wide audience. After that it starts to get more uncertain.
What would be at the top of the list? What would you prioritize? For me, we’d start with these three:
Find any ways you can push for converting the grid to green energy (I’d recommend Clean Air Task Force to get up to speed on where you can be most useful).
Donate to effective climate charities.
Vote for politicians who will take significant positive actions on climate.1
There’s a big power gap between these and any specific lifestyle changes you could make. Lifestyle changes are nothing in comparison to big systematic changes to our energy grids. The differences are not small. They’re gigantic. Here’s an example I used before to illustrate the point:
Suppose there are 3 people who each want to have an impact on the climate: Kate, Bob, and Freddy. They each independently choose their own ways of impacting the climate. All seem really noble and self-sacrificial. Kate joins a committee of 500 people working for a year to keep a nuclear power plant open for another 10 years. Any impact they have will be divided by 500 people. Bob goes vegan for a year. Freddy has a debilitating ChatGPT addiction and prompts it 2,000 times per day. About one prompt every 20 seconds every waking hour. He quits for a year.
Try to form an idea in your head of how their climate impact compares to each other. It’s not immediately obvious.
After 1 year, Kate has prevented 70,000 tonnes of CO2 from being emitted. Bob has prevented 0.4 tonnes. Freddy prevented 0.2.
This means that Kate had as much effect as 175,000 Bobs, or 350,000 Freddys.
Here’s how many people would need to go vegan for a year to match Kate’s impact:
I worry that when people sneer at quantifying climate interventions, they don’t realize how gigantic the differences are in what we can do.
To use a very blunt analogy, if we’re in a war against climate change, individual contributions to changes to the energy grid are like nuclear bombs, and individual lifestyle changes are like sticks of dynamite. It often seems ridiculous to me to even mention them in the same breath.
But let’s say you want to get those most important lifestyle changes across to people anyway. What would be most important? I’d push for relatively simple changes that aren’t especially disruptive to people’s lives but still cut huge amounts of carbon; the things most people can stick to that will actually make some kind of noticeable dent.
Buying green energy from your grid or installing a private solar panel.
Buying an electric car.
Skipping a flight if there’s another way of traveling.
Try making your own list like this. Think about what 20 interventions for the climate you would most want to communicate to the average person. Try to think about how many the average person could actually be convinced to follow. Where would you get the most climate impact for your effort?
Where would AI fit into this?
These are things the average American does every day that emit more than prompting ChatGPT 100 times (28 g CO2, and yes, including training and the embodied emissions of hardware):
Making coffee
Running an AC for 10 minutes
Running a fridge normally
Taking a 1 minute hot shower
Using hot water for 1 minute to rinse dishes
Ironing clothes
Microwaving food
Cooking on an electric stovetop for 5 minutes
Leaving 8 LED bulbs on for an hour
Running a humidifier for a few hours
All of these things would be way, way, way down my list of the most important things for Americans to cut for the climate. We have a limited number of ways the average person will be willing to adjust their lives. A limited amount of messaging we can get to them before they focus on other things. Only if I knew that I could literally get a majority of people to change their lives in hundreds of other ways for the climate would I then consider suggesting “being mindful of your AI prompts.” As of right now, the vast majority of Americans wouldn’t consider paying more than an additional $40/month in taxes to combat climate change. They are not ready to make hundreds of additional lifestyle changes, and if we get to a future of abundant green energy, we won’t need to ask for these changes anyway.
Not many people are actually thinking about climate change much
One of the reasons it’s so important to get climate messaging right is that for a majority of people who hear it, it will be one of the only times they think about climate change at all. Only 23% of Americans say they have thought about climate change a lot.
65% say they rarely or never discuss climate change with friends
A majority of Americans report that their friends and family don’t believe it’s especially important to put effort into stopping climate change.
And 44% don’t believe the actions of individuals will have any effect on the climate.
Unfortunately, this does make it seem like climate messaging is somewhat zero sum: one message people hear may crowd out others. Each message will often be one of the only times the listener considers climate change. This means we need to communicate especially important information when we do recommend lifestyle changes. Sacrificing this rare important opportunity on a message that will cut literally millions of times fewer emissions than other possible lifestyle changes seems like a decision to crowd a very limited field with something that won’t help at all. Advertising extremely tiny cuts as significant is effectively a form of kneecapping real climate action. I see people going around warning about the impacts of personal AI use as irresponsible, in the same way I think it’s irresponsible to imply that you can have significant climate impact if you randomly unplug your fridge every now and then. Messages like this are crowding out serious climate conversations for the sake of something that will never be a significant part of our emissions. The competition to reach everyday people is already extremely crowded, and people waste valuable opportunities when they throw out innumerate recommendations that won’t save us from the worst effects of climate even if all 8 billion people alive ruthlessly followed them.
Collective action
Climate is a collective action problem.
We all have strong incentives to use more fossil fuels. Fossil fuels are, unfortunately, miraculous. It’s a miracle there’s a liquid you can spend a few minutes filling your thousand pound metal vehicle with that will singlehandedly blast it hundreds of miles. There’s no reason these had to exist, but they do, and because they’re so miraculous it’s very hard to make green replacements for them, so those replacements cost more money. Life’s easier if you personally don’t worry about climate change and just do what you want.
Your individual emissions will, on their own, literally never matter for the climate. You could fly in a private jet every day of your life, and your emissions wouldn’t make a dent in the amount that’s actually going to cause tipping points for the climate. No one in a hundred years will notice any difference in the weather based on your entire lifetime emissions.
However, if a lot of people don’t change their emissions, climate change will make life significantly worse. Larger and larger catastrophes will happen.
This is a classic collective action problem: Everyone would be much better off if we all took some action, but everyone individually has a strong incentive not to, and each person’s contribution won’t matter unless a lot of people do it. Thus, the only way we can really encourage people to cut their emissions is if we reliably show that many other people are doing the same, so there will in fact be collective payoff for individual sacrifices.
Because climate is a collective action problem, we should keep a few things in mind:
People have strong reasons to emit. Asking them to change their behavior for climate is already a lot. We can’t presume they’ll have infinite willingness to change. If they did, we would have solved climate change already.
We need to find simple rules that everyone can know everyone else is trying to follow. If they’re too complex, citizens will lose hope that others are doing their part of the collective action problem, and they themselves will be less likely to play along. These simple rules also need to be things that will actually reliably reduce emissions a lot. The rule “pepper your daily life with random impulses of guilt about some tiny emissions but not others. Cut based on ominous news stories rather than actual numbers” seems like a terrible simple rule to give people as a way of solving the collective action problem of climate, yet that seems to be a very popular message in a lot of spaces. We should discourage this.
For more on collective action problems I’d review the prisoner’s dilemma. We need to remember in climate communication that we are not merely building big cool coalitions of the virtuous in-the-know people, we are asking everyone to take scary leaps into potentially more difficult lives to solve a problem that will only really be solved if many other people take the same leap. This is hard! It’s why solving climate change is so difficult to fix. We need to think about this as threading a needle. People who run around hyping people up about tiny emissions merely because they “add up” to still very tiny proportions of global emissions are not taking this problem seriously.
ChatGPT uses as much energy as 20,000 households
This seems big if you don’t consider how many people use ChatGPT. It’s the most downloaded app in the world. Every day 400 million people send 1 billion ChatGPT prompts. The most downloaded app using as much energy as Barnstable Massachusetts isn’t surprising. Fortnite uses roughly 400,000 households’ worth of energy. YouTube uses roughly 2,000,000. ChatGPT’s total energy use is small compared to common internet staples.
You should spend about as much time worrying about the global climate impact of ChatGPT as you do about the climate impact of Barnstable Massachusetts.
What’s a reasonable amount of energy an app with a billion daily uses should consume? Maybe you think all of ChatGPT should not use any more energy than a single US home. A typical US home uses about 30 KWh of electricity per day. To get ChatGPT’s energy this low, each prompt would have to use 30 KWh / 1,000,000,000 prompts = 0.00003 Wh. There’s basically nothing else we do that uses so little energy. For context, that’s the energy a single LED bulb uses in 1/40th of a second. We don’t expect any other internet service we use to use so little energy. It makes sense that, because ChatGPT is getting a billion daily prompts, it’s using the energy of a small town.
Training an AI model uses too much energy
Training GPT-4 used 50 GWh of energy. Like the 20,000 households point, this number looks ridiculously large if you don’t consider how many people are using ChatGPT.
Training costs vary relative to inference costs. Training consumes ~40% of all energy AI models in America use. Adding this ~40% figure to the total cost of a prompt each prompt costs ~0.4 Wh rather than ~0.3 Wh.
Training an AI model is like any other large project companies do. It’s more like a lot of people working together to design a new product than it is like running a normal computer program. The correct comparison to training models is to other large projects we as a society do. When you compare training models to other regular things, training stands out less. Here’s data on the very largest AI model ever trained as of September 2025:
Making AI chips emits too much carbon
The most recent comprehensive study of the “embodied” or “scope 3” cost of AI (the physical manufacture of AI equipment itself) implies that on average, 96% of the carbon emissions and resource use of a chip that’s used for ChatGPT happen while the chip is being used during training and responding to prompts, not in the chip’s manufacturing or recycling. This implies that the carbon cost of producing and recycling the chips effectively rounds to zero if you’re looking at the cost-per-query.
This makes sense. AI chips are designed to have energy running through them 24/7. It’s kind of like if you made a wire, and measured all the carbon emitted when generating all the electricity that ran through the wire over its entire lifespan as part of the wire’s emissions. You’d expect most of the wire’s emissions to come from this electricity, not from producing the wire itself. In the same way, the actual use of the chip should be pretty energy intensive relative to building it.
The buildout of digital infrastructure that makes AI use so little energy in the first place is itself bad for the environment
Maybe we should include the full carbon costs of the digital infrastructure and supply chains required to support AI, not just the data centers themselves. By using AI, you’re complicit in the rapid buildout of data centers and global electronics supply chains that support it, not just the individual energy cost of your prompt.
Something strange about comparisons of the full infrastructure costs of AI to other things we do is that they often fail to account for the infrastructure costs of those other things.
A recent popular blog post was titled “Why Saying “AI Uses the Energy of Nine Seconds of Television” is Like Spraying Dispersant Over an Oil Slick.” The author’s main point is that each individual AI prompt is able to use so little energy only because of this vast and expanding background buildout of AI infrastructure, so just reporting (as I do) that an AI prompt only uses as much energy as a few seconds of a microwave is hiding the more ominous reason why it’s able to be so cheap in the first place. By using AI, you’re complicit in some way in that infrastructure buildout.
This criticism would make more sense to me if everything else in society didn’t also have a vast sprawling physical infrastructure supporting it. “9 seconds of TV” has huge networks of electronics systems supporting it, as well as crazy amounts of money and people-hours going into making the most entertaining TV, lavish (often wasteful) lifestyles enabled by the profits from TV. Obviously, TV advertising also encourages people to buy more stuff from other complex supply chains.
If you make a comparison like this:
One off cost of an AI prompt + the full infrastructure supporting AI ←→ 9 seconds of TV
then it’s easy to make AI seem like the bigger problem, but if you make what I think is the correct comparison instead:
One off cost of an AI prompt + the full infrastructure supporting AI ←→ 9 seconds of TV + the full infrastructure supporting TV
Then I suspect the infrastructure costs of AI and TV will roughly cancel each other out, and you might as well just make the original comparison:
One off cost of an AI prompt ←→ 9 seconds of TV
This is why I think it’s reasonable to make this comparison.
We don’t really hold this standard for anything else we talk about. I can say “Your sedan emits about 320 g of CO2 for every mile you drive” and I don’t think that’s deceptive, even though the sedan is relying on a vast road infrastructure that costs 69 million tonnes of CO2 each year in America alone just to maintain, the sedan itself has “embodied carbon costs” from manufacturing it that I’m not including, and driving a car normalizes the behavior for other people. I think people understand that these additional costs exist when they talk about how much cars emit per mile, and I think they also understand these costs exist when we talk about how much AI emits per prompt.
AI infrastructure is being built out faster than most other infrastructure, but the rate of change of growth of something’s emissions doesn’t on its own tell us much about how bad it is for the climate. The carbon emissions from the global supply chain of Labubus recently began to rapidly increase, but Labubus are going to remain such a tiny part of global emissions that this doesn’t matter. What matters is the total amount of emissions and how much value we’re getting from them.
AI and electronics will obviously emit way way more in total than Labubus. However, the IEA expects the data center buildout to, on net, significantly decrease emissions overall, because AI will be optimizing so many other processes in society and making green energy and smart grid tech more viable. They project that AI alone will prevent 4 g of carbon emissions for every 1 g all global data centers emit (for both the internet and AI). So I could end up saying something strange like “Every ChatGPT prompt encourages the data center buildout, which is good because forecasts imply that the buildout will on net lower global emissions, so every ChatGPT prompt you send decreases emissions by x amount.” We’ve ascended into a level of abstraction that I think is goofy. Things get goofy when you try to cram the responsibility for every possible outcome of AI into the individual cost of an AI prompt. Your predictions about this “total, hidden” cost will mostly depend on the decisions people make in the future about how and where to use AI, not on ChatGPT’s individual in-the-moment impact itself. Seems goofy!
This goofiness is why I think it makes more sense to just limit what we say about the climate impacts of individual prompts to those prompts themselves, and leave broader forecasts of AI’s total climate impact in the future for separate conversations.
Other objections
This is all a gimmick anyway. Why not just use Google? ChatGPT doesn’t give better information
A lot of conversations about the climate impact of ChatGPT quickly turn back to its value as a service. “ChatGPT is a plagiarism machine that just produces slop. It’s glorified autocomplete” etc.
I think it’s so important to snap everyday people out of being distracted by this that you shouldn’t even try to convince people that ChatGPT is useful. That’s too much of a context switch that can get bogged down in minutia. I do think AI is useful (I explain why here) but that’s a separate point.
The better answer is that even if ChatGPT were completely useless, there are a lot of other useless things we do that use a lot more energy.
Every additional second you spend showering uses enough water for 40 ChatGPT prompts. It’s okay to accidentally go one second over the optimal shower time (or minutes, even!). That’s in some way “useless” because it’s not helping you meet your goal of getting clean, but it’s such a small amount of water that you shouldn’t stress over it.
I have about 40 tiny hanging LED lights in my room:
Each of these individually is “useless.” If I unscrewed just one of them, I don’t think anyone would notice. Each one of these uses a ChatGPT search’s worth of energy once every 20 minutes. Together they’re using as much energy as 130 ChatGPT searches every hour, and using a gallon of water every 12 hours.
My roommates haven’t knocked on my door and said “Sorry Andy, you can’t use those. They’re making the energy bill go up too much” because the lights add about $0.40 to the energy bill each month.
Should I use these even though I don’t need them? They’re just decorative and I could be fine with only a lamp. Even though they use hundreds of ChatGPT searches’ worth of energy each month, they don’t contribute to the climate crisis at all, and they make me happy. That should be the end of the argument. People seem to like using ChatGPT. Who cares whether it’s objectively valuable? People like playing Fortnite too. It’s no worse for the environment than my hanging lights. Let them use it if they like. Worry about the things that actually matter.
I’ve been vegan for 10 years, live in a big city with roommates, walk to work every day, and rarely fly. Even though I use ChatGPT daily, my environmental footprint is less than half the average American’s. After all that, it would seem silly for me to feel guilty about either my LED lights or ChatGPT.
You can find things in your life that use similar energy to ChatGPT and make similar comparisons. Sometimes we like to do silly meaningless things that use a tiny amount of energy. If the climate movement wants to make its members feel guilty about that, it will fail.
If you think there are reasons why ChatGPT is not just useless but actively harmful (copyright, hallucinations, job loss, risks from advanced AI, etc.) make the case directly without adding incorrect climate statistics. There are a lot of issues I care about that I want people to have more clarity on. I think the way we treat chickens in factory farms is a massive moral catastrophe. I could add “and each chicken has some environmental cost” every time I talk about it. That would be technically true, but chickens aren’t especially harmful to the environment, and it dilutes the message I’m actually trying to get across. If you try to smuggle in a lot of unconvincing additional reasons why something’s bad, it undermines your otherwise strong case. Environmental objections to ChatGPT often dilute other serious criticisms of the technology. Focus and clarity help AI critics’ case.
Don’t trust some random Substack post over scientific research
This one came up a lot in replies to my last post.
I’m not claiming to have discovered anything new in what I post here. I’m just gathering what seems like the consensus on how much environmental impact ChatGPT has, trying to get a bird’s eye view of how it compares to all the other things we do, and coming away with what I think is a pretty strong case that it’s not bad for the environment. Nowhere in my post do I go against the scientific consensus on climate or the environment or AI. You should see this post as basically a long written comment with a lot of links. I’m just some guy who’s noticed that these numbers don’t make sense to worry about if you just compare them to everything else we do.
I think these claims are all pretty straightforward and easy for anyone to check and verify. It seems kind of identical to saying something like “The amount Americans spend overall on Cheerios is $400 million, which seems high without context, but the amount that Cheerios contribute to individual food budgets is low, so they’re not a big problem.” These are numbers and ideas that anyone can double check. If you think I’m getting anything wrong, it should all be pretty easy to demonstrate. Saying “Don’t trust this Substack post on AI and emissions” feels kind of like saying “Don’t trust this Substack post on where Cheerios fit into your budget.” It overestimates how much trust is involved. Just check the numbers yourself!
These energy and water numbers are all based on guesswork
I go into a lot more detail here about why I’m using the specific numbers in this post. We have more and more data on chatbots coming in all the time. Here’s the best summary of what we know right now.
Trying to figure out how much energy the average ChatGPT search uses is extremely difficult, because we’re dividing one very large uncertain number (total prompts) by another (total energy used). How then should we think about ChatGPT’s energy demands when we know almost nothing certain about it?
The people who believe that ChatGPT is uniquely bad for the environment are also basing their numbers on guesswork. If we can’t know anything about ChatGPT because the numbers are too uncertain, it doesn’t make sense to single it out as being uniquely bad for the environment. We just don’t know! Whenever people try to guess at the general energy and water cost of using ChatGPT, the numbers consistently fall into a rough range with an upper bound for the average prompt’s energy at about 0.3 Wh, so that’s what I’m running with. Maybe all these guesses are wrong, but we have just as much reason to believe they’re higher than the true cost of ChatGPT as we do that they’re lower.
If we can say anything at all about ChatGPT’s energy use, everything in this post is in line with our best guesses.
If we can’t say anything at all about ChatGPT’s energy use, you should be just as skeptical of claims that it’s bad for the environment, because its critics are also basing their claims on nothing but guesswork.
Saying “ChatGPT is uniquely bad for the environment” and then also adding “And you can’t disagree with me because all numbers involved are based on guesswork. No one knows anything” is a pretty obvious double standard. If it’s all guesswork, no one can make any strong claims about ChatGPT and the environment. If we truly know nothing about it, it seems reasonable to assume it’s in line with every other normal thing we do online.
This post is “whataboutism.” Just because some things emit more doesn’t mean ChatGPT isn’t bad for the environment
This one also comes up a lot in conversations about AI and the environment.
“Whataboutism” is a bad rhetorical trick where instead of responding directly to an accusation or criticism, you just launch a different accusation or criticism at someone else to deflect. Kids do this a lot.
“Clean your room, it’s a mess!”
“My sister’s room is messier!”
Some people said my original post is whataboutism because they read me as saying “ChatGPT is bad for the environment? Well meat is bad for the environment too!”
That is not what I’m trying to say.
Literally everything we do uses energy. The way we generate energy often involves emitting CO2. It is not possible for everyone to stop emitting right this second, because if they did billions of people would die. We don’t have the green energy infrastructure to give 8 billion people the energy they need to survive, even in rich countries.
This means that by some definitions, literally everything we do is “bad for the environment” because it uses scarce energy and causes CO2 emissions.
This leaves us in a weird place. If emitting any CO2 means something is bad for the environment, literally everything we do is bad for the environment. The term stops being useful in telling us what we should do. If a definition takes away any method for the climate movement to decide what actions to take, that seems bad.
Suppose we want to figure out how to live given that everything is “bad for the environment.” An obvious move would be to figure out what’s better for the environment than other things.
“If you’re choosing between driving a Hummer and riding a bike, the Hummer is worse for the environment than the bike, so you should choose the bike” seems like a reasonable statement. Even if everything is bad for the environment, we can still say some things are better or worse.
The problem is that every time you make this comparison, you can be accused of whataboutism.
“Oh, you’re saying bikes are fine just because Hummers are worse? Biking emits a whole kilogram of CO2 for every 30 miles! It’s bad for the environment. Comparing bikes to Hummers is whataboutism.”
Maybe when people say “whataboutism” they don’t mean specific comparisons in the same general activity (modes of transportation). They mean comparisons of activities that aren’t substitutes for each other (eating meat vs. biking). Using whataboutism in this way still gives strange results.
If people were worried about biking because it emits too much, and I said “look, of all the different things in your lifestyle you could change to help the climate, you really shouldn’t worry about biking because its emissions are just so so so low. Eat less meat or fly less or use green heating or switch to renewable energy or advocate for systematic change” that could also be called whataboutism. “Oh, biking is fine just because it’s not as bad as eating meat? Biking in America emits 300,000,000 kg of CO2 each year. That’s as much as 30,000 households. It’s an environmental disaster, just like eating meat. This is classic whataboutism.”
It seems like some people are stuck in this mode where:
Everything is by definition bad for the environment.
Doing any general comparisons of different lifestyle interventions for the climate is whataboutism.
Therefore, there is no legitimate way to decide what we should and should not cut for the climate, other than reducing emissions of activities that are very directly comparable to each other, like biking and driving, or watches and clocks, or Google and ChatGPT.
My answer to them is that this way of thinking will not help people maximally reduce emissions. If people feel similarly bad about digital clocks as they do about intercontinental flights, they won’t make good decisions about the climate. Call that whataboutism if you want, but I think the climate crisis demands these kinds of comparisons.
To help people find how to emit less, I’d change the definition of what it means to be “bad for the environment.”
I think of something as being “bad for the environment” not when it emits CO2 at all, but when it emits above a threshold where, if everyone did it, it would be hard or impossible to avoid the worst impacts of climate change before we as a planet transition to 100% green energy and achieved a climate equilibrium where the temperature stops rising. People riding bikes emit CO2, but everyone riding bikes (even looking at global use) would be easily possible in a world where we transitioned to green energy before hitting dangerous climate tipping points. Everyone using Google and ChatGPT would also be extremely easy in a world where we avoid dangerous climate tipping points, because their emissions are so low. Everyone eating meat for every meal or using internal combustion engine cars would not allow for us to avoid dangerous climate tipping points, so they’re bad for the environment.
Under this revised definition, it’s whataboutism to say “eating meat isn’t bad because people drive,” but it’s not whataboutism to say “Google isn’t bad because its emissions are so drastically low compared to everything else we do,” and it’s not whataboutism to say the same about ChatGPT.
Focusing on individual prompts is greenwashing, it’s designed to distract you from how much the models use in total
I wrote this post because a lot of people were worried about the individual per-prompt energy of AI. A lot of environmentalist critics of AI at the time were constantly saying things like “We don’t know the true cost per prompt” and “labs aren’t being transparent with us” and implying people shouldn’t use chatbots frivolously because the energy was so much. I wrote this post only to say “Well, if that’s your concern, this is so tiny it doesn’t matter.” At the end I say we should focus less on individual emissions and more on systematic change to our energy grids.
Since then, a lot of people have taken the opposite line: Focusing on individual prompts is a distraction. It might even be an intentional trick by AI labs, in the same way oil companies would rather have you focus on your individual emissions rather than systematic changes to the energy grid. Maybe posts like this contribute to that greenwashing.
This has been a sudden weird lurch in the conversation. I really wish people who are suddenly concerned about too much focus on individual prompt emissions would have been more vocal about this last year when everyone was freaking out about how much energy each prompt uses. I didn’t see these people swooping in to say “Focusing on individual prompts is a distraction” then.
I’m not trying to “greenwash” AI. The question of whether AI as a whole is environmentally wasteful is separate from the question of whether your individual prompts add much to your own carbon footprint. The purpose of this post is not to say “Don’t worry about how much energy AI is using in total” (I share more of my thoughts on that question here). It’s only to say “If individual carbon emissions of your prompts is what you’re worried about, they’re ridiculously small.” I’m responding to a concern that existed a lot at the beginning of 2025. Now (editing this in August) concern has shifted more to total emissions, but a lot of people still think individual prompts use a lot! I always thought the main question was total energy use anyway, so it’s been nice to see this shift.
Every Watt-hour of energy matters
This sounds nice and noble, like a way of taking climate change maximally seriously, but it’s actually an extremely bad way to think. It leaves the climate movement open to becoming distracted when we have extremely limited time, and lots of ways we can completely fail to have a large impact.
The climate movement quantifying where it’s actually going to have the most impact is really really really really really really really really really really important.
Kate, Bob, and Freddy
Suppose there are 3 people who each want to have an impact on the climate: Kate, Bob, and Freddy. They each independently choose their own ways of impacting the climate. All seem really noble and self-sacrificial. Kate joins a committee of 500 people working for a year to keep a nuclear power plant open for another 10 years. Any impact they have will be divided by 500 people. Bob goes vegan for a year. Freddy has a debilitating ChatGPT addiction and prompts it 2,000 times per day. About one prompt every second every waking hour. He quits for a year.
Try to form an idea in your head of how their climate impact compares to each other. It’s not immediately obvious.
After 1 year, Kate has prevented 70,000 tonnes of CO2 from being emitted. Bob has prevented 0.4 tonnes. Freddy prevented 0.2.
This means that Kate had as much effect as 175,000 Bobs, or 350,000 Freddys.
Here’s how many people would need to go vegan for a year to match Kate’s impact:
I worry that when people sneer at quantifying climate interventions, they don’t realize how gigantic the differences are in what we can do.
Choices for where to work on climate to help the most don’t take too long to make. You can just do some research and see for yourself where the most promising places to work are. When you do that, you don’t double or triple your impact; you multiply it by the population of an entire city. It’s as if you create an entire new city of climate activists just by taking a little extra time to look around for the best places to help.
If we could convince 50,000 people to just stop and look around for the most impactful things they could do for climate in the way Kate did, it would have the same impact as convincing the entire world to go vegan.
The slogan “Every Watt-hour matters” implies something really dangerous. It implies that because Bob and Freddy were making any adjustments in their lifestyles that reduced emissions at all, they were contributing to end the climate crisis. Compared to Kate, their efforts rounded to zero. They became distracted by super low impact methods for an entire year, when they could have had as much impact as entire cities of people. Going around and telling people who care about the climate that they should use their limited time and energy to worry about ChatGPT is a bad distraction from actions that can do literally millions of times as much good.
The climate movement drastically needs millions of Kates, now. We need to get the message across that what you do for the climate can by any normal measure have millions of times as much impact if you just stop for a moment to try to think about what’s actually going to make a difference. What we absolutely do not need is messaging that says “No matter what you’re worried about, if it uses energy, that’s a legitimate thing to put time and thought into for the sake of the climate.” We don’t have time for that.
We’re running out of time
When I first became interested in climate in 2007, the dream was still alive that we could avoid 1.5 degrees of warming by 2100. That dream is effectively dead. We lost. We didn’t hit the deadline. We’re almost definitely going to hit 1.5 degrees of warming. This doesn’t mean we should give up. The future victims of climate change are owed our hard work. We can still avoid 2 degrees of warming by 2100 if we act aggressively. In my time following the climate movement I’ve seen the tragedy of our great collective failure to hit our first target.
This graph shows what global CO2 emissions need to do to avoid 2 degrees of warming:
The sooner we start making significant cuts to global emissions, the better. I don’t know how you can look at a graph like this and say “The climate movement should spread messages about focusing on cutting any activity, no matter how few Watt-hours it uses. Its members should spend their time and energy on anything that uses energy.” There are too many ways to have a ton of impact (and too many ways to miss them) to justify thinking this way.
It makes much more sense to say “We are basically in a very fast-moving battle against an incredibly complicated impersonal enemy who doesn’t care at all about the personal virtue of each of our soldiers. We need to deploy each person we have on our side to where they can do the absolute most good for the climate. Just having them all go off and focus on their own vibey thing as long as it cuts Watt-hours at all isn’t going to cut it at all. We’ve already had some massive losses and are on track for way more.”
Toward David MacKay thought
The text that’s influenced how I think about climate communication more than any other is Sustainable Energy - Without the Hot Air by David MacKay. If you read it, you’ll recognize that I was trying my best to imitate MacKay’s style in my last two posts. This quote is long but worth reading in full. It was written 16 years ago but is just as applicable today:
This heated debate is fundamentally about numbers. How much energy could each source deliver, at what economic and social cost, and with what risks? But actual numbers are rarely mentioned. In public debates, people just say “Nuclear is a money pit” or “We have a huge amount of wave and wind.” The trouble with this sort of language is that it’s not sufficient to know that something is huge: we need to know how the one “huge” compares with another “huge,” namely our huge energy consumption. To make this comparison, we need numbers, not adjectives.
Where numbers are used, their meaning is often obfuscated by enormousness. Numbers are chosen to impress, to score points in arguments, rather than to inform. “Los Angeles residents drive 142 million miles – the distance from Earth to Mars – every single day.” “Each year, 27 million acres of tropical rainforest are destroyed.” “14 billion pounds of trash are dumped into the sea every year.” “British people throw away 2.6 billion slices of bread per year.” “The waste paper buried each year in the UK could fill 103448 double-decker buses.”
If all the ineffective ideas for solving the energy crisis were laid end to end, they would reach to the moon and back.... I digress.
The result of this lack of meaningful numbers and facts? We are inundated with a flood of crazy innumerate codswallop. The BBC doles out advice on how we can do our bit to save the planet – for example “switch off your mobile phone charger when it’s not in use;” if anyone objects that mobile phone chargers are not actually our number one form of energy consumption, the mantra “every little bit helps” is wheeled out. Every little bit helps? A more realistic mantra is:
if everyone does a little, we’ll achieve only a little.
“Everyone should do their part and avoid prompting ChatGPT” being argued 15 years after this was written makes me want to throw my hands up and say “Look at my climate movement dawg, I’m never getting below 2 degrees of warming by 2100.”
Comparing Watt-hours to money
The median American salary is $61,984 per year. The average American uses 10,700 kWh of energy per year. This means that spending 1 Watt-hour on our energy budget is like spending $0.006 for the average American (it’s the same ratio). So if you’re budgeting your energy, and you’re the average American, adding 1 Watt-hour to your energy budget is like adding half a cent to your literal budget. Hearing climate communicators say “every Watt-hour matters” sounds kind of like someone who is worried about the national debt saying “Americans should save more, every half a cent matters!” That’s obviously hyperbole designed to make a point. It’s not literally true that people worried about the national debt should actually be worried about every individual time they spend a single penny, and the same is true for the climate movement and Watt-hours. Watt-hours are the pennies of our energy budgets.
Some other useful intuitions in conversations
AI companies don’t want to give you free energy
Energy costs money. AI companies don’t want to give too much energy to you for free. That would be like giving you lots of free money.
Almost all ChatGPT users are free users. ChatGPT has 400 million weekly users, but only 11 million paid users. This means that ~98% of ChatGPT’s users are being given free energy by OpenAI for their prompts. Google search now has an AI chatbot built in and is completely free. If either of these were giving you control over a significant amount of energy, this would basically be a massive free giveaway from the AI companies to billions of people. OpenAI and other AI labs have extremely strong incentives to make their models as energy-efficient as possible.
It’s 0.3 Wh!!!!
In a lot of these conversations, I have a very strong urge to grab the other person’s shoulders and say “This is 0.3 Wh of energy we’re talking about!!!! We agree that’s the number! 0.3 Wh!!!!!! That’s so small!!!! Don’t you know this?!?!?! What happened to the climate movement????? All my climate friends used to know what 0.3 Wh meant!!! AAAAAAHHHHH!!!!!”
This would not be very mature, so instead I post 9,000 word blog posts to let off the steam.
It’s hard to get across to people just how strange it is to be standing across from another adult and see them speak so apocalyptically about 0.3 Wh of energy. I made this list of other things 0.3 Wh can do.
I have a similar reaction to the 10x a Google search point. When someone says “ChatGPT uses 10x as much energy as a Google search” I’m sometimes tempted to just say “Yes… 10 Google searches.” and just let that hang. Imagine going back to 2020 and saying “Oh man, I thought my buddy cared about the climate, but I just found out he… oh man I can’t bring myself to say it… he searched Google TEN times today.”
When I first started reading about this, I assumed people worried about climate thought that AI was using way more energy than the companies were telling them, but it seems like everyone on both sides of the debate agrees 0.3 Wh is a reasonable upper bound guess. It’s very strange to have such a strong disagreement about such a clear amount of energy that the climate movement would have never worried about with any previous technology.
This is so so strange
Continuing from the last point, my motivation for posting about this hasn’t been that I think ChatGPT’s amazing and everyone needs to try it. It’s mostly that it feels so strange to look around and see such a strong, popular reaction to ChatGPT that’s so inconsistent with how people think about every other area of their lives.
The only thing I can think to compare it to is if a lot of new addictive phone games were coming out that everyone liked and played, but suddenly one came out (let’s call it Wizard Clash 7) and everyone started talking about how we need to boycott it because it’s bad for the environment. When you looked into it, you saw that there was no noticeable difference between Wizard Clash 7 and any previous phone games. It used 10x more power than some, but 100x less power than others. There was nothing that made it stand out, but suddenly everyone was talking in apocalyptic language about how much energy it was using. When you brought up “Well this is just a phone game. It doesn’t really seem different than anything else we do on our phones, and it’s actually on the lower end of energy use for most phone activities, never mind everything else we do that’s much worse for climate. It doesn’t stand out. Why freak out?” and instead of getting a direct reply being told things like “You’re clearly a shill for Wizard Clash 7. Every additional bit of energy matters for the climate. If everyone plays Wizard Clash 7, that will use as much energy as thousands of households. We’re already in a climate crisis. It’s burning the planet. You’re not a scientist, you can’t just make these claims about how Wizard Clash 7 is fine.” and the conversation ending there. Any additional point you made would receive a comment bringing up five unrelated points about other climate impacts of Wizard Clash 7 that, if you looked at them directly, also didn’t make sense to worry about, but that the person would kind of bounce around like they were speaking in free jazz. Reality never gets acknowledged, instead everything just gets answered with new bad associations.
Being around a lot of adults freaking out over 0.3 Wh feels like I’m in a dream reality. It has the logic of a bad dream. Everyone is suddenly fixating on this absurd concept or rule that you can’t get a grasp of, and scolding you for not seeing the same thing. Posting long blog posts is my attempt to get out of the weird dream reality this discourse has created.
AI’s effect on climate will mostly depend on how it’s used, not on what happens in data centers
We should expect AI to be like the internet
Different sectors of the economy have different levels of potential to offset their own emissions. Global cement production does not seem likely to be helpful with reducing emissions, so the carbon it emits won’t be “cancelled out” by the way we use it. In comparison, both the internet and AI seem to have a lot of potential for optimizing the ways we use energy in other sectors, to the point that it seems likely that the vast majority of their effects on the climate will come from how they’re used, not from the energy they directly use in data centers.
Cement is inert. It can be easy to just look at the physical building of the data center and think it’s simple and inert too. But each data center is in some sense a large, general tool that organizations and individuals can use to optimize the physical world around them.
The climate effects of the internet are not primarily caused by data centers
Consider Amazon. There’s an ongoing debate about Amazon’s net effects on climate change. Amazon optimizes package deliveries, but maybe causes people to consume more in total, and there’s a debate about how the emissions of the supply chains Amazon uses compare to emissions of other things we buy.
But no one in the environmental debate about Amazon has said “The most important question is how much energy its website is using in data centers.”
That’s because the energy data centers use is tiny relative to their effects on economic and personal behavior. The data centers that host Amazon’s website are using tiny amounts of energy compared to the material changes the Amazon website causes in the real world. AI is obviously using more energy in data centers, but as we have seen, this is mainly a result of AI being used so much, not each individual prompt using a significant amount of energy. AI has the potential to have large effects on the physical world per unit energy it uses in data centers.
Another example is Google Maps. Maps uses hardly any energy in data centers and has optimized billions of car trips to be as physically short as possible. It seems certain that the emissions Maps has saved completely dwarf the emissions it’s caused by using energy in data centers. Google reported that just five of its apps (including maps) prevented ~26 million tonnes CO2e in 2024. That alone is 14% of all current data center emissions.
This shouldn’t surprise us. Cars emit a lot and aren’t especially optimized. Finding ways to optimize driving (which both the internet and AI are useful for) should yield a lot of low-hanging fruit for reducing emissions.
AI is similar
Another example from Google’s list of apps helping with the environment is called Green Light. It uses a deep-learning model to pick up on traffic trends to optimize traffic light timing. Huge amounts of emissions come from cars idling in cities waiting for a light to turn green, and accelerating after having to stop. Think about all the times you’ve been stopped at a red light without any cars coming on the other road. If the traffic light had AI built into it, it could detect this and prevent your car from idling, or detect that your car was coming (with no cars coming in a perpendicular direction) and keep the light green until you passed, preventing you from having to spend emissions on idling and accelerating. The emissions from the electricity a traffic light AI used in a data center would be dwarfed by the emissions it would prevent in individual cars.
Deep-learning is incredibly general technology. It seems like the only relevant comparison to deep-learning is all other computer programs. All computer programs before deep-learning helped us in situations where we could write down clear logical step-by-step rules and algorithms. Deep-learning can help us in situations where we don’t know the exact rules and heuristics, but can identify the answers we want. This is incredibly useful for saving energy and preventing emissions. Because it’s so general, it’s likely that AI will also incentivize new emissions as new industries take off using it as a tool.
A lot of people are unfamiliar with deep-learning tools beyond chatbots, and might be unaware of just how much promise they have to both lower and raise emissions. Importantly, these emissions effects will not only come from one-off scientific experiments with individual deep learning models, they will often come from deep learning models being deployed across the economy in a way that would require lots of data centers to support. There are too many examples to exhaustively run through, so I’ll just recommend some resources for learning more about each topic.
Ways AI use could cut emissions
AI's potential to reduce emissions spans most sectors of the economy, from fundamental materials science to everyday transportation. The main benefit of AI is that it can optimize complex systems that humans struggle to manage effectively or identify clear algorithmic rules for dealing with, often finding efficiency gains that translate directly into emissions reductions.
These are some of the most promising areas for deep learning to positively impact the climate. Each has the potential to offset at minimum hundreds of megatons of CO2 annually:
Materials science: The discovery of better materials to use in renewable energy tech like solar panels and batteries). Even small efficiency gains here have huge climate payoffs.
Electricity grid optimization. Smart grid technology (where the energy grid operates ) is going to be critical for building green energy, since many renewable sources (especially solar and wind) are very intermittent and benefit from a grid that can respond rapidly and intelligently to sudden changes in energy supplies. Deep learning has a lot of potential to benefit smart grid technology. AI can provide better weather forecasting smart grids can respond to.
Building energy use optimization: Energy used in buildings is responsible for 26% of all greenhouse gas emissions. Even slightly more optimal heating and cooling systems would help a lot.
Transit: deep learning. For flights, deep-learning guided contrail avoidance can significantly reduce plane emissions. Urban traffic signal optimization can reduce local air pollution for pedestrians as well as emissions.
There are lots of deep dives on ways AI can potentially benefit the environment you can explore.
Ways AI use could raise emissions
A main danger for the climate in the way we use AI is Jevons’ Paradox: the pattern where making a process more resource-efficient lowers its cost for consumers, causing more total resources to be consumed as a result. You will notice that many of the ways AI’s use could raise emissions are instances of Jevons’ Paradox. It reduces the energy and resource cost of a service, people use it way more as a result, and the total emissions rise even as the process gets more optimized and efficient.
Jevons’ Paradox is a universal problem in thinking about the relationship between economics and climate change. It is not an iron law, and there are many examples of making processes more efficient that reduce total emissions, but we need to consider it as a big possibility. Just making any one process more efficient with AI is not guaranteed to help the climate, and in specific instances it can hurt it.
These are some of the most likely ways AI usage could increase emissions:
Autonomous vehicles: Many models of autonomous vehicles predict that they could on net significantly raise transit energy usage, because they will make driving significantly more accessible.
Deep-learning enabled advertising: Could significantly increase frivolous consumption that isn’t actually beneficial to people.
Rebound Effects in Energy Systems: AI optimization that reduces costs or increases efficiency can lead to increased usage that offsets the gains. For example, AI-optimized HVAC systems might make climate control so cheap that people use it more liberally, or AI-enhanced manufacturing efficiency might lead to increased production volumes.
Financial Trading & Speculation: AI-driven high-frequency trading and cryptocurrency activities can increase market volatility and speculative bubbles that drive resource-intensive economic activities and boom-bust cycles with associated carbon costs.
There are also lots of deep dives on ways AI could increase emissions you can explore.
We should be focused on systematic change over individual lifestyles
All the changes we can make in our personal consumption choices are nothing compared to what we can do if we contribute to making the energy grid green. The current AI debate feels like we’ve forgotten that lesson. After years of progress in addressing systemic issues over personal lifestyle changes, it’s as if everyone suddenly started obsessing over whether the digital clocks in our bedrooms use too much energy and began condemning them as a major problem. It’s sad to see the climate movement get distracted. We have gigantic problems and real enemies to deal with. ChatGPT isn’t one of them.
To be 100% clear, the broader climate, energy, and water impacts of AI are very real and worth worrying about. Some readers have jumped from my title to say “He thinks AI isn’t an environmental problem? This is propaganda. AI is a massive growing part of our energy grid.” This post is not meant to debunk climate concerns about AI. It’s only meant to debunk climate concerns about chatbot use (and, as I note in the intro, image generation).
This doesn’t include the water used in our food system. There are crazy numbers out there implying that American agriculture uses 80% of the country’s fresh water, so most of our water footprint might be in the food we eat, and it could be as high as 2,000 gallons per day (300,000 ChatGPT prompts per day). These numbers are so ambiguous I don’t feel comfortable citing them in the piece, but it seems likely that our diets add a huge amount to our water footprint that isn’t shown in my visuals.
So glad you’re vegan. I’ve been wanting to make this point for ages. People scream about personal responsibility when it comes to AI but not about their diets or lifestyle
This is brilliant and bookmarked, a very useful resource. The real issue here is that many people just generally don't like AI and therefore it gets a negative halo, including environmentally, which as you say makes no sense. One can legitimately disagree about the thing itself, but as you so comprehensively show, the environmental side is not a meaningful one.