Why using ChatGPT is not bad for the environment - a cheat sheet
The numbers clearly show this is a pointless distraction for the climate movement
My post on why ChatGPT is not bad for the environment got a lot of readers. It’s 9,000 words and written to be read from beginning to end, which is a lot to drop on someone. This post will be a cheat sheet for that post, framed around conversations you might have. I’ve broken it up so you can skip around and only read sections relevant or interesting to you. I add new arguments and intuitions. If you think I’m getting anything wrong I’d be excited to update this with the most accurate numbers. Please let me know in the comments or at AndyMasley@gmail.com.
Intro
The question this post is trying to answer is “Should I boycott ChatGPT or limit how much I use it for the sake of the climate?” and the answer is a resounding and conclusive “No.”
It’s not bad for the environment if you or any number of people use ChatGPT, Gemini, Claude, Grok, or other large language model (LLM) chatbots. You can use ChatGPT as much as you like without worrying that you’re doing any harm to the planet. Worrying about your personal use of ChatGPT is wasted time that you could spend on the serious problems of climate change instead.
This post is not about the broader climate impacts of AI beyond chatbots1, or about whether AI is bad for other reasons (copyright, hallucinations, job loss, risks from advanced AI, etc.). I’m not especially “pro” or “anti” AI. The reason I’m writing this is mainly that the climate movement is getting distracted by a non-issue.
I’m not an authority on AI and energy use. I cite all my sources and claims and defer to what seems like the expert consensus where it exists. I have a physics degree and taught physics for seven years, so I do know a lot about where and how energy is used.
If I say anything here without a source or explanation, it’s cited or explained in the other post. I want this post to be easy to read without getting bogged down in the minutia of where I got the numbers. I’m a fan of linking citations with hypertext instead of footnotes, so my sources are all in the writing itself instead of at the bottom.
We can divide concerns about ChatGPT’s environmental impact into two categories:
Personal use: How much ChatGPT increases your personal environmental footprint.
Global use: How much ChatGPT is harming the planet as a whole.
I’ll write a bunch of responses to the most common objections in each category, this time much simpler and pared down so you can use them IRL.
Throughout this post I’ll assume the average ChatGPT query uses 3 Watt-hours (Wh) of energy, which is 10x as much as a Google search. This statistic is likely wrong. ChatGPT’s energy use is probably lower according to EpochAI. Google’s might be lower too, or maybe higher now that they’re incorporating AI into every search. We’re a little in the dark on this, but we can set a reasonable range. It’s hard for me to find a statistic that implies the average ChatGPT prompt uses more than 3 Wh, so I’ll stick with this as an upper bound to be charitable to ChatGPT’s critics. I explain my justification for making this assumption here.
It seems like image generators also use less than 3 Wh per prompt (with large error bars), so everything I say here also applies to AI images. AI video DOES seem bad for the environment, explained here. I’d skip generating AI videos for now.
I’m collecting all my responses to critiques of this post here, and corrections here.
Contents
Personal use
A ChatGPT prompt uses too much energy/water
Energy
ChatGPT uses 3 Wh. This is enough energy to:
Leave a single incandescent light bulb on for 3 minutes.
Leave a wireless router on for 30 minutes.
Play a gaming console for 1 minute.
Run a vacuum cleaner for 10 seconds.
Run a microwave for 10 seconds
Run a toaster for 8 seconds
Brew coffee for 10 seconds
Use a laptop for 3 minutes. ChatGPT could write this post using less energy than your laptop uses over the time you read it.
You can look up how much 3 Wh costs in your area. In DC where I live it’s $0.00051. Think about how much your energy bill would have to increase before you noticed “Oh I’m using more energy. I should really try to reduce it for the sake of the climate.” What multiple of $0.00051 would that happen at? That can tell you roughly how many ChatGPT searches it’s okay for you to do.
Because this is so low, encouraging people to stop using ChatGPT is basically never going to have any impact on their individual emissions. If individual emissions are what you’re worried about, ChatGPT is hopeless as a way of lowering them. It’s like seeing people who are spending too much money, and saying they should buy one fewer gum ball per month:
By being vegan, I have as much climate impact as not prompting ChatGPT 400,000 times each year (the water impact is even bigger). I don’t think I’m going to come close to prompting ChatGPT 400,000 times in my life, so each year I effectively stop more than a person’s entire lifetime of ChatGPT searches with a single lifestyle change. If I choose not to take a flight to Europe, I save 3,500,000 ChatGPT searches. this is like stopping more than 7 people from searching ChatGPT for their entire lives. Preventing ChatGPT searches is a hopelessly useless lever for the climate movement to try to pull. We have so many tools at our disposal to make the climate better. Why make everyone feel guilt over something that won’t have any impact?
The average American uses about 10,000 times as much energy each day. If each of these dots is one ChatGPT prompt, all the dots together are how much energy you use in one day.
I still find, even after showing this, that some people think using literally any additional energy is bad, because “every bit matters.” The thing is that our energy use changes a lot day to day, just like the money we spend changes day to day. If I started spending an additional penny per month, I wouldn’t notice, because there would be other days where I’d randomly spend way more or fewer pennies on other things. If I looked at a graph of my spending, the penny would be drowned out in the random noise of my other decisions. ChatGPT searches are like this. They use so little energy that they get drowned out in the random ways we change our energy use day to day. If you looked at a graph of my energy footprint before and after using ChatGPT, you wouldn’t notice any change at all.
We have limited hours in the day, and different choices for how we spend our time. If prompting ChatGPT 100 times and reading its responses takes up hours of my time that I could have spent playing a video game or watching Netflix or driving my car, then using it actually prevents me from emitting way more, because those other things use way more energy per hour. Printing a physical book uses 5,000 Wh, so even just sitting down and reading a book you bought for 6 hours (using 833 Wh per hour) is going to use more energy per minute than ChatGPT, unless you prompt ChatGPT 278 times per hour, or once every 13 seconds for a full hour. Switching to using ChatGPT from another activity is almost always going to decrease the total energy I use every day. This isn’t an argument that you should only use ChatGPT! It’s often worth it to spend more energy. But people sitting and using ChatGPT are often using way less energy per minute than almost anyone else in the world.
Water
I think a lot of people don’t realize how much water we each use every day.
Almost all electricity generation involves heating water to create steam to spin a turbine. The American energy grid uses 58,000,000,000,000 gallons of water every year. That’s enough to cover the entire surface area of Pennsylvania in Olympic swimming pools.
In America it takes about 2 gallons of water to produce 1 kWh of electricity. The average American uses around 30 kWh per day, so they use 60 gallons of water per day just to generate their electricity. They also use around 100 gallons of water directly each day, so 160 gallons in total.2
When I hear people say “50 ChatGPT searches use a whole bottle of water!” I think they’re internally comparing this to the few times a year they buy a bottle of water. That makes ChatGPT’s water use seem like a lot. They’re not comparing it to the 1200 bottles of water they use every single day in their ordinary lives.
A ChatGPT prompt running on GPT-3 used between 10-25 mL of water if you include the water cost of training, the water cost of generating the electricity used, and the water used by the data center to cool the equipment. Water used in data centers is usually a direct function of how much energy a process uses. Because it seems like the energy per prompt decreased from GPT-3 to GPT-4 and 4o, I’ll use the 10-25 mL as a reasonable upper bound for how much water prompts are currently using.
This means that every single day, the average American uses enough water for 24,000-61,000 ChatGPT prompts.
Each dot in this image represents one ChatGPT prompt’s worth of water. All the dots together represent how much water you use in one day in your everyday life:
If you want to prompt ChatGPT 40 times, you can just stop your shower 1 second early. If you normally take a 5 minute shower, set a timer for 299 seconds instead, and you’ll have saved enough water to justify 40 ChatGPT prompts.
If you want to reduce your water footprint, avoiding ChatGPT will never make a dent.
Everything else we do online uses water in data centers too. ChatGPT seems to have been singled out because it uses a little more water per use than Google, but it doesn’t look bad relative to other normal online things we do.

ChatGPT is bad relative to other things we do (it’s ten times as bad as a Google search)
If you multiply an extremely small value by 10, it can still be so small that it shouldn’t factor into your decisions.
If you were being billed $0.0005 per month for energy for an activity, and then suddenly it began to cost $0.005 per month, how much would that change your plans?
A digital clock uses one million times more power (1W) than an analog watch (1µW). “Using a digital clock instead of a watch is one million times as harmful to the climate” is correct, but misleading. The energy digital clocks use rounds to zero compared to travel, food, and heat and air conditioning. Climate guilt about digital clocks would be misplaced.
The relationship between Google and ChatGPT is similar to watches and clocks. One uses more energy than the other, but both round to zero.
When was the last time you heard a climate scientist say we should avoid using Google for the environment? This would sound strange. It would sound strange if I said “Ugh, my friend did over 100 Google searches today. She clearly doesn’t care about the climate.” Google doesn’t add to our energy budget at all. Assuming a Google search uses 0.3 Wh, it would take 30,000 Google searches to increase your monthly energy use by 1%. It would be a sad meaningless distraction for people who care about the climate to freak out about how often they use Google search. Imagine what your reaction would be to someone telling you they did ten Google searches. You should have the same reaction to someone telling you they prompted ChatGPT.
What matters for your individual carbon budget is total emissions. Increasing the emissions of a specific activity by 10 times is only bad if that meaningfully contributes to your total emissions. If the original value is extremely small, this doesn’t matter.
It’s as if you were trying to save money and had a few options for where to cut:
You buy a gum ball once a month for $0.01. Suddenly their price jumps to $0.10 per gum ball.
You have a fancy meal out for $50 once a week to keep up with a friend. The restaurant host likes you because you come so often, so she lowers the price to $40.
It’s very unlikely that spending an additional $0.10 per month is ever going to matter for your budget. Spending any mental energy on the gum ball is going to be a waste of time for your budget, even though its cost was multiplied by 10. The meal out is making a sizable dent in your budget. Even though it decreased in cost, cutting that meal and finding something different to do with your friend is important if you’re trying to save money. What matters is the total money spent and the value you got for it, not how much individual activities increased or decreased relative to some other arbitrary point.
Google and ChatGPT are like the gum ball. If a friend were worried about their finances, but spent any time talking about foregoing a gum ball each month, you would correctly say they had been distracted by a cost that rounds to zero. They should be able to enjoy something that’s very close to free. The same is true for our energy budget. What matters for the climate is the total energy we use, just like what matters for our budget is how much we spend in total. The climate doesn’t react to hyper specific categories of activities, like search or AI prompts. What matters is our total CO2 emissions. If an extremely small slice of your energy budget suddenly jumps up by 10, it is unlikely to affect your overall energy budget.
If you’re an average American, each ChatGPT prompt increases your daily energy use (not including the energy you use in your car) by 0.01%. It takes about 100 ChatGPT prompts to increase your daily energy use by 1%. If you did 100 ChatGPT prompts in 1 day and feel bad about the increased energy, you could remove an equal amount of energy from your daily use by:
Running a clothes drier for 6 fewer minutes.
Running an air conditioner for 18 fewer minutes.
ChatGPT uses enough energy that you should be very careful with how you use it. Don’t use it as a search engine or a calculator or just to goof around
There are costs that are just so incredibly small that it does not matter if you take a few more. Imagine that someone told you that they had perfectly timed their microwave down to the second for each meal. They knew that their vegan chicken nuggies needed exactly 3 minutes and 42 seconds. Any longer and they would be wasting energy. That would be admirable, but a very tiny thing that would take a lot of extra effort relative to how much it helps the climate (basically not at all). If someone else just set the microwave to 4 minutes, this would use so little extra energy that it wouldn’t be bad for the climate at all.
This would also be 2 ChatGPT search’s worth of extra energy.
I sometimes hear people say that you should “think carefully about your ChatGPT use” and “be environmentally aware while using it” because of its environmental impact, and not just use it for simple things you could use other services for, like a simple search or a calculator or just making jokes. This sounds a lot like scolding the person setting their microwave to 4 minutes instead of 3:42. It’s misunderstanding just how little energy is involved.
I regularly use the Google search bar as a calculator. I’m too lazy to click on the calculator app on my computer. The search bar is right there. This adds a tiny tiny bit of energy cost, but it’s not enough that I should ever worry.
Suppose you gave yourself an energy budget for goofy ChatGPT prompts. Every year, you’re allowed to use it for 1,000 goofy things (a calculator, making funny text, a simple search you could have used Google for). At the end, all those prompts together would have used the same amount of energy as running a single clothes dryer a single time for half an hour. This would increase your energy budget by 0.03%. This is not enough to worry about. If you feel like it, please goof around on ChatGPT.
Global use
Data centers are an environmental disaster. This shows that ChatGPT as a whole is using too much energy and we should boycott it
I think the main way people get tripped up about ChatGPT is that they hear ominous stories about the (very real!) environmental costs of AI, like AI raising energy demand so much that it’s straining the energy grid or causing coal plants to reopen, and assume ChatGPT’s to blame. They hear about AI data centers rapidly growing, look around, and see that everyone’s using ChatGPT, and assume there must be some connection. They think that even if all those previous cited numbers about individual use being small are true, they must still add up to something really bad because of all these bad stories.
The mistake they’re making is simple: ChatGPT and other AI chatbots are extremely, extremely small parts of AI’s energy demand. Even if everyone stopped using all AI chatbots, AI’s energy demand wouldn’t change in a noticeable way at all. The data implies that at most all chatbots are only using 1-3% of the energy used on AI. Meanwhile, 400,000,000 people are using ChatGPT every week and sending 1 billion prompts every day. If we convinced every one of those people to stop using ChatGPT forever (along with Gemini, Claude, Grok, and other chatbots), AI energy and water use would still be 97-99% of its current value.
One fun way to build your intuition on this is to compare how many homes a coal plant can power vs. how many homes’ worth of energy ChatGPT is using. Reported energy use implies that ChatGPT consumes about as much energy as 20,000 American homes. An average US coal plant generates enough energy for 80,000 American homes every day. This means that even if OpenAI decided to power every one of its billion ChatGPT queries per day entirely on coal, all those queries together would only need one quarter of a single coal plant. ChatGPT is not the reason coal plants are reopening to power AI data centers.
The services using 97-99% of AI’s energy budget are (roughly in order)
Recommender Systems - Content recommendation engines and personalization models used by streaming platforms, e-commerce sites, social media feeds, and online advertising networks.
Enterprise Analytics & Predictive AI - AI used in business and enterprise settings for data analytics, forecasting, and decision support.
Search & Ad Targeting - The machine learning algorithms behind web search engines and online advertising networks.
Computer vision - AI tasks involving image and video analysis.
Voice and Audio AI - AI systems that process spoken language or audio signals.
There are a lot of valid concerns about data center environmental impact. This recent Bloomberg piece is a great example. But when you read these pieces, don’t jump to the conclusion that individuals using AI chatbots are causing this growth. The numbers just don’t imply that.
ChatGPT may not raise your own carbon footprint much, but it will be very bad for the environment if everyone starts using it
It’s true that if everyone used ChatGPT more, the carbon cost would rise by what we’d consider a lot for an individual person.
This is true for everything we do.
Suppose I used this same logic when buying a digital clock.
If everyone in America bought one additional digital clock, we’d need 8,000 Olympic swimming pools of water each year to power them. I could say “You individually buying that clock may not matter much, but if you add up what everyone’s doing, it’s a huge impact. We can’t afford an additional 8,000 Olympic swimming pools of water!”
There’s a subtle mistake in thinking people are making when they talk this way that I need to use some pictures to illustrate. People look at a small‑emission activity, and scale it up: “If every person on Earth did this, global emissions would jump a lot.”
They then compare that “if everyone did it” number to the individual footprint of another habit, such as driving a few miles. In the picture below, they’d compare the large red square on the right to the small purple square on the left.
The big mistake is comparing the global impact of ChatGPT (the red square on the right) to the individual impact of their other activities (the purple square on the left): Comparing “ChatGPT is using as much energy as 20,000 households per day” to “I drive a few miles a day” is a mistake. In this case, ChatGPT seems like the bigger problem. They forget that everything else they do also creates a lot more emissions when everyone else does it, in rough proportion to much it emits in their individual lives (driving in America uses more energy than all American homes, 400,000,000 American households’ equivalent).
In the above image, it’s clear that the red square on the right isn’t as big of a problem as the purple square on the right. It’s a mistake to compare the red square on the right to the small purple square on the left. That’s not the correct comparison. That bad comparison happens all the time in ChatGPT energy discourse
If we’re looking for individual emissions to cut, we need to compare how they each contribute to our total individual emissions. If we’re looking for global emissions to cut, we need to compare them to other global trends. Crossing these categories creates a lot of confusion. Articles that say “ChatGPT is using 20,000 US households’ worth of energy” sound concerning because they’re comparing global ChatGPT use to your personal life. They’re comparing the large red square on the right to the tiny purple square on the left. If instead they compared ChatGPT to other global services, and said “ChatGPT is now using 1% as much energy as YouTube” or better, “ChatGPT is using 0.005% as much energy as American cars” it would be clear which parts of our lives are actually contributing to climate change.
Obviously there’s a connection between our individual and global emissions. Global emissions are the sum of everything we as individuals do in aggregate. This is convenient, because it means the places we emit the most in our individual lives usually mirror the places where the world is emitting the most as a whole. Because ChatGPT is such an insignificant part of our own energy budget, it’s an insignificant part of the world’s energy budget as well.
If everyone in the world stopped using ChatGPT, this would save around 3GWh per day. If everyone in the world who owns a microwave committed to using their microwaves for 10 fewer seconds every day, this would also save around 3GWh per day.
Imagine you met someone campaigning to convince everyone to decrease their microwave usage by 10 seconds per day. Your first reaction would be “There must be much more effective things to campaign for that would have a lot more climate impact. This must be really small compared to global energy demand.” You’d be correct. Campaigning to stop ChatGPT is exactly like campaigning to use microwaves for 10 fewer seconds.
Global energy consumption in 2023 was around 500,000 GWh per day. That means that ChatGPT’s global 3GWh per day is using 0.0006% of Earth’s energy demand. This is an even smaller proportion than ChatGPT in our personal energy budget.
If one dot represents ChatGPT’s total energy use every day, all the dots together represent the world’s total daily energy use. Imagine that someone came to you and said it was really urgent that a billion people change their behavior and stop using ChatGPT so that we could eliminate exactly one of these dots. How seriously would you take them?
Climate change is a collective action problem. What we do as individuals does affect the climate in aggregate, so we need a way of deciding what in our lives to focus on changing. We should focus on changing things that will actually matter, like flying less, buying green electricity, changing our diets, and (much more important than the last three combined) working for systematic change in our energy systems to transition to green energy.
ChatGPT uses as much energy as 20,000 households
This seems big if you don’t consider how many people use ChatGPT. It’s the most downloaded app in the world. Every day 400 million people send 1 billion ChatGPT prompts. The most downloaded app using as much energy as Barnstable Massachusetts isn’t surprising. Fortnite uses roughly 400,000 households’ worth of energy. YouTube uses roughly 2,000,000. ChatGPT’s total energy use is small compared to common internet staples.
You should spend about as much time worrying about the global climate impact of ChatGPT as you do about the climate impact of Barnstable Massachusetts.
Training an AI model uses too much energy
Training GPT-4 used 50 GWh of energy. Like the 20,000 households point, this number looks ridiculously large if you don’t consider how many people are using ChatGPT.
The numbers here are very uncertain, but my best guess based on available data says that since GPT-4 was trained, it answered around 50 billion prompts, until it was mostly replaced with GPT-4o. GPT-4 and other models were used for a lot more than ChatGPT — Notion, Grammarly, Jasper, AirTable, Khan Academy, Duolingo, GitHub Copilot — but to be charitable let’s assume it was only used for chatbots. Dividing 50GWh by 50 billion prompts gives us 1 Wh per prompt. This means that including the cost of training the model (and assuming each prompt is using 3 Wh) raises the energy cost per prompt by 33 percent, from the equivalent of 10 Google searches to 13. That’s not nothing, but it’s not a huge increase per prompt.
There are a lot more AI models being trained, collectively using a lot of energy. It seems like the only reasonable way to judge how bad this is is to amortize the cost of training over how many prompts we can expect the specific model to deal with. For evaluating ChatGPT, we’d look at models used for ChatGPT (GPT-4 etc.) and amortize their cost across expected uses. The same for Gemini, Grok, etc. This is somewhat similar to comparing the costs of different shirts. If one shirt cost $40, but would last for 80 wears, it effectively costs $0.50 per wear. If another cost $20, but only lasts for 10 wears, it effectively costs $2. Even though the up-front cost of the second shirt is lower, it ends up being more expensive once you amortize the cost across each wear. Amortizing the cost and including that in our total energy and water cost of the model seems like the best way to think about whether the up front cost of training an AI model is worth it. For any popular model I’ve looked at, once the training cost was amortized it shrunk to a small portion of the overall energy cost of a prompt.
Making AI chips emits too much carbon
This is the best study I can find on the embodied carbon in the physical materials that make up an LLM. It estimates the embodied carbon of the physical materials in a large language model over the course of its life and finds that it’s about 22% of the model’s total emissions. Mapping that onto a ChatGPT search we can set a maximum upper bound for the cost per query at 3 Wh per query, plus 0.3 Wh for the amortized cost of training, multiplied by 1.22 for the amortized embodied emissions equivalent of energy of the AI chips, which leaves us with 4 Wh as a rough upper bound for the cost per ChatGPT search. Including embodied emissions of the AI data centers doesn’t significantly change the proportions of any of the numbers I’ve shared.
An important note is that many comparisons between ChatGPT and other activities don’t take those things’ embodied emissions into account. A botec implies the average LED light bulb takes about 40 kWh to manufacture. This means you’d have to have the LED bulb on constantly for half a year before the embodied emissions of the bulb dropped below 50% of its total emissions. If we’re going to include the carbon costs of physically making chips in our comparisons, we need to also include the costs of making everything we’re comparing ChatGPT to.
Other objections
This is all a gimmick anyway. Why not just use Google? ChatGPT doesn’t give better information
A lot of conversations about the climate impact of ChatGPT quickly turn back to its value as a service. “ChatGPT is a plagiarism machine that just produces slop. It’s glorified autocomplete” etc.
I think it’s so important to snap the climate movement out of being distracted by this that you shouldn’t even try to convince people that ChatGPT is useful. That’s too much of a context switch that can get bogged down in minutia. I do think AI is useful (I explain how I use it here) but that’s a separate point.
The better answer is that even if ChatGPT were completely useless, there are a lot of other useless things we do that use a lot more energy.
Every additional second you spend showering uses enough water for 40 ChatGPT prompts. It’s okay to accidentally go one second over the optimal shower time (or minutes, even!). That’s in some way “useless” because it’s not helping you meet your goal of getting clean, but it’s such a small amount of water that you shouldn’t stress over it.
I have about 40 tiny hanging LED lights in my room:
Each of these individually is “useless.” If I unscrewed just one of them, I don’t think anyone would notice. Each one of these uses a ChatGPT search’s worth of energy once every 3 hours. Together they’re using as much energy as 13 ChatGPT searches every hour, and using a gallon of water every 12 hours.
My roommates haven’t knocked on my door and said “Sorry Andy, you can’t use those. They’re making the energy bill go up too much” because the lights add about $0.40 to the energy bill each month.
Should I use these even though I don’t need them? They’re just decorative and I could be fine with only a lamp. Even though they use hundreds of ChatGPT searches’ worth of energy each month, they don’t contribute to the climate crisis at all, and they make me happy. That should be the end of the argument. People seem to like using ChatGPT. Who cares whether it’s objectively valuable? People like playing Fortnite too. It’s no worse for the environment than my hanging lights. Let them use it if they like. Worry about the things that actually matter.
I’ve been vegan for 10 years, live in a big city with roommates, walk to work every day, and rarely fly. Even though I use ChatGPT daily, my environmental footprint is less than half the average American’s. After all that, it would seem silly for me to feel guilty about either my LED lights or ChatGPT.
You can find things in your life that use similar energy to ChatGPT and make similar comparisons. Sometimes we like to do silly meaningless things that use a tiny amount of energy. If the climate movement wants to make its members feel guilty about that, it will fail.
If you think there are reasons why ChatGPT is not just useless but actively harmful (copyright, hallucinations, job loss, risks from advanced AI, etc.) make the case directly without adding incorrect climate statistics. There are a lot of issues I care about that I want people to have more clarity on. I think the way we treat chickens in factory farms is a massive moral catastrophe. I could add “and each chicken has some environmental cost” every time I talk about it. That would be technically true, but chickens aren’t especially harmful to the environment, and it dilutes the message I’m actually trying to get across. If you try to smuggle in a lot of unconvincing additional reasons why something’s bad, it undermines your otherwise strong case. Environmental objections to ChatGPT often dilute other serious criticisms of the technology. Focus and clarity help AI critics’ case.
Don’t trust some random Substack post over scientific research
This one came up a lot in replies to my last post.
I’m not claiming to have discovered anything new in what I post here. I’m just gathering what seems like the consensus on how much environmental impact ChatGPT has, trying to get a bird’s eye view of how it compares to all the other things we do, and coming away with what I think is a pretty strong case that it’s not bad for the environment. Nowhere in my post do I go against the scientific consensus on climate or the environment or AI. You should see this post as basically a long written comment with a lot of links. I’m just some guy who’s noticed that these numbers don’t make sense to worry about if you just compare them to everything else we do.
These energy and water numbers are all based on guesswork
I go into a lot more detail here about why I’m using the energy numbers I chose.
Trying to figure out how much energy the average ChatGPT search uses is extremely difficult, because we’re dividing one very large uncertain number (total prompts) by another (total energy used). How then should we think about ChatGPT’s energy demands when we know almost nothing certain about it?
The people who believe that ChatGPT is uniquely bad for the environment are also basing their numbers on guesswork. If we can’t know anything about ChatGPT because the numbers are too uncertain, it doesn’t make sense to single it out as being uniquely bad for the environment. We just don’t know! Whenever people try to guess at the general energy and water cost of using ChatGPT, the numbers consistently fall into a rough range with an upper bound for the average prompt’s energy at about 3 Wh, so that’s what I’m running with. Maybe all these guesses are wrong, but we have just as much reason to believe they’re higher than the true cost of ChatGPT as we do that they’re lower.
If we can say anything at all about ChatGPT’s energy use, everything in this post is in line with our best guesses.
If we can’t say anything at all about ChatGPT’s energy use, you should be just as skeptical of claims that it’s bad for the environment, because its critics are also basing their claims on nothing but guesswork.
Saying “ChatGPT is uniquely bad for the environment” and then also adding “And you can’t disagree with me because all numbers involved are based on guesswork. No one knows anything” is a pretty obvious double standard. If it’s all guesswork, no one can make any strong claims about ChatGPT and the environment. If we truly know nothing about it, it seems reasonable to assume it’s in line with every other normal thing we do online.
This post is “whataboutism.” ChatGPT is still bad for the environment because it emits at all
This one also comes up a lot in conversations about AI and the environment.
“Whataboutism” is a bad rhetorical trick where instead of responding directly to an accusation or criticism, you just launch a different accusation or criticism at someone else to deflect. Kids do this a lot.
“Clean your room, it’s a mess!”
“My sister’s room is messier!”
Some people said my original post is whataboutism because they read me as saying “ChatGPT is bad for the environment? Well meat is bad for the environment too!”
That is not what I’m trying to say.
Literally everything we do uses energy. The way we generate energy often involves emitting CO2. It is not possible for everyone to stop emitting right this second, because if they did billions of people would die. We don’t have the green energy infrastructure to give 8 billion people the energy they need to survive, even in rich countries.
This means that by some definitions, literally everything we do is “bad for the environment” because it uses scarce energy and causes CO2 emissions.
This leaves us in a weird place. If emitting any CO2 means something is bad for the environment, literally everything we do is bad for the environment. The term stops being useful in telling us what we should do. If a definition takes away any method for the climate movement to decide what actions to take, that seems bad.
Suppose we want to figure out how to live given that everything is “bad for the environment.” An obvious move would be to figure out what’s better for the environment than other things.
“If you’re choosing between driving a Hummer and riding a bike, the Hummer is worse for the environment than the bike, so you should choose the bike” seems like a reasonable statement. Even if everything is bad for the environment, we can still say some things are better or worse.
The problem is that every time you make this comparison, you can be accused of whataboutism.
“Oh, you’re saying bikes are fine just because Hummers are worse? Biking emits a whole kilogram of CO2 for every 30 miles! It’s bad for the environment. Comparing bikes to Hummers is whataboutism.”
Maybe when people say “whataboutism” they don’t mean specific comparisons in the same general activity (modes of transportation). They mean comparisons of activities that aren’t substitutes for each other (eating meat vs. biking). Using whataboutism in this way still gives strange results.
If people were worried about biking because it emits too much, and I said “look, of all the different things in your lifestyle you could change to help the climate, you really shouldn’t worry about biking because its emissions are just so so so low. Eat less meat or fly less or use green heating or switch to renewable energy or advocate for systematic change” that could also be called whataboutism. “Oh, biking is fine just because it’s not as bad as eating meat? Biking in America emits 300,000,000 kg of CO2 each year. That’s as much as 30,000 households. It’s an environmental disaster, just like eating meat. This is classic whataboutism.”
It seems like some people are stuck in this mode where:
Everything is by definition bad for the environment.
Doing any general comparisons of different lifestyle interventions for the climate is whataboutism.
Therefore, there is no legitimate way to decide what we should and should not cut for the climate, other than reducing emissions of activities that are very directly comparable to each other, like biking and driving, or watches and clocks, or Google and ChatGPT.
My answer to them is that this way of thinking will not help people maximally reduce emissions. If people feel similarly bad about digital clocks as they do about intercontinental flights, they won’t make good decisions about the climate. Call that whataboutism if you want, but I think the climate crisis demands these kinds of comparisons.
To help people find how to emit less, I’d change the definition of what it means to be “bad for the environment.”
I think of something as being “bad for the environment” not when it emits CO2 at all, but when it emits above a threshold where, if everyone did it, it would be hard or impossible to avoid the worst impacts of climate change before we as a planet transition to 100% green energy and achieved a climate equilibrium where the temperature stops rising. People riding bikes emit CO2, but everyone riding bikes (even looking at global use) would be easily possible in a world where we transitioned to green energy before hitting dangerous climate tipping points. Everyone using Google and ChatGPT would also be extremely easy in a world where we avoid dangerous climate tipping points, because their emissions are so low. Everyone eating meat for every meal or using internal combustion engine cars would not allow for us to avoid dangerous climate tipping points, so they’re bad for the environment.
Under this revised definition, it’s whataboutism to say “eating meat isn’t bad because people drive,” but it’s not whataboutism to say “Google isn’t bad because its emissions are so drastically low compared to everything else we do,” and it’s not whataboutism to say the same about ChatGPT.
Some other useful intuitions in conversations
AI companies don’t want to give you free energy
Energy costs money. AI companies don’t want to give too much energy to you for free. That would be like giving you lots of free money.
Almost all ChatGPT users are free users. ChatGPT has 400 million weekly users, but only 11 million paid users. This means that ~98% of ChatGPT’s users are being given free energy by OpenAI for their prompts. Google search now has an AI chatbot built in and is completely free. If either of these were giving you control over a significant amount of energy, this would basically be a massive free giveaway from the AI companies to billions of people. OpenAI and other AI labs have extremely strong incentives to make their models as energy efficient as possible.
It’s 3 Wh!!!!
In a lot of these conversations, I have a very strong urge to grab the other person’s shoulders and say “This is 3 Wh of energy we’re talking about!!!! We agree that’s the number! 3 Wh!!!!!! That’s so small!!!! Don’t you know this?!?!?! What happened to the climate movement????? All my climate friends used to know what 3 Wh meant!!! AAAAAAHHHHH!!!!!”
This would not be very mature, so instead I post 9,000 word blog posts to let off the steam.
It’s hard to get across to people just how strange it is to be standing across from another adult and see them speak so apocalyptically about 3 Wh of energy. I made this list of other things 3 Wh can do.
I have a similar reaction to the 10x a Google search point. When someone says “ChatGPT uses 10x as much energy as a Google search” I’m sometimes tempted to just say “Yes… 10 Google searches.” and just let that hang. Imagine going back to 2020 and saying “Oh man, I thought my buddy cared about the climate, but I just found out he… oh man I can’t bring myself to say it… he searched Google TEN times today.”
When I first started reading about this, I assumed people worried about climate thought that AI was using way more energy than the companies were telling them, but it seems like everyone on both sides of the debate agrees 3 Wh is a reasonable upper bound guess. It’s very strange to have such a strong disagreement about such a clear amount of energy that the climate movement would have never worried about with any previous technology.
This is so so strange
Continuing from the last point, my motivation for posting about this hasn’t been that I think ChatGPT’s amazing and everyone needs to try it. It’s mostly that it feels so strange to look around and see such a strong, popular reaction to ChatGPT that’s so inconsistent with how people think about every other area of their lives.
The only thing I can think to compare it to is if a lot of new addictive phone games were coming out that everyone liked and played, but suddenly one came out (let’s call it Wizard Clash 7) and everyone started talking about how we need to boycott it because it’s bad for the environment. When you looked into it, you saw that there was no noticeable difference between Wizard Clash 7 and any previous phone games. It used 10x more power than some, but 100x less power than others. There was nothing that made it stand out, but suddenly everyone was talking in apocalyptic language about how much energy it was using. When you brought up “Well this is just a phone game. It doesn’t really seem different than anything else we do on our phones, and it’s actually on the lower end of energy use for most phone activities, never mind everything else we do that’s much worse for climate. It doesn’t stand out. Why freak out?” and instead of getting a direct reply being told things like “You’re clearly a shill for Wizard Clash 7. Every additional bit of energy matters for the climate. If everyone plays Wizard Clash 7, that will use as much energy as thousands of households. We’re already in a climate crisis. It’s burning the planet. You’re not a scientist, you can’t just make these claims about how Wizard Clash 7 is fine.” and the conversation ending there. Any additional point you made would receive a comment bringing up five unrelated points about other climate impacts of Wizard Clash 7 that, if you looked at them directly, also didn’t make sense to worry about, but that the person would kind of bounce around like they were speaking in free jazz. Reality never gets acknowledged, instead everything just gets answered with new bad associations.
Being around a lot of adults freaking out over 3 Wh feels like I’m in a dream reality. It has the logic of a bad dream. Everyone is suddenly fixating on this absurd concept or rule that you can’t get a grasp of, and scolding you for not seeing the same thing. Posting long blog posts is my attempt to get out of the weird dream reality this discourse has created.
We should be focused on systematic change over individual lifestyles
All the changes we can make in our personal consumption choices are nothing compared to what we can do if we contribute to making the energy grid green. The current AI debate feels like we’ve forgotten that lesson. After years of progress in addressing systemic issues over personal lifestyle changes, it’s as if everyone suddenly started obsessing over whether the digital clocks in our bedrooms use too much energy and began condemning them as a major problem. It’s sad to see the climate movement get distracted. We have gigantic problems and real enemies to deal with. ChatGPT isn’t one of them.
To be 100% clear, the broader climate, energy, and water impacts of AI are very real and worth worrying about. Some readers have jumped from my title to say “He thinks AI isn’t an environmental problem? This is propaganda. AI is a massive growing part of our energy grid.” This post is not meant to debunk climate concerns about AI. It’s only meant to debunk climate concerns about chatbot use (and, as I note in the intro, image generation).
This doesn’t include the water used in our food system. There are crazy numbers out there implying that American agriculture uses 80% of the country’s fresh water, so most of our water footprint might be in the food we eat, and it could be as high as 2,000 gallons per day (300,000 ChatGPT prompts per day). These numbers are so ambiguous I don’t feel comfortable citing them in the piece, but it seems likely that our diets add a huge amount to our water footprint that isn’t shown in my visuals.
This is brilliant and bookmarked, a very useful resource. The real issue here is that many people just generally don't like AI and therefore it gets a negative halo, including environmentally, which as you say makes no sense. One can legitimately disagree about the thing itself, but as you so comprehensively show, the environmental side is not a meaningful one.
So glad you’re vegan. I’ve been wanting to make this point for ages. People scream about personal responsibility when it comes to AI but not about their diets or lifestyle