Using ChatGPT is not bad for the environment
And a plea to think seriously about climate change without getting distracted
If you don’t have time to read this post, these four graphs give most of the argument:
Contents
Why write this?
I’m not usually interested in writing simple debunking posts, but I regularly talk and read about the debate around emissions associated with AI and it’s completely clear to me that one side is getting it entirely wrong and spreading misleading ideas. These ideas have become so widespread that I run into them constantly, but I haven’t found a good summary explaining why they’re wrong, so I’m putting one together.
At the last few parties I’ve been to I’ve offhandedly mentioned that I use ChatGPT, and at each one someone I don’t know has said something like “Ew… you use ChatGPT? Don’t you know how terrible that is for the planet? And it just produces slop.” I’ve also seen a lot of popular Twitter posts (many above 100,000 likes) very confidently announcing that it is bad to use AI because it’s burning the planet. Common points made in these conversations and posts are:
Each ChatGPT search emits 10 times as much as a Google search.
A ChatGPT search uses 500 mL of water.
ChatGPT as a whole emits as much as 20,000 US households per day. It uses as much water as 200 Olympic swimming pools’ worth of water each day.
Training an AI model emits as much as 200 plane flights from New York to San Francisco.
The one incorrect claim in this list is the 500 mL of water point. It’s a misunderstanding of an original report which said that 500 mL of water are used for every 20-50 ChatGPT searches, not every search. Every other claim in this list is true, but also paints a drastically inaccurate picture of the emissions produced by ChatGPT and other large language models (LLMs) and how they compare to emissions from other activities. These are not minor errors—they fundamentally misunderstand energy use, and they risk distracting the climate movement.
One of the most important shifts in talking about climate has been the collective realization that individual actions like recycling pale in comparison to the urgent need to transition the energy sector to renewables. The current AI debate feels like we’ve forgotten that lesson. After years of progress in addressing systemic issues over personal lifestyle changes, it’s as if everyone suddenly started obsessing over whether the digital clocks in our bedrooms use too much energy and began condemning them as a major problem.
Separately, LLMs have been an unbelievable life improvement for me. I’ve found that most people who haven’t actually played around with them much don’t know how powerful they’ve become or how useful they can be in your everyday life. They’re the first piece of new technology in a long time that I’ve become insistent that absolutely everyone try. If you’re not using them because you’re concerned about the environmental impact, I think that you’ve been misled into missing out on one of the most useful (and scientifically interesting) new pieces of technology in my lifetime. If people in the climate movement stop using them they will lose a lot of potential value and ability to learn quickly. This would be a shame!
On a meta level, there’s a background assumption about how one is supposed to think about climate change that I’ve become exhausted by, and that the AI emissions conversation is awash in. The bad assumption is:
To think and behave well about the climate you need to identify a few bad individual actors/institutions and mostly hate them and not use their products. Do not worry about numbers or complex trade-offs or other aspects of your own lifestyle too much. Identify the bad guys and act accordingly.
Climate change is too complex, important, and interesting as a problem to operate using this rule. When people complain to me about AI emissions I usually interpret them as saying “I’m a good person who has done my part and identified a bad guy. If you don’t hate the bad guy too, you’re suspicious.” This is a mind-killing way of thinking. I’m using this post partly to demonstrate how I’d prefer to think about climate instead: we coldly look at the numbers, the institutions, and actors who we can actually collectively influence, and we respond based on where we will actually have the most positive effect on the future, not based on who we happen to be giving status to in the process. I’m not inclined to give status to AI companies. A lot of my job is making people worry more about AI in other areas. What I want is for people to actually react to the realities of climate change. If you’re worried at all about your own use of AI contributing to climate change, you have been tricked into constructing monsters in your head and you need to snap out of it.
How should we think about the ethics of emissions?
Here are some assumptions that will guide the rest of this post:
You are trying to reduce your individual emissions
If you’re not trying to reduce your emissions, you’re not worried about the climate impact of individual LLM use anyway. I’ll assume that you are interested in reducing your emissions and will write about whether LLMs are acceptable to use.
There’s a case to be made that people who care about climate change should spend much less time worrying about how to reduce their individual emissions and much more time thinking about how to bring about systematic change to make our energy systems better (the effects you as an individual can have on our energy system often completely dwarf the effects you can have via your individual consumption choices) but this is a topic for another post.
The optimal amount of CO2 you emit is not zero
Our energy system is so reliant on fossil fuels that individuals cannot eliminate all their personal emissions. Immediately stopping all global CO2 emissions would cause billions of deaths. We need to phase out emissions gradually by transitioning to renewables and making trade-offs in energy use. If everyone concerned about climate change adopted a zero-emissions lifestyle today, many of them would die. The rest would lose access to most of modern society, leaving them powerless to influence energy systems. Climate deniers would take over society. Individual zero-emissions living isn’t feasible right now.
In deciding what to cut, we need to factor in both how much an activity is emitting and how useful and beneficial the activity is to our lives. We should not cut an activity based solely on its emissions.
The average children’s hospital emits more CO2 per day than the average cruise ship. If we followed the rule “Cut the highest emitters first” we’d prioritize cutting hospitals over cruise ships—which is clearly a bad idea. Reducing emissions requires weighing the value of something against its emissions, not blindly cutting based on CO2 output alone. We should ask questions like “Can we achieve the same outcome with lower emissions?” or “Is this activity necessary?” But the rule “Find the highest emitting thing in a group of activities and cut it” doesn’t work.
In this post, I’ll compare LLM use to other activities and resources of similar usefulness. If you believe LLMs are entirely useless, then we should stop using them—but I’m convinced they are useful. Part of this post will explain why.
It is extremely bad to distract the climate movement with debates about inconsequential levels of emissions
If climate change is an emergency that requires lots of people working collectively to fix in limited time, we cannot afford to get distracted by focusing too much of our effort and thinking on extremely small levels of emissions. The climate movement has seen a lot of progress and success in shifting its focus away from individual actions like turning off lights when leaving a room to big systematic changes like building smart grid infrastructure or funding renewable tech. Even if you are only focused on lifestyle changes, it is best to focus on the most impactful lifestyle changes for climate. It would be much better for climate activists to spend all their time focused on helping people switch to green heating than encouraging people to hang dry their clothes:
If the climate movement should not focus its efforts on getting individual people to hang dry their clothes, it should definitely not focus on convincing people not to use ChatGPT:
There are other environmental concerns besides emissions
Another common concern about LLMs is their water use. This matters even though it’s not a direct cause of climate change. I’ll address that in the second part of the post. There might be other concerns as well (the supply chains involved in constructing data centers in the first place) but from what I can tell other environmental concerns also apply to basically all computers and phones, and I don’t see many people saying that we need to immediately stop using our computers and phones for the sake of the climate. If you think there are other bad environmental results of LLMs that I’m missing in this post, I’d be excited to hear about them in the comments!
It’s impossible to get very precise measurements of exactly how much energy individuals using extremely large complex systems are consuming
Any statistics about the energy consumption of individual internet activities have large error bars, because the internet is so gigantic and the energy use is spread across so many devices. Any source I’ve used has arrived at these numbers by dividing one very large uncertain number by another. I’ve tried my best to report numbers as they exist in public data, but you should assume there are significant error bars in either direction. What matters is the proportions more than the very specific numbers.
Are LLMs useful?
If LLMs are not useful at all, any emissions no matter how minute are not worth the trade-off, so we should stop using them. This post depends on LLMs being at least a little useful, so I’m going to make the case here.
I think my best argument for why LLMs are useful is to just have you play around with Claude or ChatGPT and try asking it difficult factual questions you’ve been trying to get answers to. Experiment with the prompts you give it and see if asking very specific questions with requests about how you’d like the answer to be framed (bullet-points, textbook-like paragraph) gets you what you want. Try uploading a complicated text that you’re trying to understand and use the prompt “Can you summarize this and define any terms that would be unfamiliar to a novice in the field?” Try asking it for help with a complicated technical problem you’re dealing with at work.
If you’d like testimonials from other people you can read people’s accounts of how they use LLMs. Here’s a good one. Here’s another. This article is a great introduction to just how much current LLMs can do.
LLMs are not perfect. If they were, the world would be very strange. Human-level intelligence existing on computers would lead to some strange things happening. Google isn’t perfect either, and yet most people get a lot of value out of using it. Receiving bad or incorrect responses from an LLM is to be expected. The technology is attempting to recreate a high level conversation with an expert in any and every domain of human knowledge. We should expect it to occasionally fail.
I personally find LLMs much more useful as a tool for learning than most of what exists on the internet outside of high quality specific articles. Most content on the internet isn’t the Stanford Encyclopedia of Philosophy, or Wikipedia. If I want to understand a new topic, it’s often much more useful for me to read a ChatGPT summary than watch an hour of some of the best YouTube content about it. I can ask very specific clarifying questions about a topic that it would take a long time to dig around the internet to find.
Emissions
What’s the right way to think about LLM emissions? Something suspicious a lot of claims about LLMs do is compare them to physical real-world objects and their emissions. When talking about global use of ChatGPT, there are a lot of comparisons to cars, planes, and households. Another suspicious move is to compare them to regular online activities that don’t normally come up in conversations about the climate (when was the last time you heard a climate scientist bring up Google searches as a significant cause of CO2 emissions?) The reason this is suspicious is that most people are lacking three key intuitions:
Without these intuitions, it is easy to make any statistic about AI seem like a ridiculous catastrophe. Let’s explore each one.
The incredibly small scales involved in individual LLM use
“A ChatGPT question uses 10 times as much energy as a Google search”
It is true that a ChatGPT question uses 10x as much energy as a Google search. How much energy is this? A good first question is to ask when the last time was that you heard a climate scientist bring up Google search as a significant source of emissions. If someone told you that they had done 1000 Google searches in a day, would your first thought be that the climate impact must be terrible? Probably not.
The average Google search uses 0.3 Watt-hours (Wh) of energy. The average ChatGPT question uses 3 Wh, so if you choose to use ChatGPT over Google, you are using an additional 2.7 Wh of energy.
How concerned should you be about spending 2.7 Wh? 2.7 Wh is enough to
Drive a sedan at a consistent speed for 15 feet
In Washington DC where I live, the household cost of 2.7 Wh is $0.000432.
Sitting down to watch 1 hour of Netflix has the same impact on the climate as asking ChatGPT 300 questions in 1 hour. I suspect that if I announced at a party that I had asked ChatGPT 300 questions in 1 hour I might get accused of hating the Earth, but if I announced that I had watched an hour of Netflix or that I drove 0.8 miles in my sedan the reaction would be a little different. It would be strange if we were having a big national conversation about limiting YouTube watching or never buying books or avoiding uploading more than 30 photos to social media at once or limiting ourselves to 1 email per day for the sake of the climate. If this were happening, climate scientists would correctly say that the public is getting bogged down in minutia and not focusing on the big real ways we need to act on climate. Getting worried about whether you should use LLMs is as much of a distraction to the real issues involved with climate change as worrying about whether you should stop the YouTube video you’re watching 12 seconds early for the sake of the Earth.
Let’s take an extreme case and imagine that the reason you don’t want to use LLMs is that if everyone used LLMs over Google for every search, this would use too much energy. There are 8,500,000,000 Google searches per day. Let’s image that we replaced every single Google search with a ChatGPT search. That takes us from a daily energy use of 2,550,000,000 Watt-hours (Wh) to 25,500,000,000 Wh, or an additional 22,950,000,000 Wh, or 23 Giga-Watt-hours (GWh). The daily global energy demand from the internet is 2,200 GWh, so this would increase the daily global energy demand of the internet by 1%. A global switch from Google to ChatGPT would therefore be about the same as increasing the global population using the internet by 1%. If you heard that next year 1% more people would have access to the internet around the world, how concerned would that make you for the climate? Last year the actual growth rate of internet users was 3.4%.
In my experience using ChatGPT is much more useful than a Google search to the point that I’d rather use it than search Google ten times anyway. I can often find things I’m looking for much faster with a single ChatGPT search than multiple Google searches. Here’s a search I did asking it to summarize what we know about the current and future energy sources used for American data centers. It also saves me a lot of valuable time compared to searching Google ten times.
How many people are actually using LLMs
A lot of complaints about the total use of LLMs do not make sense when you consider the number of people using them. In considering LLM use, we can’t just look at their total emissions. We need to consider how many people are using the product. Someone could correctly point out that Google as a company produces way more emissions than a Hummer, but this is silly because Google has billions of users and the Hummer has one, and Google is very efficient with the energy consumed by each user.
Here are some examples to illustrate the point:
“ChatGPT uses the same energy as 20,000 American households.”
ChatGPT as of the time of writing has 300,000,000 daily users and 1,000,000,000 daily messages answered. Let’s imagine that you can snap your fingers and create one additional American household, with all its energy demands and environmental impact. This American household is special. The people in the household have one hobby: spending all their time writing very detailed responses to emails. They enjoy doing this and never stop, and they’re so good at it that they have 15,000 people emailing them every day, each person sending on average 3.3 emails for a total of 50,000 emails per day, or 1 email every 2 seconds 24 hours per day. People seem to find their replies useful, because the rate of use just keeps going up over time. Would you choose to snap your fingers and create this household, even though it will have the climate impacts of one additional normal American household? Seems like a clearly good trade-off. What if you had the option to do that a second time, so now 50,000 more messages could be answered by a second household every day? Again, this seems worth the emissions. If you keep snapping your fingers until you meet the demand for their message replies, you would have created 20,000 new American households and have 1 billion messages answered per day. 20,000 American households is about the size of the Massachusetts city of Barnstable:
If one additional version of Barnstable Massachusetts appeared in America, how much would that make you worry about the climate? This would be an increase in America’s population of 0.015%. What if you found out that everyone who lived in the new town spent every waking moment sending paragraphs of extremely useful specific text about any and all human knowledge to the world and kept getting demands for more? Of all the places and institutions in America to cut emissions, should we start by preventing that town from growing?
“Training an AI model emits as much as 200 plane flights from New York to San Francisco”
This number only really applies to our largest AI models, like GPT-4. GPT-4’s energy use in training was equivalent to about 200 plane flights from New York to San Fransisco. Was this worth it?
To understand this debate, it’s really helpful to understand what it means to actually train an AI model. Writing that up would take too much time and isn’t the focus of this post, so I asked ChatGPT to describe the training process in detail. Here’s its explanation. What’s important to understand about training a model like GPT-4 is
It’s a one-time cost. Once you have the model trained, you can tweak it, but it’s good to go and be used. You don’t have to continuously train it after for anywhere near the same energy cost.
It’s incredibly technologically complex. Training GPT-4 required 2 × 10²⁵ floating point operations (simple calculations like multiplication, subtraction, multiplication, and division). This is 70 million times as many calculations as there are grains of sand on the Earth. OpenAI had to wire together 25,000 state-of-the-art GPUs specially designed for AI together to perform these calculations over a period of 100 days. We should expect that this process is somewhat energy intensive.
It gave us a model that can give extremely long, detailed, consistent responses to very specific questions about basically all human knowledge. This is not nothing.
It’s rarely this large and energy intensive. There are only a few AI models as large as GPT-4. A lot of the AI applications you see are using the results of training GPT-4 rather than training their own models.
It's helpful to think about whether getting rid of "200 flights from New York to San Francisco" would really move the needle on climate. There are about 630 flights between New York and San Francisco every week. If OpenAI didn't train GPT-4, that would be about the same as there being no flights between New York to San Francisco for about 2 days. That's not 2 days per week. It's 2 days total. Even if ChatGPT had to be retrained every year (and remember, it doesn't) that is less than 1% of the emissions from flights between these two specific American cities. How much of our collective effort is it worth to stop this?
200 planes can carry about 35,000 people. About 20 times that amount of people fly from around the country to Coachella each year. There aren’t 20 AI models of equal size to GPT-4, so for the same carbon cost we could either cease all progress in advanced AI for a decade or choose not to run Coachella for 1 year so people don’t fly to it. This does not seem worth it.
To put a more specific number on the energy it took to train GPT-4, it’s about 60 GWh. GPT-4 was trained to answer questions, so to consider the energy cost we need to consider how many searches we have gotten out of that training energy use. I see the training cost as equivalent to comparing the cost of a shirt with how often you’ll wear it. If a shirt costs $40 and is well-made so that it will survive 60 washes, and another shirt is $20 but is poorly made so it will only survive 10 washes, then even though the first shirt is initially more expensive, it actually costs $0.67 per wear, while the second shirt costs $2 per wear, so in some meaningful way the first shirt is actually cheaper after you make the initial investment. In the same way, the training can look expensive in terms of energy if you don’t factor in just how many users and searches GPT-4 will handle.
A very rough estimate using publicly available data says that there have been about 200 billion ChatGPT searches so far. This means that so far, if we include the cost of training in the total energy cost of searching ChatGPT, we add 3 Wh/search to 60 GWh/200,000,000,000 searches = 3.3 Wh/search. The training cost distributed over each search adds 0.3 Watt-hours of energy, so it increases the total energy cost of a ChatGPT search by 1 Google search’s worth of energy. This does not seem significant. Consider now that ChatGPT is just one thing GPT-4 is being used for, other things include:
DuoLingo
Khan Academy
Be My Eyes
GitHub Copilot X
Once you factor in just how much use it’s getting, the energy cost of training GPT-4 looks incredibly cheap, in the same way that the more initially expensive shirt is overall cheaper than the second.
Other online activities’ emissions
When someone throws a statistic at you with a large number about a very popular product, you should be careful about how well you actually understand the magnitudes involved. We’re not really built for thinking about large numbers like this, so the best we can do is compare them to similar situations to give us more context. The internet is ridiculously large, complex, and used by almost everyone, so we should expect that it uses a large portion of our total energy. Anything widely used on the internet is going to come with eye-popping numbers about its energy use. If we just look at those numbers in a vacuum it is easy to make anything look like a climate emergency.
ChatGPT uses as much energy as 20,000 households, but Netflix reported using 450 GWh of energy last year which is equivalent to 40,000 households. Netflix’s estimate only includes its data center use, which is only 5% of the total energy cost of streaming, so Netflix’s actual energy use is closer to 800,000 households. This is just one streaming site. In total, video streaming accounted for 1.5% of all global electricity use, or 375,000 GWh, or the yearly energy use of 33,000,000 households. ChatGPT uses the same energy as Barnstable Massachusetts, while video streaming uses the same energy per year as all of New England, New York State, New Jersey, and Pennsylvania combined. Video streaming is using 1600x as much energy as ChatGPT, but we don’t hear about it as much because it’s a much more normal part of everyday life. 20,000 households can sound like a crazy number when you compare it to your individual life, but it’s incredibly small by the standards of internet energy use.
Here’s how many American households worth of energy different online activities use globally, all back of the envelope calculations I did with available info, plus an equivalent American city using the same energy. I factored in both the energy used in data centers and the energy used on each individual device. There are large error bars but the rough proportions are correct.
11,000 households - Barre, VT - Google Maps
20,000 households - Barnstable MA - ChatGPT
23,000 households - Bozeman, MT - Fortnite
150,000 households - Cleveland, OH - Zoom
200,000 households - Worcester, MA - Spotify
800,000 households - Houston, TX - Netflix
1,000,000 households - Chicago, IL - YouTube
Does this mean that we should stop using Spotify or video streaming? No. Remember the rule that we shouldn’t just default to cutting the biggest emitters without considering both the value of the product and how many people are using it. Each individual Spotify stream uses a tiny amount of energy. The reason it’s such a big part of our energy budget is that a lot of people use Spotify! What matters when considering what to reduce is the energy used compared to the amount of value produced, and other options to get the same service. The energy involved in streaming a Spotify song is much much less than the energy required to physically produce and distribute music CDs, cassettes, and records. Replacing energy-intensive physical processes with digital options is part of the reason the energy consumption per American citizen has gone down by 22% since its peak in 1979.
If people are going to listen to music, we should prefer that they do it via streaming rather than buying physical objects. Just saying that Spotify is using the same energy as all of New York City without considering the number of users, the benefits they’re getting from the service, or how energy efficient other options for listening to music would be is extremely misleading. Pointing out that ChatGPT uses the same energy as 20,000 households without adding any other details is just as misleading.
Water use
Why do LLMs use water? Where does it go after?
Here’s ChatGPT’s explanation of why and how AI data centers use water and where it goes after. In a nutshell, AI data centers:
Draw from local water supplies.
Use water to cool the GPUs doing the calculations (in the same way your laptop fan cools your laptop when it overheats).
Evaporate the water after, or drain it back into local supplies.
Something to note about LLM water use is that while much of the water is evaporated and leaves the specific water source, data centers create significantly less water pollution per gallon of water used compared to many other sectors, especially agriculture. The impact of AI data centers on local water sources is obviously important to think about and how sustainable they are mostly depends on how fragile the water source is. Good water management policies should help factor in which water sources are most threatened and how to protect them.
How to morally weigh different types of water use (data centers evaporating it vs. agriculture polluting it) seems very difficult. The ecosystems affected are too complex to try to put exact numbers on how bad one is compared to the other. I will say that intuitively a data center that extracts from a local water source but evaporates the water unpolluted back into the broader local water system seems bad for very specific local sources, but not really bad for our overall access to water, so the whole concern might be really overblown and wouldn’t matter at all if we just built data centers exclusively around stable water supplies. I’m open to being wrong here and would be excited to get more thoughts in the comments. Simply reporting “Data centers use X amount of water” without clarifying whether the water is evaporated, or returned to the local water supply polluted or unpolluted seems so vague that it’s a bad statistic without more context.
How much water do LLMs use?
20-50 searches on ChatGPT uses the same amount of water as a normal water bottle (0.5 L).
This means that it takes about 300 ChatGPT queries to hit 1 gallon of water.
The amount of water use by LLMs can seem like a lot. It is always shocking to realize that our internet activities actually do have significant real-world physical impacts. The issue with how AI water use is talked about is that conversations often don’t compare the water use of AI to other ways water gets used.
All online activity relies on data centers, and data centers use water for cooling, so everything that you do online uses water. The conversation about LLMs often presents the water they use as ridiculous without giving any context for how much water other online activity uses. It’s actually pretty easy to calculate a rough estimate for how much water different online activities use, because data centers typically use about 1.8 liters of water per kWh of energy consumed. This number includes both the water used by the data center itself and the water used in generating the electricity used. Here’s the water used in a bunch of different things you do on the internet in milliliters:
10 mL - Sending an email
10 mL - Posting a photo on social media
20 mL - One online bank transaction
30 mL - Asking ChatGPT a question
40 mL - Downloading a phone app
170 mL - E-commerce purchase (browsing and checkout)
250 mL - 1 hour listening to streaming music
260 mL - 1 hour using GPS navigation
430 mL - 1 hour browsing social media
860 mL - Uploading a 1GB file to cloud storage
1720 mL - 1 hour Zoom call
2580 mL - 10 minute 4K video
After the recent California wildfires I scrolled by several social media posts with over 1 million views each saying something like “People are STILL using ChatGPT as California BURNS.” They should have focused more on the people watching Fantastic Places in 4k 60FPS HDR Dolby Vision (4K Video).
Should the climate movement start demanding that everyone stop listening to Spotify? Would that be a good use of our time?
What about the water cost of training GPT-4? So far I’ve only included the cost of individual queries. A rough estimate based on available info says GPT-4 took about 250 million gallons of water, or about 1 billion liters. Taking from the assumption above that ChatGPT has received about 200 billion queries so far, the training water cost adds 0.005 L of water to the 0.030 L cost of the search, so if we include the training cost the water use per search goes up by about 16%. That’s still not as water intensive as downloading an app on your phone or 10 minutes of streaming music. Remember that ChatGPT is just one function that GPT-4 is used for, so the actual water cost of training per ChatGPT search is even lower.
Animal agriculture uses orders of magnitude more water than data centers. If I wanted to reduce my water use by 600 gallons, I could:
Skip sending 200,000 ChatGPT queries, or 50 queries every single day for a decade.
Skip listening to ~2 hours of streaming music every single day for a decade.
Skip 1 burger.
A common criticism of the above graph is that water cows consume exists in grass while water used in data centers is drawn from local sources that are often already water strained. This is not correct for three reasons:
About 20% of US beef cows are in Texas, Colorado, and California. Each state has experienced significant strains on its water resources. Approximately 20% of U.S. data centers draw water from moderately to highly stressed watersheds, particularly in the western regions of the country, so on this it seems like both industries have roughly 20% of their total activity concentrated in places where they may be harming local watersheds.
Data centers pollute local water supplies significantly less than animal agriculture.
If you are trying to reduce your water consumption, eliminating personal use of ChatGPT is like thinking about where in your life you can cut your emissions most effectively and beginning by getting rid of your digital alarm clock.
A back of the envelope calculation tells me that the ratio of water use of 1 ChatGPT search compared to 1 burger is the same ratio as the energy use of a 1 mile drive in a sedan compared to the energy used by driving the world’s largest cruise ship for 60 miles.
If your friend were about to drive their personal largest ever in history cruise ship solo for 60 miles, but decided to walk 1 mile to the dock instead of driving because they were “concerned about the climate impact of driving” how seriously would you take them? The situation is the same with water use and LLMs. There are problems that so completely dwarf individual LLM water use that it does not make sense for the climate movement to focus on individual LLM use at all.
The people who are trying to tell you that your personal use of ChatGPT is bad for the environment are just fundamentally confused about where water (and energy) is being used. This is such a widespread misconception that you should politely but firmly let them know that they’re wrong.
Thanks for writing this, Andy! A point worth sharing here is that the environmental critique of LLMs seems to have been "transferred" from the same critique of blockchain technology and NFTs. An environmental critique of NFTs is, as far as I know, valid — selling 10 NFTs does about as much harm, measured in carbon emissions, as switching to a hybrid car does good. What may have happened is that two coalitions that were arguing with each other about one technology simply reused the same arguments when a newer, reputedly-energy-intensive technology came around.
This hypothesis is not my own, but it strikes me as extremely plausible. I couldn't see how otherwise critics of AI could have anchored on this argument, when there are many other perfectly valid arguments about the downsides of LLMs!
Nice post!
Could you give exact sources for the numbers in the "Water consumed by ChatGPT vs other activities" graph? I prefer to personally verify claims in infographics before resharing them, but sources like "US Census Bureau" and "UNEP" put out a _lot_ of data so it's not obvious which one of their reports I should look at.