86 Comments

Thanks for writing this, Andy! A point worth sharing here is that the environmental critique of LLMs seems to have been "transferred" from the same critique of blockchain technology and NFTs. An environmental critique of NFTs is, as far as I know, valid — selling 10 NFTs does about as much harm, measured in carbon emissions, as switching to a hybrid car does good. What may have happened is that two coalitions that were arguing with each other about one technology simply reused the same arguments when a newer, reputedly-energy-intensive technology came around.

This hypothesis is not my own, but it strikes me as extremely plausible. I couldn't see how otherwise critics of AI could have anchored on this argument, when there are many other perfectly valid arguments about the downsides of LLMs!

Expand full comment

FWIW the critique was actually equally incorrect and immaterial as applied to blockchain technology (selling 10 NFTs does not result in as much carbon emissions as not switching to a hybrid car), and so the animus is likely instead a dislike of the types of people who pursue both of these technologies rather than an object-level critique of them.

Expand full comment

I have heard exactly that whenever this discussion comes up, something along the lines of: "I can recall similar defences being put forward to explain away the energy demands of Blockchain a couple of years ago", so I don't believe you now.

Expand full comment

Great article, but this also contains some zombie facts about older models that consumed way more energy - the costs keep coming down over time. https://simonwillison.net/2024/Dec/31/llms-in-2024/#the-environmental-impact-got-much-much-worse

Expand full comment

Yup appreciate you pointing this out! I knew the stats were overestimates and thought about addressing that but with the limited space I have I figured it'd be more convincing if I used the highest estimates we have for ChatGPT's energy and water use. I might add an addendum at the end, worried the intro's already very long.

Expand full comment

I think this point does qualitatively change some of the sections, in a counterintuitive, fun-for-a-dinner-party way that's maybe worth it's own separate call-out.

e.g. if the costs of ChatGPT have reduced more than 10x since 2023 (seems likely), then switching all usage of Google to ChatGPT would actually be a net positive.

Even if the main point is that it doesn't really matter in scale, it's probably a snappier rebuttal.

Expand full comment

Yup I agree. Conversations like this are pretty hard because I could see myself getting accused of believing the data pushed by large AI companies, so I'm trying to give environmentalist critics of AI as much of a firm ground to just look at the arguments themselves without introducing new things to question. I figured just going with the numbers people are already using there would be most useful. I might write a follow-up with notes about how those new numbers would affect things.

I would be surprised if a ChatGPT search were using LESS energy than a Google search. Will be interested in the numbers that come out for this.

Expand full comment

Nice to see independent numerical analysis. BTW, it may be worth noting that the ChatGPT power numbers you use come from a April 2023 paper (Li et al, arXiv:2304.03271) that estimates water/power usage based off of GPT-3 (175B dense model) numbers published from *2021* numbers. There was a newer paper that directly measured power usage on a Llama 65B (on V100/A100 hardware) that showed a 14X better efficiency (Samsi et al, arXiv:2310.03003).

Since I've been running my own tests, I decided what this looked like using similar direct testing methodology in 2025. On the latest vLLM w/ Llama 3.3 70B FP8, my results were 120X more efficient than the commonly cited Li et al numbers.

For those interested in a full analysis/table with all the citations (including my full testing results) see this o1 chat:

https://chatgpt.com/share/678b55bb-336c-8012-97cc-b94f70919daa

Expand full comment

Yup I knew ChatGPT is actually probably more energy efficient, the post is already very long and I was worried I'd lose readers if I kept adding to it so I tried to show that even with the worst estimates for how much energy it's using it's still ridiculous to spend time worrying about how much energy your searches are using

Expand full comment

Thanks for the write-up!

Although I feel like your overarching point is valid, I do have some doubts about the numbers:

- The source about 10 seconds of video actually mentions that streaming video consumes 77 Wh per hour, so 2.7 Wh would be equivalent to just over 2 minutes.

- The source about sending emails and Tiktok does not mention the energy consumption of emails at all.

- Both of these URLs (and a few others) have a UTM tag of chatgpt. Are these sources hallucinated from ChatGPT?

Expand full comment

Will edit those, thank you! I used ChatGPT for searching for sources but had inspected them individually, the article was long enough that some math errors probably slipped in.

Expand full comment

Thanks for writing this Andy! I found myself writing and sharing a similar, though less thorough version of the same critique among my friends.

Glad you did it so well! And hooray for the shoutout on hard fork :)

I found that some of the people I know feel it important to acknowledge, alongside this kind of valuable critique, that people affected most by climate change, and those affected directly by the fires, have plenty of right to feel like the world is burning, to look for something to blame, and to ask for individual actions toward fixing things. And climate change affects people differentially, those who are most affected often time feeling and being the most disempowered. I think the railing against AI feels good because it is versus a huge differential between powerful and rich companies, and it is companies like these (such as oil and gas companies) whose greed and active lobbying, deflection, and disinformation has gotten us to this point of climate change and disaster.

Expand full comment

Love that you put this together. Great stuff!

Something that I think is worth pointing out and maybe reframing around when doing the usage comparisons is that the energy consumption of screens generally is more impactful than almost anything we do on them.

This is a little dated but the BBC did in depth studies of energy use across their broadcast and streaming distribution chain in 2020 and …

“For every platform, we found that the devices in the our audiences' homes used more energy in total than in our distribution chain. Overall, the home equipment (including mobile phones) accounted for over 90% of the total energy use.”

https://www.bbc.co.uk/rd/blog/2020-09-sustainability-video-energy-streaming-broadcast

Expand full comment

My concerns are within the domain of computing. I agree with the implicit argument of the article - eating meat and driving cars doesn't really work "at scale."

But back to the domain of computing, how do I correlate these numbers with the shocking growth in data center power consumption? Is it just a coincidence that Microsoft and Google power consumption in the data centers have jumped dramatically - that Google has now abandoned many of its climate goals?

If they are correlated, it seems that training and serving LLMs have an outsized cost in cloud computing that isn't reflected on these graphs.

Expand full comment

AI is being used for a lot more than ChatGPT and is going to grow to a huge size of overall computing costs in the next few years. My point in this post is only to say that your individual ChatGPT searches aren't on their own worthwhile to worry about for the sake of the climate, even the aggregate of everyone using ChatGPT, not that AI more broadly isn't going to use a lot of energy.

Expand full comment

Oh I see you already discussed this more thoroughly in another comment. You also answered

> The goal of this post is just to say "If you as an individual are worried about how much your use of ChatGPT is harming the environment, that's silly and you shouldn't worry."

Indeed, one hamburger doesn't harm the environment. But that just externalizes the actual cost of the systemic hamburger supply/demand. This is a common slight-of-hand that most economists use in their modeling. The logical fallacy is that these costs do re-emerge and it's usually up to the individual deal with the burden of the debt (through taxes, drinking bad water, etc...).

Expand full comment

At some point we need to decide on exactly how we're measuring bad environmental impact. If we're distributing the impact over the total effect on the economy and all the price signals you're sending, then other things like flying and driving would also rise quite a bit in their measured emissions. Basically my claim is 1) No matter how we measure it, individually using ChatGPT is going to have about as bad of an effect as using a digital clock, and 2) This is a bad way to think about it anyway and we should focus most of our limited time and energy on systematic change (transitioning to renewables, smartgrid etc.) rather than adding up our individual daily activities to minimize emissions.

Expand full comment

Thanks for the thoughtful replies.

We'll probably never get to the point of deciding how exactly we're measuring bad environmental impact. But you and I will probably converge here: shifting the burden of environmental action to the individual has been a problematic misdirection.

Sure - we should all turn the lights out when we leave the room. That will save "the energy cost" for our ChatGPT query later. ;)

In reality, environmental activism and change needs to be focused on the corporate, state, and international level. In the specific case of large language models, I'd like to major penalties for building data centers in hot places that lack ample water.

If I can be more radical for a moment - I'd also like to see specific cost tiers for energy consumption by data centers with a high number of GPU clusters. Generally, I'd like the venture capitalists to pay for the cost of building rather than the taxpayers. If they don't agree, then it seems fair that the taxpayers should directly enjoy the benefits from the projected revenue downstream. People forget that taxpayers also pay for risks and moonshots.

Expand full comment

Yup agree with everything here!

Expand full comment

The comments are interesting so far: From my understanding, they're celebrating that "AI isn't bad" or at least that bad. Well ...

... another "easy" metric is the datacenter count x energy consumption. Please show me, how that is going down and I'll shut up.

Because until the 3rd Summer of AI datacenter consumption and grow was slow / or was pointing down per center. Systems got more and more energy effective thanks optimization of Hardware and Virtualization.

But now its 30 to 50% up in Energy and CO2 since 2020 for Microsoft, Google and so on. Why the f** would they need atomic energy so bad?

Also, the current speculation on datacenters *needed* in the future for anything AI is a business - much like the oil industry - that will create demand one way or another. Either demand is high and will pay out. Or ... by overproviding, the cost will be that low, AI becomes dirt cheap for a while to put it in anything up to your buttons. WIN WIN.

On top of that Future AI - unless there is a transformer miracle - will need so much more than a few hundred tokens per task: AI Agent Systems need constant and multiplied compute time to function.

But hey, what do I know, I only see the big picture and read the financial statments of AI & datacenter companies. Maybe its all a bubble. Maybe you should read the enviroment and risk sections, too.

Expand full comment

I tried to make it clear that the post is about whether individual people should feel bad about how much their personal ChatGPT use is harming the environment, not AI as a broader industry. There's a lot happening in AI beyond ChatGPT and it's too complicated to write about in a blog post. The goal of this post is just to say "If you as an individual are worried about how much your use of ChatGPT is harming the environment, that's silly and you shouldn't worry."

Expand full comment

Thanks for the clarification. It wasn't actually needed, but since you said "that's silly and you shouldn't worry" as a learning:

This is concerning, because of the mindset of the many. And I know, changing the world is really difficult to sell.

Anyway, its the same vibe of "no, just one drive with the car is not that bad" vs. having a car at all (I don't - and yes, some people need a car due location, job, etc. - so when we design cities and structures to require cars - isn't that stupid?).

Similar to roads not rails: We won't have much of a choice to use AI in nearly every interaction in the future. Don't believe me? I'll set a timer for one year and I'll be back.

So in the grand scheme using AI just for everything (looking at google enabling AI tools at scale and unconditionally on gmail and gdrive in the us) is just like turning up the water heat one degree - no one's getting boiled ... yet. Since one prompt is not so bad and won't change the planet, lets use more!

Anyways, I'm in my fifties and have seen the world, so I'll probably die before this all boils humanity hell. Kinda half glass full vibe.

Bad for the kids though. And Bitcoin and NFTs won't safe them, thats sure - which was my primary irk when mentioned in the comments to write my own reply in the first place, not your text. It actually helped me for my next talk about climate and AI. And so: Thanks!

Expand full comment

I'm not sure if you read the full post but I went into a lot more detail here. It's not the same mindset as driving because one drive actually emits quite a bit and collectively driving is a gigantic part of global emissions. ChatGPT collectively is about the same level of global emissions as digital clocks. If someone concerned about climate were spending more than a second thinking about the emissions of their digital clock, I'd tell them that they need to focus on real problems instead. Happy this was helpful for your talk!

Expand full comment

Its a multi faceted, multi level issue. You talk about the one tool, the single use of AI, I (have to) see the system and its whole impact. One doesn't exist without the other. And "small scale" creates demand for the other. Thats just the way it is. You're not wrong, you're just in my opinion micro centered and I happen to be macro.

Men like Sam Altman, Elon Musk eta. al. will use the simple tools to make examples for the large scale and "demand" for (more) automation - aka wouldn't it be nice if ChatGPT could read ALL my Mails and ALL my Messages and ALL my Shedules, etc. etc. will be the next steps.

In RL I was recently tasked with setting up AI "Systems" on 1000+ user scale, so I made the effort to calculate what has impact and makes sense. And being in the EU means you have to calculate enviromental impact - which is a great requirement - for your enviromental footprint. If only any AI company would be so kind to offer the calculation or data. But they either don't know themselfes or its part of the secrecy around model optimization and quantification ... no idea.

Expand full comment

Thank you very much for this. Every time I tried to crunch the numbers, even being generous in estimates, they did not add up to the claims of "AI burning the rainforest"... but seeing it all in one place is even more effective.

Expand full comment

Nice post!

Could you give exact sources for the numbers in the "Water consumed by ChatGPT vs other activities" graph? I prefer to personally verify claims in infographics before resharing them, but sources like "US Census Bureau" and "UNEP" put out a _lot_ of data so it's not obvious which one of their reports I should look at.

Expand full comment

That's the one graph I didn't actually make and can't find a reliable source for who originally made it. Here's the burger statistic: https://www.latimes.com/food/dailydish/la-dd-gallons-of-water-to-make-a-burger-20140124-story.html#:~:text=Just%20to%20get%20a%20sense,requires%20660%20gallons%20of%20water. And the water used per ChatGPT search statistic: https://www.govtech.com/question-of-the-day/how-much-water-does-chatgpt-drink-for-every-20-questions-it-answers

Expand full comment

Great, thanks!

Expand full comment

I wanted to write a post like this until I read this post! excellent writing :D

arguably, in the medium-term (next ~5 years), AI will be very good for the climate because:

1. it'll make climate scientists and engineers more efficient by helping them rapidly ideate, test hypotheses, and find information. it can democratize knowledge, so more people can contribute to climate policy and research.

2. it'll incentivize massive capital expenditure by the world's largest tech corporations with net 0 carbon commitments who are investing hundreds of billions into clean energy infrastructure to help power AI data centers. this creates economies of scale, driving down the cost of nuclear, solar, batteries, transformers, and more - all critical for a green transition. tech accounted for 68% of all clean energy funding in 2024.

3. optimization. like AlphaFold, AI can test millions of combinations of materials for better solar panels or batteries, or it can help optimize electricity usage and reduce the carbon intensity of industrial operations. the energy company Vistra used AI models to save ~1.6M tons of carbon annually - equivalent to taking 348k cars off the road or removing the carbon footprint of New Haven, CT. AI optimization of HVAC systems can reduce GHG emissions by ~40%. Google - though admittedly biased - made a report with BCG showing that AI can mitigate 5-10% of global GHG emissions by 2030. that's the emissions of all of Europe.

what other technologies have the potential for such positive climate impacts at such a low carbon cost?

it' disappointing that the hyper-online left and TikTok creators have started seeing AI tools as a climate bogeyman. holding this view requires a lack of numeracy and blindness to the data

Expand full comment

Most likely we’ll use up efficiency gains by just doing more of it. Have you read about Jevons paradox in AI? https://arxiv.org/abs/2501.16548v1

Expand full comment

I have - and I think this is a good point. thanks for the paper reference! seems interesting

but imo the efficiency benefits will exceed the increase in consumption. admittedly I don't have strong evidence for that view. even if not, the potential benefits of AI actually helping us solve the scientific & engineering problems of climate change hold no matter our level of consumption, and could be far more impactful than efficiency gains

Expand full comment

Hahaha so you’re accusing the left of data blindness but don’t have evidence for your view? ;) Jevon’s paradox is not something that only applies to AI. Historically when we’ve made efficiency gains in any process we started doing more of it / produced more. I’m not that optimistic we won’t do the same wrt to the things you mention in your comment.

Expand full comment

This post provides very helpful framing at the level of individual choice. For that I am grateful.

Still, it is probably helpful to unpack how AI and computing in general drive energy usage in aggregate. Our data centers could account for 6.7-12% of total US energy consumption by 2028 (source below), up from about 4.4% in 2023. The rate of consumption in data center energy is only accelerating. That’s not nothing for one of the largest electricity-consuming countries on Earth. Given solving climate change requires a multifaceted approach, it makes sense to focus on ways to address data centers’ energy demand and carbon footprint, even as we focus on the even more important work of transitioning to cleaner forms of transposition, agriculture, etc. https://eta-publications.lbl.gov/sites/default/files/2024-12/lbnl-2024-united-states-data-center-energy-usage-report.pdf

Expand full comment

Yup agree

Expand full comment

Been trying to explain this to people for over a year, always wanted to create an article like this, find all the data points, the graphs and show it. You sir are a LEGEND! <3

will be using this in every keynote (and of course give you full credits <3)

Expand full comment

So happy it was useful! And yeah I was surprised this didn't exist which is why I ended up writing it, was also looking for something and it was surprisingly easy to find and compare all the numbers!

Expand full comment

Congrats on the Hard Fork podcast call out of this article: https://www.nytimes.com/2025/01/17/podcasts/hardfork-tiktok-rednote-environment.html

Expand full comment

Yeah that was a wild surprise!

Expand full comment

Very interesting. "average ChatGPT question uses 3 Wh" is this for 4o? How would this change if using o1 pro?

Expand full comment

So I can't actually find consistent data on this and from what I can tell the 3 Wh stat might actually be a big overestimate. The post is pretty long already so I thought about adding a section saying "By the way ChatGPT might be even less bad than this" but decided to just assume that the environmental critics are correct and run with their numbers. I might consider adding a note about this in another post, not sure what the right move is.

Expand full comment

I don't have a strong view but been thinking what it would take to get an o1 query an answered. If it takes like a few minutes, using x H100s or whatever ...

Expand full comment

Yeah I'd like to circle back on this for the post once there are clearer statistics produced by other people. I'm worried if I just dig into that myself the numbers will have error bars too large to be useful.

Expand full comment

Since you seem open to updating the graphs, could you please color "Asking chatgpt a question" on the "online activities" graph a different color (probably red for both the text and bar)? The way substack crushes images by default renders the text much less legible, and needing to expand the image misses the at-a-glance purpose of a tl;dr.

Beyond that, though, thank you so much for giving me something to link to whenever someone trots out this dumb argument.

Expand full comment

Great point thanks! Will have time to fix it this weekend.

Expand full comment