The Energy Cost of Curiosity: What I Discovered About AI and Climate
Another episode of the"Arguments with Algorithms" series, this time exploring the carbon costs of AI
Welcome to the wonderful world of AI's climate impact, where your innocent request to "make this email sound more professional" burns roughly the same amount of energy as leaving your laptop plugged in for an hour. Surprise!
The AI Gold Rush
Here is stuff I learnt: training GPT-3 consumed approximately 1,287 MWh of energy, equivalent to the annual energy consumption of around 120 average American homes. Training GPT-4 is estimated to take 50 times more electricity than GPT-3, which would put it closer to 6,000+ homes. Considering that we're in the middle of an AI gold rush where every tech company is racing to build the next big thing and believe we are inches away from AGI, spare a thought for the polar bears and ice caps.
At this point in time, we are so used to using AI in the open window on our laptops for every conceivable thing. When the internet came to be, I’m not sure if we made an explicit connection between climate effects of the technology itself. Perhaps some thought more deeply about the compute costs of individual computers but the impact of a digital network infrastructure was really brought to our attention (or atleast my attention) with the more recent rise of cryptocurrency (token mining) and the proliferation of data centers.
Geography Matters
The geography of your prompts apparently matters more than you think. Who would have thunk it? Ask your AI assistant a question while it's processed in coal-powered Virginia? That's like taking a gas-guzzling road trip. Same question handled by hydroelectric-powered servers in Sweden? SO much better for the polar bears! Sweden generates 98% of its electricity from renewables and nuclear sources as of 2023.
The specific mix breaks down into three main sources:
Hydroelectric Power - This provides around 41% of electricity production. Sweden has abundant rivers and waterfalls, which have been extensively developed for hydroelectric generation.
Nuclear Power - Nuclear provides about 29% of Sweden's electricity from three plants with six reactors.
Wind Power - This is the fast-growing piece. Wind made up just 2% in 2010 but now provides nearly 25% of total energy generation. Sweden's long coastline and northern location provide excellent wind resources.
So how do we estimate this...
So this brings us to my latest project - building Wattif AI which estimates energy burned for every AI prompt (what if.. get it?).
Try it out! I don’t mean to at all dissuade you from using AI - I mean this entire substack is devoted to my adventures with and thoughts about AI!
But think of it as having a environmental conscience that pops up to remind you, "Hey, that request to generate 47 variations of your LinkedIn headline? That just cost about as much energy as boiling water for your morning coffee." I mean I exaggerate and the app is not going to give you stats like that.. but you get my meaning. Think of the polar bears!
The real value isn't in the precise numbers—interactions with AIs like ChatGPT could consume 10 times more electricity than a standard Google search, but the range of estimates for individual queries varies wildly depending on who's doing the math and which research paper they're citing. The value is in making people think twice before asking AI to generate a meme for their social media every single day of the week.
Big Disclaimers
Big disclosure up front - estimating AI energy consumption is basically an educated guessing game wrapped in good intentions.
The most accurate way to understand how much energy an AI model uses is to directly measure the amount of electricity used by the server handling the request, but I don’t have access to Google's and OpenAI’s electricity bills. So instead, my Wattif AI relies on academic research, industry estimates, and a healthy dose of Claude code running some mathematical approximations.
I’ve listed the sources and papers I used at the end, if you are so inclined to read up about this.
Long story short, Wattif AI runs on a system that can take a user's prompt, estimate the energy consumption based on the research, factor in the server location, and spit out something that resembles reality more than complete fiction. BUT ITS NOT PERFECT. ITS A FREE APP GET OFF MY BACK YOU PSYCHOS!
How AI can help solve climate change
AI is already doing incredible work optimizing renewable energy, predicting weather patterns, and making supply chains more efficient. I am actually very interested in hearing about startups that are in this space. If that’s you - hi! Let’s chat.
We're at this fascinating inflection point where we need AI to help solve climate change, but we also need to solve AI's climate problem.
The solution isn't to abandon AI and go back to reviewing contracts by hand (I have been forever scarred by the M&A diligence review of my early days as a first year associate). It's about building awareness and demanding transparency and thinking out loud here, perhaps thinking about cleaner and more sustainable sources of energy?
I’m hoping a silly project like Wattif AI will be a small but crucial step towards that awareness. Maybe like nutrition labels on food we start to think about AI energy labels. Is that wild?
The climate impact of AI isn't going away, but neither is AI itself. The question is whether we'll figure out how to make this relationship sustainable, or whether future generations will look back at our AI usage the same way we look at people who used to think cigarettes were healthy.
You can check out the website at https://ninjalawyer.github.io/climateAI/ and plant a tree or something. :)
📚 Research Sources & Methodology
Here's the research I found that actually seemed credible:
How much energy does ChatGPT use?
Epoch AI - February 2025
Latest research estimates ChatGPT (GPT-4o) consumes approximately 0.3 watt-hours per query, significantly lower than previous estimates. I used this as my baseline for modern efficient models.
We did the math on AI's energy footprint
MIT Technology Review - May 2025
Comprehensive analysis confirming ~0.3 watt-hours per ChatGPT message, with detailed methodology and industry validation. This helped me validate the Epoch AI findings.
Estimating the Carbon Footprint of BLOOM
Luccioni et al. - 2022
Measured ~4 Wh energy consumption per query for BLOOM-176B, a model comparable in size to GPT-4. I obviously used AI to understand this paper. Why can’t people dumb it down?
Power Hungry Processing: Watts Driving the Cost of AI
Samsi et al. - 2023
Measured 3-4 joules per token for LLaMA 65B model inference, providing crucial per-token energy measurements. I used this for token-based calculations when per-query data wasn't available.
Goldman Sachs AI Infrastructure Report
Goldman Sachs Research - 2024
ChatGPT queries require nearly 10 times as much electricity as Google search queries, highlighting AI's energy intensity. Helped the app understand relative impact.
The Environmental Impact of ChatGPT
Earth.Org - May 2024
ChatGPT emits approximately 8.4 tons of CO2 annually, more than twice an individual's average carbon footprint. Used for context about overall impact.
AI brings soaring emissions for Google and Microsoft
NPR Climate Coverage - July 2024
Major tech companies report significant increases in emissions due to AI deployment, with datacenter energy demands growing rapidly. Industry trend validation.
Generative AI's Environmental Impact Explained
MIT News - January 2025
ChatGPT queries consume about 5 times more electricity than simple web searches, but actual usage patterns matter significantly. Context for search comparisons.
The growing energy footprint of artificial intelligence
Alex de Vries (Nature Machine Intelligence) - 2023
Comprehensive analysis of AI's energy growth trajectory, including training and inference costs across different model sizes. Helped me understand scaling relationships.
You are brilliant Smita. I love your ideas and ability to make them a reality.