An early estimate suggested that ChatGPT generates 8.4 tons of CO2 annually—double an average person’s output. But there’s a twist.
The Initial calculations assumed ChatGPT ran on just 16 GPUs, but the reality is a staggering 30,000 units, if not more. The true carbon cost could far exceed our estimates.
Training ChatGPT might use as much water as manufacturing 370 BMWs or 320 Teslas. And that’s only the beginning. Operational energy demands for data centers suggest that by 2027, ChatGPT’s electricity consumption could rival that of entire nations like Sweden, Argentina, or the Netherlands.
To put this into perspective, we crunched the numbers. With the help of a CO2 emissions calculator, each message you send to ChatGPT produces approximately 4.32 grams of CO2. Doesn’t sound like much, right?
AI’s Invisible Exhaust: The Cost of Using ChatGPT
Before we proceed, we need to make a disclaimer. Our statistic is an oversimplified calculation based on public data. If you want to see our homework, we’ll share it towards the end of the article.
For now, let’s dive deeper into what 4.32g of CO2 emissions per query means.
A single query in a conversation won’t shift the needle. But interacting with ChatGPT is usually a to-and-fro affair.
Here’s what the equivalent CO2 usage is by query:
- 15 queries = watching one hour of videos
- 16 queries = boiling one kettle
- 20-50 queries is the equivalent of consuming 500ml of water.
- 139 queries = one load of laundry washed at 86 degrees Fahrenheit, then dried on a clothesline
- 92,593 queries = a round-trip flight from San Francisco to Seattle (according to this calculator)
Now, a single person wouldn’t be able to get through those numbers in a day. But given that there were an average of 50 million unique visits per day to the site in July, the number of queries suddenly ramps up. If each unique visit resulted in 10 queries on average, you’d have 15 trillion queries each month.
The numbers are dizzying, but you can see how easy it is to ramp up the carbon footprint of ChatGPT as more people use it.
How we calculated the information
If you’re wondering how we arrived at our CO2 figure, here’s how we came to our calculation.
- Open AI uses Microsoft Azure cloud infrastructure
- According to Nvidia, they use enterprise-grade A100 GPUs
- When run continuously over 24 hours, the tool calculated that each GPU emitted 1.44kg of CO2 per day
While the exact number is obscure, reports indicate that ChatGPT is estimated to run on 30,000 GPUs per day. This is a conservative measure, and the true number of GPUs deployed to keep the generative AI tool running is likely higher.
- If we assume a minimum of 30,000 GPUs are in use, that means 43,200kg CO2 is being emitted daily.
- We know that ChatGPT received 10 million queries per day during launch week in November 2022. This is likely much higher at time of writing, but without official figures, we’ll use this as our baseline.
- 43,200kg CO2/10 million queries = 4.32g CO2 per search query
Not all doom and gloom
However, it’s not all cloudy skies on the horizon. The future could be brighter with talks of OpenAI developing their own energy-efficient chips, as well as continuous advancements in hardware and AI optimization.
OpenAI has also shown they’re doing their part to lower their carbon footprint, opting to use Microsoft’s Azure cloud provider, which is carbon neutral.
They were also intentional about using the A100 GPUs, which are 5x more energy efficient than CPU systems for generative AI applications.