Altman lately shared a concrete determine for the power and water consumption of ChatGPT queries. In response to his weblog publish, every question to ChatGPT consumes about 0.34 Wh of electrical energy (0.00034 KWh) and about 0.000085 gallons of water. The equal to what a high-efficiency lightbulb makes use of in a few minutes and roughly one-fifteenth of a teaspoon.
That is the primary time OpenAI has publicly shared such information, and it provides an necessary information level to ongoing debates concerning the environmental impression of enormous AI programs. The announcement sparked widespread dialogue – each supportive and skeptical. On this publish I analyze the declare and unpack reactions on social media to take a look at the arguments on each side.
What Helps the 0.34 Wh Declare?
Let’s take a look at the arguments that lend credibility to OpenAI’s quantity.
1. Impartial estimates align with OpenAI’s quantity
A key motive some think about the determine credible is that it aligns carefully with earlier third-party estimates. In 2025, analysis institute Epoch.AI estimated {that a} single question to GPT-4o consumes roughly 0.0003 KWh of power – carefully aligning with OpenAI’s personal estimate. This assumes GPT-4o makes use of a mixture-of-experts structure with 100 billion energetic parameters and a typical response size of 500 tokens. Nevertheless, they don’t account for different elements than the power consumption by the GPU servers and they don’t incorporate energy utilization effectiveness (PUE) as is in any other case customary.
A latest educational examine by Jehham et al (2025) estimates that GPT-4.1 nano makes use of 0.000454 KWh, o3 makes use of 0.0039 KWh and GPT-4.5 makes use of 0.030 KWh for lengthy prompts (roughly 7,000 phrases of enter and 1,000 phrases of output).
The settlement between the estimates and OpenAI’s information level means that OpenAI’s determine falls inside an affordable vary, at the very least when focusing solely on the stage the place the mannequin responds to a immediate (referred to as “inference”).

2. OpenAI’s quantity is likely to be believable on the {hardware} degree
It’s been reported that OpenAI servers 1 billion queries per day. Let’s think about the maths behind how ChatGPT may serve that quantity of queries per day. If that is true, and the power per question is 0.34 Wh, then the overall each day power might be round 340 megawatt-hours, in accordance with an trade knowledgeable. He speculates that this might imply OpenAI may help ChatGPT with about 3,200 servers (assuming Nvidia DGX A100). If 3,200 servers need to deal with 1 billion each day queries, then every server must deal with round 4.5 prompts per second. If we assume one occasion of ChatGPT’s underlying LLM is deployed on every server, and that the typical immediate leads to 500 output tokens (roughly 375 phrases, in accordance with OpenAI’s rule of thumb), then the servers would want to generate 2,250 tokens per second. Is that life like?
Stojkovic et al (2024) have been in a position to obtain a throughput of 6,000 tokens per second from Llama-2–70b on an Nvidia DGX H100 server with 8 H100 GPUs.
Nevertheless, Jegham et al (2025) have discovered that three completely different OpenAI fashions generated between 75 and 200 tokens per second on common. It’s, nonetheless, unclear how they arrived at this.
So it appears that evidently we can not reject the concept that 3,200 servers may have the ability to deal with 1 billion each day queries.
Why some specialists are skeptical
Regardless of the supporting proof, many stay cautious or vital of the 0.34 Wh determine, elevating a number of key issues. Let’s check out these.
1. OpenAI’s quantity would possibly miss main elements of the system
I believe the quantity solely consists of the power utilized by the GPU servers themselves, and never the remainder of the infrastructure – equivalent to information storage, cooling programs, networking tools, firewalls, electrical energy conversion loss, or backup programs. This can be a frequent limitation in power reporting throughout tech corporations.
As an example, Meta has additionally reported GPU-only power numbers prior to now. However in real-world information facilities, GPU energy is simply a part of the total image.
2. Server estimates appear low in comparison with trade experiences
Some commentators, equivalent to GreenOps advocate Mark Butcher, argue that 3,200 GPU servers appears far too low to help all of ChatGPT’s customers, particularly should you think about world utilization, excessive availability, and different purposes past informal chat (like coding or picture evaluation).
Different experiences recommend that OpenAI makes use of tens and even a whole bunch of 1000’s of GPUs for inference. If that’s true, the overall power use might be a lot greater than what the 0.34 Wh/question quantity implies.
3. Lack of element raises questions
Critics, eg David Mytton, additionally level out that OpenAI’s assertion lacks fundamental context. As an example:
- What precisely is an “common” question? A single query, or a full dialog?
- Does this determine apply to only one mannequin (e.g., GPT-3.5, GPT-4o) or a mean throughout a number of?
- Does it embrace newer, extra complicated duties like multimodal enter (e.g., analyzing PDFs or producing photographs)?
- Is the water utilization quantity direct (used for cooling servers) or oblique (from electrical energy sources like hydro energy)?
- What about carbon emissions? That relies upon closely on the placement and power combine.
With out solutions to those questions, it’s arduous to know the way a lot belief to position within the quantity or how you can examine it to different AI programs.
Views
Are large tech lastly listening to our prayers?
OpenAI’s disclosure comes within the wake of Nvidia’s launch of knowledge concerning the embodided emissions of the GPU’s, and Google’s weblog publish concerning the life cycle emissions of their TPU {hardware}. This might recommend that the companies are lastly responding to the various calls which were made for extra transparency. Are we witnessing the daybreak of a brand new period? Or is Sam Altman simply enjoying methods on us as a result of it’s in his monetary pursuits to downplay the local weather impression of his firm? I’ll depart that query as a thought experiment for the reader.
Inference vs coaching
Traditionally, the numbers that now we have seen estimated and reported about AI’s power consumption has associated to the power use of coaching AI fashions. And whereas the coaching stage might be very power intensive, over time, serving billions of queries (inference) can really use extra complete power than coaching the mannequin within the first place. My very own estimates recommend that coaching GPT-4 might have used round 50–60 million KWh of electrical energy. With 0.34 Wh per question and 1 billion each day queries, the power used to reply person queries would surpass the power use of the coaching stage after 150-200 days. This lends credibility to the concept that inference power is price measuring carefully.
Conclusion: A welcome first step, however removed from the total image
Simply as we thought the controversy about OpenAI’s power use had gotten outdated, the notoriously closed firm stirs it up with their disclosure of this determine. Many are enthusiastic about the truth that OpenAI has now entered the controversy concerning the power and water use of their merchandise and hope that this is step one in the direction of higher transparency concerning the ressource draw and local weather impression of massive tech. Alternatively, many are skeptical of OpenAI’s determine. And for good motive. It was disclosed as a parenthesis in a weblog publish about an a completely completely different subject, and no context was given in any respect as detailed above.
Although we is likely to be witnessing a shift in the direction of extra transparency, we nonetheless want lots of info from OpenAI so as to have the ability to critically assess their 0.34 Wh determine. Till then, it needs to be taken not simply with a grain of salt, however with a handful.
That’s it! I hope you loved the story. Let me know what you suppose!
Observe me for extra on AI and sustainability and be at liberty to observe me on LinkedIn.