The University of Rhode Island’s AI lab estimates that GPT-5 averages just over 18 Wh per query, so putting all of ChatGPT’s reported 2.5 billion requests a day through the model could see energy usage as high as 45 GWh.
A daily energy use of 45 GWh is enormous. A typical modern nuclear power plant produces between 1 and 1.6 GW of electricity per reactor per hour, so data centers running OpenAI’s GPT-5 at 18 Wh per query could require the power equivalent of two to three nuclear power reactors, an amount that could be enough to power a small country.
That doesn’t seem right. By my calculations it should be like 5¢. Can you show your work?
Depends on your electric rates, of course. The gotcha in this statement is “per thousand requests” which cranks up the power usage from 40 watt-hours to 40 kilowatt hours. Say you’ve got “affordable” electricity at 12.5 cents per kilowatt hour: 40 * .125 = 5.