New Study Reveals ChatGPT Uses Less Energy Than Previously Estimated
A new study has challenged previous estimates of ChatGPT’s power consumption, revealing that the AI chatbot consumes significantly less energy than earlier reports suggested. However, as AI models become more complex and widely used, global energy demands may still rise substantially.
Updated Energy Consumption Estimates
A recent Epoch AI analysis found that a typical ChatGPT query consumes around 0.3 watt-hours, far lower than the commonly cited 3 watt-hours per query. This earlier estimate implied that ChatGPT was 10 times more power-hungry than a Google search.
Read More: Data breach from Chinese chatbot DeepSeek sparks security concerns.
Expert Insights
According to Joshua You, a data analyst at Epoch AI, ChatGPT’s power consumption is lower than everyday activities such as using household appliances or driving a car. However, he emphasized that these figures remain approximations, as OpenAI has not disclosed full details of ChatGPT’s energy consumption.
AI’s Growing Infrastructure & Environmental Concerns
Despite improvements in AI hardware efficiency, concerns about AI’s rising energy consumption continue. Recently, over 100 organizations signed an open letter urging AI companies and regulators to prevent AI data centers from straining power grids or increasing dependence on fossil fuels.
Outdated Estimates & Advancements in AI Hardware
The widely circulated 3-watt-hour estimate originated from an older study, which assumed OpenAI was using outdated chips. However, advancements in AI hardware have led to more energy-efficient processing, reducing power consumption per query.
Future AI Energy Demands: A Growing Challenge
A Rand Corporation report predicts that by 2027, AI data centers could consume nearly all of California’s 2022 power capacity (68 GW). By 2030, training a highly advanced AI model may require as much electricity as eight nuclear reactors (8 GW).
According to Reuters, OpenAI and its partners—including Microsoft, SoftBank, Oracle, and MGX—plan to invest billions of dollars in AI infrastructure expansion. However, as AI moves toward reasoning models, which spend more time processing information, energy consumption may increase further.
Balancing AI Growth With Energy Efficiency
To address energy concerns, OpenAI has begun developing power-efficient AI models, such as GPT-4o-mini. However, Joshua You suggested in his TechCrunch interview that users concerned about energy consumption should consider:
- Limiting AI usage
- Choosing smaller, energy-efficient models
The Future of AI & Energy Consumption
While improvements in AI efficiency continue, the global adoption of AI may counteract these gains, leading to higher energy demands in the coming years. The challenge ahead is to balance AI innovation with sustainable energy solutions.