
Study Reveals ChatGPT's Energy Consumption Lower Than Previously Estimated
A recent study conducted by Epoch AI, a nonprofit AI research institute, has revealed that the energy consumption of OpenAI's ChatGPT is significantly lower than previously estimated. The study found that a typical ChatGPT query using OpenAI's latest model, GPT-4o, consumes approximately 0.3 watt-hours of electricity. This figure is substantially lower than the earlier estimate of 3 watt-hours per query, which was based on outdated assumptions and less efficient hardware.
Joshua You, a data analyst at Epoch AI, explained that the previous estimates were based on older research that assumed the use of less efficient chips. The new analysis, however, uses up-to-date facts and clearer assumptions, showing that ChatGPT's energy consumption is less than that of many household appliances.
Despite the lower energy consumption figures, You noted that the 0.3 watt-hour estimate is still an approximation, as OpenAI has not disclosed detailed energy consumption data. The analysis also does not account for additional energy costs from features like image generation or processing long input queries, which could increase energy usage.
Looking ahead, You anticipates that energy consumption may rise as AI technology advances and models become more complex. OpenAI and its partners are planning significant investments in AI data centers, which are expected to increase electricity demand. For users concerned about energy consumption, You suggests reducing usage frequency or choosing models with lower computational demands.
We hope you enjoyed this article.
Consider subscribing to one of several newsletters we publish. For example, in the Daily AI Brief you can read the most up to date AI news round-up 6 days per week.
Also, consider following our LinkedIn page AI Brief.
More from: Chips & Data Centers
Subscribe to Daily AI Brief
Daily report covering major AI developments and industry news, with both top stories and complete market updates