The GPT-5 Paradox: Brilliant AI, But at What Environmental Cost?

Date:

OpenAI’s latest creation, GPT-5, is making headlines for its advanced capabilities, but a growing chorus of experts is raising a critical concern: its potentially massive energy consumption. As the company remains silent on the model’s resource usage, experts are voicing serious concerns. They argue that the enhanced abilities of GPT-5, such as its ability to create websites and answer PhD-level questions, come with a steep and unprecedented environmental cost. This lack of transparency from a leading AI developer is raising serious questions about the industry’s commitment to sustainability.
A key indicator of this trend is the energy required for a single query. Researchers at the University of Rhode Island’s AI lab found that generating a medium-length response of about 1,000 tokens with GPT-5 can consume an average of 18 watt-hours. This is a dramatic increase from previous models. To put this into perspective, 18 watt-hours is equivalent to an incandescent light bulb burning for 18 minutes. Given that a service like ChatGPT handles billions of requests daily, the aggregate consumption could be staggering, potentially reaching the daily electricity demand of millions of homes.
The increase in energy use is directly linked to the model’s size and complexity. Experts believe GPT-5 is significantly larger than its predecessors, with a greater number of parameters. This aligns with a study by French AI company Mistral, which found a strong correlation between a model’s size and its energy consumption. A model 10 times bigger, the study concluded, will have an impact that is an order of magnitude larger. This principle seems to be holding true for GPT-5, with some experts suggesting its resource use could be “orders of magnitude higher” than even GPT-3.
Compounding the issue is the new model’s architecture. While it does use a “mixture-of-experts” system to improve efficiency, its reasoning capabilities and ability to handle video and images likely counteract these gains. The “reasoning mode,” which involves the model computing for a longer time before generating a response, could make its resource footprint several times greater than text-only operations. This combination of size, complexity, and advanced features paints a clear picture of an AI system with a voracious appetite for power, leading to urgent calls for greater transparency from OpenAI and the broader AI industry.

Related articles

Mark Zuckerberg Tried to Build the Metaverse — Now He’s Trying to Build Trust Back After $80 Billion

Trust is harder to rebuild than a product. Meta is shutting down Horizon Worlds on VR — removed...

Instagram’s DM Encryption Ends: Meta’s Quiet but Major Privacy Shift

Meta has quietly but decisively ended end-to-end encryption for Instagram direct messages. The change is confirmed for May...

Musk’s AI Power Hub Approved: Mississippi Regulators Issue Turbine Permit

Elon Musk’s xAI has won a significant victory after Mississippi officials approved a permit for 41 methane gas...

OpenAI’s Pentagon Contract Highlights the High Stakes of AI Ethics Under Political Pressure

High stakes and principled commitments have collided in one of the most dramatic episodes in AI industry history,...