AI

Sam Altman reveals energy usage of ChatGPT

There was a whole debate going on around how much energy Chat GPT uses and its effects on the climate. Well, Sam Altman, the CEO of OpenAI, has addressed concerns about ChatGPT’s energy usage, stating that people shouldn’t be worried about it.

According to Altman, an average ChatGPT query uses approximately 0.34 watt-hours of electricity. To put this into perspective, he compared it to roughly one second of an oven’s usage or a couple of minutes of a high-efficiency lightbulb’s usage.

In terms of water consumption, Altman stated that a single ChatGPT query uses about 0.000085 gallons of water, which is roughly equivalent to one-fifteenth of a teaspoon. This water is primarily used for cooling the servers in the data centers that power the AI models.

While these individual figures might seem negligible in isolation, ChatGPT has over 400 million weekly users, with many likely engaging in multiple prompts daily. When scaled up to billions of queries per day across all AI tools (including Google Gemini and Anthropic’s Claude), the total energy and water consumption becomes significant.

Altman’s statement also touches on his broader vision for AI, suggesting that the “cost of intelligence should eventually converge to near the cost of electricity.” This implies that as AI advancements continue, the operational expenses will primarily be driven by energy prices. He foresees a future where both intelligence and energy become “wildly abundant” in the 2030s, which will be fundamental for human progress.

Despite Altman’s attempt to ease concerns, the debate around AI’s environmental footprint continues. Projections from earlier this year suggest that global AI operations could consume more electricity than Bitcoin mining by the end of 2025. There have also been reports, like one from MIT Technology Review, that a five-second AI video can use as much energy as a microwave running for an hour or more.

While individual query consumption might be low, the sheer scale of AI usage requires efforts towards more sustainable practices in the deployment of AI technologies.