Sam Altman Says Training a Human Costs More Energy Than ChatGPT — He’s Serious
The OpenAI CEO just compared your entire childhood to a GPU cluster. At an AI summit in India. With a straight face.
AI data centers are projected to consume 90 TWh of electricity by 2026 — roughly the annual power consumption of Belgium. Altman’s defense? “It takes 20 years of life and all of the food you eat before you get smart.”
Between you and me, when a CEO starts comparing server farms to human evolution, that’s the tell. That’s the moment you know the energy bills got ugly enough to need a PR strategy.
Why He Said This in India, Specifically
This wasn’t random. Altman was at an AI summit in India — a country where:
Electricity costs are a top-of-mind issue for both government and voters
Data center buildouts are accelerating fast
Local pushback against power-hungry tech infrastructure is growing
OpenAI is actively expanding its user base there (hundreds of millions of potential users)
The play → get ahead of the energy narrative before Indian regulators or media turn it into a problem. Same playbook he’s run in every market. Show up, charm the room, reframe the question before anyone pins you down on the actual watts.