Sam Altman Defends AI's Massive Energy Use, Compares Model Training to 'Training a Human' Over 20 Years
OpenAI CEO Sam Altman pushed back against growing concerns over artificial intelligence's voracious electricity consumption during a recent interview in India, arguing that critics unfairly focus on the power needed to train large models while overlooking the resources required to develop human intelligence.
"It also takes a lot of energy to train a human," Altman said Feb. 21, 2026, during an onstage discussion at the India AI Impact Summit hosted by The Indian Express. "It takes like 20 years of life, and all of the food you eat during that time before you get smart."

The remark, which drew laughter from the audience, came as Altman addressed questions about AI's environmental footprint amid exploding demand for data centers powering systems like ChatGPT. He described direct comparisons between AI training costs and a single human query as "always unfair," emphasizing that once a model is trained, individual inferences — or responses — become highly efficient.
"The fair comparison is if you ask ChatGPT a question, how much energy does it take once a model is trained to answer that question, versus a human, and probably AI has already caught up on an energy efficiency basis, measured that way," he added.
Altman's comments arrived against a backdrop of intensifying scrutiny on the tech industry's power hunger. Data centers, fueled by AI workloads, are projected to consume far more electricity in coming years, with some forecasts suggesting AI could drive significant increases in global demand. The International Energy Agency has noted that data center electricity use rose rapidly, accounting for about 1.5% of worldwide consumption in recent years and expected to grow sharply.
OpenAI and partners have invested heavily in infrastructure, including massive GPU clusters for training successive GPT models. Critics argue this scale strains grids, contributes to carbon emissions if powered by fossil fuels and exacerbates local energy challenges in regions hosting facilities.
Altman conceded that total energy consumption from widespread AI adoption is a legitimate issue. "Not per query, but in total — because the world is using so much AI ... and we need to move towards nuclear, or wind, or solar very quickly," he said, advocating for a rapid shift to cleaner energy sources to support the industry's growth.
He was more dismissive on water usage claims, calling viral assertions that each ChatGPT query consumes gallons of water "completely untrue, totally insane" and bearing "no connection to reality." Altman explained that older data centers relied on evaporative cooling, which used significant water, but modern facilities have largely abandoned that method. He cited his own June 2025 blog post estimating an average ChatGPT request at about 0.34 watt-hours of electricity — roughly equivalent to an oven's use for a second — and minimal water, around one-fifteenth of a teaspoon for cooling.
The human-training analogy extended beyond individual development. Altman referenced the broader evolutionary and civilizational costs: "It took like the very widespread evolution of the hundred billion people that have ever lived and learned not to get eaten by predators and learned how to figure out science and whatever to produce you."
The quip sparked mixed reactions online and in media. Supporters viewed it as a clever reframing of the debate, highlighting that intelligence — whether biological or artificial — demands upfront investment for long-term efficiency. Detractors called it tone-deaf or evasive, arguing it sidesteps accountability for AI's accelerating environmental impact compared to human baselines. Social media posts on platforms like Reddit and X mocked the "meat computer" comparison, with some noting the human brain's remarkable efficiency — orders of magnitude better than current AI hardware for certain tasks.
Environmental groups and researchers have long flagged AI's resource demands. Reports indicate training a single large language model can emit carbon equivalent to hundreds of transatlantic flights, while inference at scale across billions of queries adds up. Altman has previously acknowledged these realities, including in public writings, but maintains that innovation will outpace problems if energy transitions accelerate.
The comments align with OpenAI's broader push for advanced AI amid competition from Google, Anthropic, Meta and others. The company continues scaling models like potential successors to GPT-4o and o1, requiring ever-larger compute resources. Partnerships with Microsoft provide Azure infrastructure, but Altman has stressed the need for breakthroughs in energy production, including nuclear, to sustain progress.
As AI integrates deeper into daily life — from productivity tools to creative applications — debates over sustainability intensify. Altman's defense underscores the industry's view: the transformative benefits of AI justify resource use, provided infrastructure evolves responsibly.
Critics counter that equating biological upbringing to silicon-based training glosses over choices in energy sourcing and efficiency. Humans consume energy continuously, trained or not, while AI's costs concentrate in bursts of training followed by low-per-query use.
For now, the analogy has fueled discourse on AI ethics, environmental responsibility and the true cost of intelligence. With demand for compute showing no signs of slowing, Altman's call for nuclear, wind and solar expansion highlights a shared challenge: powering the future without derailing planetary goals.
Whether the human-training parallel resonates or rankles, it spotlights the stakes as AI races forward.
© Copyright 2026 IBTimes AU. All rights reserved.




















