OpenAI CEO Defends AI Energy Use, Claims Humans Cost More to "Train"
OpenAI CEO Sam Altman pushed back against concerns about artificial intelligence's environmental footprint on Monday, arguing that raising a human child to adulthood consumes far more resources than training an AI model—a comparison that drew both laughter and scrutiny at an industry summit in India.
Speaking at the India AI Impact summit on February 24, Altman dismissed questions about ChatGPT's water consumption as "completely untrue, totally insane," while acknowledging that concerns about electricity usage were "fair." The exchange highlights a growing tension as finance leaders evaluate the infrastructure costs of AI adoption against promised productivity gains.
When pressed on energy requirements, Altman offered an unconventional defense: humans themselves are resource-intensive to develop. "It also takes a lot of energy to train a human," he told the audience, according to video posted by The Indian Express. "It takes, like, 20 years of life, and all of the food you eat during that time before you get smart."
He extended the argument further, noting that modern humans represent the culmination of evolutionary learning spanning "100 billion people that have ever lived and learned not to get eaten by predators and learned how to, like, figure out science or whatever to produce you."
The OpenAI chief argued that fair comparisons should measure the energy an AI uses to answer a single query against a human performing the same task—not the upfront training costs. "Probably, AI has already caught up on an energy efficiency basis measured that way," he claimed.
On water usage specifically, Altman said data centers powering ChatGPT have largely moved away from "evaporative cooling" systems that consume significant water to prevent server overheating, though he provided no specific figures on current consumption levels.
The CEO did concede that AI's power demands require urgent infrastructure changes. "We need to move toward nuclear, or wind, or solar [energy] very quickly," he said.
For context on the scale involved, Altman stated in a June 2025 blog post that each ChatGPT query consumes approximately 0.34 watt-hours of electricity—roughly equivalent to running an oven for one second. However, he published that figure before OpenAI released its GPT-5 model and subsequent upgrades, and energy consumption varies significantly based on query complexity, with image generation requiring substantially more power than simple text responses.
The remarks come as CFOs and finance leaders grapple with the true cost of AI deployment. While vendors tout efficiency gains, the infrastructure requirements—from specialized chips to cooling systems to electricity contracts—represent capital expenditures that don't appear in software-as-a-service pricing models. Altman's comparison to human workers may resonate with executives evaluating headcount reduction, but it sidesteps the question of whether organizations will face unexpected utility bills or data center capacity constraints as AI usage scales.
The CEO's defensive posture also signals a shift in the AI industry's messaging. Early pitches focused on capability and speed; now, as models proliferate and power grids strain, executives are being asked to justify the environmental cost of automation—a question that will likely intensify as regulators and investors demand clearer sustainability metrics.


















Responses (0 )