Tech Explained: Here’s a simplified explanation of the latest technology update around Tech Explained: Sam Altman says AI may be energy-intensive, but humans consume a lot of energy, too in Simple Termsand what it means for users..
Speaking at an Indian Express event in New Delhi, Altman said public debate often fixates on the electricity used to train large AI models while ignoring the cost of human intelligence. “People talk about how much energy it takes to train an AI model,” he said. “But it also takes a lot of energy to train a human. It takes like 20 years of life and all of the food you eat during that time before you get smart.” He later invoked human evolution itself, noting that “the 100 billion people that have ever lived” and learned “not to get eaten by predators” were part of the process that produced modern human capabilities.
During the session, Altman dismissed viral claims that each ChatGPT query requires the same energy as charging a smartphone, saying they were “totally fake.” He also deemed as misinformation the reports of each ChatGPT query using up about 17-18 gallons of water. Citing his own numbers from a previous blog post, he said an average ChatGPT request uses around 0.34 watt‑hours of electricity and roughly 0.000085 gallons of water, or about one‑fifteenth of a teaspoon. This water is for data‑centre cooling, a problem for environmental groups, which have flagged AI’s impact on local water supplies.
Altman said AI is a resource-intensive technology, and framed it as part of a broader transition to abundant clean energy.
However, Altman iterated that answers to all energy questions are still on earth, calling the idea of putting data centres in space “ridiculous” for now, in a veiled dig at xAI’s Elon Musk. He said the economics and logistics just don’t make sense yet. The comment comes just weeks after Musk announced a merger of SpaceX and xAI to build orbital computing facilities for his AI models.
“If you just do the rough math of launch costs relative to the cost of power we can do on Earth, just say nothing of how you’re gonna fix a broken GPU in space, we are not there yet,” said Altman.
Many said the OpenAI chief’s comments underplay local impacts such as higher electricity prices, water stress around data‑centre clusters, and the opportunity cost of diverting large chunks of grid capacity to AI instead of other uses.
Zoho founder Sridhar Vembu took to X to share his opinion, dismissing a world where “we equate a piece of technology to a human being”. “I work hard as a technologist to see a world where we don’t allow technology to dominate our lives, instead it should quietly recede into the background,” Vembu wrote.
Some also see Altman’s “significant fraction of Earth’s power should go to AI” stance and his push for multi‑gigawatt AI data centres as symbolic of an aggressive pro‑growth tech mindset, which will go up against overarching climate concerns.
