Tech Explained: Sam Altman says AI may be energy-intensive, but humans consume a lot of energy, too  in Simple Terms

Tech Explained: Here’s a simplified explanation of the latest technology update around Tech Explained: Sam Altman says AI may be energy-intensive, but humans consume a lot of energy, too in Simple Termsand what it means for users..

OpenAI CEO Sam Altman doubled down on his argument that artificial intelligence (AI) should not be singled out for its energy and water footprint, since humans also consume a lot of energy and resources to become “smart”.

Speaking at an Indian Express event in New Delhi, Altman said public debate often fixates on the electricity used to train large AI models while ignoring the cost of human intelligence. “People talk about how much energy it takes to train an AI model,” he said. “But it also takes a lot of energy to train a human. It takes like 20 years of life and all of the food you eat during that time before you get smart.” He later invoked human evolution itself, noting that “the 100 billion people that have ever lived” and learned “not to get eaten by predators” were part of the process that produced modern human capabilities.

During the session, Altman dismissed viral claims that each ChatGPT query requires the same energy as charging a smartphone, saying they were “totally fake.” He also deemed as misinformation the reports of each ChatGPT query using up about 17-18 gallons of water. Citing his own numbers from a previous blog post, he said an average ChatGPT request uses around 0.34 watt‑hours of electricity and roughly 0.000085 gallons of water, or about one‑fifteenth of a teaspoon. This water is for data‑centre cooling, a problem for environmental groups, which have flagged AI’s impact on local water supplies.

Altman said AI is a resource-intensive technology, and framed it as part of a broader transition to abundant clean energy.

However, Altman iterated that answers to all energy questions are still on earth, calling the idea of putting data centres in space “ridiculous” for now, in a veiled dig at xAI’s Elon Musk. He said the economics and logistics just don’t make sense yet. The comment comes just weeks after Musk announced a merger of SpaceX and xAI to build orbital computing facilities for his AI models.

“If you just do the rough math of launch costs relative to the cost of power we can do on Earth, just say nothing of how you’re gonna fix a broken GPU in space, we are not there yet,” said Altman.