OpenAI’s Sam Altman Defends AI’s Energy Costs: ‘It Also Takes a Lot of Energy to Train a Human’

Altman says AI “probably” caught up with energy efficiency, which appears to be untrue.

A man in a gray sweater stands on a stage with blue curtains, looking to the side.
Getty Images

OpenAI CEO Sam Altman went to bat for AI’s energy cost with an interesting comparison: raising a human.

In an interview with Anant Goenka from Express Adda, Altman acknowledged the amount of energy it takes to run AI, a common criticism as its popularity grows.

”People talk about how much energy it takes to train an AI model relative to how much it takes a human to do one inference inquiry,” Altman said. But it also takes a lot of energy to train a human. It takes like 20 years of life and all of the food you eat during that time before you get smart.”

He further goes on to say AI has “probably caught up” energy efficiency-wise. He did not provide a citation.


According to the MIT Technology Review, the energy costs seem small but add up, estimating that by 2028, “more than half of the electricity going to data centers will be used for AI. At that point, AI alone could consume as much electricity annually as 22% of all US households.”

Elsewhere, Altman slammed Elon Musk, responsible for rival AI model Grok, for proposing data centers in space of all places.
“I honestly think the idea with the current landscape of putting data centers in space is ridiculous," he said.

"We are not there yet," Altman continued. "There will come a time. Space is great for a lot of things. Orbital data centers are not something that's going to matter at scale this decade."

Unsurprisingly, people slammed Altman’s comments as callous. Check out some reactions below.

Stay ahead on Exclusives

Download the Complex App