alt.hn

2/22/2026 at 7:17:25 PM

Training a Human Takes 20 Years of Food

https://www.news18.com/world/training-a-human-takes-20-years-of-food-sam-altman-on-how-much-power-ai-consumes-ws-kl-9922309.html

by Aldipower

2/22/2026 at 8:29:58 PM

Yes it does. It's kind of a fixed cost though, since we're going to feed and educate our youth anyway, unless Sam Altman would have those people to starve to death.

by mitthrowaway2

2/23/2026 at 5:04:42 PM

I think your almost on to something with how these people think...

by asacrowflies

2/22/2026 at 11:23:01 PM

Has there ever been at time when a wealthy person would run their mouth like this without any fear of an angry mob tearing their limbs off their body? Maybe right before the French revolution?

by lisp2240

2/23/2026 at 1:27:06 AM

I don't think my parents and grandparents spent their lives working towards a future where grifters like Altman could take everything for themselves.

by Gibbon1

2/23/2026 at 12:31:17 AM

In the Epstein files they talk about how to rid the world of poor people

by b3ing

2/23/2026 at 4:15:26 AM

Lie, rich people know that the only reason they are rich is because of poor people

by harddrivereque

2/23/2026 at 5:03:49 PM

That implies they are intelligent and self aware. The Epstein files prove this to be untrue

by asacrowflies

2/23/2026 at 12:37:25 AM

[dead]

by cindyllm

2/23/2026 at 4:26:34 AM

I get the feeling that this guy has never been punched in the mouth. Otherwise he might be more careful with what he says.

by robbbed

2/23/2026 at 5:00:38 AM

You can remove "in the mouth".

by random_duck

2/22/2026 at 8:14:43 PM

This comparison only works if you assume scaling keeps paying off. Sara Hooker's research shows (https://papers.ssrn.com/sol3/papers.cfm?abstract_id=5877662) compact models now outperform massive predecessors and scaling laws only predict pre-training loss, not downstream performance. If marginal returns on compute are falling (https://philippdubach.com/posts/the-most-expensive-assumptio...), "energy per query" hides the real problem, a trillion dollars of infrastructure built on the bet that they won't.

by 7777777phil

2/22/2026 at 10:10:46 PM

You are the carbon they want to reduce

by p0w3n3d

2/22/2026 at 11:04:37 PM

I asked ChatGPT to do some napkin math, and it seems like on average it would take a human 13.75 million kcal worth of food, in those 20 years.

by TrackerFF

2/23/2026 at 1:03:29 AM

That's 58 GWh, but considering each food calorie actually require 5-10 calories of input energy (oil mostly), let's say 290 GWh.

I couldn't find much on training AI models. Apparently GPT-3 used 1.3 GWh for training. So maybe ~10 GWh for newer models?

So... let's stop training humans I guess.

by eulgro

2/23/2026 at 4:31:14 AM

13.75 million kcal is 0.01598 GWh, not 58 GWh. So with the 5x multiplier, that's 0.08 GWh for a human.

by cinnamonteal

2/23/2026 at 12:39:18 AM

If I were one of his family, I'd advise him to hire someone to stop himself from saying things like this.

by zipping1549

2/23/2026 at 6:11:13 AM

Training an LLM takes petabytes of theft.

by brnt

2/22/2026 at 9:38:38 PM

Now compare our waste and what might be extracted, psycho.

by bravetraveler

2/22/2026 at 8:12:43 PM

This what I expect from a mid marketing team.....not a supposed visionary thought leader (/s).....

This is completely fallacious thinking that I assume is meant as a means of manipulating people who dont think deeply about the implications and procession of ideas that leads such obviously disingenuous intelluctual dishonesty....

Waste heat....is not the same as a biologically closed loop which microbes, bacyerium, myceliums, and plants and aninals all work in a concerted effort....

My Food becomes fertilizer....His waste becomes nothing of utility (unless they have amazing efficiencies that defy what we know about physics...)

by kderbyma

2/22/2026 at 9:17:52 PM

It also just doesn't make sense. Like, we train a human and that takes 20 years of food.

To train an LLM it needed a collection of 800TiB of data (The Pile). To generate that pile, you needed millions to billions of humans. So did training the LLM now suddenly take 20.000.000 billion years of food or are we not allowed to make the same shitty comparison.

by 878654Tom

2/22/2026 at 10:39:29 PM

Oh so sad … 20 years the corporate overlords have to wait for their minions to be ready…

/s

by la64710