2/24/2026 at 3:10:38 PM
This article is dripping with the same kind of cringey techno-engineering naivete you find in hollywood movies. The author is totally lost in the sauce of complex surface level analyses mixed with romantic ideals of human exceptionalism, and completely blind to the deeper abstractions and common under girding systems that an expertise in computation would reveal (and don't have any care for emotional concepts).The takeaway seems to be "Only meat brains can be conscious because I can feel it and computers aren't made of meat". Which is basically the plot line of every human/robot movie for the last 80 years.
by WarmWash
2/24/2026 at 5:23:11 PM
The interesting version of the argument isn't about substrate: it's about motivation.Present the trolley problem to GPT-4 and it gives you a philosophy survey answer.
Present it to a human and their palms sweat. The gap isn't computation, it's that humans are value-making machines shaped by millions of years of selection pressure.
Pollan lands on the wrong argument (biology vs. silicon) when the real one is: where do the values come from, and can they emerge without a reproductive lineage that stakes survival on getting them right?
by adamzwasserman
2/25/2026 at 4:05:00 AM
I refer to this as moral grounding.I'm not sure I would call it a requirement for consciousness, but knowing that most beings with general intelligence (humans) have a form of it similar to my own does make it easier to sleep at night.
by judahmeek
2/24/2026 at 10:48:42 PM
[dead]by ath3nd
2/25/2026 at 10:10:36 AM
I'd hazard a counter prediction that we'll have AI seeming pretty conscious within a decade. People will say it's not real in the same way they say said driving cars will never work when they were driving around but it will become hard to argue against when you can hang out with real examples.by tim333