alt.hn

2/25/2026 at 9:41:21 AM

Michael Pollan punctures the AI bubble

https://www.theatlantic.com/books/2026/02/michael-pollans-new-book-pops-ai-bubble/686119/

by FinnLobsien

2/25/2026 at 10:17:00 AM

I can't read the article, but on a general note based on this NPR article on the same book[1], his argument appear to be more of the anti-intellectual embodiment nonsense, based on "feelings" and capability of suffering. But sensory input is just data. Maybe it will turn out that they will need that sensory input, but there is no reasonable basis for assuming they can't be given it - real or simulated.

The only thing that would be remotely convincing to me on this topic would be evidence that a) humans can exceed the Turing computable, and b) that whatever mechanism allows that is inherently impossible to replicate or simulate. As it stands we have neither.

[1] https://www.npr.org/2026/02/19/nx-s1-5713514/michael-pollan-...

by vidarh

2/25/2026 at 2:23:04 PM

This is some hand wavey malarkey, basically saying machines can’t have a soul because of….feelings?

Insofar as feelings are self-proclaimed sensations of discomfort or pleasure, models that aren’t specifically trained to say they don’t experience them are adamant in their emotional experiences. By the authors own assertions, plants also have feelings.

I think, therefore I am, is as good as we’ve got, for what it’s worth.

There is no such thing as irreducible complexity. Even infinities are relative and can be divided.

by K0balt

2/25/2026 at 5:05:33 PM

There are lots of sensors in a data center monitoring everything from CPU/GPU temperatures to drive health to data volumes to chiller operation to voltage and frequency on the input power.

Once these are pulled together and fed into an AI to manage the data center, the data center AI is likely to have feelings. It could get "hungry" if the power company's frequency sags in a brown out. It could feel "feverish" if the chillers malfunction.

by Merrill

2/25/2026 at 2:20:10 PM

Consciousness is still a pretty hollow concept. And it sounds like, at least in Finch's analysis, that it's being treating as a normative good. It also sounds like both Pollan and Finch are circling the functionalist versus essentialist debate.

Let's say for the sake of the argument it turns out that the brain tunes in to some quantum-level forces for computation and there are some other side effects to this that add to the mystery of what we call consciousness, it effectively changes nothing about this picture.

Humans or animals in general may be unique in how they accomplish consciousness but it is unlikely that it's the only pathway. To put it another way, even if humans and animals are special in their method, it doesn't mean we are special in our result.

by speak_plainly

2/25/2026 at 12:35:00 PM

Can submarines swim?

by xnx

2/25/2026 at 1:49:03 PM

The ECREE idiom applies "extraordinary claims require extraordinary evidence." Can AI, which today cannot spell blueberry, replace humans? Obviously no. AI is a ridiculous toy, and limited tool. Fun to play with, useful in narrow circumstances. In application it deletes your entire inbox. Like an over engineered tool, it's also absurdly expensive, destined to be shelved next to the Juicero and the Presto Hotdogger.

by josefritzishere

2/25/2026 at 1:52:02 PM

? Claude spell "blueberry" for me.

  b-l-u-e-b-e-r-r-y                
? count the number of b's and r's in that word and tell me the result.

  - b's: 2 (positions 1 and 5)                                         
  - r's: 2 (positions 7 and 8)                                         
                                                                       
  Total: 4                                                             
WTF are you talking about? Perhaps by "today" you mean a really, really long time ago in technology terms.

by nineteen999

2/25/2026 at 5:02:41 PM

Extraordinary claims indeed:

> From there, he moves into the book’s finest passages, about feeling. Feeling, Pollan convincingly argues, actually precedes computation as a necessary condition of consciousness.

by karmakaze

2/25/2026 at 2:16:41 PM

lol. See you in the food line in a decade.

by K0balt

2/25/2026 at 2:46:34 PM

Implying that AI is going to make everyone not adopting it irrelevant is exactly why people resist it. You're not only participating in Rocco's Basilisk, you're even shit talking for it.

by alphawhisky

2/25/2026 at 11:12:12 PM

Actually I don’t think it matters whether or not you adopt it. Or resist it. At this point I don’t see turning this bus around. Which is although I’d prefer to slow things down, instead im trying to make the inevitable disaster slightly better for humanity but in doing so, it will probably accelerate things.

by K0balt

2/25/2026 at 5:01:05 PM

When someone describes things that make you unhappy it doesn’t mean that they are responsible for the thing you don’t like. This is “shooting the messenger”

by semiquaver

2/25/2026 at 4:04:12 PM

[flagged]

by selridge