4/26/2025 at 7:26:27 AM
An AI doesn't need to think in the same way humans think. It just needs to achieve results (that are better, or at least equal to humans).The same question has been asked of chess "ai" in the past - that chess ai isn't thinking, it's "just" searching through all possibilities etc. And yet, the result is that no humans can beat chess ais now-a-days.
by chii
4/26/2025 at 2:43:17 PM
That an LLM does not need to think to produce the output we want seems fairly uncontroversial. However, a statement like “LLMs may think, just not in the same way humans think, to produce the output we want” is problematic.“The same way humans think” is the only kind of “think” that matters, for all intents and purposes. If we cannot define what it specifically is—because it loops us immediately back to the definition of consciousness et al.—the most precise definition of it will have to be along the lines of “the sort of thing that goes on in human minds”.
by strogonoff
4/26/2025 at 9:27:30 AM
"The question of whether computers can think is about as interesting as the question whether submarines can swim" - Dijkstra.by Scarblac