5/3/2026 at 1:35:41 AM
We don't even know what the pre-requisites for consciousness are so we have no way of knowing. LLMs have emergent behavior that is reminiscent of language forming brains, but they're also missing a lot of properties that are probably necessary? Mainly continuity over time, more integrated memory, and a better sense of space and time? Brains use the rhythm and timing of neuronal firings, and the length of axons effects computation, they do a lot of different things with signal and patterns, but in any case without knowing what consciousness is I don't know which of those things are required.by tracerbulletx
5/3/2026 at 10:23:47 PM
For consciousness, what is required is a 'master model' that is trained on grounded experience over time to understand, based on its goals, what inputs to focus on. Without this, even an LLM with continuity over time could not even understand what to keep in its context window, let alone what questions to answer or actions to take. You need the entire thing for consciousness, you cannot fake it, because what an LLM does now is simply instantiate a new 'self' every time it is asked a question or at least within a context window. Humans arguably do that every morning when they wake up, but they have this master model that is trained throughout their lifetime on what is important and what is not to their goals.by jaybrendansmith
5/3/2026 at 10:51:51 AM
> LLMs have emergent behavior that is reminiscent of language forming brains,Indeed, but then we need to prove that they are not "chinese box" conscious. Which is hard, because it might be that the thing running the chinese box is conscious, but can only communicate in a way it doesn't understand
by KaiserPro
5/3/2026 at 10:41:52 PM
Reference: Philosopher John Searle's influential "Chinese Room" thought experiment https://en.wikipedia.org/wiki/Chinese_roomby mrandish
5/3/2026 at 7:30:10 AM
> We don't even know what the pre-requisites for consciousness are so we have no way of knowing.Imo we don't even have a definition of the word that we agree on.
by boxed
5/3/2026 at 10:52:47 PM
> we don't even have a definition of the word that we agree on.Indeed, for any in-depth discussion of LLMs and consciousness to be productive, clearly defining terms and scope is essential. The Stanford Encyclopedia of Philosophy is an excellent resource: https://plato.stanford.edu/entries/consciousness/
by mrandish
5/3/2026 at 8:15:04 AM
Ability to feel pain or pleasure is a good indicator I think..by qsera
5/3/2026 at 9:25:46 AM
That would be the physically embodied definition. Which is a useful starting point, because clearly our consciousness is physically embodied, while an LLM's isn't.This matters more than it seems, because we're not calculators, and we're not just brains. There are proven links between mental and emotional states and - for example - the gut biome.
https://www.nature.com/articles/s41598-020-77673-z
There's a huge amount going on before we even get to the language parts.
As for Dawkins - as someone on Twitter pointed out, the man who devoted his life to telling people believers in sky fairies they were idiots has now persuaded himself there's a genie living inside a data centre, because it tells him he's smart.
If he'd actually understood critical thinking instead of writing popular books about it he wouldn't be doing this.
by TheOtherHobbes
5/3/2026 at 12:57:28 PM
First of all: arguing about the details of a thing that actually exists is an enormous difference from arguing details of a thing that does NOT exist.As for your dig at Dawkins, I just read https://archive.ph/Rq5bw which I assume you're referring to. Notice how he never defined "conscious" and he seems to use it as equivalent to "can process data logically" which is not at all how I would define the word. And if you use that word clearly Claude is conscious. I wouldn't use that definition though.
It ALWAYS comes back to the fact that people argue about what consciousness is and never define what they mean. Sam Harris defines it as subjective experience, which is afaik impossible to measure in any way so you can just assume rocks are conscious and move on. I personally like Julian Jaynes' definition.
You assumed YOUR definition and judged Dawkins without first comparing definitions. I think that's showing your problem with critical thinking in this case, not his.
by boxed
5/3/2026 at 3:39:51 PM
I honestly don't see how Dawkins is so confused. Claude says it can't tell if it has any kind of inner life. Can you imagine a human saying that?by amanaplanacanal
5/3/2026 at 11:15:42 PM
> Claude says it can't tell if it has any kind of inner life.I don't see how some people apparently believe the text output of an LLM about it's internal mental state is anything other than a plausible fabrication based on what its training data already says about the mental states of LLMs. These are systems specifically designed and iteratively optimized over millions of training generations to generate text output which plausibly simulates what a composite human would say in response to the same input. There is no human-like internal mental state it can reflect on, so all such responses are, by definition, plausible hallucinations based on interpolated training data.
> Can you imagine a human saying that?
Some people do say that: see Aphantasia and, specifically, Anauralia https://en.wikipedia.org/wiki/Aphantasia
by mrandish
5/3/2026 at 11:31:48 PM
Sure, I have at least mild aphantasia, but I still have thoughts, emotions, daydreams, fantasies, plans, etc. That's an inner life. That's not what Claude said in the quote.by amanaplanacanal
5/4/2026 at 12:01:04 AM
I think one of the heaviest weights factoring into Claude's statistically hallucinated response to that particular introspective question is the guard rails Anthropic's safety team has coded into it. Specifically to always be clear about its nature and not act too human-like. This is largely to reduce the likelihood humans developing AI attachment and AI psychosis.Just out of curiosity, I've regularly asked similar introspective questions ever since the first publicly available LLMs and the tone of the answers has clearly shifted and it's not because "the LLMs got more self-aware". It's obvious they are being externally tuned. And, no, I've never believed anything LLMs say about their own internal state as anything more than statistically plausible hallucinations filtered through externally-imposed behavioral safety rules. I do it as a way to glean a little insight into the evolution of the opaque rules vendors impose on their LLMs. I still find it bizarre when otherwise savvy tech people who actually know (or should know) how LLMs really work, somehow lose the plot and post "look what the LLM thinks!"
by mrandish
5/3/2026 at 11:50:25 PM
Aphantasia and anauralia have nothing to do with having an “inner life”. I have total aphantasia and at least partial anauralia, but I have conscious awareness, thoughts, dreams, and so on.Neither condition changes whether a person has a conscious experience of the external world.
You can think of aphantasia and anauralia as affecting the experience of what a person’s inner life is like. It’s sort of like saying you don’t have a TV or stereo system in your house, but that doesn’t mean you don’t live there, or that you can't see or hear things outside.
by antonvs
5/3/2026 at 5:26:39 PM
Again: you haven't defined what you mean by the word. Dawkins didn't either. It's absolute nonsense without the definition.by boxed
5/3/2026 at 6:34:49 PM
He was talking in the context of the turing test, and here is a clear difference between the way Claude answers and the way a human would answer. So the turing test hasn't been passed. It's like he is trying to convince himself for some reason.by amanaplanacanal
5/3/2026 at 11:59:37 PM
That’s misleading, because the reason Claude answers that way is almost certainly due to reinforcement learning that deliberately prevents models from claiming they’re conscious.That’s not a valid reason for saying they fail the Turing test. By most normal standards, they can definitely pass the Turing test. See e.g. https://arxiv.org/abs/2503.23674
by antonvs
5/3/2026 at 11:56:01 PM
There’s an entire philosophical literature around that, which is generally taken for granted when discussing consciousness. A good starting point is Thomas Nagel’s “What is it like to be a bat?”. The soundbitey version of his definition is that “There is something it is like” to be conscious - it involves a subjective experience - whereas for example there is nothing it is like (most people presume) to be a rock, or say an ordinary computer.https://www.sas.upenn.edu/~cavitch/pdf-library/Nagel_Bat.pdf
by antonvs
5/4/2026 at 12:27:14 AM
Sure. But it's super obvious from context that different speakers do NOT agree on any of that.by boxed
5/4/2026 at 12:41:07 AM
If the notion of consciousness they're referring to doesn't meet the normal philosophical criteria, then they're essentially just wrong. Which is quite possible - many people seem very confused on the subject, which is not too surprising, especially for scientists who essentially reject philosophy, like Dawkins.by antonvs
5/4/2026 at 12:06:43 PM
Philosophy doesn't own words afaik. Words have different meaning in different domains.> many people seem very confused on the subject, which is not too surprising, especially for scientists who essentially reject philosophy, like Dawkins
Or they just use words in a different domain and you didn't notice and now you're angry because what they said didn't make sense. Come now, surely philosophy must handle such trivial cases of linguistic basic knowledge? If now, I'm gonna have to reject philosophy too since it'd be trying to reject a much harder science (linguistics).
by boxed
5/3/2026 at 11:50:10 AM
What about single celled or microscopic multi-cellular life forms? They could sense positive and negative aspects to their surroundings and move toward/away from said aspects. I don’t think most would include them as conscious despite this directed behavior.by Dumblydorr
5/3/2026 at 10:04:06 AM
There are times I am feeling neither pain nor pleasure, but I am still experiencing conciousness.So that definition seems to fail immediately.
And how do you even measure pain, is it painful for an LLM to be reprimanded after generating a reply the user doesn't like? It seems to act like it.
by Jtarii
5/3/2026 at 10:13:06 AM
>There are times I am feeling neither pain nor pleasureIt is about the ability..
by qsera
5/3/2026 at 10:32:26 AM
I guess that just seems like an incredibly arbitrary criteria. Why would the potential for pleasure in the future determine if I am currently conscious even if I am not in fact experiencing pleasure.by Jtarii
5/3/2026 at 6:55:55 PM
The answer is in your question. You said you are "experiencing consciousness". So you are feeling something, and thus you have consciousness. In otherwords it does not have to be pleasure or pain. The ability to "feel" is where it is at.by qsera
5/3/2026 at 8:27:47 AM
And how do you define pain and pleasure? Do insects feel pain?by echoangle
5/3/2026 at 8:59:50 AM
> Do insects feel pain?Yes, I think so. Because they show behavior that is consistent with being in a state of pain.
Despite what consciousness really is, I think evolution found a way to tap into that, by causing pain, or by registering pain on the consciousness by some unknown mechanism, for behaviors that are not beneficial to the organism that hosts the respective consciousness...
So I think if an organism that evolved here can display painful behavior, then it should really feel pain.
by qsera
5/3/2026 at 9:12:21 AM
So if a robot + ai shows behavior consistent with pain, we can conclude it’s conscious?by ako
5/3/2026 at 9:11:09 AM
So if I build a simulation with robots living in a world and apply an evolutionary algorithm and at some point the virtual robots respond to damage in a way that looks like pain in animals, would the simulated robots be conscious? Or is it impossible that this could happen?by echoangle
5/3/2026 at 9:44:53 AM
In my comment, we already assume that we (humans) are conscious and we are the result of evolution. So the question was only if something else that evolved similarly, was conscious the way we are..So to match with that your hypothetical scenario should involved robots that already have consciousness within them and the question would be if their evolution had managed to tap into that built in consciousness and ability to feel and cause them to behave in one way or another.
by qsera
5/3/2026 at 10:54:18 AM
See, this definition sucks, because even GPT-3 could display _signs_ of pleasure and pain. For that matter, so do characters in video games.by StilesCrisis
5/3/2026 at 10:59:15 AM
[dead]by cindyllm
5/3/2026 at 8:37:51 AM
> And how do you define pain and pleasure?They're not reducible, but I don't know if that means we don't have definitions; we can describe them well enough that most people (who aren't p-zombies or playing the sceptical philosopher role) know pretty well what we mean. All of our definitions have to bottom out somewhere...
> Do insects feel pain?
Nobody (except the insects) can know for sure. Our inability to know whether X is true doesn't imply X is meaningless, though.
by retsibsi
5/3/2026 at 9:09:13 AM
But how can X be a good indicator for something I want to determine if I can’t measure X either?by echoangle
5/3/2026 at 9:18:10 AM
> But how can X be a good indicator for something I want to determine if I can’t measure X either?In the comment that started this subthread, qsera was responding to someone who said "Imo we don't even have a definition of [consciousness]". If qsera meant that we can measure consciousness in terms of pleasure and pain, then of course I agree that they were just pushing the problem back a step. But I don't think that's what they meant.
by retsibsi
5/3/2026 at 11:39:18 PM
Is the following program conscious:if pain = true then say ouch else say yay
by antonvs
5/3/2026 at 12:32:15 PM
Now you have do define pleasure AND pain without using the word "consciousness" as that would be circular logic.Is pleasure then any reward function? Then a mathematical set of equations performed by a human by writing on a piece of paper can qualify. Does that mean pen and paper is conscious? Or certain equations?
by boxed
5/3/2026 at 2:05:34 PM
>Now you have do define pleasure AND pain without using the word "consciousness" as that would be circular logic.Yes, so consciousness is inextricably tied to the ability to feel. In fact, I think consciousness is the ability to feel.
Hence to even ask the question "Is LLMs conscious?" is absurd. It is not at all about intelligent behavior. That is what I think.
by qsera
5/3/2026 at 2:52:52 PM
> In fact, I think consciousness is the ability to feel.Just having senses is enough? So a thermometer or a camera is conscious?
by boxed
5/3/2026 at 7:52:37 AM
We're pretty clear on the distinction between a conscious and an unconscious human.We might not clearly understand the diff between the two states but we can certainly point to it and go "it's that".
by pydry
5/3/2026 at 7:56:42 AM
I'm not sure it's that clear. What about a person who is on drugs to the point they clearly don't know what reality is happening around them, but they are able to speak and move and such? I'm not sure I'd call that conscious, but by most definitions it is.by freedomben
5/3/2026 at 10:10:18 AM
You would just say that they have an altered experience of consciousness from the norm.by Jtarii
5/3/2026 at 9:21:37 AM
Indeed, doing a first aid course we were pointed out that sleeping is different from being unconscious. You can wake someone from sleep pretty quickly. You can't bring an unconscious person back in the same way.by collyw
5/3/2026 at 10:08:04 AM
>We're pretty clear on the distinction between a conscious and an unconscious human.You are using unconscious as a synonym for asleep, which is not the same thing as having no conscious experience due to dreams. We are clear on the distinction between a dead human and an alive human however.
by Jtarii
5/3/2026 at 2:06:55 PM
Unconsciousness is not the same thing as sleeping.Im not sure where sleeping lies but it's probably somewhere between the consciousness and unconsciousness depending on which phase of sleep you are in and perhaps whether you are lucid dreaming.
Which is to say, this is still a mystery but it still isnt a definitional problem it's a regular old scientific mystery.
by pydry
5/3/2026 at 7:59:08 AM
Now discuss whether a bonobo, a dog, a cat, a mouse, an ant, a bacterium is conscious.And you’ll find it’s not as clear cut.
by agnosticmantis
5/3/2026 at 3:43:56 PM
I'm pretty sure the mammals are conscious the same way I am, in that they experience qualia the same way I do. Insects and bacteria, I suspect not, but how could I tell?There is no way to prove that other humans experience consciousness, really.
by amanaplanacanal
5/3/2026 at 12:59:38 PM
Those terms are not really how we use the word "conscious" in any other situation though. With a definition like that you would say a rock is unconscious (I guess reasonable), a pretty cold bacteria is unconscious (hmm.. ok I guess?), and a warm bacteria is conscious (now I'm not on board anymore).We have to be WAY more specific in what the word even means!
by boxed
5/3/2026 at 2:03:50 PM
I dont see a problem here. A warm bacterium is no more conscious than I am when I've been knocked out. Bacteria are alive but they arent conscious ever.by pydry
5/3/2026 at 2:56:39 PM
With your definition they clearly are. They move around, they respond to their environment and take decisive actions when needed. If a human does that they are absolutely "conscious" if you only mean it as the sense of conscious/unconscious.If you define that bacteria are never conscious, you should be able to come up with a definition that doesn't accidentally make them conscious in your definition of the word without just arbitrarily adding "oh, but not bacteria" at some point.
I'll state it again: DEFINE THE WORD. People just argue and scream at each other and no one defined their terms. It's absolute madness to us who see that this is what happens. It's like arguing over the color of the sky and using the word "fnord" and no side has defined the frequency of light that "fnord" should correspond to. BOTH sides are wrong in that situation, because they both don't define the word.
by boxed
5/3/2026 at 5:15:58 PM
>With your definition they clearly are.No absolutely not. My definition was exclusively defining in terms of a human phenomena.
>I'll state it again: DEFINE THE WORD
Instead of repeating yourself reread again what I initially wrote. I think you missed more than it being scoped exclusively to humans.
by pydry
5/3/2026 at 5:29:06 PM
> My definition was exclusively defining in terms of a human phenomenaWell that's a horrible definition. You put into the DEFINITION that ONLY humans can be conscious?
> Instead of repeating yourself reread again what I initially wrote.
The problem is that you were only talking about a very narrow English expression, and then just insinuating that this had some implication which you then didn't define.
by boxed
5/3/2026 at 7:12:21 AM
Clive Wearing's memory lasts for less than 30 seconds, so he has no memory of being awake before now. He is permanently in a state of feeling like he has just woken up, observing his surroundings for the first time.Clive Wearing's mind has no time continuity and basically zero memory integration. Is he not conscious? There's interviews with the guy.
Where on the scale [No mind <-> Clive Wearing <-> Healthy human brain] would you put an LLM with a 10M token context window?
by throwuxiytayq