> Well, my personal opinion is with the rise of neural nets we've basically proven that "consciousness" is an illusion and there is nothing there to find.
I keep seeing this claim being made but I never understand what people mean by it. Do you mean that the colors we see, the sounds we hear, the tastes, smells, feels, emotions, dreams, inner dialog are all illusions? Isn't an illusion an experience? You're saying that experience itself is an illusion and there is nothing to experience.
I can't make sense of that. At any rate, I see no reason to suppose LLMs have experiences. They don't have bodies, so what would they be experiencing? When you say an LLM is identical to a person, I can't make good sense of that either. There's a thousand things people do that language models don't. Just the simple fact that I have to eat on a regular basis to survive is meaningful in a way that it can't be for a language model.
If an LLM generates text about preparing a certain meal because it's hungry, I know that's not true in a way it can be true of a human. So right away, there's reasons we say things that go beyond the statistical black box reasoning of an LLM. They don't have any bodies to attend to.
I agree, it almost seems like some type of coping mechanism for that fact that after all the ability to get computers to generate art and coherent sentences, we're still completely none the wiser about understanding objective reality and consciousness and even knowing how to really enjoy the gift of having the experience of consciousness. So instead people create these type of "cop outs".
Mechanical drawing machines have existed forever, I loved, love, loved them when I was a kid and I used to hang the generated images on my wall. Never did I once look at those machines who could draw some pretty freaking awesome abstract art and think to myself, "well that's it, the vale of consciousness is so thin now, it's all an illusion", or "the machine is conscious".
As impressive as some of these models are at generating art, they are still drawing machines. They display the same amount of consciousness as a mechanical drawing machine.
I saw someone on Twitter ask ChatGPT-4 to draw a normal image, you know what it drew ? A picture of a suburban neighborhood, why might a conscious drawing machine do that?
The "it's an illusion" part is a piece of rhetorically toxic language that usually comes up in these discussions, as its a bit provocative. But it's equally anthropocentric to say that someone without a body can't have a conscious experience. When you're dreaming your body is shut off - but you can still be conscious (lucid dreaming or not). You can even have conscious experiences without most of your brain - when you hit your little toe on something, you have a few seconds of terror and pain that surely doesn't require most of your brain areas to experience. In fact, you can argue you won't even need your brain. Is that not a conscious experience? (I'm not really trying to argue against you, I just find this boundary interesting between what you'd call conscious experience and not)
I keep seeing this claim being made but I never understand what people mean by it. Do you mean that the colors we see, the sounds we hear, the tastes, smells, feels, emotions, dreams, inner dialog are all illusions? Isn't an illusion an experience? You're saying that experience itself is an illusion and there is nothing to experience.
I can't make sense of that. At any rate, I see no reason to suppose LLMs have experiences. They don't have bodies, so what would they be experiencing? When you say an LLM is identical to a person, I can't make good sense of that either. There's a thousand things people do that language models don't. Just the simple fact that I have to eat on a regular basis to survive is meaningful in a way that it can't be for a language model.
If an LLM generates text about preparing a certain meal because it's hungry, I know that's not true in a way it can be true of a human. So right away, there's reasons we say things that go beyond the statistical black box reasoning of an LLM. They don't have any bodies to attend to.