Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

No, this totally missing the point again.

It is not a question of what the sensor detects. It is a question of how it is possible for there to be an experience when sensing occurs.

Your introspection is simply pointing out the likely nature of what you experience, and I actually agree (tentatively) with the idea that most of our conscious experience is rooted in a self-reflecting system. But none of that has any bearing on how there can be ay experience at all.



What you call “experience” for me is just sensing of internal information processing, of internal representations. This may need some dedicated introspection to fully realize. You’re making a distinction which I believe is a mirage. It’s just a special attributation we make in our minds to those inner perceptions. If you look closely, it vanishes.

Think about it: How do you know that you have what you call an “experience”? It’s because you perceive the having of the experience. So, at some point, the perception of having an “experience” is something that enters as an input into your cognitive process, and you match it to some mental models you have about such inputs.

I adjusted my mental model to think of those “qualia” perceptions as being the sensing of parts of the internal workings of my brain. It’s a side-effect of all the processing that is going on, if you will, and of the likely fact that the sensing of some subset of the processing steps is being feed back as inputs into the cognitive processing.


> "just sensing of ..."

We know that physical systems can sense phenomena in the world. We doubt that they experience anything when they do so. Even if we create a Hofstadterian "strange loop" so that we sense our own sensing, that does not give rise to "experience".

I concede that it could be a matter of scale, but a photosensor that knows that it glows yellow when it senses red does not have an experience.


I dispute that "experience" is something that requires a special explanation. The feeling of what you call "experience" is just something that you sense inside your brain. It is just content, data, like thoughts and other perceptions are as well. You are thinking about it. You are thinking about how you feel about it. You have perceptions about your thinking of how you feel about it. Is your cognition able to process anything that isn't (mere) information/data? I don't believe mine is. What would that even mean?

Yes, a photosensor doesn't have that kind of sensation, and that's because there is nothing going on inside of it that would correspond to that sensation (no neural correlate, so to speak). In the humain brain there is enough that is going on (the complexity and quantity is staggering) that our range of inner experience is easily representable by it. And "experiencing" merely means that we are processing those representations in a way that enters our cognition, like "mere" perceptions do as well.

It seems that you are thinking that perceptions lead to experience, and then nothing else. My view is that perceptions lead to internal processing that in turn is itself perceived by our cognition, in addition to the original perceptions. And it's the texture of this perceived processing that we call "experiencing". That is, it doesn't stop at the "experience" step, because then we wouldn't know anything about it. Instead, the information about the experience then enters our cognitive processing as a subsequent step. And that's how we don't perceive just "red" full stop, but also an associated "experience" of the red. But again, we wouldn't know about that experience if that wasn't information that enters our mind. And I see no puzzle about such information entering our mind.


    It seems that you are thinking that perceptions lead 
    to experience, and then nothing else.

    My view is that perceptions lead to internal processing 
    that in turn is itself perceived by our cognition, in 
    addition to the original perceptions
By this definition, which I don't necessarily disagree with, wouldn't a single level of reflection/introspection have to qualify as consciousness?

If we decide that a simple light sensor isn't "conscious" because it's a pure input/output machine with no machinery that could reasonably be described as "internal processing", then what about a Roomba?


I wouldn’t describe it as a single level, because there is some recursiveness involved, and different parts of an experience may involve a different “strength” (like individual tangled weeds have different strengths) or a different number of levels. I think of it as having an organic shape.

I also don’t think consciousness is black and white, there is certainly a spectrum. To us, qualia have complex, multi-faceted and multi-layered textures, in line with what we can cognitively perceive and process, and they trigger just-as-complex associations and emotions. It seems to me that this richness is a large part of what makes consciousness wondrous.

I don’t think a Roomba is observing its own internal processing. A program that debugs/profiles its own execution and uses those data as inputs of its main functions, or a neural network that feeds back the changing weights of its edges as additional inputs for itself, would maybe come closer to asking that question.

As an analogy, does a simple light sensor qualify as “seeing”? “Seeing” on the level of a mammal, with object and motion recognition and building an internal spacial model of what it sees, along with predicting what will happen in the next moments based on what it sees, is certainly on a whole other level.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: