Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I want to open a side thread about your definitions or descriptions of what "consciousness" is. I think that could be pretty interesting after reading all the comments, and I think there's a lot of knowledge hidden here that we could throw together.

Some things that I understood, in my words:

- consciousness is probably not reducible to smaller, non-conscious parts of which it is composed. You could maybe say it is intrinsically holistic

- consciousness entails being aware of or observing qualities that hard science tells us the things don't have (green vs. length of lightwave); but "being aware of" or "observing" are so closely related to consciousness, that it may not be very informative

- consciousness can't be detected from the outside for now, and probably by the structure of the process. It is "inner" in a very peculiar sense (everything else is outer, and can't get in, except as representation)



The fact a conscious mind loses capability when brain damage happens shows quite clearly that consciousness as a process is reducible to smaller non-conscious parts though.

There's also an innate problem in assuming the human experience of say, "green" is consistent. What I actually see when I see the colour green only appears consistent with the physical behaviour of light. Whether any two people really see colours the same way is highly questionable.


I think there's a consensus that you don't assume that the human experience of "green" is consistent, only that people do have such an experience. We can possibly try to "align" those experiences with communication and referring to a shared real world, but for that an interesting experiment scenario is communication between a person with the common trichromatic sight, a person with a tetrachromatic retina, and someone with partial color blindness, as the experience of "green" for them is not only inconsistent but also likely incompatible, without a possibility to align them.


> The fact a conscious mind loses capability when brain damage happens shows quite clearly that consciousness as a process is reducible to smaller non-conscious parts though.

https://en.m.wikipedia.org/wiki/Necessity_and_sufficiency


> The fact a conscious mind loses capability when brain damage happens shows quite clearly that consciousness as a process is reducible to smaller non-conscious parts though.

This does not follow.


Ability to perceive your own thoughts. Access to your own debug logs.

I think this is distinct from "ability to perceive green, which doesn't exist". A neural network trained to distinguish green in the output of a spectroscope will perceive green without ever knowing there is something like a "neural network" or "thoughts".

Also, this probably can be detected from the outside via debugging. What cannot be detected may be the thing that distinguishes a hypothetical "philosophical zombie" from a truly conscious human, but I don't think anything like a philosophical zombie exists. Once it is physically identical to the human, it will also be thinking identically.

As a next step, you may observe humans around you and realize that the thoughts you perceive seem to be running inside the head of one of these humans (which you will call "me"). However, I don't think knowing what you look like from the outside is necessary for consciousness.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: