Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Is there a useful definition of "consciousness" that is empirically testable with some sort of instrument such that you can point it at a squid and say "yep, that one has consciousness!" ?

I ask because I've never heard such a definition nor has anyone actually ever told me what the instrument is that detects this. Until then, I figure there's no such thing as consciousness... it's just what superstitious monkeys say now that they feel silly talking about "souls" which is the old word that used to be used for this superstitious concept.



My view on this is that the mind is the instrument that acknowledges it's own existence. "I think therefore I am" and all that. I findnit funny, we all walk around absolutely certain of our own existence, very much aware of it, and then question that this is even occurring. I'm satisfied that I've identified consciousness with 100% certainty by simply noticing it in myself, even being able to notice it is proof that it's there. And heuristically I presume others with form similar to mine have done the same thing, I think that's a safe bet to make.


In my mind, this is part of the mysticism of the movement. If you define a word widely enough, then anything can fit. I hate to invoke this, but this often feels like a Motte & Bailey argument. If you want to call "God" the energy which started the big bang, you're sort of smuggling personification into something which really bears zero resemblance to traditional religious ideas. And so it is with consciousness. If we define consciousness widely enough so that a planet, galaxy, etc, could be considered conscious, then we've smuggled a familiar and personified concept into an arena where it doesn't really apply.

A great example would be trees: trees "sleep", they communicate with each other. They have immune responses, they mate, etc. But there's no coherent reason to think of trees as conscious, unless you just stretch the definition outside of its common meaning.


Isn't that precisely the argument to extend the definition of consciousness, our own human bias on what is conscious colors what we define as such.

If the 'universe' is conscious, every cell in your body is as well... and well... they are alive - and they do make 'decisions' that end up keeping (royal)us alive that look intelligent from an outside perspective... which is about the same perspective a large (zero-g evolved) kaiju would have of us {small pieces of a whole that do seemingly intelligent things to benefit the group-entity}...

if we don't consider what we are made of as conscious material - from where does consciousness arise from? ...and until we have a clear answer, maybe it's a sliding scale and everything is some degree of 'awareness'?


I understand the argument you're making, but it feels misguided in my opinion. Other things don't need consciousness to be valid -- and more importantly, consciousness is not an inherently positive thing which needs to be extended other things. Ideally, consciousness is specifically defined such that we can draw relatively clear boundaries between what is conscious and what is not.

In any case to answer where consciousness comes from, consciousness arises from the brain. You can test this with anesthesia, which is markedly different from sleep in that it does remove conscious experience. Although I agree it would be hard to pin down exactly what spectrum of animal is, or is not conscious, I think the extremes are pretty obvious. Obvious, unless, you're trying to extend consciousness to anything which could sense and react to its environment. If you want to call that consciousness, that's fine -- the human experience of having a distinct identity which can feel self-conscious, proud, content, anxious, etc -- is then something different entirely.


> In any case to answer where consciousness comes from, consciousness arises from the brain. You can test this with anesthesia, which is markedly different from sleep in that it does remove conscious experience.

Similarly, the sound that comes out of a radio comes from the batteries, you can test this by removing the batteries and seeing that the sound stops.


But if you say consciousness is "a specific evolved mechanism, and like the eye, has probably convergently evolved to serve different purposes" then you must have some notion what this mechanism is and what it's purpose is so you can say, "See, here is consciousness. It helps this organism achieve the expected purpose." I think this is what NoMoreNicksLeft is getting at.

I believe the typical (mis)understanding proceeds like so: "I am conscious. I not only know things, but I have a sensation of knowing them. This (computer/animal) reacts to things in such a way that it demonstrably knows them in some sense, but it has no sensation of knowing them. Now what is this wonderful thing that distinguishes me from it? How did it come to be? What purpose does it serve?"

One purpose the concept of consciousness serves is that it gives a special ethical status to its possessor. The conscious being needs to care about other conscious beings but can treat those beings without consciousness as tools or resources.


>then you must have some notion what this mechanism is and what it's purpose is so you can say,

My intuitive sense is that consciousness is present in so many mammals precisely because mammal young require so much care after birth. ie, it is beneficial for a mother to have differentiate between her own consciousness and the consciousness of her children. And, to care for her young and possess an intuitive sense for the needs of her young. It feels like this trait, once bootstrapped, could find other evolutionarily-beneficial behaviors, such as hunting in packs, building cities, or forming competitive social hierarchies. (you can't really have social status without consciousness)

To be clear, I'm not a biologist, and for sure I could be wrong here. I think the general principle holds though -- if a species of animal possesses a trait, then that trait either currently or previously conferred some evolutionary benefit. It's easy enough to think of what those benefits could sensibly be, and of course more difficult to actually go ahead and prove it out.

>I believe the typical (mis)understanding proceeds like so: "I am conscious. I not only know things, but I have a sensation of knowing them. This (computer/animal) reacts to things in such a way that it demonstrably knows them in some sense, but it has no sensation of knowing them. Now what is this wonderful thing that distinguishes me from it? How did it come to be? What purpose does it serve?"

I think that's a really fair characterization, but I also think that this "feeling of what happens" does characterize consciousness. 1) this means that an AI _could_ be conscious, but it would probably also need to have been programmed to sense input and also have some sort of identity. 2) if we don't believe that consciousness is a "feeling of what happens," but instead is just a response to external stimuli, then I would think consciousness is just synonymous with "living things," and there's no point in having a separate concept for it.

>One purpose the concept of consciousness serves is that it gives a special ethical status to its possessor.

I'm not sure what you mean by this. Are you saying that "from the perspective of the conscious being, that person themselves puts more value on themselves or other conscious beings?" -- or, are you saying that "conscious beings _should_ put more emphasis on other conscious beings, and therefore it's important to extend consciousness other things?" I think in either case, I would say that this bias towards other conscious beings is just sort of an in-built human bias, and isn't particularly important when it comes to scientific understanding of consciousness. (from a moral standpoint, I concede it could be important.)


Perhaps, trees have a very muted consciousness. Perhaps every living thing is conscious at some level ranging from nearly zero to far in excess of human consciousness.


I think this just redefines consciousness to mean "things that are alive." This may be a fine concept, but it is not what people commonly mean by consciousness. I strongly suspect that people who want to extend consciousness to all living things (and beyond) do some from a _moral_ perspective -- they want to connect people to all living things in some moral and spiritual sense. And they want to remove the historical moral hierarchy whereby mankind has placed himself at the top of creation.

I have no interest in the antiquated view that man is the peak of creation. However, extending consciousness to everything seems to be a spiritual act, and quite distinct from determining a scientific definition of consciousness.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: