Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What we have is:

For all consciousnesses that exist, they arose from a (complex) physical system.

What you are positing is, effectively:

Any physical system (of sufficient complexity) could potentially become conscious.

This is a classic logical error. It's basically the same error as "Socrates is a man. All men are mortal. Therefore, all men are Socrates."



It's actually not. The "sufficient complexity" part is doing all the heavy lifting. If a proposed physical system is not conscious, well clearly it's not sufficiently complex.

Am I saying LLMs are sufficiently complex? No.

But can they be in principle? I mean, maybe? We'll have to see right?

Saying something is possible is definitely an easier epistemic position to hold than saying something is impossible, I'll tell you that.


No I'm not saying that. I'm saying that we can't trivially rule out the possibility that any sufficiently complex physical system is conscious.

You might think this is obvious and I agree. Yet there are many people that argue that computers can never be conscious.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: