Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> I agree that mimicking feeling behavior isn't the same as having feelings

The problem is that, as an external observer, you can only observe behavior. A machine retracts its arm due to its sensor sensing a pressure. How do you know it feels pain or not ? other's entities 'mind' is terra incognita.



My answer to this kind of thought experiment is that you can only know the answer by asking the thing doing the experiencing. Is the robot feeling pain? I dunno, ask it. This is, I think, the reason we do not assign "thing feels pain" to cows or fish or other animals we consume. Or at least "that thing does not feel pain the same way my child feels pain", which allows us to treat them not as fellow Earthlings but as lower life forms. No, cows do not have civilizations, and I am a fervent meat-eater. But this is my roundabout way of saying that the first "truly human (i.e. generalized/strong) artificial intelligence" will have first and foremost an immaculate ability to communicate with humans on every level. Without that perfected capability, it will always just be a smart robot to other human beings.


The problem is that I do think cows feel pain (I thought most people did? I'm not a meat eater though), and I don't care how smart it is (unless intelligence is somehow required to appreciate pain). A cow-level AI may not be able to communicate it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: