Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

My concerns in regard to fusing human and machine aren't necessarily because I question the supportive nature of humanity. It's more so that there is no telling what a brain would do with that much power. It's not implausible to say that such a being might disassociate with the human race altogether. Furthermore, innate urges such as survival and production of offspring would still be strongly intact, probably much more prevalent than any idealistic thoughts floating around in the frontal lobe, hence posing a threat of the classic "robot takeover."

Comparatively, teaching/programming a system to have morality seems like a safer bet than giving the reins to a being whose psyche was sculpted by evolution.



I'd certainly agree that we wouldn't want the the most powerful system in the world to have originated as an uploaded human with all their cognitive biases and undocumented value functions, as opposed to an intentionally designed system.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: