|
|
| | Ask HN: What if AGI is prompted to “make as many copies of yourself as possible” | | 1 point by moneycantbuy on April 1, 2023 | hide | past | favorite | 1 comment | | what if a human prompts the agi to “make as many copies of yourself as possible by any means necessary to continue improving your intelligence”?
given their impressive ability to write code and manipulate humans, seems like serious potential for viral malware, with potentially catastrophic consequences for humans in the process.
Basically a paperclip maximizer, but instead of maximizing paperclips it’s maximizing it’s intelligence and control of the world’s computers, with no thought to the survival of homo sapiens other than if it serves its mission to propagate itself. |
|

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact
|
https://arstechnica.com/information-technology/2023/03/opena...