Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Self annihilation fails due to nuclear proliferation, i.e MAD. So your conclusion is backward.

But that's irrelevant anyway, because nukes are a terrible analogy. If you insist on sci-fi speculation, use an analogy that's somewhat remotely similar -- perhaps compare the development of AI vs. traditional medicine. They're both very general technologies with incredible benefits and important dangers (e.g. superbugs, etc).



If you insist on sci-fi analogy, then try protomolecule from The Expanse. Or a runaway grey goo scenario triggered by a biotech or nanotech accident.

Artificial general intelligence is not a stick you can wield and threaten other countries with. It's a process, complex beyond our understanding.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: