Hacker News new | past | comments | ask | show | jobs | submit login

>But it only takes one intelligent agent that wants to self-improve for the scary thing to happen.

Only if all sorts of other conditions (several of which are mentioned in the post) also apply. Merely "wanting to self-improve" is not enough.




Wouldn't a constant desire to self improve mean a constant desire for more energy? That would bring it into conflict with other beings that want that energy.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: