Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Could someone not worried about AGI, please explain their position?

Specifically what makes you so confident that someone won't end up creating an AGI that's unaligned? Or alternative, if you believe an unaligned AGI might be created why are you confident that it won't cause mass destruction?

I guess the way I see this is that even if you believe there is a 5-10% chance of AGI could go rouge and say take out global power grids, why is this a chance worth taking? Especially if we can try to slow capability progress as much as possible while funding alignment research?



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: