Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Any value system in which "save one person" is not obviously and objectively better than "save zero people", with no other tradeoffs or details present, seems objectively incorrect.


tbf real world tradeoffs are usually more like "save one person with high degree of likelihood of success" vs "attempt to save two people with significant likelihood of failure"


If there are 10^58 people in the second situation we call them longtermists.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: