Hacker News new | past | comments | ask | show | jobs | submit login

Your second paragraph is basically what I'm saying but with the extension that we only actually care about reasoning when we're in these kinds of asymmetric situations. But the asymmetry isn't about the other reasoner, it's about the problem. By definition we only have to reason through something if we can't predict (don't know) the answer.

I think it's important for us to all understand that if we build a machine to do valuable reasoning, we cannot know a priori what it will tell us or what it will do.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: