Hacker News new | past | comments | ask | show | jobs | submit login

People internalize conversations and the thought processes that went into them. If I have a conversation with somebody, I often walk away remembering and understanding what somebody else said and why they said it. And these memories get used in future interactions. So just like the offloading of arithmetic likely resulted in people not being able to perform mental math, what would be the result of conversing with an AI that has hallucination/logical issues (a lesser intelligence)? Isn't it reasonable to guess that this will result in diminished reasoning?



I hadn't considered that. If that's the case then we should hope people simply copy and paste the output rather than try to engage with it or take it seriously.

Though in more practical economic terms, perhaps what we're being trained for is a future in which the typical worker has a low paying job sanity checking AI output rather than a higher paying job doing the work themself.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: