Hacker News new | past | comments | ask | show | jobs | submit login

Is AI like other technologies though? Most technologies require a learning curve that usually increases as the technology develops and adds features. They become "skills" in themselves. They are tools to be used; not the users of the tools themselves.

AI seems like the opposite to me. It is the technology that is "the learning curve" in the long term. Its whole point long term is to emulate learning/intelligence - it is trying to be the worker, not the workers tool (whether it suceeds or not is another story). The industry seems to treat it as another tech/tool/etc which you need experience/training in which I wonder is the right approach long term.

Many people will be wondering (incl myself) whether learning to use "AI" is really just an accessibility/interface problem. My time is valuable, should I bother if the productivity gains (which may only last a year or so before it changes again) outweigh the learning time/cost of developing tools/wrappers/etc? Everyone will have a different answer to this question based on their current tradeoffs.

I ask the question: If I don't need it right now (e.g. code is only 10-20% of my job for example), why bother learning it when the future AI will require even less intelligence/learning to use?






Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: