Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You're assuming that non-technical people have the ability to describe exactly what it is that they want. Until they can do that, or until LLMs can generate and gather requirements, we're very safe.


In fact, that's exactly what programming is. Learning the syntax of a programming language is easy, telling a computer precisely what needs to be done is hard.

The reason we have programming languages is not because you can't use more intuitive metaphors, "no code" programming is about as old as code itself. There are graphical tools, tools that attempt to replicate natural languages, etc... And yet, most programming tasks use code, because code is the best way we found for expressing ideas in a way that is sufficiently detailed so that computers can execute them without supervision.

So using ChatGPT to write code effectively is also programming, and without the mindset of a programmer, you won't get far. It may act as a force multiplier, making programmers more productive, like compilers and other non-AI tools, but I don't expect proficient programmers to be out a a job.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: