I don't know for sure if an LLM could do it. But theoretically one could build a system that sends chat messages asking for clarification and also eventually with more context or something is able to translate vague or stupid "requirements" into ones that make sense.
In five years or so the capabilities may be pretty amazing.
Definitely agreed! I'm both nervous and excited. I fear for my livelihood but also if we continue to make even a percentage of the progress we've made in the last year, the next 5 years are going to be wild!
In five years or so the capabilities may be pretty amazing.