the decent models will (mostly) decide when they need to write code for problem solving themselves.
either way a reply with a bogus answer is the fault of the provider and model, not the question-asker -- if we all need to carry lexicons around to remember how to ask the black box a question we may as well just learn a programming language outright.
I disagree, the answer you get is dictated by the question you ask. Ask stupid, get stupid. Present the problem better, get a better answer. These tools are trained to be highly compliant, so you get what you ask.
Same happens with regular people - a smart person doing something stupid because they weren't overly critical and judgingof your request - and these tools have much more limited thinking/reasoning than a normal person would have, even if they seem to have a lot more "knowledge".
either way a reply with a bogus answer is the fault of the provider and model, not the question-asker -- if we all need to carry lexicons around to remember how to ask the black box a question we may as well just learn a programming language outright.