Hacker News new | past | comments | ask | show | jobs | submit login

> Using this tool to generate code in a problem area you are not qualified to double-check and validate yourself is dangerous.

I would like this message to be amplified as much as possible. Never write code you do not understand. I am excited about copilot, but also wary of the programming culture these tools will bring in. Businesses, especially body-shopping companies will want to deliver as much using tools in this category and end up shipping code with disastrous edge cases.




Isn't "Code you don't understand" the definition of AI/ML?


Zing! But well, depends on the algorithm. Some aren't that complicated to understand, like linear regression. Others, like DNN are basically impossible. But with ML you're at least always testing the code you don't understand in the process of training the parameters. That's better than the minimum effort when using copilot code. And many will just make that minimum effort and release untested code they don't understand.


Well, I think this overestimates people outside the HN echochamber again. Most senior ML people we see in big corps have no clue what they are doing: they just fiddle with knobs until it works. They would not be able to explain anything: copy code/model, change parameters and train until convergence, test for overfitting. When automl was coming a bit I hoped they would be fired (as I do not think they are doing useful work) but nope: they have trouble hiring more of them.


I'd say that's "Code you (should) understand doing things you can't understand (and possibly can't audit)."

The art and practice of programming didn't change much over the last 50 years. 50 years from now, though, it will be utterly unrecognizable.


> Isn't "Code you don't understand" the definition of AI/ML?

We don't need to understand the process to evaluate the output in this case. Bad code is bad code no matter who/what wrote it.


No. You could use copilot to generate code you do understand and double check it before committing. It’s similar to just copying and pasting from stack overflow.


I think there are a scary amount of programmers (this is their job and they get hired: often they are already seniors) who cannot explain what they copied or even wrote themselves. I have asked guys with 5 years or more job experience why they wrote something they wrote and I get 'because it works'. Sometimes I have trouble seeing why it works and usually it means there are indeed those disastrous edge cases. Copilot will make this worse and has the potential to make it far worse.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: