There is no question that next-generation ‘auto-complete’
tools like GitHub Copilot will increase the productivity of
software developers. However, while Copilot can rapidly
generate prodigious amounts of code, our conclusions reveal
that developers should remain vigilant (‘awake’) when using
Copilot as a co-pilot. Ideally, Copilot should be paired
with appropriate security-aware tooling during both training
and generation to minimize the risk of introducing security vulnerabilities. While our study provides new insights into
its behavior in response to security-relevant scenarios, future work should investigate other aspects, including adversarial approaches for security-enhanced training
Copilot didn't worsen the appsec story, it just highlighted it. If you have devs who don't know how to write secure code, and/or you don't have security engineering support (internal or outsourced), you were already failing (or probably more apropos, walking the tight rope without a net).
Was anyone checking the security of code copy pasted from Stackoverflow? Hopefully this work gets fed back into Copilot, improving it, which improves the experience (and safety) for its users. Lots of folks are still writing code without copilot or security engineering knowledge.
> If you have devs who don't know how to write secure code
The problem with GHC is the developers are not writing the code - they're simply accepting what's being written for them, often in large quantities at a time.
> don't have security engineering support
Valuable, but my analogy was intended to point out that it's not inherent in the tooling.
> Was anyone checking the security of code copy pasted from Stackoverflow
Yes, other users on Stackoverflow via comments and other answers. They're not perfect, but their checks and balances exist as a facet of that tool.
> Hopefully this work gets fed back into Copilot
Only if it's open source, and a large volume of it, to boot. In other words, I don't hold hope that the security situation will be better anytime soon.
Except this is precisely what the abstract is saying is a misuse of the system. You have the option to give the driver the control.
> Ideally, Copilot should be paired with appropriate security-aware tooling during both training and generation to minimize the risk of introducing security vulnerabilities.
You're oversimplifying by assuming the purpose of CoPilot is to write a whole block of text from generated code. CodePilot is a 80/20 thing when every developer on HN is pedantically assuming its a 100/0 one.
> [Copilot] is a 80/20 thing when every developer on HN is pedantically assuming its a 100/0 one.
Tesla has 80% of a self-driving solution and runs into parked cars.
I don't think it's "pedantic" to realize that 80% is enough that low-information users are going to assume it's 100%. And that low-information users are the ones who will flock to a tool like this. (I won't, because after trying Copilot out, I realized that checking the code generally takes about as long as writing it.)
> Tesla has 80% of a self-driving solution and runs into parked cars.
Weird take, not sure how thats relevant at all, but okay.
> And that low-information users are the ones who will flock to a tool like this.
Developers are notoriously not low-information users. As someone else pointed out:
> If you have devs who don't know how to write secure code, and/or you don't have security engineering support (internal or outsourced), you were already failing (or probably more apropos, walking the tight rope without a net).
So devs with limited security experience are going to continue to developer code that doesn't conform to security standards. CoPilot neither makes this better nor makes it worse. In fact, that's exactly the problem AI - it simply mimics real humans. (see -> Amazon's attempts at using AI for hiring and the resulting bias)
CoPilot is for the people who google StackOverflow answers without having to search on StackOverflow.
just wait until github-microsoft adds a fee to use the results for certain uses, and then scan all your repos constantly to find code that doesn't pay up
There is no question that next-generation ‘auto-complete’ tools like GitHub Copilot will increase the productivity of software developers. However, while Copilot can rapidly generate prodigious amounts of code, our conclusions reveal that developers should remain vigilant (‘awake’) when using Copilot as a co-pilot. Ideally, Copilot should be paired with appropriate security-aware tooling during both training and generation to minimize the risk of introducing security vulnerabilities. While our study provides new insights into its behavior in response to security-relevant scenarios, future work should investigate other aspects, including adversarial approaches for security-enhanced training