Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That seems a pretty simple one to manage—a disclaimer stating "Copilot will not generate code referencing certain topics" seems both sufficient and uncontroversial.


like thus line from the FAQ?

>GitHub Copilot includes filters to block offensive language in the prompts and to avoid synthesizing suggestions in sensitive contexts.

I think calling gender a sensitive context is not unreasonable.


>I think calling gender a sensitive context is not unreasonable.

It is very unreasonable, but it's also the truth. sigh


Yes, but medical stuff is a sensitive context too. And financial, as well. Plus ethnicity. And age. As well as anything could be indicative of the aforementioned topics, such as vehicle makes & models, ecommerce products, tea vs coffee preference, accounting, and so on.

Oh, wouldn't you know it... Turns out that almost all code doing something important might be able to be interpreted as sensitive.


Oh god, the thought of Copilot contributed code ending up in medical applications is terrifying…


Yep, that’s perfect.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: