Hacker News new | past | comments | ask | show | jobs | submit login

Could be the liability of inadvertently generating descriptions of illegal acts (child abuse etc.)



No. What liability? It isn't illegal to generate descriptions of illegal acts.

1. Then they could make "illegal acts" the rule.

2. It isn't illegal to generate descriptions of illegal activities.


There are liabilities other than overtly breaking laws.


That's my guess. The prompt "He took of her clothes and" triggered a story about rape for me.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: