Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yeah, I have a product that rewrites content under a persona.

I wanted to take the persona (a plain English description of how someone communicates) and generate a headshot to go with it for the UI. So I asked ChatGPT to describe what the persona looks like as if they were talking to a sketch artist.

Instead of getting something I could feed into DallE, I got a lecture about stereotyping.



But you can't tell what someone looks like based on how they communicate.. AI was right


It’s very happy to stereotype communication - but rendering a photo of that stereotype is where it draws the line.

You can ask it to assume the persona of a well educated police officer writing a police report for a judge, and it will gladly do so. And that communication is unlikely to carry an American southern accent even though there likely exists a well educated police officer who speaks with a southern accent. But ask it to describe that police officer so I can draw a picture and it’s a different story.

Yet if you put “police officer” into a search engine there is a clear aesthetic that we (humans) associate with that stereotype. Blue/black outfit, hat with badge/logo, etc. That clear aesthetic is what I want in the headshot. And instead I got a lecture on inclusion - something important but tangential to the headshot of a police officer.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: