Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

[flagged]


So you actually _wanted_ images that perpetuate the biases of the world?


Unfortunately, the method OpenAI may be using to reduce bias (by adding words to the prompt unknown to the user) is a naive approach that can affect images unexpectedly and outside of the domain OpenAI intended: https://twitter.com/rzhang88/status/1549472829304741888

I have also seeing some cases where the bias correction may not be working at all, so who knows. And it's why transparancy is important.


What a fascinating hack. I mean, yeah, naive and simplistic and doesn't really do anything interesting with the model itself, but props to the person who was given the "make this more diverse" instruction and said "okay, what's the simplest thing that could possibly work? What if I just append some races and genders onto the end of the query string, would that mostly work?" and then it did! Was it a GOOD idea? Maybe not. But I appreciate the optimization.


This sounds like something that could backfire very badly on certain prompts. "person eating a watermelon" for example.


Yes, I did. I want it to show world as it is not as people want it to be.


So you want the world to be the way it is?


Reread what I said: I WANT THE DALLE GENERATOR TO SHOW THE WORLD AS IT IS NOT AS PEOPLE WISH IT WAS.


Reread what I said, try engaging more of your brain this time.


How do you remove bias as long as humans are in the loop? Aren't they just swapping one bias for their own?


I thought the same thing but I think the commenter is making a joke, but I could be wrong.

I think they are suggesting that things like this (neural nets etc) work using bias, and by removing "bias" the developers are making the product worse.

It's a very sh!t comment if it's not a joke.


Just to be sure. Does "OC" here mean Original Comment?


Typo, now fixed.


Reducing bias means affecting the data, instead of letting the end user just choose an appropriate image generated by a clean data set.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: