Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It does provide information. Regardless of whether they use a post-inference filter, we now know that the model itself was trained on and can produce NSFW content. Compare this to SD3 which produces a noise pattern if you request naked bodies.

(Also you can download the model itself to check the local behaviour without extra filters. Unfortunately I don't have time to do it right now, but I'd love to know)



Right, that (the black bars) gives no info on how the model works. Thus, you'd love to "know more". ;)

Rest is groping for a reason to make "model is censored [classifier made POST return black image instead of boobs]" something sensical.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: