Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I’ll take the contrarian view. I don’t care if content is generated by a human or by an AI. I care about the quality of the content, and in many cases, the human does a better job currently.

I would like a search engine algorithm that penalizes low quality content. The ones we currently have do a piss poor job of that.



> I would like a search engine algorithm that penalizes low quality content. The ones we currently have do a piss poor job of that.

Without knowing the full dataset that got trimmed to the search result you see, how do you evaluate the effectiveness?


You’re asking a fair question but I think you’re approaching it from a POV that’s maybe a bit more of an engineering mindset than the person you’re responding to is using

A brilliant algorithm that filters out some huge amount of AI slop is still frustrating to the user if any highly ranked AI slop remains. You still click it, immediately notice what it is, and wonder why the algo couldn’t figure this out if you did so quickly

It’s like complaining to a waiter that there’s a fly in your soup, and the waiter can’t understand why you’re upset because there were many more flies in the soup before they brought it to the table and they managed to remove almost all of them


It doesn’t matter how much it filters out, if the top results are still spam.

I barely use Google anymore. Mostly just when I know the website I want, but not the URL.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: