Hacker News new | past | comments | ask | show | jobs | submit login

I really dislike this as OpenAI has spent the past months signing sweetheart deals with any publisher willing to sell their content for training data.

It ties everything to their platform and returns a regurgitation of prioritized content without indicating any sort of sponsorship.

SEO will be replaced by cold hard cash, favors, and backroom deals




>SEO will be replaced by cold hard cash, favors, and backroom deals

Maybe it's my pessimistic nature, but it's garbage either way to me - backroom deals in your scenario, or the SEO-gameified garbage we currently have.


Can't believe unfettered greed and self-interest would ruin something like this.


Cold hard cash, favors, and backroom deals have been the modus operandi of this leadership team for over a decade now, it's the only song they know.


At least with "SEO-gameified garbage" the little guy has a chance to compete by learning the SEO game.


> SEO will be replaced by cold hard cash, favors, and backroom deals

Maybe this reflects my biases, but isn't that was SEO has been from the get go? Like, from the moment someone had the idea that they could influence search engine results in their favor and charge money for those services, SEO has been purely negative for internet users simply trying to find the most fitting results for their query.


Well, there's SEO and then there's SEO. Some of it is just common-sense stuff to aid search engines a bit, and that benefits everyone. And then there's SEO which is all the bullshittery you're referring to.

For well over a decade the best SEO trick is to write helpful useful content.

Your small independent blog can become a top Google hit without too much effort. This is kind of neat.


If no one else does it soon I'll probably do it myself: we're long overdue for the ad-block of LLM output. I want a browser plugin that nukes it at the DOM, and I don't care how many false positives it has.


You can't detect LLM output with any reasonable rate. You'd have both false positives and false negatives all over the place. If you solve that part on its own, that will be a SOTA method.


This is a dangerous falsehood. OpenAI's since-cancelled polygraph had a 9% rate of false positives, and a 26% rate of true positive. If I can lose a quarter of toxic bytes and need to enable JavaScript on one site in ten? Count me in!

I want more false positives.

https://openai.com/index/new-ai-classifier-for-indicating-ai...


Then don't use any website - 100% false positives. But seriously, it's a 9% rate for specific models at the time. It's a cat and mouse game and any fine tuning or a new release will throw it off. Also they don't say which 9% was misclassified, but I suspect it's the most important ones - the well written articles. If I see a dumb tweet with a typo it's unlikely to come from LLM (and if it does, who cares), but a well written long form article may have been slightly edited with LLM and get caught. The 9% is not evenly distributed.


It was a cat and mouse game before, spam always is. The inevitable reality that spam is a slog of a war isn’t a good argument for giving up.

I don’t know the current meta on LLM vs LLM detector, but if I had to pick one job or the other, I’d rather train a binary classifier to detect a giant randomized nucleus sampling decoder thing than fool a binary classifier with said Markov process thing.

Please don’t advocate for giving up on spam, that affects us all.


> If no one else does it soon I'll probably do it myself: we're long overdue for the ad-block of LLM output. I want a browser plugin that nukes it at the DOM, and I don't care how many false positives it has.

Well, if you don't care how many false positives it has, just block everything. But there's no even remotely reliable way to detect LLM output if it isn't deliberately watermarked to facilitate that, so you aren't going to get anything that is actually good at that.


> SEO will be replaced by cold hard cash, favors, and backroom deals

The fact that SEO has to exist in the first place is evidence of search engine mafia.


as long as sama is running things we'll be seeing this. he's trying to grow as large as possible, for more leverage.


I honestly think that will be an improvement. SEO is enshittification, it degrades the quality of the product. I would rather pay a couple bucks for something good and vetted.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: