Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Maybe the exact match is in metadata or in hyperlink anchors pointing to the page. More in general, each search will hit thousands of machines and there will be always approximate behavior. It's not just about what you are searching and how, but also what others are searching. The most deterministic aspect of it all might be the latency budgets on the backends and indices. You could tweak those, but then costs, failures and abandonment rates would go up, too.


It used to work as expected before.

It works as expected in competitors with far far far less resources.

The reason they don't fix it is either because they aren't competent to do it or because they don't want to give me correct results.

There is no need to make up excuses for a company the size of Google.

Edit: at this point I should be happy. The faster Google destroys itself, the faster real competition can get a chance and the web can heal.


If you think that teams who find bugs in CPUs or understand some aspects of them better than the engineers at Intel and AMD aren't competent enough to fix verbatim search, I'm not sure what to tell you. It's not about excuses, it's about trade-offs that you might not agree with. For more nuance and history, see e.g. https://research.google.com/people/jeff/Stanford-DL-Nov-2010...

If you believe that this or similar issues will destroy Google, I guess we'll see.


So it is not about incompetence but about wanting to provide a service that is vastly inferior to what they used to do on hardware a fraction as powerful as what they have today.

If what you write is correct they actually want me to have almost as bad results on Google as the ones I get on Bing.

I'll take your word for it.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: