Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Is it, though? I reckon it’s commonly accepted. I reckon there are cases that can be highlighted. As I’m sure it’s possible to make the case that some trickle-down policies work. But in the end, plural forms or synonyms do not equate to the original word.


You can just surround a word with double quotes if you need exact matches. It's nice having synonyms otherwise.


Haven't worked reliably in Google since somewhere around 2012 AFAIK.

Edit: Downvote all you want but this is documented.


I use it all the time. My biggest issue is I frequently get 0 search results with it on specific searches. It may not always have a result, but I don't get returned results that don't include the quoted portion.


As I write above it doesn't work reliably.

And as I think everyone on HN knows "my Google" isn't "your Google".

Even if it works reliably for you doesn't mean it does for me and many others.


The problem is that you're assuming search is deterministic.


> The problem is that you're assuming search is deterministic.

I am not assuming search is deterministic.

But even in a generally non-deterministic system, some things should always stay true.

For example: When doublequotes means "exact match", any item that doesn't contain an exact match should not be shown.

I also understand that a webpage might have changed since it was indexed, but I have a really hard time believing all the false matches I have wasted time on over the years relates to websites suddenly changing between the time when they were indexed and the time when I visited them.


Maybe the exact match is in metadata or in hyperlink anchors pointing to the page. More in general, each search will hit thousands of machines and there will be always approximate behavior. It's not just about what you are searching and how, but also what others are searching. The most deterministic aspect of it all might be the latency budgets on the backends and indices. You could tweak those, but then costs, failures and abandonment rates would go up, too.


It used to work as expected before.

It works as expected in competitors with far far far less resources.

The reason they don't fix it is either because they aren't competent to do it or because they don't want to give me correct results.

There is no need to make up excuses for a company the size of Google.

Edit: at this point I should be happy. The faster Google destroys itself, the faster real competition can get a chance and the web can heal.


If you think that teams who find bugs in CPUs or understand some aspects of them better than the engineers at Intel and AMD aren't competent enough to fix verbatim search, I'm not sure what to tell you. It's not about excuses, it's about trade-offs that you might not agree with. For more nuance and history, see e.g. https://research.google.com/people/jeff/Stanford-DL-Nov-2010...

If you believe that this or similar issues will destroy Google, I guess we'll see.


So it is not about incompetence but about wanting to provide a service that is vastly inferior to what they used to do on hardware a fraction as powerful as what they have today.

If what you write is correct they actually want me to have almost as bad results on Google as the ones I get on Bing.

I'll take your word for it.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: