Maybe the exact match is in metadata or in hyperlink anchors pointing to the page. More in general, each search will hit thousands of machines and there will be always approximate behavior. It's not just about what you are searching and how, but also what others are searching. The most deterministic aspect of it all might be the latency budgets on the backends and indices. You could tweak those, but then costs, failures and abandonment rates would go up, too.
If you think that teams who find bugs in CPUs or understand some aspects of them better than the engineers at Intel and AMD aren't competent enough to fix verbatim search, I'm not sure what to tell you. It's not about excuses, it's about trade-offs that you might not agree with. For more nuance and history, see e.g. https://research.google.com/people/jeff/Stanford-DL-Nov-2010...
If you believe that this or similar issues will destroy Google, I guess we'll see.
So it is not about incompetence but about wanting to provide a service that is vastly inferior to what they used to do on hardware a fraction as powerful as what they have today.
If what you write is correct they actually want me to have almost as bad results on Google as the ones I get on Bing.