Just seems like a way to lose a lawsuit. All you need is some handicapped client the system can't understand, or a demonstration that it's biased against an accent, and that's all it'll take.
The customer could blame the service provider - especially when the provider stresses how unbiased they aim to be. But then negligence by the customer is an issue too. I agree with you — bias in ML systems exists and applying it to more facets of life is tricky and snake-oil can be tasty, but this does also surface issues that require standards and assessments of bias.