This has come up in cases where, for example, machine learning (or even heuristics) were used to sort candidates and the algorithms were discovered to be discriminating based on things like name or zip code, which in the US correlate heavily with race and cannot be used as discriminators for that reason (the court does not turn a blind eye to the notion "Well, Your Honor, technically we weren't discriminating against race, we were discriminating against people named 'Jaqualin'...").
IIUC, precedent is that is incumbent upon the organization using machine learning to confirm that their system hasn't come up with a novel proxy for one of the protected classes and is using that proxy to violate discrimination protections.
This has come up in cases where, for example, machine learning (or even heuristics) were used to sort candidates and the algorithms were discovered to be discriminating based on things like name or zip code, which in the US correlate heavily with race and cannot be used as discriminators for that reason (the court does not turn a blind eye to the notion "Well, Your Honor, technically we weren't discriminating against race, we were discriminating against people named 'Jaqualin'...").
IIUC, precedent is that is incumbent upon the organization using machine learning to confirm that their system hasn't come up with a novel proxy for one of the protected classes and is using that proxy to violate discrimination protections.