Comment by londons_explore
Comment by londons_explore 5 days ago
> Two people who are identical except for their nationality face the same probability of a false positive
It would be immoral to disadvantage one nationality over another. But we also cannot disadvantage one age group over another. Or one gender over another. Or one hair colour over another. Or one brand of car over another.
So if we update this statement:
> Two people who are identical except for any set of properties face the same probability of a false positive.
With that new constraint, I don't believe it is possible to construct a model which outperforms a data-less coin flip.
I think you took too much of a jump, considering all properties the same, as if the only way to make the system fair is to make it entirely blind to the applicant.
We tend to distinguish between ascribed and achieved characteristics. It is considered to be unethical to discriminate upon things a person has no control over, such as their nationality, gender, age or natural hair color.
However, things like a car brand are entirely dependent on one's own actions, and if there's a meaningful statistically significant correlation owning a Maserati and fraudulently applying for welfare, I'm not entirely sure it would be unethical to consider such factor.
And it also depends on what a false positive means for a person in question. Fairness (like most things social) is not binary, and while outright rejections can be very unfair, additional scrutiny can be less so, even though still not fair (causing prolonged times and extra stress). If things are working normally, I believe there's a sort of (ever-changing, of course, as times and circumstances evolve) an unspoken social agreement on what's the balance between fairness and abuse that can be afforded.