Imagine that your eight-year-old son arrives home boasting that he won the race that day in gym class.
Can an algorithm be racist? ProPublica has argued as much against an algorithm used to determine bail sentencing. The algorithm assesses the risk that an individual will reoffend. The creators of the algorithm defend it saying that black and white individuals with similar risk assessments have similar chances of reoffending.
But critics deride the algorithm because, looking just at individuals deemed high risk who didn't end up reoffending, a lot more black individuals were incorrectly assessed as high risk in this way. (These are complicated points, and the article below should clarify them.) Can both of these claims in fact be true?
The following Washington Post article argues in the affirmative. The article points to how the creators and critics may disagree more fundamentally on what the relevant standard of fairness is.