Algorithms should have made courts more fair. What went wrong?

Status
You're currently viewing only bob0921's posts. Click here to go back to viewing the entire thread.

bob0921

Wise, Aged Ars Veteran
155
How did the predictions of the algorithms match reality? That question is the only fair test of their accuracy. How many of the white people predicted to be safe to leave at home committed some infraction? It is certainly possible for artificial intelligence techniques to be racially biased. The algorithms are only as good as the data they are trained on. But, it is also possible for there to be a real correlation between the color of a person's skin and some kind of behavioral outcome. The fact that the results of the algorithm had some correlation with race does not prove that they were wrong.

The article also only looked at race correlation. It would have been interesting to look at income/poverty correlation. And also break that out to urban and rural. Many of these "race" issues often can be correlated to poverty. Urban blacks being disaportionally poor can make a class issue a race issue for those that want to find racism everywhere. Urban vs. rural is a different matter. In areas where everyone knows everyone, there is more inherit trust.
 
Upvote
1 (1 / 0)
Status
You're currently viewing only bob0921's posts. Click here to go back to viewing the entire thread.