Critics have long suspected that predictive policing software was racially biased.
Today, we have the answer: & analyzed 5.9 million algorithmic crime predictions. We found they disproportionately target Black & Latino areas. /1
Conversation
Don't black and Latino populations have higher rates of crime though
common misconception resulting from those communities being over surveilled and policed.
Imagine having 3 cops surveilling team A (with 100 people) and 30 cops surveilling team B (with 60 people). Team B can be better behaved, the cops will still report more things than in team A
6
1
Sure, but it seems very likely that poverty correlates with crime, and black people tend to be poorer (due to mostly systemic stuff). Also, black people self-report having experienced higher amounts of crime than white people, iirc?
2
7
Show replies


