Conversation

Critics have long suspected that predictive policing software was racially biased. Today, we have the answer: & analyzed 5.9 million algorithmic crime predictions. We found they disproportionately target Black & Latino areas. /1
448
857
Sure, but it seems very likely that poverty correlates with crime, and black people tend to be poorer (due to mostly systemic stuff). Also, black people self-report having experienced higher amounts of crime than white people, iirc?
2
7
Show replies