Conversation

Critics have long suspected that predictive policing software was racially biased. Today, we have the answer: & analyzed 5.9 million algorithmic crime predictions. We found they disproportionately target Black & Latino areas. /1
448
857
Sure, but it seems very likely that poverty correlates with crime, and black people tend to be poorer (due to mostly systemic stuff). Also, black people self-report having experienced higher amounts of crime than white people, iirc?
2
7
But my point is that if you run a program that's like "oh hey, I notice black communities seem to have way more crime," this isn't bias, this is accurate, and the answer isn't to... stop policing crime-ridden areas (black communities probably don't want that, especially-
1
15
Show additional replies, including those that may contain offensive content
Show