Conversation

Critics have long suspected that predictive policing software was racially biased. Today, we have the answer: & analyzed 5.9 million algorithmic crime predictions. We found they disproportionately target Black & Latino areas. /1
448
857
Sure, but it seems very likely that poverty correlates with crime, and black people tend to be poorer (due to mostly systemic stuff). Also, black people self-report having experienced higher amounts of crime than white people, iirc?
2
7
I mean there's a cyclical effect; oppressed communities tend to *become* more dangerous (like black men getting locked up for drug crimes and then lots of black kids growing up without fathers). This causes more crime, which then causes more policing, which justifies stereotypes
3
1
But my point is that if you run a program that's like "oh hey, I notice black communities seem to have way more crime," this isn't bias, this is accurate, and the answer isn't to... stop policing crime-ridden areas (black communities probably don't want that, especially-
1
15