Critics have long suspected that predictive policing software was racially biased.
Today, we have the answer: & analyzed 5.9 million algorithmic crime predictions. We found they disproportionately target Black & Latino areas. /1
Conversation
Don't black and Latino populations have higher rates of crime though
2
30
common misconception resulting from those communities being over surveilled and policed.
Imagine having 3 cops surveilling team A (with 100 people) and 30 cops surveilling team B (with 60 people). Team B can be better behaved, the cops will still report more things than in team A
6
1
Sure, but it seems very likely that poverty correlates with crime, and black people tend to be poorer (due to mostly systemic stuff). Also, black people self-report having experienced higher amounts of crime than white people, iirc?
2
7
yes, in other words, crime prediction software doesn't predict, it perpetuates injustices while coating them in an illusion of machinic objectivity
the result of that will be: "blacks and latino commit more crimes" narrative being reinforced, police brutality goes up, poverty too
4
I mean there's a cyclical effect; oppressed communities tend to *become* more dangerous (like black men getting locked up for drug crimes and then lots of black kids growing up without fathers). This causes more crime, which then causes more policing, which justifies stereotypes
But my point is that if you run a program that's like "oh hey, I notice black communities seem to have way more crime," this isn't bias, this is accurate, and the answer isn't to... stop policing crime-ridden areas (black communities probably don't want that, especially-
1
15
if the claim that black people are more likely to report crime to the police is true), it's to do stuff like change drug law sentencing, the stuff that caused the inequalities in the first place.
7
Show additional replies, including those that may contain offensive content
Show


