Conversation

Critics have long suspected that predictive policing software was racially biased. Today, we have the answer: & analyzed 5.9 million algorithmic crime predictions. We found they disproportionately target Black & Latino areas. /1
448
857
I mean there's a cyclical effect; oppressed communities tend to *become* more dangerous (like black men getting locked up for drug crimes and then lots of black kids growing up without fathers). This causes more crime, which then causes more policing, which justifies stereotypes
3
1
Show replies
Show additional replies, including those that may contain offensive content
Show