the problem here is in how we read what the engine is doing and what that leads us to do
-
-
-
if we feed it arrest data and it says "oh lotta black people" and we read from that "black people are criminals" that is *our error*
-
because it didn't tell us a lot of black people are criminals, it told us a lot of black people *got arrested*
-
if we read these systems that poorly it's also going to tell us a number of forms of disability make people violent threats
-
because of the disabled people who are routinely murdered by police for failure of compliance
-
tl;dr our inability to reason precisely is going to make us fuck with people we shouldn't, such news, very change, wow
-
In order: if black arrest rates reflect systemic bias rather than actual criminality, one would expect a corresponding bias...
-
...toward higher white victimization reports. In fact, we see this in sexual violence--but not in nonsexual violence or property
- 6 more replies
New conversation -
-
-
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
"yeaaaah I'm really good need AIs to be considered systematically racist to continue rent seeking"
-
The whole thread is a wonderland of derp.
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.
. Banned in Sweden. SubGenius, Zhuangist, white-hat troll. Defrocked mathematician. Brain problems.