Well there were scary stories about that from back then. It didn’t totally happen that way, but we’ve actually had versions of it ruin lives—an academic accidentally on the no-fly list had career destroyed.
-
-
The whole point of unconscious bias is that problems can be in plain sight but people still can’t see them. The Guardian’s inclusion group set up an unpaid internship program and never once thought who that excluded. That’s not an auditability problem. It’s an awareness problem.
-
Sounds like something a well-designed ML algorithm could surface.
- Show replies
New conversation -
-
-
Formal/explicit rules can be better than human judgment for the discriminated (goes to Steve's point). Hotel booking/Uber easier for non-white people than AirBnB/Taxis. But, ML is a different twist, opaque like humans but not, at the moment, deployed to help less powerful. 2/2
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
This Tweet is unavailable.
-
We have a handle on why and how the NYT news side behaves, as well as the op-ed page. They even write editorials explaining their reasoning (which you can further analyze), and we have fields of study on why and how institutitional power operates. Not at all there for ML.
End of conversation
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.
