If you've ever worked at a tech company and have had to make (or watch your peers make) the wrenching decisions that go behind content moderation, watching the slow motion train wreck @TeamYouTube feels like a strange mix of catharsis and schadenfreude
The promise of of automation was that by freeing up the rote work, humans could focus on the more difficult cases. That came with its own tradeoffs, of course, but I don't think it compounded the kinds of problems YouTube is facing with their moderation.
-
-
I haven’t seen a platform with automated deletion of hateful/racist etc comments that would allow humans to work on higher order problems. They’re just... ignored.
-
The compounding issue was more about edge cases for Klassy.
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.
, previously