If you've ever worked at a tech company and have had to make (or watch your peers make) the wrenching decisions that go behind content moderation, watching the slow motion train wreck @TeamYouTube feels like a strange mix of catharsis and schadenfreude
-
Show this thread
-
Replying to @fredbenenson @TeamYouTube
Do you mean, having seen how easily people under pressure lose all perspective... yes
2 replies 0 retweets 5 likes -
But do you also mean when algorithms are developed to klassify human behavior or are you leaving out of this discussion
1 reply 0 retweets 0 likes -
Maybe? The idea was to build so that the humans would be left to make the hard decisions.
1 reply 0 retweets 0 likes -
That has assumptions baked in that compounds the problems of moderation. Not saying it wasn’t worth trying.
1 reply 0 retweets 0 likes -
The promise of of automation was that by freeing up the rote work, humans could focus on the more difficult cases. That came with its own tradeoffs, of course, but I don't think it compounded the kinds of problems YouTube is facing with their moderation.
1 reply 0 retweets 0 likes -
I haven’t seen a platform with automated deletion of hateful/racist etc comments that would allow humans to work on higher order problems. They’re just... ignored.
1 reply 0 retweets 0 likes
The compounding issue was more about edge cases for Klassy.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.