Actually there are two elephants, and they both stink.pic.twitter.com/J28GFs05ZZ
-
-
-
Elephant #1. There is too much content for them to moderate by themselves. Last week I sat across from a senior FB employee who spent the entire time saying, "Hey, if you guys tell us when you find this content, we'll remove it," as if this is reasonable.
Show this thread -
No. You make a profit from selling millions of cars, you either a) fit them with brakes right out of the shop or b) admit you should have and that now you don't know where all the cars are.
Show this thread -
This outrageous corporate line attempts to reframe us - businesses, interest groups, consumers - as the ones who aren't playing ball. Instead, how about, "This is out of our control. We really struggle to find and remove specific types of content and we sure as hell can't stop..
Show this thread -
..someone from livestreaming an execution even though that has already happened*. To effectively moderate every post we'd have to employ ~250,000 people and we can't do that. So please can you help us?"
Show this thread -
There are thousands of smart people with the time and resources to do just that. Also, you're not exactly short of money yourselves.
Show this thread -
Elephant #2. There is a major misalignment of incentives. Try finding footage of a premier league goal on YouTube, Twitter or Facebook shortly after a game. It's hard. So why so much violence, hate, racism and misogyny?pic.twitter.com/U6WpZWbalB
Show this thread -
Cynical view: it's about $$$. Happily, more people look for goals than ISIS magazines; less happily, there's more money in the premier league than counter-extremism. Practical view: maybe sports footage is especially easy for machine-learning algorithms to detect.
Show this thread -
Either way, we need a transparent conversation about what's technically possible. Then the hive mind can get to work on filling the gaps for you.
Show this thread -
We know you can't fix this on your own, but if you don't ask for help, it's over to the regulators. And some of them still think this is all a series of tubes.pic.twitter.com/a9ZKlSYF2S
Show this thread -
As an aside: regulation=financial penalties, but it won't fix the problem. If FB had no way of finding every Christchurch attack video, how exactly do we think Ofcom are going to find them? We will be back to square #1, as above: there is an unpoliceable volume of awful content.
Show this thread -
All social media companies have a strategic business imperative - not to mention a social obligation - to stop spinning and start openly collaborating to solve this problem. *And just disable Facebook Live.
Show this thread
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.