No only can people disagree about what the companies should do (apart from what US law requires of them), but the same person can often be of two minds about it: “Take responsibility for what you’re spreading!” vs. “smh FB thinks it’s God and took down a perfectly legal post.”
-
Show this thread
-
The old prevailing framework for thinking about this was one about rights and values. It’s what led Twitter’s former GC to describe the company as the free speech wing of the free speech party. Not moderating stuff was a choice reflecting particular free speech values.
1 reply 2 retweets 13 likesShow this thread -
Twitter doesn’t say that anymore. And the framework has shifted. It’s now less about rights and abstract values and more what I’d call a public health framework. Today, the harms of misinformation or incitement are foregrounded, and platforms don’t just ban or allow, they ...
1 reply 4 retweets 19 likesShow this thread -
... affirmatively recommend — see the debate about the paths YouTube can lead people down as one video autoplays after another at YouTube’s discretion.
1 reply 2 retweets 13 likesShow this thread -
Or consider this graph from Mark Zuckerberg’s recent post on content moderation. He observed that content near but not over the edge of what FB allows gets more engagement, and suggested that that should be fixed — by FB’s exacting a penalty on its appearances in others’ feeds.pic.twitter.com/m9MEFXBY0u
2 replies 10 retweets 20 likesShow this thread -
FB and Twitter are making these decisions all the time, as they must for feeds that are shaped other than straight-up chronologically. They’re fraught decisions, and little-noticed ones. It’s not as if FB puts a virality score next to each post as the author submits it.
1 reply 1 retweet 11 likesShow this thread -
That leads to anxiety and even paranoia about whether posts are being “shadow banned,” left to linger unshared by the platform without being formally deleted. I can understand why FB would start an independent review board to decide appeals on what stays and what goes.
2 replies 2 retweets 8 likesShow this thread -
A board like that is kind of a third framework: not rights, not public health, but one of process: design an independent jury of sorts, to act in a process that recognizes that speech decisions of this magnitude shouldn’t just be by a company - but inevitably will get made.
1 reply 1 retweet 13 likesShow this thread -
There is a huge legitimacy deficit for the companies wielding so much power in the realm of human discourse. The categories of “platform” and “publisher” no longer offer much guidance on bridging it. Are FB and Twitter platforms or publishers? The answer is “yes.”
4 replies 6 retweets 26 likesShow this thread -
The really hard questions to answer are ones around what we want the speech policies to be, and who we’d actually trust to enforce them at scale.
4 replies 7 retweets 40 likesShow this thread
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.