I actually think that’s a big statement. They’re now like saying that their literal company mission — make the world more open and connected — was partly wrong.
FB is probably already tacitly doing it. Their fact checker program is probably actually a bunch of humans training a large ML system to down or uprank news sources.
-
-
I think you're right. But it interesting that many who seem to have zero trust in FB and Zuck encourage them to introduce all kinds of possible biases when determining who gets heard and who gets muted. (Not saying that's you.)
-
There is no unbiased version of the system that is possible. If the system is directed around maximizing engagement, that will incentivize certain behaviors. If it’s directed around something else, it will promote that kind of behavior.
- 3 more replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.