A new paper has been making the rounds with the intriguing claim that YouTube has a *de-radicalizing* influence. https://arxiv.org/abs/1912.11211 Having read the paper, I wanted to call it wrong, but that would give the paper too much credit, because it is not even wrong. Let me explain.
-
-
The key is that the user’s beliefs, preferences, and behavior shift over time, and the algorithm both learns and encourages this, nudging the user gradually. But this study didn’t analyze real users. So the crucial question becomes: what model of user behavior did they use?
Prikaži ovu nit -
The answer: they didn’t! They reached their sweeping conclusions by analyzing YouTube *without logging in*, based on sidebar recommendations for a sample of channels (not even the user’s home page because, again, there’s no user). Whatever they measured, it’s not radicalization.
Prikaži ovu nit -
Sidenote: the first author has been on a diatribe about the media, even in the thread introducing the paper. It doesn’t undermine the paper by itself, but given that they disingenuously exclude how radicalization might actually work, it… raises questions.https://twitter.com/mark_ledwich/status/1210743217982803970 …
Prikaži ovu nit -
Others have pointed out many more limitations of the paper, including the fact that it claims to refute years of allegations of radicalization using late-2019 measurements. Sure, but that’s a bit like pointing out typos in the article that announced "Dewey Defeats Truman".
Prikaži ovu nit -
Incidentally, I spent about a year studying YouTube radicalization with several students. We dismissed simplistic research designs (like the one in the paper) by about week 2, and realized that the phenomenon results from users/the algorithm/video creators adapting to each other.
Prikaži ovu nit -
Let’s not forget: the peddlers of extreme content adversarially navigate YouTube’s algorithm, optimizing the clickbaitiness of their video thumbnails and titles, while reputable sources attempt to maintain some semblance of impartiality. (None of this is modeled in the paper.)
Prikaži ovu nit -
After tussling with these complexities, my students and I ended up with nothing publishable because we realized that there’s no good way for external researchers to quantitatively study radicalization. I think YouTube can study it internally, but only in a very limited way.
Prikaži ovu nit -
If you’re wondering how such a widely discussed problem has attracted so little scientific study before this paper, that’s exactly why. Many have tried, but chose to say nothing rather than publish meaningless results, leaving the field open for authors with lower standards.
Prikaži ovu nit -
In our data-driven world, the claim that we don’t have a good way to study something quantitatively may sound shocking. The reality even worse — in many cases we don’t even have the vocabulary to ask meaningful quantitative *questions* about complex socio-technical systems.
Prikaži ovu nit -
Consider the paper’s definition of radicalization: "YouTube’s algorithm [exposes users] to more extreme content than they would otherwise." Savvy readers are probably screaming: There is no "otherwise"! There is no YouTube without the algorithm! There is no neutral!
Prikaži ovu nit -
That’s the note on which I’d like to end: a plea to consider that the available quantitative methods can’t answer everything. And I want to thank the journalists who’ve been doing the next best thing — telling the stories of people led down a rabbit hole by YouTube’s algorithm.
Prikaži ovu nit
Kraj razgovora
Novi razgovor -
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.