A new paper has been making the rounds with the intriguing claim that YouTube has a *de-radicalizing* influence. https://arxiv.org/abs/1912.11211 Having read the paper, I wanted to call it wrong, but that would give the paper too much credit, because it is not even wrong. Let me explain.
-
-
After tussling with these complexities, my students and I ended up with nothing publishable because we realized that there’s no good way for external researchers to quantitatively study radicalization. I think YouTube can study it internally, but only in a very limited way.
Prikaži ovu nit -
If you’re wondering how such a widely discussed problem has attracted so little scientific study before this paper, that’s exactly why. Many have tried, but chose to say nothing rather than publish meaningless results, leaving the field open for authors with lower standards.
Prikaži ovu nit -
In our data-driven world, the claim that we don’t have a good way to study something quantitatively may sound shocking. The reality even worse — in many cases we don’t even have the vocabulary to ask meaningful quantitative *questions* about complex socio-technical systems.
Prikaži ovu nit -
Consider the paper’s definition of radicalization: "YouTube’s algorithm [exposes users] to more extreme content than they would otherwise." Savvy readers are probably screaming: There is no "otherwise"! There is no YouTube without the algorithm! There is no neutral!
Prikaži ovu nit -
That’s the note on which I’d like to end: a plea to consider that the available quantitative methods can’t answer everything. And I want to thank the journalists who’ve been doing the next best thing — telling the stories of people led down a rabbit hole by YouTube’s algorithm.
Prikaži ovu nit
Kraj razgovora
Novi razgovor -
-
-
And I'm guessing the paper, since it didn't even create a simulated user, neglects to note that even users with clear leftist viewing preferences are bombarded with Nazi material in their recommendations
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
-
-
Exactly. And there are a wide array of 'view bots' which promote content by increasing the numbers of watchers of videos. And creators are partly responding to the views of the 'view bots,' as well. If certain videos get more views, creators make more content like that.
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
-
-
that was true years ago. Mainstream channels are full of clickbait
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
-
-
what reputable sources are there with a decent degree of impartiality? Seems like everyone is biased in some direction
-
The problem is not dishonest reporting/studying. The problem is that our available means of study are biased. Example: if you use your eyes, your data is biased towards visible light, even if you mean to work fairly. To do better, read methods and results for discussions of bias.
- Još 2 druga odgovora
Novi razgovor -
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.