If YouTube radicalises people doesn't the fault lie with the "rabbit hole" algorithm that encourages people to view more extreme videos with every click? Individual content creators aren't responsible for the methods YouTube uses to keep people on its site to sell more ads.
-
-
Could it be people simply give more notice to the extreme recommendations? If you see three recommendations and one is extreme while the other two are ordinary and you ignore, it doesn’t make all of the recommendations extreme
-
No, it’s that people who already made up their minds about the existence of some “extreme” rabid hole algorithm look evidence that fits their preconceptions & ignore everything that doesn’t. There’s no way to tell which vid is more “extreme” just by looking at thumbnails.
-
Not that the word “extreme” has any meaning. That’s just a word people throw around as if we all understood and were in perfect agreement about what exactly constitutes “extreme”.
-
I'd say that a video about how the holocaust isn't real is more "extreme" than a cooking show, so yeah, the word does have some meaning
-
Except that the vast majority of people that get called “extreme” these days don’t deny the holocaust. The word is utter bullshit.
- 1 more reply
New conversation -
-
-
You'd still have to think about the ethics and effects of using a circle algorithm and decide if that's a reasonable choice for the platform. Algorithm complexity is irrelevant. It is possible that a circle algorithm would be even "worse" due to external social factors...
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Exactly. Which is why the issue is obviously not with an algorithm, but with the content. If all content is allowed, then providing instant, worldwide distribution and efficient discovery of
-
said content is obviously going to increase awareness. It'd be surprising if YouTube *didn't* produce an increase in political extremism, solely based on historical analysis of what happens societally when the ease of information publishing increases.
-
agree, but isn't the issue also with the algorithm then? if the algorithm works on known content & produces a bad outcome, it's a bad algorithm even if it's simple
-
No, the point is that the higher order "issue" is increased information movement. YouTube's algorithms/platform should be expected to increase political turbulence, historically speaking.
-
But that's just of a piece with the trend of the internet in general. To point fingers at YouTube because it's a big platform is missing the macro point. The genie isn't going back in the bottle.
-
Agreed that access to all this information should increase turbulence overall! But if different “watch next” algorithms can produce more or less turbulence, and we have prefs, then we can say the current one is bad imo
-
It depends on what you mean by 'bad'. The line between shouting fire in a theater and provocative political speech can't be perfectly defined, and with info increase, the job of defining 'good' or 'bad' filtering becomes essentially impossible, which is Jon's point.
- 4 more replies
New conversation -
-
This Tweet is unavailable
-
-
the argument isn't that the algorithm is designed to radicalize you, it's designed to keep you watching for as long as possible, so it just floods your recommendations with more of the same, creating an echo chamber
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
I think extreme or fundamentalist views, by definition, stand on a narrower ground and seem more simple, thus easier to spread than complex ideas. I guess youtube mimics yellow press in that way by giving people whatever makes them click the fastest.
-
And I would guess you don't need anything too complex to solve for click speed. So, more circles and pick from overlapping areas?
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.