If YouTube radicalises people doesn't the fault lie with the "rabbit hole" algorithm that encourages people to view more extreme videos with every click? Individual content creators aren't responsible for the methods YouTube uses to keep people on its site to sell more ads.
-
-
supposedly thinking very hard about bias in YouTube's algorithms seem to have noticed this. The algorithm *may* be diabolically tuned to push people to extremes, but it may not, and if the evidence is also consistent with a simple circle algorithm, well, come on folks.
-
Could it be people simply give more notice to the extreme recommendations? If you see three recommendations and one is extreme while the other two are ordinary and you ignore, it doesn’t make all of the recommendations extreme
-
No, it’s that people who already made up their minds about the existence of some “extreme” rabid hole algorithm look evidence that fits their preconceptions & ignore everything that doesn’t. There’s no way to tell which vid is more “extreme” just by looking at thumbnails.
-
Not that the word “extreme” has any meaning. That’s just a word people throw around as if we all understood and were in perfect agreement about what exactly constitutes “extreme”.
-
I'd say that a video about how the holocaust isn't real is more "extreme" than a cooking show, so yeah, the word does have some meaning
-
Except that the vast majority of people that get called “extreme” these days don’t deny the holocaust. The word is utter bullshit.
- 1 more reply
New conversation -
-
-
The simplest possible algorithm is random sampling, not something adaptive.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.