YouTube, the Great Radicalizer

from NYTs

At one point during the 2016 presidential election campaign, I watched a bunch of videos of Donald Trump rallies on YouTube. I was writing an article about his appeal to his voter base and wanted to confirm a few quotations.

Soon I noticed something peculiar. YouTube started to recommend and “autoplay” videos for me that featured white supremacist rants, Holocaust denials and other disturbing content.

Since I was not in the habit of watching extreme right-wing fare on YouTube, I was curious whether this was an exclusively right-wing phenomenon. So I created another YouTube account and started watching videos of Hillary Clinton and Bernie Sanders, letting YouTube’s recommender algorithm take me wherever it would.

Before long, I was being directed to videos of a leftish conspiratorial cast, including arguments about the existence of secret government agencies and allegations that the United States government was behind the attacks of Sept. 11. As with the Trump videos, YouTube was recommending content that was more and more extreme than the mainstream political fare I had started with.

Intrigued, I experimented with nonpolitical topics. The same basic pattern emerged. Videos about vegetarianism led to videos about veganism. Videos about jogging led to videos about running ultramarathons.

It seems as if you are never “hard core” enough for YouTube’s recommendation algorithm. It promotes, recommends and disseminates videos in a manner that appears to constantly up the stakes. Given its billion or so users, YouTube may be one of the most powerful radicalizing instruments of the 21st century.

More here.


52 Responses to YouTube, the Great Radicalizer

  1. maria perdomo June 15, 2018 at 5:15 pm #

    I have to admit that I do not agree with Zeynep Tufekci’s supposition and I would like to explain my reasons. In her article, the author claims that YouTube recommends more extreme content for those who watch much less extreme mainstream videos in order to radicalize viewers. In fact, this statement is just partially true since YouTube may recommend much more extreme content but not deliberately or with an intention to radicalize viewers. Indeed, YouTube just links content with the related keywords and, obviously, such videos, regardless of their extreme or radical ideas, have shared keywords. This is one of the basic principles of any search engines, advertisement tools, and generally, of the sites with different content which is usually supplemented with corresponding keywords. In this case, I believe that YouTube similarly recommends less extreme mainstream videos to those who search for more extreme content as they share the same keywords.
    On the other hand, Zeynep Tufekci’s article still raises important issues. It makes me think that YouTube can be involuntarily used as a “radicalizer” by third persons who deliberately include to their videos popular keywords in order to gain more viewers and followers. In this situation, YouTube can be used as a platform and keywords can be used as tools for the propaganda of adverse social movements and subcultures. For instance, terrorists can include keywords “Islam” or “religion” and related words and phrases which seem neutral at the first sight to the videos promoting terrorism. Therefore, Zeynep Tufekci seems to address the right issue, however, her article might be not directed at the criticism of YouTube but rather might warn about cybersecurity issues.

  2. KC June 15, 2018 at 10:54 pm #

    I think the author of this article kind of missed the mark when it comes to the dangers of internet algorithms. She mentions how the YouTube algorithm recommends more and more extreme versions of the content the user is watching, and then concludes that YouTube may be trying to radicalize its users. As much as I love a good conspiracy, I don’t see how this is accurate. First, Google (YouTube) is a business, and I don’t see a financial incentive to turn “joggers into ultramarathon runners”. Next, as far as politics goes, there is a lot of political content on YouTube. The company is obviously left-leaning, as evidenced by its countless censorship scandals against right-wing YouTube channels. So then why would YouTube recommend more extreme right-wing content to someone consuming, say, “vanilla” Republican media? It seems as though the YouTube algorithm crawls the site for videos that share similar tags, titles, and users watching the same videos. Similar to Amazon’s “Consumers who liked this product also bought…” section, YouTube recommends videos based on groups of media consumers that watch like-minded content. So if a lot of the users consuming a certain moderate video also watched a more extreme video, that extreme video is going to be recommended.
    I’m not sure if the author knows this, but Google owns most of the internet. It holds an enormous majority of online searches, and tens of millions of websites use Google analytics for tracking. So chances are when you visit a website, information about your stay on that website is getting sent to Google so that it can target more specific ads to you. It seems reasonable to conclude that it not only targets more specific ads, but recommends more specific content. In this way, it makes sense that someone watching a video about vegetarianism will also get recommendations for videos about veganism, because those two topics are closely related, have similar audiences, and have a lot of crossover all around the web. For example, visiting a site like, a popular plant-based science website, for information about vegetarian diets will send packets of data to the analytics team. Since they aren’t in your brain, they will see you visited that website and looked at articles about plant-based nutrition, and assume that you may want to see videos about veganism.
    In an age of instant gratification and sensory overload, people want to see bigger and better things all the time. We consume more media than ever before, and eventually our desires for bigger and better, or perhaps, more extreme content continue to grow. It seems a bit far-fetched to say that YouTube is purposefully recommending more extreme topics to users in order to radicalize them. Instead, it makes more sense that YouTube is receiving your browsing information and cross-referencing your interests with those who are interested in similar things. Then, YouTube recommends you something else you may like based on that information.

Leave a Reply