YouTube, the Great Radicalizer

from NYTs

At one point during the 2016 presidential election campaign, I watched a bunch of videos of Donald Trump rallies on YouTube. I was writing an article about his appeal to his voter base and wanted to confirm a few quotations.

Soon I noticed something peculiar. YouTube started to recommend and “autoplay” videos for me that featured white supremacist rants, Holocaust denials and other disturbing content.

Since I was not in the habit of watching extreme right-wing fare on YouTube, I was curious whether this was an exclusively right-wing phenomenon. So I created another YouTube account and started watching videos of Hillary Clinton and Bernie Sanders, letting YouTube’s recommender algorithm take me wherever it would.

Before long, I was being directed to videos of a leftish conspiratorial cast, including arguments about the existence of secret government agencies and allegations that the United States government was behind the attacks of Sept. 11. As with the Trump videos, YouTube was recommending content that was more and more extreme than the mainstream political fare I had started with.

Intrigued, I experimented with nonpolitical topics. The same basic pattern emerged. Videos about vegetarianism led to videos about veganism. Videos about jogging led to videos about running ultramarathons.

It seems as if you are never “hard core” enough for YouTube’s recommendation algorithm. It promotes, recommends and disseminates videos in a manner that appears to constantly up the stakes. Given its billion or so users, YouTube may be one of the most powerful radicalizing instruments of the 21st century.

More here.

Posted in Web and tagged , .


  1. This article brought up numerous, very interesting points about how we humans consume news and how that process plays into our primal natures. First and foremost, the article brings up the argument that people are always compelled to “get behind the curtain” and know an issue more deeply. Human beings have a psychology tendency to want closure on things and, oftentimes will jump to conclusions unnecessarily in order to reach those conclusions: “In our rush for definition, we tend to produce fewer hypotheses and search less thoroughly for information.” The pressure to know, however, is based largely on our environment. When there is something stressful or something that can affect us, we feel a particular urgent need to determine the truth (or whatever appears to us to be truthful most quickly) ( It would seem that YouTube leveraged this human need in order to up their viewership and, consequently, earn more money from advertisements on those videos. To me, this seems to be a very unethical practice, especially given the quite hostile political climate we are currently a part of. YouTube’s blatant usage of “click-bait” video recommendations is clearly a ploy to get more views. Yes, YouTube is a business and it is understandable that it seeks to do what it can in order to get a higher viewership, however, context is important. Given the highly volatile political dialogue being had, it strikes me as highly inappropriate to be using quasi-extremist recommendations in order to garner those views. Especially in the era of “fake-news” and the ease with which it is propagated, it is not only unethical to continue a strategy such as this, but also potentially dangerous. You never know who will end up viewing those videos and what they will do with what it is they “learned.”

  2. Youtube has come to the forefront of online streaming media. Starting as a streaming service that allowed people to share their ideas, creation, and comedy genius, it has transformed into a political platform for politicians and celebrities alike. In the article the writer shows the perfect example on how Youtube, now Google, uses an algorithm to suggest videos that fit your profile. This truly shows that Google tracks all your information and is able to infer what kind of content you are most likely to watch. The more time you spend on Youtube, the more money Youtube makes. Therefore, suggestions for videos is truly a smart idea when it comes to making money. Youtube is serving up videos that we find attractive in the same way we find sugary and fatty foods at fast food chains. We know its not good for us but somehow we are still attracted to that food or that video. Also, conspiracy videos are not good for the public. This creates speculation that does not even seem remotely correct.
    With our current President taking advantage of social media, the rise of radicalization of people is growing. With video suggestions, Youtube can actually radicalize someone whether it be on the far left or the far right of the political spectrum. Showing videos of support of the Holocaust or the support for a communist party are a few examples when it comes to Youtube’s power to persuade people to be radicalized. Terrorist organizations are now infiltrating social media in an effort to attract new recruits and to attract new militia members to their group.
    This example just shows how much power Google has when it comes to attracting people. With ads that suit our preferences, Google and Facebook have built profiles on each of us without us even realizing it. Our buying habits, what websites we visit the most, and who we communicate with is all traceable with simple technologies developed by these big companies. We are being watched everywhere we go on and offline. Youtube suggesting videos is simple compared to what else is going on the internet and off the internet.

  3. The author raised a few interest point in the article. The author mention during the 2016 presidential election campaign, when she watches videos about Donald Trump on Youtube, she noticed Youtube started to recommend a lot of white supremacist rants and Holocaust denials. She thinks Youtube is trying to promote Donald Trump’s campaign. However, she creates a few new Youtube accounts and starts to watch the video about Hillary Clinton and Bernie Sanders, Youtube start directed her to leftish conspiratorial cast. Youtube is owned by Google. Google is using people’s online behavior to racks up the ad sales. Since Youtube is free to everyone, the only sources of income for Youtube are money from the advertising companies. According to the former Google engineer named Guillaume Chaslot. Mr. Chaslot worked on the recommender algorithm while he works on Youtube. He confirmed that Youtube is using people’s online record to promote relevant radical contents to people and keep people on the Youtube and watch more ads. Advertisers can now target Youtube ads based on people’s Youtube watch history. For example, if the user watches a lot of videos about car reviews, Youtube will recommend a lot of car companies ads.
    In the future, Google is going to let advertisers target Youtube ads based on people’s search histories. The advertisers will have a whole new aspect of customer mindset and focus more on the target market. In the other can, Youtube will gain more income, because advertisers are willing to pay more for accused information. However, Google selling people’s information to advisors are wrong, because people will become more radicalized and violent. For example, if someone was just trying to watch some mainstream news about gun violence, and then Youtube will keep recommending video on school shooting and anti-gun protest. Since Youtube keep recommend the anti gun violent video, that person will become an anti-gun protestor. Youtube is making the political decision for people. If Youtube keeps using recommender algorithm. In the future people will be either far left or far right, and the country is already so divided; Youtube will just make the country even more divided.
    I agreed with Nicolas’s comment. Youtube using human need to up their viewership and provide more income for Google. More video that people watch means more popularity for the advertisement on Youtube.

  4. Youtube is one of the most popular forms of social media out there in the world today. With the popularity of the site, there will inevitably be advertisements, and the way Youtube advertises is pretty smart when you think about it. When you have a trend of videos that you have been watching, Youtube starts to recommend videos for you. On those videos there are ads that are directed towards your psycho graphics based on your searches. This is useful in advertising, but sometimes they can lead you to the wrong sort of videos which in this case, is the Trump and white supremacy videos. Youtube is owned by Google, and Google is a left winged company, because of course everything needs to be about politics, so for them to seize an opportunity to determine what videos the viewer is interested in, they need to determine the psychographics of the viewer. In this case, Youtube thinks that if you watch Trump videos, you are interested in white supremacy. When the politics of the site start to show themselves, that can turn viewers off significantly, especially when it is something as devising as thoughts on Donald Trump.

    Typically people want to keep their entertainment and their politics separate, but since the media now want to try their hardest to turn everyone against the president, it is impossible to not have any politics shoved in your face. If the advertisements and recommended videos shown were not as politically inspired, than the social media community would feel more at ease and more comfortable to say things that they want to say as long as it is not threatening.With sites like Youtube now using their advertisements and their use of video recommendations, it is almost like to guilt trip people into seeing the recommended video and think to themselves, “Is this what I am supporting?” It is a clever way to do things, but if someone who has been hearing about the bias of the media and the person who has been looking into these things for a while, than they can spot the political intentions right away and immediately be turned off by it. This is something Youtube needs to be careful with, as the placement of ads and videos determines the demographic and psychographic of the individual watching. It can be a very creative marketing strategy, but to make it better? Keep political bias out of the viewers attentions.

  5. This article was an interesting read, and given the current state of events regarding politics and the spread of information, the findings the author presents is very alarming.

    As someone who uses Youtube myself, this disturbing increase in how “hardcore” the videos that are recommended to a user are true. An individual could look up previously uploaded streams of a famous Youtube gamer, for example, and have recommended videos by other users who delve deep into the history of that same Youtuber, either with false information, or a dark past that the individual might have already put behind his or her self. Regarding the author’s findings on the leftist or far right videos, the information presented in this article can prove to be more harmful to the user than the average Youtube video.

    The author mentions that the Youtube algorithm caters to the desire for humans to “look behind the curtain,” or discover hidden truths. The issue with looking for the truth in Youtube videos is that people are less likely to fact-check the content they see. Referring back to the New York Time’s article “Our Hackable Political Future,” the influence that online videos have on our perceptions of public figures and politics can drastically become more powerful, ( Videos spreading misinformation through false facts, and figures could become edited videos of actors portraying a particular figure, making it harder to distinguish whether or not the video presented is credible. This would worsen the spread of false information or “fake news,” as many people would be even less likely to fact-check a video that is edited so well, they do not even realize that the person being represented is not actually in the video.

    It is important for viewers to not trust all information on the internet, and to check many sources before coming to a conclusion on a specific topic. These videos spreading false or extreme points of view does not allow room for the viewers to formulate their own opinions through their own research. This Youtube algorithm is essentially taking away a viewer’s ability to think for themselves when formulating an opinion on a specific issue for greater amounts of revenue. While it is important for individuals to be able to express themselves online, whether it is through the content they create, or that they support, it is also equally as important to allow individuals some breathing room to allow them to conduct their own research and formulate their own opinions, instead of going to extremes.

  6. As with most social media platforms, YouTube began as a way to connect with others and share ideas with friends. It wasn’t taken as a serious channel for the widespread distribution of information. Of course, this has all changed in the past decade or so, and these platforms have become a major source of news for the worlds population. In America in particular, 67% of people use social media as their main source of news, showing that these platforms are actually the most popular source of news in America.
    This dependence on social media as main source of news is particularly troublesome because and the base of all of these platforms is a company that is trying to increase their profits. Unlike news outlets like NBC, BBC, or CNN, there are no required fact-checking regulations on these sites. Anyone can post a video without documenting their sources. This is particularly evident when looking at the algorithm used by YouTube, in which they take topics you looked at, and auto-play videos increasing in their extremism. It’s human nature to desire more information on topics we’re interested in and uncover things we didn’t previously know before. This characteristic of YouTube’s algorithm is particularly concerning when regarding adolescents. It has been shown that children are most impressionable from the ages of six to fourteen, and it is those later in the age bracket that could suffer the most from YouTube’s extremist algorithm. Fifty six percent of children aged twelve to fifteen use YouTube at least once a week; these kids are at the age where they are aware of politics and the news and are generally beginning to shape their own views. If they were to try and gather information about hot political topics from YouTube, they would be shown increasingly radical viewpoints, and they would be at the age in which they would be susceptible to these views without developing them organically on their own.
    Politics, in America particularly, are already so polarized, that having potentially increasing amount of radicalized left and right wingers, could be quite detrimental. There are already so many major topics that the government cannot seem to tackle in todays world (gun laws and healthcare for example), that if the up and coming generation (and future legislators) are a large group of opposing radicals, nothing will be achieved, and social unrest would commence.


  7. This article brings up something that tends to always hide in the background and in long privacy policies that we always scroll quickly through to accept, usually until something happens, which is how companies track you and collect tons of your information and the effects their target marketing have on society as a whole. The author, Zeynep Tufekci, writes about how she was writing an article about Donald Trump’s appeal to his voter base during the 2016 presidential election, and consulted YouTube videos of his rallies to confirm some statements. Soon, she began to notice that YouTube began suggesting extreme, far-right white supremacist and Holocaust denial videos. Out of curiosity, she tested to see what YouTube would do if she had watched Hillary Clinton videos instead. YouTube recommended videos about government conspiracies such as the existence of secret government agencies and that the U.S. government was behind 9/11. YouTube might have connected these videos to Donald Trump because that’s what the public, especially leftists, post and comment about Trump on the internet, and Googles data collection might show that people who watched a rally might go search for the extreme since, again, it’s what a lot of internet users put out there. Not every Donald Trump supporter is a white supremacist. She also bring up a point how Google makes money, and their overall business model. Unless you have subscriptions to YouTube Red or TV, then YouTube, as well as many of Google’s services, are free primarily paid for by ads. All this can be problematic where some people can be radicalized, or even simply annoyed. But if there was someone with curiosity in what a radicals think and believe in after some event happened on the news, but not a radical themselves, they could be radicalized from their own voluntary research without YouTube or Google recommending anything in the beginning. There are also other ways one can be radicalized, either far right or far left. Censoring the radicalizing content would be violating the poster’s freedom of speech, so as long as it is not obscene. Additionally, it would be very difficult to stop the public from posting or publishing content like that in the first place. Spreading content of falsified information, hate, ignorance, etc. is an issue but what to do with it is an issue that there is no simple solution to. Also, in a way, the company is right, they are only giving you what you want. If over time your tastes change and adjust, you will go on Google or YouTube and search deeper into your interest anyway; and with the internet being a vast collection of information, it is inevitable that one could come across the extreme of their interest.

  8. In wake of the shooting in Parkland, there has been a detrimental increase in gun control awareness across the nation. On March 14th, there was a walkout in schools all across America to honor the victims of the massacre from Stoneman Douglas high school in Parkland, Florida. Spewing from this debate of the walkout also came the Walk up Not out Campaign, where students are encouraged to interact with other students they normally do not socialize with. This campaign is to stop bullying in our schools across the country, as well as preventing another mass shooting from a student of a school they attended. Nikolas Cruz was said to be a “troubled” student, and was kicked out of Stoneman Douglas the year prior. There were claims made by other students that Cruz always had peculiar behaviors which led to him being bullied by other classmates, but nothing will ever justify the heinous act he committed.
    The complaint being discussed is that there were more students that participated in the walkouts for the sake of cutting class for 17 minutes instead of doing the honorable and participating in a moment of silence for 17 minutes to honor the victims that succumbed to Cruz’s heartless actions. Due to the algorithms that were written by YouTube, more videos about children “faking” their moment of silence were depicted on their pages, in place of actual footage of the protests of students walking out of their schools. The argument with this algorithm is that it was the alt right that was controlling the algorithms that were created for YouTube’s search results. One researcher complained that there were more related videos to appear on her YouTube account that were either pro-Trump videos or more videos that supported Republican beliefs instead of portrayals of the Democratic Candidates that ran in the previous election. Could an algorithm be to blame for this action? According to approval ratings, Trump is not the preferred candidate as the President of the United States, however he still won the election in 2016. To set politics aside, there is also a chance that there are more videos on YouTube that were uploaded about the misuse of the March 14th Walkout protest, in addition to more videos being uploaded that supported the Walk up not Out campaign as well.
    Families of the Parkland victims encouraged students not to walk out of school even though it was intended to be out of protest for gun violence in American schools. A father of one of the victims encouraged students to pay more attention to their peers who are less forgotten, to smile at someone you never socialize with, be more proactive in your community and speak up for what they believe in, not walk out and deprive themselves of time in the classroom. He also encourages students to be more appreciative for what they have, be more respectful to others and especially their teachers, being that one day their teacher could be the one to save their life.
    Walking up instead of out has the potential to take a greater action than to simply walk out of school to protest. Walking out is also something that is able to raise mental health awareness in children and teens, being that the rate of mental illness in the youth of today is climbing.


  9. I think this article is a bunch of bologna and just a bunch of propaganda against YouTube and google for some odd reason. If I were to take a guess, the statement in the article where it says, “what keeps people glued to YouTube? It’s algorithm seems to have concluded that people are drawn to content that is more extreme then what they started with – or incendiary content in general” or when in the article it says “YouTube has recently come under fire for recommending videos promoting the conspiracy theory that the outspoken survivors of the school shooting in Parkland, Fla., are “crisis actors” masquerading as victims” I would say this propaganda is being directed towards controlling a monitoring what people say and do on the internet which is not allowed and is against our constitutional rights as Americans. Whoever is spreading this fear might possibly be behind the scenes for the gun control debate as well. This is just an idea though.
    Also why did the government repeal net neutrality? I don’t have all the answers to this question but maybe this propaganda claiming that, “human beings have many natural tendencies that need to be vigilantly monitored in the context of modern life”. This is a quote from the article we just read. Maybe you didn’t see it as that suspicious when you first read it but now with the context that I’ve provided to you, don’t you think that it is unnatural to monitor someone’s behavior? This is laughable. You can’t just take away our freedom of privacy because there are people with guns (who shouldn’t of had access to a weapon in the first place) shooting people. By the way if you really want to get technical “the devil is in the details”. TPP, the Trans-Pacific Partnership dedicated an entire chapter on internet based commerce where all countries in the agreement would have to abide by certain rules and guidelines in order to protect citizens from the misuse of data, discrimination, potential trade barriers, censorship and more. Just look it up.
    Link: (

  10. YouTube is the second most popular website on the internet with number one being the flagship website of its owner Google. YouTube has over 30 million people visit each day and 300 hours of video uploaded every minute ( making it at this point not only a force to be reckoned with but THE force to be reckoned with in regards to amount of people it can reach and effect all at once. That ability to influence cannot be overstated in its importance because if YouTube does something wrong that can very easily cause some serious fallout. This is no longer a hypothetical either there has been serious catastrophes that many casual visitors will be completely unaware of as a direct result of YouTube’s apparent “radicalization” algorithm. As the article points out YouTube is very clearly pushing large amounts of its users to more and more extreme versions of whatever it was they were originally watching. They do this for the simple reason that those types of videos keep people engaged and the more people engaged the more money YouTube makes. The key point there is not just that YouTube makes a lot of money off of that algorithm but how it actually makes money. Just like regular TV, YouTube makes money by bidding off ad space to prospective advertisers, the more views videos get the higher YouTube can start the bid off.
    Putting that together with YouTube’s “radicalization” algorithm and you get ads for Coke and Amazon being advertised on extreme right and left-wing conspiracy propaganda and hate speech. This for obvious reason made advertisers very unhappy, as no one wants there products being linked to those sorts of messages or crowds. This led to what’s called “Adpocalypse” of which there were actually three separate instances where large amounts of advertisers were pulling their advertisements from YouTube to avoid being associated with “Not Advertiser Friendly” content ( Which of course sent YouTube into a mass panic trying to fix this problem before it could get any worse, this failed spectacularly. In order to try and make sure advertisement were not running on “Not Advertiser Friendly” content they ramped up their “de-monetization” algorithm which pulls ads from anything unsavory. On the surface that seems like a pretty reasonable fix except that their algorithm was particularly bad at deciding which videos were and were not advertiser friendly.
    So things like Pubic Service Messages about suicide prevention, war reporting, and educational history were all having their advertisements pulled meaning they were no longer providing these content creators with any income. This adversely effected a huge portion of the creator population leading many of them to lose significant amounts of income that was often just barely providing their livelihood. This left YouTube in a very difficult position between a rock and a hard place. On one hand those radical videos get tons of views because of their algorithm but on the other advertisers don’t want to advertise on those videos leaving the content creators to get screwed over in the process. YouTube today is still trying to deal with the outcomes of “Adpocalyspe” and at this point YouTube hasn’t made any concrete improvements to any of its algorithms or appeal processes leaving everyone in a state of limbo.

  11. This article actually highlights something that has been a big fear of mine ever since I attended a TED talk on the subject. You can find the talk here, unsurprisingly on YouTube: The fact that YouTube can influence what we see is quite scary given the current political climate. Political ideologies are increasingly becoming more polarized and more extreme in the United States and this is only contributing to the problem. Seeing as YouTube is also one of the only means in which we can watch so many videos online it has more of a responsibility to have a well regulated website. This article also does a good job of highlighting how this is part of a larger issue. Companies are increasingly going to have to look at the intent of their content especially companies that provide information to the general public. This is a role that companies such Facebook, YouTube, and even the internet service providers have never been in before. Government may be required to step in and make sure that these companies are acting in a responsible manner. If a reasonable solution is not come to regarding how we should apply algorithms then it could undermine our current political system to the point of it collapsing.
    The article highlights that Google, the parent company of YouTube, really doesn’t seem to care what it plays as it is gathering large amounts of advertising revenue. This also shows how the irresponsibility of companies in this realm should be treated, in my opinion, the same way as if they were polluting the environment. This is to important to go unchecked.

  12. As a millennial who likes to watch “Last Week Tonight with John Oliver” and other entertaining/educational videos online, I enjoy what Youtube offers. I have a chance to watch a trailer for an upcoming movie and watch the newest Buzzfeed’s “Unsolved Mysteries” videos. However, I am not surprised to read how the Youtube’s algorithm matches us with videos that most likely elicit some sort of response from us. Facebook had a similar issue with its algorithms that matched Facebook viewers with false news that appealed to their political preferences. Although Facebook is now attempting to correct its algorithm to produce fewer, trustworthy news, we are once again reminded the problems that social media pose for many of us. Now, information travels fast to our phones and Facebook feed. Therefore, our constant exposure to information is causing many negative symptoms such as insomnia, social anxiety, and depression.
    At the same time, dangers loom in the dark corners of the internet where people with radical thoughts can post their messages online for the world to see. One past example is with the ISIS recruiting videos on social media. On the social media, the ISIS terrorist group managed to convince many Westerners to join with its propaganda videos and messages. Another example is with the recent school shooting in Parkland. Many are concerned that media coverage of school shooters will encourage copycat crimes. Of course, the internet also allows people to vent and express their true opinions on certain matters, although their attitudes may not be popular. On different platforms such as Breitbart and Alex Jones’ radio show, they often represent the extreme right view. On the other hand, CNN tends to lean heavily on the leftist agenda. Simply put, social media platforms may allow access to free information and opinions. However, some of these opinions and information are either potentially dangerous or misleading. Therefore, people need to be more cautious and wary when they are online.
    As the internet provides many contents, scammers are also prowling to make few bucks from unwitting participants. One of the most common job frauds regards with home jobs. Someone may contact a job seeker via email or text message. The scammer may say that the person has an opportunity to work as a personal assistant. Interestingly, many students in Seton Hall University received an email notification about the personal assistant’s position. Of course, this is another reason to practice good judgment and precaution whenever someone offers an offer that sounds too good to be true.
    People often characterize the internet age as a time of boundless opportunities. However, we should also realize the dangers that come along with the internet. If we do not take responsibility to be vigilant about the content that we see on the internet, we may end up in a bad place.

  13. Simply put it, YouTube has changed the world. There is absolutely no denying this very fact that because of YouTube, we live in the society we have now; without the social media platform we would not have Justin Bieber, Kony 2012, Gangnam Style, Charlie Bit My Finger, and millions more. Personally, I watch YouTube more than I do television, and that fact has held true for a number of years now. Ever since I first discovered YouTube in the 5th grade watching stop-motion Lego videos, I have never not visited the site during the week. For me, YouTube has transformed from an entertainment value to a secondary source of a plethora of knowledge and educational substitutes for my classes.
    I have watched countless hours and went “click-happy” on YouTube and have often conducted my own “tests” and “studies” on where the platform will take me. If this is the route I want to take on a particular late-night YouTube session, I will begin by choosing a video from either the recommended or on the Popular page; I then will choose something that is of particular interest, obscurity, or has an intellectual component to it. From there, I let YouTube do its magic. The algorithm that the New York Times article touches upon is absolutely incredible. Essentially it keeps our eyes on the screen- feeding us “sugary, fatty foods”- and keeps the revenue flowing. It is absolutely genius on behalf of YouTube and is the reason for its survival. By the time I realize what happened, I have already watched about 10 videos and it is 2.5hrs later; this is purely off of YouTube’s algorithm that is backed by parent company Google. The “dynamic duo” can take over the video content sharing game forever with the continued push towards stronger AI and algorithm intelligence. Amazon’s recent acquisition for Twitch however, is something that should be alarming for YouTube.
    Yes, the New York Times Article talks about politics and extremism in the video recommendations, but what they do not touch on is the growing eSports genre and gaming. This is absolutely, by far, the future of the world. I, like a select other few, believe this will be the fastest growing sport in America within the next few years- and YouTube paved this path. They were the first to include video-livestreaming, which has promulgated into a multibillion-dollar industry. From YouTube, people have made an astronomical amount of money; the reason holds in the hands of the algorithms and the “behind the scenes” that happens within the YouTube platform.
    Personally I do not see a problem with YouTube and their business model, particularly their implementations of specific content affiliated recommendations using their algorithms. They are simply using human curiosity to pave the way for the next video.
    This link sums up just how important YouTube has been to the world in the past 13 years:

  14. The Cyber world has been a platform from which individuals have migrated their research towards; proffering more than suffice tools for deep cave exploration amid the incessant topics of the modern world. Trapped by the ever-growing mists of the technological era, we are oppressed by the suffocative forces that press against our laptops throughout our itinerant searches- the network system operates almost entirely like a monopoly in its own, having specific companies house the most popular means of a category of research. In this monopolistic based ideology, YouTube represents the Information and Entertainment stream of Intelligence, making almost any movie, song, or educational data available within the seemingly infinite videos it contains. With its user database exceeding over 1.3 billion individuals, almost 300 hours of content is uploaded to YouTube every minute. It in itself is an inexhaustible technological engine, powering not only the interests of its many consumers but providing an online informational tome from which almost any subject could be acquired. But it’s not just the open entertainment platform that makes YouTube such a widely utilized source, it is the recommendation ability, as the article displays, which additionally draws users in further to either their topic of research or shadows of a similar interest. The recommendation algorithm, as YouTube had explained at an ACM conference system regarding the recommendation platform concept, is run almost entirely by Google Brain, a deep learning artificial informational intelligence system that combines open-ended based machine learning with systematic engineering. With such software, individuals are now able to explore even more interesting videos, in an infinite loop of Artificial Intelligence-based suggestion. As soon as you are determined to make an account amid the platform, you have just then signed up for a complete preference related program; any video you watch or decide to like will be embedded into your “channel’s” DNA, until all of your interests have appeared on your monitor, which would have likely started from watching a single video. Additionally, many individuals on YouTube have actually started to receive income from the company as well, such as actual organizations that have made an account under the source or individuals with a certain and specific realm of expertise knowledge, making the videos available that much more appealing and authentic to the common eye. But branching off from its original roots that had progressively been evolving for over a decade, YouTube represents a media powerhouse, controlling not only the attention of most of the cyber web’s population, or at least those that embark toward their cite, but additionally a growing revenue opportunity system; a corporative scheme that is leeching onto to its consumers, from what would seem as a platform of pure visual enjoyment. The advertisements placed in YouTube have created a direct systematic approach to build profit off of the company’s website. From what used to be a full and everlasting stream of educational resource, especially with the recommendation system intact, is now transforming into an advertisement factory- polluting any video that would attract enough views and promote income. With millions of diverse categories to delve toward in the YouTube atmosphere, the source has become an informational cyber encyclopedia, where any interest you have could be pursued within any of the various videos the site offers. Any additional knowledge that you may even have to contribute to the topic, could be implemented among the comments section under the video’s display. Personally, much like the author of this article proffers, I have been in situations where my Business Law YouTube research has grown into a more elaborate approach on the subject, watching more videos that basically symbolize video-based technological synonyms from the original source. From this, more educational opportunity has been generated despite your occasional off topic related videos suggestion. Consumers are constantly emerged among their entertainment or educational research before exploring their next suggested video. This growing corporation has been an interconnected bridge where personal preference and discussion may emerge, among what may seem as a manipulative advertisement platform scheme.


  15. It seems as if you are never “hardcore” enough for YouTube’s recommendation algorithm. It promotes, recommends and disseminates videos in a manner that appears to constantly up the stakes. Given its billion or so users, YouTube may be one of the most powerful radicalizing instruments of the 21st century. This is not because a cabal of YouTube engineers is plotting to drive the world off a cliff. A more likely explanation has to do with the nexus of artificial intelligence and Google’s business model. (YouTube is owned by Google.) For all its lofty rhetoric, Google is an advertising broker, selling our attention to companies that will pay for it. The longer people stay on YouTube, the more money Google makes. What keeps people glued to YouTube? Its algorithm seems to have concluded that people are drawn to content that is more extreme than what they started with — or to incendiary content in general. Is this suspicion correct? Good data is hard to come by; Google is loath to share information with independent researchers. But we now have the first inklings of confirmation, thanks in part to a former Google engineer named Guillaume Chaslot. Chaslot worked on the recommender algorithm while on YouTube. He grew alarmed at the tactics used to increase the time people spent on the site. Google fired him in 2013, citing his job performance. He maintains the real reason was that he pushed too hard for changes in how the company handles such issues. The Wall Street Journal conducted an investigation of YouTube content with the help of Mr. Chaslot. It found that YouTube often “fed far-right or far-left videos to users who watched relatively mainstream news sources,” and that such extremist tendencies were evident with a wide variety of material. If you searched for information on the flu vaccine, you were recommended anti-vaccination conspiracy videos. It is also possible that YouTube’s recommender algorithm has a bias toward inflammatory content. In the run-up to the 2016 election, Mr. Chaslot created a program to keep track of YouTube’s most recommended videos as well as its patterns of recommendations. He discovered that whether you started with a pro-Clinton or pro-Trump video on YouTube, you were many times more likely to end up with a pro-Trump video recommended. Combine this finding with other research showing that during the 2016 campaign, fake news, which tends toward the outrageous, included much more pro-Trump than pro-Clinton content, and YouTube’s tendency toward the incendiary seems evident. YouTube has recently come under fire for recommending videos promoting the conspiracy theory that the outspoken survivors of the school shooting in Parkland, Fla., are “crisis actors” masquerading as victims. Jonathan Albright, a researcher at Columbia, recently “seeded” a YouTube account with a search for “crisis actor” and found that following the “up next” recommendations led to a network of some 9,000 videos promoting that and related conspiracy theories, including the claim that the 2012 school shooting in Newtown, Conn., was a hoax. What we are witnessing is the computational exploitation of a natural human desire: to look “behind the curtain,” to dig deeper into something that engages us. As we click and click, we are carried along by the exciting sensation of uncovering more secrets and deeper truths. YouTube leads viewers down a rabbit hole of extremism, while Google racks up the ad sales. Human beings have many natural tendencies that need to be vigilantly monitored in the context of modern life. For example, our craving for fat, salt, and sugar, which served us well when food was scarce, can lead us astray in an environment in which fat, salt and sugar are all too plentiful and heavily marketed to us. So too our natural curiosity about the unknown can lead us astray on a website that leads us too much in the direction of lies, hoaxes, and misinformation. In effect, YouTube has created a restaurant that serves us increasingly sugary, fatty foods, loading up our plates as soon as we are finished with the last meal. Over time, our tastes adjust, and we seek even more sugary, fatty foods, which the restaurant dutifully provides. When confronted about this by the health department and concerned citizens, the restaurant managers reply that they are merely serving us what we want. This situation is especially dangerous given how many people — especially young people — turn to YouTube for information. Google’s cheap and sturdy Chromebook laptops, which now make up more than 50 percent of the pre-college laptop education market in the United States, typically come loaded with ready access to YouTube. This state of affairs is unacceptable but not inevitable. There is no reason to let a company make so much money while potentially helping to radicalize billions of people, reaping the financial benefits while asking society to bear so many of the costs.

  16. As someone who watched a lot of YouTube this article really scares me. For some reason I have never been suspicious of the videos recommended to me but it makes a lot of sense. There have been more than a few times when the video I started with lead me down some weird paths I never thought I would ever find. I always thought the algorithm was a little off and would sometimes give me videos I’ve never searched for or topics I’ve never even consider looking for.
    At first this all seems harmless, until you start realizes some of the consequences this could have for younger kids and even people on the fence about certain topics. People look to YouTube for a source of information because video evidence typically provides an unbiased view. However, when kids search for topics related to the news they will be brought to radical videos which will shape their mind and never allow them to create an opinion of their own. I know exactly how convincing YouTube videos can be and I’ve learned how to watch them while keeping a completely open mind and not allowing any single argument convince me. Kids, on the other hand, are much more vulnerable to persuasion and will not be able to spot information that is possibly false. And the fact that Google, the owner of YouTube, makes up 50% of pre-college laptop education cannot be a good step to solving this problem.
    This is not the only recent problem that YouTube has had with their algorithms. Many people are getting upset because the algorithms used to determine how much money YouTube creators make is decreasing everyones’ revenue and YouTube has done very little to explain the trend. It does not surprise me that YouTube is getting caught up in more controversy and I would not be surprised to see something like this happen again.
    In terms of presidential elections, this could obviously cause major riffs throughout the country. Everyone is aware of the argument that Russia helped Trump win the campaign and there has been massive backlash because of it. Now that a former YouTube employee has come out saying we are more likely to end up on a Pro-Trump video than Pro-Clinton there will no doubt be uproar. We just want unbiased, mainstream content that we search for without an algorithm deciding what we want to watch for us. Until YouTube acknowledges what they are doing and tries to fix it we will have to be careful exactly what videos we watch. Right now they firing employees who try to make changes so they can keep pushing whatever agenda they have. I sincerely hope more people do research and make it available to the public because the only way to fix this problem is to spread awareness.

  17. I believe it would be unfair to just section this off to YouTube because they aren’t the only place where young people are now getting their information. Sites/apps such as Instagram or Facebook have the same amount of power because of their search algorithms end up leading you in the same direction YouTube is. As you like more and more pictures on Instagram you get deeper and deeper into the rabbit hole of posts on your search page that will lead you to a far left or far right direction. Facebook could lead you in the same direction as it suggests certain pages if you search for something. So, it’s not just YouTube making up the information we consume as young people because as YouTube may be videos, we can find the same on Instagram as well. However I do agree that these social media outlets do have too much control and could better be regulated than they are now. YouTube has forever been mysterious about their search algorithm since this hasn’t been the first time people have speculated it leads them on a certain path. It’s kind of like advertising in real time because as we get deeper and deeper it brings us to these outcomes. We could definitely better regulate YouTube as people now have substituted TV for YouTube. As Adam Facella said in his comment, “A young person would not sit down and watch the news to get political information based on the news channel. But, they will watch a 2 minute video if it is “recommended” for them.” Like we did in the past, we can regulate what is shown on YouTube as we have done with TV. TV we have regulated what advertisements have been shown especially for the younger generations. Companies can no longer target children for certain advertisements such as junk food which would be bad for them, instead publications have to advertise healthier foods. This created a positive change as child obesity is a growing problem and people are trying to fix this. Children can be easily manipulated and if they are going through YouTube and watching radical videos, it would eventually affect them if they believe the information.

    Some type of legislation has to be made to regulate ads and YouTube videos. An easier fix instead of getting some type of legislation involved would be that YouTube just change their search algorithm and suggestion algorithm so that videos are related to what you actually searched for. At least putting an age restriction on these radical suggestions would be a better fix so that they aren’t exposed to these videos. Someone that is 16 or 18 years of age has a much better time dealing with these types of information over a 10 year old. In turn financially, YouTube or Google would be losing some money off of ad revenue, but if it ends up stopping the radicalization of children then it’s a better alternative. Places such as Norway and Quebec already has legislation in place that says companies can’t advertise to children under the age of 12. So, I believe it wouldn’t be too much to ask YouTube to change their algorithm for kids under a certain age.

  18. YouTube is one of the great wonders of modern technology and AI. The Google owned platform is both a great divider and a great source of community for billions of people worldwide. As someone who uses YouTube every day as a source of information, it’s positive feedback feature is clear. At times the feedback loop is exhausting and eventually causes me to leave the platform altogether. I have followed politics most of my life, and YouTube serves as a way to get snippets of the most important news without having to endure hours of debates and opinion sharing that is rampant on most large news networks. Many people like YouTube because it is easier to find only the information one is searching for, but this of course also caters to people’s natural tendency towards confirmation bias. In a sense when YouTube executives defend their actions by stating that they are only trying to give people what they want, they aren’t incorrect, yet people may desire more varying information if it is easily offered to them. For example, I tend to binge watch a single type of content for hours and become frustrated when other content becomes hard to find.

    My experience on YouTube would be more enjoyable if it were easier for me to switch from one kind of content to another without having to subscribe to every channel. It makes sense for YouTube to want its consumers to subscribe to as many creators as possible, but often times I just want to watch a single video to research a topic or better understand an opinion without giving a creator a new subscriber. I have definitely had experiences where I started watching a single video about politics and then ended up on extremist creators’ pages where uncomfortable topics are treated as matter of fact. The business model Youtube uses is faulty because it fails to acknowledge that many people do not wish to be avid watchers of certain content, but prefer to dabble in different genres and have a well rounded set of interests. By improving the algorithm YouTube may be able to make more money and be less divisive. At the end of the day, Youtube is not responsible for the content its creators make, but it should be conscious of how its algorithm can quickly push a moderate information consumer towards questionable and inflammatory content.

  19. Whether intentional or not, I believe that Youtube’s algorithms can be dangerous. Especially when it comes to politics. Obviously anyone can throw their opinions into a video and post it. Which itself is not bad at all. However, like many other databases of information, this can easily become an echo chamber for those who only listen to what they want to hear. When it comes to news, it is in my opinion imperative that both sides of the story are told and all aspects assessed. Having Youtube queue up similar or even more radical videos one after another is a recipe for disaster. Take the writer’s example for instance. Say they had been on the fence about their political stance and continued watching queued videos. Youtube would obviously not tell both sides of the story, only feeding viewers related video after related video. As a source of news, this is very dangerous. And the harsh reality is that for many young impressionable children, this is their only form of news. Of the 1.3 billion youtube users, 14% of them are under 18. The majority of children do not turn on their televisions to view the daily news, so youtube is their platform for current world events. It is important that they receive unbiased accounts of the stories. Therefore, I think that it is important for youtube to adjust their algorithms for at the very least political videos. And with regards to nonpolitical topics, some diversity would be welcome.

  20. In today’s society, social media platforms are becoming an integral part of people’s everyday lives. With sites, such as Facebook and Twitter, people have a greater access to information and developing news that are happening around the world, instantaneously. However, a recent trend that has become very controversial recently is the presence of recommender engines designed to “cater” to consumers and Internet users. Many of these use an online data system, known as “cookies”, that base recommendations based on a user’s prior use or experience on another site. Internet search engine platforms, such as Google and Safari originally created this for convenience and as a way for corporations and companies to promote their businesses online. For example, If a person shops online and purchases a item, ads start appearing on a person’s social media sites that come from the online store that they purchased from. Many of these advertisements offer discounts, incentives, or new arrivals that would appeal to the shopper into buying from the site again. Another example would be when a person reads news articles online. Once they read an article, ads begin appearing on the person’s social media sites of different articles that would interest the reader. However, more recently, one controversy surrounds the video streaming site, YouTube. In this site, it has been claimed that, after watching a video, YouTube recommends extreme versions on views of the topic of the video/ more videos that cover that topic. For many, the main issue at hand is if this infringes upon personal privacy. In a “Big Brother” sense, an unknown person/computer system is collecting data on a person based on how they use the internet. It, essentially
    In my perspective, I do feel like there are positives and negatives with these recommendation engines. For example, these recommendations engines have the potential to catch people who committed or are planning criminal activity. For example, recently, police and law enforcement agencies were able to catch potential people who were planning to commit school shootings because of the profiles built around the criminal’s web activity. Many other criminals, such as sex predators and child pornography distributers, have been caught because of the data collected by these engines that were monitored. These recommendation engines have truly proved useful for providing police data and information that led to a crackdown of online criminal activity. However, for the many other people this creates a sense of fear. Many people are becoming fearful that even with their phones and computers, they are now being monitored on the content they look up. It also creates a fear that the government/ Internet provider collects information about without your notice or even the people knowing who is monitoring them.
    In my personal experience, I do find these recommendation engines very useful. Whenever I shop online or watch videos on YouTube, it provides a convenient way of finding videos or items that I find appealing or related to what I just watched. For example, I once had to do a project on the Beatles. When I went on YouTube, I would look up documentaries or songs that I would watch and, immediately, more recommended videos would appear. This helped me greatly in developing my report as these recommended videos brought different perspectives on their works and was also provided conveniently and, otherwise, I would not have found.

  21. Social Media is influencing our lives like never before. Even though we talk a lot how Amazon, Facebook and Google have information about us, we rarely mention how YouTube is molding an entire generation’s thoughts and ideas. Ten years ago, most children on a age range from 5 to 10 years old spend their free time watching Child TV channels like Cartoon Network, Discovery Kids or Nickelodeon. Now, the same age range spends much more time on YouTube, watching content that can vary from anything we can think about. The revolution YouTube caused on the way we see, understand and interact with new content has created the so-called YouTuber profession. These are people which work by creating content and uploading on YouTube on a regular basis, and the ones who succeed at this are making a lot of money, both on publicity and based on YouTube’s “AdSense” algorithm, where advertisements are shown before the actual video starts. The algorithm basically bases itself on the amount of views a video or a YouTube channel is having, and then inserts the most expensive Ads on the videos with most views. As such, YouTubers receive a paycheck for each video they upload which has ads. One example: Brazilian channel “Felipe Neto”, post videos every day, averaging 3.5 million views per video posted. Usually, these videos receive big ads, since this is Brazil’s second largest channel, with 20 million subscribers, which also puts it into the top 10 biggest channels in YouTube. In today’s YouTube, 3.5 million views generate approximately 2,500 US Dollars. Since this is a channel which posts videos every day, it can make almost 75,000 dollars a month, not even considering all the advertisements already in the video. As mentioned above, YouTube generate an enormous amount of money, and thus, it can be quite a resource for people, but, most of them are using it the wrong way, by uploading content which isn’t exactly beneficial for mainstream people.

  22. We live in a time where social media has become a viable source for millenials and the younger generation to receive news and updates about what is happening in the world around them. In this article, YouTube is the focused social media target that is used as a news source of the 2016 presidential election. Besides the ridiculous fact that social media is being used instead of an actual news show to give information about the election, the algorithm coded into the YouTube site automatically queues up videos that characterize the user into an extremist for whatever side of politics they are based on the videos that they watch. In this situation, the author watched a Donald Trump rally video and found “white supremacist rants, Holocaust denials and other disturbing content” being queued up following the rally video. Although the thinking of putting similar videos queued together makes sense, the fact that the YouTube algorithm points users to extreme lengths in the choice of videos they watch is uncomforting.
    As a user of YouTube, I enjoy the fact that there are similar videos following the current video I am watching. Similar to marketing strategies of data mining, the YouTube algorithm is doing exactly what it should be doing. The fact that Google receives more profit from YouTube the longer the user is on the site means that the algorithm going to extremes is perfect for their organization as well as tantalizing the user’s attention. Because YouTube manages to keep my attention by putting similar videos queued after each other, I believe the algorithm is fine the way it is.

  23. YouTube was founded as a wonderland of free expression for content creators. I have enjoyed content on the site and following channels for years now and it has become a platform that I love. As of late, it has done everything within its power to test that love. A short while ago YouTube came under fire for their advertising algorithms ( Ads are the only way that content creators can make money from their videos on YouTube, and the site’s tightening of algorithms denied monetization to many of my favorite creators. Now YouTube is in trouble again concerning their algorithms, but this time the concern is surrounding their recommendations. I decided to observe my own recommendation tendencies and compare them to the content of this article. I started with a Daily Show video that was recommended at the top of my home page. The next recommended video was from Stephen Colbert concerning the same topic. Next came a Bernie Sanders interview at SXSW, followed by multiple SXSW interviews leading me to Jon Stewart. This took me to a string George Carlin clips, that eventually led to an analytical video of the “Genius of George Carlin”, which finally brought me to a conspiracy video made by the same channel. The chain was long and it would have taken me almost an entire day to watch every video, but it did eventually lead to something radical. All starting with the Trump joke of the day from Trevor Noah.
    The recommendation algorithm having a bias towards radical content surprises me based on YouTube’s handling of the previously mentioned “Adpocalypse”. To convince advertisers not to pull their backing, YouTube categorized non-family friendly content as not monetize able ( This tendency to recommend radical content seems contradictory to the advertiser friendly environment that YouTube is trying to create. Ultimately, I do not see an intentional conspiracy to promote radical content here. Google has always tried to milk every cent of advertiser revenue that it can from the platform. Their jump to censor offensive content after a threat of lost revenue simply does not leave room for the practices it is being accused of in recommending content.
    Assuming there is intent or at least tendency for YouTube to recommend radical content, I do not find the issue to be all that concerning. All social media platforms will inevitably push radical content. In a massive public forum where everyone is given a voice, only the loudest will be heard. As this article addresses, our viewing tendencies can be likened to that of our diets. We naturally crave sugar and fat just as we do controversy. As a result, it would not be all that surprising for platforms to begin pushing content that stokes that interest. Perhaps I am giving YouTube too much credit and that is exactly what they have done. In response, I think that people willing to believe the content of misleading and radical videos/posts would naturally find them eventually. For every “Sandy Hook Was A Hoax” video filled with lies and anger there are plenty of level minded informational outlets debunking the false claims. The internet holds more information than we could ever encounter in our entire lifetimes. There are plenty of ways to challenge and verify the things we see online, and our own views. I believe that it is the duty of self-respecting thinkers to broaden his or her informational net as much as possible. If they choose to take the information in every recommended YouTube video as gospel, I see that as their fault just as much if not more than that of the platform.
    That is not to say that every individual on YouTube is expected to be a self-respecting thinker. I am speaking specifically of the large population of children that enjoy content on the website ( Parents can try their hardest to monitor what their kids take in, but ultimately they will find a way to branch out. YouTube offers a separate kids page that is closely monitored for offensive content, but that does not mean that plenty of kids are not running free on the unrestricted pages. After all, all you need to do is check a box claiming that you are 18. A tendency to promote radical content is definitely concerning if it is able to corrupt the impressionable minds of children, but I am more concerned with the advertiser friendly content that kids flock to already. Recently Logan Paul, who has a massive following of almost 17 million subscribers on YouTube (, most of which are children, came under fire for showing graphic content of a dead body found in a Japanese forest. He mocked the body and ultimately made light of suicide for his underage audience ( The point I am trying to make in bringing up this example is that kids are already seeing content they should not. Making a great concern over the content that YouTube recommends seems pointless to me.

  24. As someone who uses YouTube often and loves to find funny and entertaining videos on the site, this article comes as somewhat of an eye opening revelation to me. YouTube’s recommendation bar is one of the most ingenious inventions ever. The little bar pops up next to the video you are watching with an assortment of related videos to the one you are currently watching. The reason I say this is ingenious is because as stated in the article, YouTube makes its money by advertising and the longer people stay on the site, the more money they make. Thanks to the related videos bar, sometimes just watching one video turns into watching 10 or 15 and spending 5 minutes on YouTube turns into an hour. YouTube has figured out how to catch our attention and keep us hooked and constantly wanting more.
    While I have always known this, what I did not realize was that the recommendations bar provided videos with much more radical videos than the one being currently watched. As stated in the article, when the author watched videos on Trump rallies, videos for the KKK popped up in recommendations bar. While this is obviously a very dangerous problem, I wonder if it is already too late to be fixed. While the radical videos are bad enough, as I stated before the recommendations bar can turn a 5 minute YouTube session into an hour one. What happens, especially to younger people on YouTube is that they start off by watching one political video and then spiral into watching dozens. The problem is that all of these political videos are coming from the same viewpoint. These people are getting brainwashed by these videos and are completely lacking the other side of the story.
    What this does is it helps to make the deep political divide in this country even deeper. With the recent election, our country is already more divisive than it’s ever been. People refuse to see other people’s point of view and only want to see what they think is right. With the current algorithm, viewers of YouTube will only see the videos that fall under their biases and will never see the other side, thus contributing to the divisive society we have today.

  25. YouTube has been a great source of information for students, teachers, researchers, and many others for years. Recently, YouTube implemented an “autoplay” feature to play a similar video to the one you’re watching next. It has been found that through the use of this feature, ideas are radicalized through the more extreme videos that are recommended to watch after the last. This enhances the concept of humans “digging deeper” into theories, information, and ideas. While this broadens our knowledge on certain topics, it can also hurt our formation of opinions on other topics.
    The article explains the higher need humans have to consume deeper and more outrageous material through the consumption of radical videos on YouTube. In the example used, the author found political opinions to be a factor in her searching of videos; where YouTube would ultimately point them to a pro-Trump themed video. YouTube’s use of the autoplay feature is mixing politics with consumers want for entertainment, which cannot be good. Although there isn’t a direct parallel of definition, YouTube is essentially functioning as propaganda in a much broader sense. Google is a specialist in advertising, and this is just a new form of advertisement for them to utilize. From a business and political standpoint, this is an genius idea to control consumers into liking and consuming something specific. However, from a personal standpoint, this feature will only disrupt how we function as humans, as this forces us into something we don’t want.
    As a YouTuber watcher myself, I can definitely confirm that the autoplay feature leads to a more radicalized experience of watching a simple video. When doing research for my TID, I needed to look up a video on how steel was manufactured in the United States. Since the enforcement of American-made steel and therefore “America First” policies are mainly Republican concepts, the videos recommended for me after watching the steel manufacturing video got more and more Republican-based. Although the following videos didn’t change my political opinion, I definitely think a feature such as this could to other people. Individuals who search tirelessly at a subject—such as me with conspiracy theories—can develop deeper and deeper understandings and opinions in favor of a certain side. YouTube is only supporting the enforcement of ideas onto its users through this feature.

  26. YouTube is one of the most popular sites that many people go and view on almost a daily basis. Because it houses videos about many different topics and ideas, there is something for everyone on YouTube. YouTube does have some content that is not for everyone and the average person would probably not be found searching on YouTube to find white supremacist content. It was interesting to see that the author tried the other side of the spectrum to see what someone who was more liberal would get for content. YouTube uses algorithms to decide what kind of content you would want to watch based off of what you have previously watched. Which makes sense because of YouTube being owned by Google, who is famous for tracking the information in which you search and tailors your experience to that. Even the advertisements you see on YouTube is dependent on your Google search history or what websites you have visited in the past. I notice it and sometimes will search things just to play around and see what happens.
    For example, if I go to Bloomingdale’s website and look for shoes then I go on a mattress company’s website, then go to YouTube, most likely I will get an advertisement for something similar to what I have searched in the past. Other companies have admitted to tracking what you buy or look for. Like Target who sent a teenage girl congratulations on your new baby to her house when nobody but her knew she was pregnant. Some may say that could be an invasion of privacy which yes could be true but we all know about it and still use the internet and Google. But I am not sure that YouTube is radicalizing school children because the children would have to go click on content like that to get recommendations to watch political videos. Though, when opening YouTube up for the first time does have a politics section, if one does not click on it then you will not fall into the black-hole of watching strictly political videos. I know that I like to keep my entertainment and politics separate so I know to never click on anything political on YouTube unless I want to keep seeing political videos. YouTube was created to spread ideas that others might share, though a lot of the children using the Chrome Laptops provided by schools are able to shape their own views, they can do that on Google by themselves without the use of YouTube. While YouTube is not censored some of the information they are feed might be incorrect, but some of the news outlets are not always being truthful. But the flip side of that is that younger people are learning about how they can take action on things such as the National School Walk-Out that happened recently in wake of the recent school shooting in Parkland, Florida. Without YouTube and other social media, the students would not know what to do or that it was even happening.

  27. YouTube has become one of the most popular social media platforms that there is to this day. The main reason for that might be because of their ability of keeping their consumers watching with the recommended algorithm. Just as it is stated in the article, “YouTube, The Great Radicalizer” the author points out about our human desire to look “behind the curtain” to continue to watch something else that is related to the video that had just ended. This algorithm can be very dangerous because it can lead any viewer into any direction that the algorithm takes you. Just like how the author also points out that regardless if you were pro Clinton or pro Trump leading up to the election, you were a lot more likely to be recommended with a pro Trump video. With many people turning towards YouTube for information, the algorithm could have been strong enough to be the reason that Trump won the election in the first place. Taking it one step further that the extreme right wing videos can lead you into holocaust denials and white supremacy.
    If this is the path that the recommended algorithm will take you on, then YouTube can become more than just a social media platform about sharing ideas. I agree with Mark’s comment that; YouTube can be used as a tool for terrorist groups to recruit people or manipulate viewers into believing into conspiracies. The programming of the algorithm could easily shape the world that we live in today. For example, as stated in the article, the recommended videos after watching some Trump videos would be white supremacy videos so YouTube would like the viewer to believe that Donald Trump is a white supremacist as well. The algorithm could also lead people into places that there is misinformation, hoaxes, and lies. The author points out that it is similar to when you crave sugary and fatty foods and YouTube is the restaurant that has an unlimited amount of it and will feed everyone whatever they are craving at that moment. The consumer would constantly be craving the next meal without worrying the consequences until it is too late. Even though YouTube may have first started as sharing data, now it could be used to radicalize the public into whatever it wants them to believe and see.

  28. Was there ever a time where you were wanting to conduct research for a big project or essay, and then find an interesting topic that’s irrelevant and decided to read on that instead. Well, that’s happened to me on many occasions. One minute you could be looking up a historical event, and a famous person’s background in another. That could also lead to reading endless articles based off Google searches, and watching funny YouTube videos as well. You could even also share that with your best friends if you find funny images. These are all examples of social media influencing. For things like research, YouTube is an excellent resource, as there are all sort of opinions on endless topics. When it comes to entertaining purposes, it can both become a distraction and a negative influencer, as many controversies have emerged just from uploading things that involve some of the most sensitive topics to date.
    Even though it is extremely difficult, I feel that it is important that we try our best to not take the opinions and decisions of other content creators to heart (especially in the world of politics), as they are normal people who make mistakes everyday just like everyone else in the world.

  29. This article from The New York Times wonderfully dove into the realm of YouTube, showcasing some things that I, myself have been thinking about for years, and feel is very important to bring to the light. That is, how successful YouTube has become at creating these simple wormhole type traps where people can get sucked into and watch these senseless videos for hours and hours, with little effort involved. The topic of these videos can be absolutely anything, however it is always the case where as the user gets deeper into this “wormhole”, the videos of the specific topic get more and more extreme. We see an extreme viewpoint of a certain topic from someone who may not even be qualified to speak on such topics or issues. Although, the idea of getting some sort of exclusive scoop on something that might seemingly give someone some sort of intellectual edge over other people, is something very attractive to a lot of people. This is what comes as a result of these more radical YouTube videos. Whether the speaker is right or wrong, by being an account that might have some sort of following, some people may take their words as serious as gospel, and use it to form their own extreme opinions on various topics.
    In a world where we use the internet to express ourselves, and give ourselves brief sources of entertainment, media platforms such as YouTube have quickly become the wild west for sources of media. Beyond the loveable cat videos and music video parodies, lies several cult-like communities of users who express different views, all of which sway towards one extreme side of the pendulum. With YouTube being able to make anyone famous, it usually falls to the most outrageous characters, saying the most outrageous things that keep getting those all important clicks on their videos. This creates one large silly contest, similar to President Trump’s “my button is bigger than your button”, when referring to North Korea’s Kim Jong Un. Yet, it is those clicks that make Google such a substantial amount of money, and the reason they use the algorithms they do when recommending specific content to their users. If the model is making money, and the videos are within the guidelines of YouTube’s content policy, than why would they stop what they are doing? Furthermore, what’s the result of this when things all of a sudden get increasingly political and every shocking proclamation coming from a vlogger’s parent’s basement gets liked and shared to the next person in the “wormhole”? From what has been noted in this article, it seems that you create the wrong kind of thought provoking conspiracies, detailing only the most extreme of views from either side of any spectrum. And what’s more thought provoking; what if you were only searching up one video as a joke, and YOU got sucked into the “wormhole”?

  30. YouTube is an interesting platform as over a relatively short period of time it is able to identify the types of videos we watch and is able to tailor a profile specifically based on our browsing history. The findings of this article are alarming to me. Our suggestions should not be so extremely polar as to not even provide us the viewpoints of the opposing side. The fact that the website auto plays videos that are set to agree with us is an unhealthy practice and in my opinion, closes us off to the opinions of others as we bury ourselves deeper and deeper in evidence of the side we agree with. Regardless as to what the topics are, be it politics, diet, or anything in between, YouTube, and countless other browsers and applications carefully categorize us and the materials we view to suggest information that matches our preferences.

    As we have mentioned on countless occasions, if you do not pay for something, you are not the customer. YouTube like many other platforms collects data on individuals which it analyzes and sells to large companies, mainly for advertising purposes. With the information collected they can create images of the consumer which can be used to pigeonhole content and lead individuals to radical schools of thought. Any given individual could be ignorant to the opposing side of any respective topic due to the simple fact that when doing their research the information was not provided to them. The internet should not conform to the individual and I feel as though it should stand on its own. Everyone uses the internet, everyone uses YouTube and I believe it to be unethical for this to be an action that happens regularly. I further believe that each individual should receive a variety of information when doing research so we can each have a clearer view as to what the truth is regarding any given topic. When you ask a conservative and liberal to search the same words in the same web browser their search results are drastically different and this should not be the case. Our browsing history should not influence our search results.

  31. After reading this article, in my opinion, the biggest concern that was arising was the issue of privacy. As mentioned in the article, Google sells YouTube data to companies that are interested. This raises many privacy issues that personally, I am not comfortable with. First, I would like to know what information is used and where. The uncertainty of where my data information is going is a little scary. I can’t help but think of George Orwell’s 1984 when discussing about privacy issues and the fact that no one in that society had any privacy and government control because at the end of the day, the information from Google and YouTube eventually does get to the hands of the government. When reading the book, it was frightening to realize how relevant it is to this current day society. In addition, I do agree that YouTube does a great job with recommending videos based on previously watched ones and I too find myself watching one video after another for quite a while but that in itself is questionable too. I believe the lack of knowledge about the algorithms that the companies use can be very dangerous. Overall, YouTube is changing the way that information is accessed and interpreted but this can have major consequences if one is not careful in what he or she might search.

  32. Artificial intelligence is exactly what it sounds like. It is a manmade network of knowledge that is being applied in our daily lives more than we think. The psychological render of recommendations that pop up is nothing short of what our future holds. Google is one of the largest and most successful corporations on this earth, and just like anything applicable to success, once you have a little, all you want is more. In order for a company to grow in success, they need to expand their market. Googles algorithm for recommendation is one of growth for the company. It keeps users interested in the things going on and items they enjoy or are attracted. The fact that Google owns YouTube allows you to connect the dots and apply that they are using this psychological algorithm in their other platforms. However, at what point do these recommendations not apply to the user? Are users lead somewhere they do not want to be? Well, it is actually more often than not that some of these recommendations are nonsense. The New York Times says, “human beings have many natural tendencies that need to be vigilantly monitored in the context of modern life. For example, our craving for fat, salt, and sugar, which served us well when food was scarce, can lead us astray in an environment in which fat, salt, and sugar are all too plentiful and heavily marketed to us. So too our natural curiosity about the unknown can lead us astray on a website that leads us too much in the direction of lies, hoaxes, and misinformation.” This simply leads me to my position and feeling towards artificial intelligence, which is the growth, it is good and its expansion will only continue, but there is an invasion of privacy when it comes to some recommendations. If I want to watch another video after I just did, I would like to be the judge of what it is.

  33. With the vast array of content available on YouTube, such content will have representatives from all parts of the political spectrum. When an individual views a video, the subsequently recommended videos follow the YouTube algorithm which suggests similar videos in regard to content. That is why when one watches a video, such as the 2016 Donald Trump rallies, more videos of similar or more radical content will be made present to the viewer. That is why YouTube seems to “radicalize” viewers because the site is merely presenting more content that the viewer may be interested in. The manner in which a viewer of these videos may be radicalized is by following a path of confirmation bias (see:
    In following this path of confirmation bias, it is possible for one’s prejudices to be justified and beliefs they were unsure were true to be supported. The underlying issue with this system is the lack of variety in the content. Essentially, that is how one frees himself of this radicalization. YouTube’s algorithm does not allow for that and on a much larger scale, Google and other companies that data mine will continuously present one with related and more fervent content. The intent may or may not be this malicious, but nevertheless raises the question of how the content we view online is affecting viewers and their view of the world. One must make a concerted effort to expose himself to a diverse set of outlets of information to have a more worldly view. It is easy to fall victim to this system of perspective molding, but to ensure a method of forming a set of beliefs divorced from the influence of radical sources, one has to be open to all parts of the spectrum. In the case of YouTube, one has to watch videos from different sources.

  34. About twenty years ago, David Foster Wallace wrote the book Infinite Jest. Today, nearly no person who has started it has finished it. However, the book does bring a good point to the table: the Internet can be a dangerous place.

    The premise of the book is that some groups of people—patients from a halfway house, students from a tennis academy, and wheelchair assassins from France—are on a search for a movie called Infinite Jest. The movie is described as captivating to the point that anyone who watches it will become so absorbed that they will watch it on repeat until they have died of starvation or dehydration. This is an exaggeration of a mentality that describes YouTube users today.

    The article above has shown YouTube to be a great “radicalizer” by means of the autoplay algorithm. While it is true that a lot of people, especially students in a middle school classroom with a tech-non-savvy teacher, find this feature annoying, it is also true that either this algorithm is irresponsible, or YouTube’s userbase is largely radical.

    The situation is similar to a Catch-22: YouTube’s community is radical, possibly because the site has a radical algorithm, possibly because the community is radical itself. It’s a vicious cycle.

    The examples cited in the New York Times article are far from exaggerated. Right-wing sentiment has been escalating since the rise of Donald Trump, and even high-school freshmen have been getting involved. Reddit communities such as r/The_Donald and r/CringeAnarchy testify to the existence of radical conservatives.

    Meanwhile, left-wing politics have become more focused on directly opposing the traditions of their opposing counterparts. In an effort to counteract anti-LGBT sentiments, the left-wing has been coming up with labels for gender identities that shouldn’t really count, such as “vapogender”, a gender that “sort of feels like smoke”. (

    This is problematic, not because the right wing is angry or the left wing is angrier. The issue with radical Internet is that it is easy to access and forces impressionable young people to think they have to conform to a specific category.

    I think about conformity a lot, myself, and how counterculture forms. In the sixties, there were hippies. In the 2000s, there were variants of punks. While I’m all for nonconformity (you could have seen me in a skirt last semester or in the future with hotter weather), I find it ironic how radical nonconformity itself is a form of conformity.

    People who stand with the radical “alt-right” think their independent journalists are a good way to avoid fake news, but end up with even more fake news filtered by bias or wording. The “social justice warriors” think they’re standing up against oppression but end up oppressing the privileged majorities themselves. Gluten-free eaters miss the point of eating gluten. Vegans mock others for their own diets.

    The bottom line is, all this radicalization on the Internet is exactly what David Foster Wallace expected. If he lived ten more years, he could have convinced people to think a little differently.

    Instead we have a video-streaming site that’s perpetuating toxic ideals working with a data-mining social media site that’s forcing people to stop thinking for themselves.

  35. YouTube has become one of, if not the largest website for watching videos on virtually anything you could imagine. The site clearly draws people in with intentions to keep them there for hours on end by bringing up more and more videos that relate to the person’s original search. I am sure that almost every person can say that they have gone on YouTube for one video and ended up watching something completely different an hour or so later. The site is addicting, because there are an endless amount of videos that you can come across with each search. You may see a title of a video on the sidebar of recommended videos to watch next that strikes your attention, so you click on it and watch it. This pattern continues over and over until you begin to forget what you were even looking for. This is exactly what makes YouTube so successful, and adds to the success of Google.
    I don’t think that I use YouTube religiously enough to realize that as I keep receiving recommendations from videos I’m watching, they are becoming more extreme as I watch more. The author of the article discusses how they experimented with a bunch of different searches, which all eventually led to more intense videos to watch. This is apparently one of YouTube’s methods to keeping users logged on and active on their site in order to make more money. The company really does not have the desires of their users in mind, but rather the large profit that they are aiming to make from these users.
    I believe that the internet is one of the main reasons why politics and current events are so blown up in everyone’s faces in today’s day and age. People tend to form opinions based off of everything they see online, but a lot of that information is not true. Just like the article discusses, searching videos of President Trump or Hillary Clinton may eventually bring you to videos of conspiracy theories or extreme viewpoints that are probably not at all truthful. Many videos on YouTube definitely have the potential to brainwash individuals or even be dangerous for some to watch, but they are still kept on the internet regardless.
    Watching YouTube videos can be just as much of an addiction to someone as checking Facebook or Instagram is because the owners are giving viewers exactly what they want. They keep users interested by bringing up more and more videos that should intrigue the individual based on their evaluated preferences. People believe that YouTube is a reputable source and that they are investing their time into something positive, but in reality, the company is too money hungry and is contributing to the radical viewpoints of individuals in our society today.

  36. YouTube is a rising platform and it actually is a really useful platform for watching videos whether it be for studying or just for entertainment. YouTube on the other hand can be a very dangerous platform for children. YouTube decides to play videos by itself sometimes. If a young person is watching Donald Trump and then a KKK protest shows up that is scary. YouTube and even the internet has so much power that it is now shaping your political views when you are not even trying to focus on that. YouTube should get rid of the auto play feature. This article stated that conspiracy theories started playing, this is a dangerous thing for young minds, and could shape how they behave as they grow up. There is a lot of crazy people out there in the world today and platforms such as YouTube gives kids access to learn from them.
    In my opinion, social media is getting out of control. It’s like a second nature too most people including me. When you need to learn to do something you go oh I will google instructions. If you want to watch a video on how to do something you will probably search it on YouTube. But there is a very big downside to this, and I am afraid for the future. When I have children I ill must limit my children from what platforms they can watch and what they can look at on the internet. I do not want my seven-year-old son watching conspiracy theories, or KKK members walking down a street. It is a scary thing that everyone has access to these videos. Either that or YouTube must learn how to manage their content, and not let these terrible things pop up so easily.

  37. I would think I spend a lot of time on Youtube, either for school or for my leisure. I have watched presidential debates and new stories and other things like that on youtube but I haven’t noticed a biased next video selection. My youtube knows what I like to watch so my auto video choices are usually the normal things I watch. However, Youtube does have suggestions to things regarding the video your currently watching. The article says that she was watching President Trump talk in one of his rallies. Then the autoplay feature had recommended song extreme right-wing videos. I feel like they are recommended because that is usually what is associated with trump and who stands for him and all of that. I do not believe google is trying to sway you in any direction. She also stated that when she watched Clinton’s or Sander’s rallies that it recommended extreme left-wing videos. It’s the same concept the people who you look up on youtube to watch, no matter what party they are from you’re going to get suggestions about whatever those parties are associated with. I watch comedians play videogames and my autoplay usually suggest another video by the same channel or other people who also play video games. Google is trying to appeal to what you like to watch. I don’t think they are trying to do anything dubious.
    The article points out that Youtube came under fire for recommending a conspiracy theory regarding the survivors of the Parkland shooting. My main argument towards this is that if you are watching a news story about Parkland, or a school shooting in general, there are going to be videos with words in the name and tags in the video that are similar to what you are watching. Anyone can post something on Youtube, whether you agree with it or not it is a freedom of speech. If it meets Youtubes Code of Conduct and stays up, it’s free game. You are entitled to your opinion to not like the videos about the conspiracy theories. You can dislike the video, you can flag the video if you feel that strongly about it, or you could just not watch the video. They have the right to make and upload that video even if you and most don’t agree with it. There are many videos that are conspiracy theories against the moon landing, JFK’s assassination, and much more touchy subjects along with the devastating Parkland shooting. You. Do. Not. Have. To. Watch. Those. Videos,

  38. YouTube has evolved to become one of the biggest forms of entertainment with the advancement of technology. YouTube has recorded an estimate of 1.5 billion users monthly; It has become the biggest platform to reach human users. YouTube and other big Data website like Google have to adapt and create a better user face algorithm that accommodates the user. That is why I am not surprise when YouTube tries to learn and adapt to the personal profile by suggesting likeminded videos. When YouTube creates an algorithm to play videos, it suggests other video base on past preference to compile a list of video the user might be interested in. I do not believe it’s based on political views but the website is trying to learn or understand the interests of the personal user. This makes sense business wise because the whole goal of the algorithm is to keep the user on the website. That is how YouTube creates its revenue, through personalized advertisement base on the user profile and selling data.
    So I when dealing with politics, I believe YouTube recommends any type of video based on the person’s political views. Let’s say someone is a republican, YouTube will analyze that and provide Republican based videos to fit the person’s need. If YouTube where to recommend Democratic videos. It would make the person less interested to watch therefore not watching the video. I don’t believe YouTube is intentional try to push people to believe only one side of the spectrum when it comes to politics. It’s purely a business tactic to try to get as much views as possible.
    Even though YouTube has several video on its platform, it has a lot of radical video that most people do not agree with. I feel if a video is extremely radical and/or offensive it should not be promoted or recommended for people to watch. The YouTube algorithm does make sense in a way to grab as many views as possible but it also leads people to watch these radical ideologies from both ends of the spectrum. Maybe the algorithm needs to be adjusted or improved on to reduce how quick these radical videos start appearing on peoples’ feed. Especially now with our current government situation, a lot of radical video are being posted. YouTube needs to work on the profiling system a little better so people can watch reliable political issue and not radical views that spread hate though our nation.

  39. This article offers some insight into what we should already know about all of the Internet sites. They are free for our use and there are billions of people on them every minute of every day. This author did an experiment when he viewed some YouTube videos that were political and then others came up related to the original video. He then tried looking at something different and the same thing happened. We must realize that these are on the stock market index and that they are a business. If I look up something I want to buy on my email, other similar items appear on my Facebook page. How are these two connected? I don’t know, but I do know that they are connected. YouTube has been known to radicalize certain groups. Facebook is in a great deal of trouble right now for giving private information to various political groups. The pop-op ads are as bad as robo calls. We have no control over the calls, even when we get on the do not call list. They find another way around it and no amount of reporting to authorities can stop it. We can’t stop the ads or anything else because the stock owners are making their money from this and the underhanded way they do it is the price we have to pay for “free” access.
    The Wall Street Journal has investigated these practices and has evidence that personal information is being shared and we are being opened up to related videos on YouTube that we may not want to see. They are all obviously interconnected and are trying to increase their business and the viewers are paying one way or another. We like using these research entities and probably will not stop. The trouble with Facebook now demands legislation of some kind. All the Internet needs some controls. But how can it be done? It is almost impossible to control and most people don’t really care about because they like it so much.
    It’s disheartening to know that everything I do on the Internet is seen by someone, somewhere, but I am not breaking any laws or running for any elected office. Most people are like me and will continue to enjoy the benefits of these sites, but, hopefully, we will be a little more careful on what we post.

    What we are witnessing is the computational exploitation of a natural human desire: to look “behind the curtain,” to dig deeper into something that engages us. As we click and click, we are carried along by the exciting sensation of uncovering more secrets and deeper truths. YouTube leads viewers down a rabbit hole of extremism, while Google racks up the ad sales.
    Human beings have many natural tendencies that need to be vigilantly monitored in the context of modern life. For example, our craving for fat, salt and sugar, which served us well when food was scarce, can lead us astray in an environment in which fat, salt and sugar are all too plentiful and heavily marketed to us. So too our natural curiosity about the unknown can lead us astray on a website that leads us too much in the direction of lies, hoaxes and misinformation.
    This state of affairs is unacceptable but not inevitable. There is no reason to let a company make so much money while potentially helping to radicalize billions of people, reaping the financial benefits while asking society to bear so many of the costs.

  40. In today’s society, YouTube is one of the most popular sites that many people view on a daily basis. YouTube provides a variety of different videos that people can watch from the news to sports and at least something that everyone can enjoy to watch. YouTube has content that wouldn’t interest everyone and majority of people would most likely not search up something that doesn’t satisfy them. Some people turn to YouTube for information because they want to see facts from current events and watch how other people react to these events. However, google is owned by YouTube and the more people that visit YouTube, the more money that google is making by the amount of viewers using YouTube each day. Google helps YouTube increase popularity toward it’s videos and gives consumers a variety of ads that might interest them with different information. These ads would help YouTube and some companies increase revenue from more people viewing YouTube and help companies increase the amount of consumers following their business. YouTube has become a helpful website for many people today to view videos based on their interest and provide information that answers their question toward a particular topic.

    Watching YouTube videos can be addicted to some people as in using Snapchat or Facebook because the owners are giving the viewers videos that they expect to watch based on their purpose. The goal for YouTube is to satisfy consumers by providing videos based on their preference and keep them interested with ads or other current events. For example, the author of this article mentioned how YouTube provided videos of Trump rallies before the previous presidential election and spotted white supremacist rants and other disturbing content toward the videos. Even though putting up videos on a variety of topics would help viewers have interest on a video, it can be a negative impact toward some of the videos because people might disagree on the topic or feel frustrated about it. As people click on videos to watch, it would either interest them and watch more videos similar to the previous video or give them a negative reaction toward the video. Similar to YouTube, people can also find videos on Facebook depending on what other people post and how they react toward the particular event. YouTube would either satisfy consumers’ needs based on the videos provided for a particular topic or give a negative response with a video that doesn’t interest them at all.

    From my point of view, I believe that YouTube has been increasing its popularity each month giving viewers a variety of videos to interest them on a daily basis. I find YouTube very useful because it helps us answer questions that we might have either with school or a current event and provides entertainment to those that need it. As I watch more YouTube videos, it keeps getting more extreme each time because of the amount of reactions that people would give toward the videos either in a positive or negative way. The author of this article mentioned how they experimented with different searches and found out that it’s leading to more intense videos each time. I realize that YouTube made current events or politics more interested because of reactions toward these topics and some of the information isn’t true. YouTube has the right to upload videos that might not be at someone’s interest and people can even flag the video if it’s too offending to them. Overall, I believe that YouTube helps a variety of people watch videos that interests them either on current events or a particular topic and a site that helps companies increase revenue based on the videos they provide for consumers.

  41. As someone who is an avid YouTube user, I have experienced going down many rabbit holes that led me to a place that was not even close to where I started. I have always had a suspicion that this what part of YouTube’s algorithm, and this article definitely confirms my suspicions. It is not surprising; a company that makes money by getting people to keep watching videos has noticed trends and capitalized on them. From a business standpoint it makes sense, and they are doing what they can to maximize revenue.

    However, Google is one of the world’s largest companies, and have a huge impact on people’s everyday lives. A company so vast in its influence on the world should be aware of the social implications of its products. The world, or at least America, has become increasingly polarized in the past few years. I understand that they may be giving us what we want, but due to YouTube’s recommendations, they may be giving us something that we did not know we wanted. This is another question of whether society is shaping the media, or media is shaping society. YouTube has the power to redirect viewers to whatever content they please, but since radical opinions are what grab people’s attention, why would they keep search results neutral?

    Personally, I am not angry about the way YouTube has set up its algorithm. I use the video platform almost daily, and it has been a great way to learn new things and watch some of my favorite content. However, I am an adult and can discern between right and wrong, and already have my own set of beliefs and morals. For children, who make up a large portion of viewers, the radical or extreme recommendations can become consequential. Young children, or even teens, who do not yet have a strong set of values, become susceptible to a skewed worldview by being recommended videos that can be controversial, extreme, and politically charged. For me, it is interesting and inconsequential to watch radical content to help expand my knowledge of people’s opinions on certain topics, but for children who do not know any opinions of their own yet, the videos can help shape how they think.

    At the end of the day, YouTube is a media platform like any other, and can promote whatever content makes them the most money. However, it has to be aware of the effect it may have on the population. Whether they are doing it intentionally or not, YouTube is surely one of the factors contributing a divided nation today.

  42. We have seen over the last few years of advertisers all throughout the social media world and all over websites. What is crazy about this all is that it brings up things that they think you would like based on previous searches. One website that we have seen really use your previous searches to find what they think you would like has been Youtube. In this article the author, Turfekci explains how youtube has used previous videos that she watched on one account brought up videos that thought she would like regarding the 2016 presidential election. What she saw was how youtube pretty much showed very radical video suggestions based on one video she watched. She made several accounts and watched different videos on both sides of the presidential election. She said that whichever side she chose to watch youtube suggested videos that pretty much supported the far right or the far left. So it sort of makes us wonder what they are trying to do.
    For me when I read this article it made me feel like youtube was trying to sort of divide people into two different sides. If I were to watch videos that supported Donald Trump it does not mean I support the far right nor do I want to watch videos that support them because I could care less about it. Same goes the other way. If I watched videos that supported Hilary does not mean I support the far left. So I ask myself what is it that youtube is trying to do? Do they believe that if you like one candidate it automatically means you support the far side of them? I know it does not mean that to me. So to me it makes it seem like that youtube in reference to this past campaign may have been trying to get people to be in support of the far left or far right. This is what makes me really uncomfortable with what these sites are trying to do because I really do not know.

  43. I personally have encountered the point this article addresses. I like to watch makeup videos on Youtube. I have noticed that watching makeup videos leads to Youtube recommending other makeup/beauty videos for me to watch. In a way, this is good because it is finding everything I am interested in. However, it also makes me feel a bit unsafe. It feels as if no part of the internet is actually secure. It feels as if everything you do on the internet is always being watched. I found it a bit creepy when I realized that recommendations pop up based on what you are already watching.

    After reading Jesse’s comment, I actually realized that advertisements for whatever I am looking at on the internet pop up on Facebook. For example, if I am looking at a pair of Nike sneakers, they will pop up on my Facebook page. I find this crazy and I think it is actually good and bad because in a way your social media accounts and youtube account is personalized to what you like; however, it still feels a bit creepy.

  44. In 2018, Youtube is one the most commonly used streaming sites. Youtube’s content is vastly spread amongst all topics, industries, and fields. Today, almost anything can be found on Youtube. There are music videos, makeup tutorials, church services, therapy sessions, anything that can be recorded on video is most likely found on this site.

    In tbe past year or so, Youtube added a new feature to their site. This feature is called “Autoplay”. As soon as one video ends, another video plays automatically after. The video played is chosen by Youtube and tends to have very similar content to the video played prior. For example, if I’m listening to Hip Hop song on Youtube, another Hip Hop song would be played after, most likely by the same recording artist.

    The article speaks on the dividend between Pro Trump and Pro Clinton videos on Youtube. The writer of this explained how when listening to Pro Trump videos, the longer she allowed autoplay to play videos the more right winged the videos became. However, when watching Pro Clinton videos, the more left winged the videos became, but somehow swayed toward a more rightist perspective after a period of time. This makes me question whether there is a hidden agenda. By exposing us to videos that negates our original sought out intentions, is there a political reform Youtube is trying to install, in terms of politics? Many Youtube users are young and curious, aiming to discover there confirmed opinions through the insight of Youtube content producers. There’s a huge chance that the more rightist or leftist the videos become as do their opinions. Also this makes me question the possibility of an outside source paying Youtube to maintain this alithogram and always sway toward a more rightist opinion? This makes me believe that there is a superior aiming to sway the political perspectives of future generations.

  45. YouTube is a platform where anyone can share videos. The one point in this article I can wholeheartedly agree with is that YouTube wants to keep people watching. Other than that I have many problems with it. Even the title, YouTube, the Great Radicalizer, is ridiculous. It is clearly hyperbolic. I wouldn’t attempt to guess how many videos are available to watch on YouTube, but I would say that there is most likely a multiple videos on any topic imaginable. People have the ability to watch any video they want. While it is true that YouTube recommends videos and has an auto play feature, it is not as if it forces you to watch those similar videos. At some point people need to be held accountable for themselves. The article likens YouTube to a restaurant that serves sugary and fatty foods. If that is what the customer orders why wouldn’t they give it to them. People should be able to regulate what they consume. If YouTube controlled and handpicked what was appropriate for people to consume , people would be outraged. YouTube cannot be put at fault for people consuming what they want. On occasion, I enjoy watching Alex Jones. Jones and his show, Infowars, are very far right voices. They create and subscribe to many conspiracy theories. Watching this show has not convinced me of one. No one should get all of their information from YouTube. If someone can be convinced that the Holocaust wasn’t real by a few videos then the problem isn’t YouTube. “What we are witnessing is the computational exploitation of a natural human desire.” This is a ridiculous statement. People choose to watch videos on YouTube. They are not being exploited. Absolving people of any fault is much more dangerous than anything YouTube does.

  46. I have to admit that I do not agree with Zeynep Tufekci’s supposition and I would like to explain my reasons. In her article, the author claims that YouTube recommends more extreme content for those who watch much less extreme mainstream videos in order to radicalize viewers. In fact, this statement is just partially true since YouTube may recommend much more extreme content but not deliberately or with an intention to radicalize viewers. Indeed, YouTube just links content with the related keywords and, obviously, such videos, regardless of their extreme or radical ideas, have shared keywords. This is one of the basic principles of any search engines, advertisement tools, and generally, of the sites with different content which is usually supplemented with corresponding keywords. In this case, I believe that YouTube similarly recommends less extreme mainstream videos to those who search for more extreme content as they share the same keywords.
    On the other hand, Zeynep Tufekci’s article still raises important issues. It makes me think that YouTube can be involuntarily used as a “radicalizer” by third persons who deliberately include to their videos popular keywords in order to gain more viewers and followers. In this situation, YouTube can be used as a platform and keywords can be used as tools for the propaganda of adverse social movements and subcultures. For instance, terrorists can include keywords “Islam” or “religion” and related words and phrases which seem neutral at the first sight to the videos promoting terrorism. Therefore, Zeynep Tufekci seems to address the right issue, however, her article might be not directed at the criticism of YouTube but rather might warn about cybersecurity issues.

  47. I think the author of this article kind of missed the mark when it comes to the dangers of internet algorithms. She mentions how the YouTube algorithm recommends more and more extreme versions of the content the user is watching, and then concludes that YouTube may be trying to radicalize its users. As much as I love a good conspiracy, I don’t see how this is accurate. First, Google (YouTube) is a business, and I don’t see a financial incentive to turn “joggers into ultramarathon runners”. Next, as far as politics goes, there is a lot of political content on YouTube. The company is obviously left-leaning, as evidenced by its countless censorship scandals against right-wing YouTube channels. So then why would YouTube recommend more extreme right-wing content to someone consuming, say, “vanilla” Republican media? It seems as though the YouTube algorithm crawls the site for videos that share similar tags, titles, and users watching the same videos. Similar to Amazon’s “Consumers who liked this product also bought…” section, YouTube recommends videos based on groups of media consumers that watch like-minded content. So if a lot of the users consuming a certain moderate video also watched a more extreme video, that extreme video is going to be recommended.
    I’m not sure if the author knows this, but Google owns most of the internet. It holds an enormous majority of online searches, and tens of millions of websites use Google analytics for tracking. So chances are when you visit a website, information about your stay on that website is getting sent to Google so that it can target more specific ads to you. It seems reasonable to conclude that it not only targets more specific ads, but recommends more specific content. In this way, it makes sense that someone watching a video about vegetarianism will also get recommendations for videos about veganism, because those two topics are closely related, have similar audiences, and have a lot of crossover all around the web. For example, visiting a site like, a popular plant-based science website, for information about vegetarian diets will send packets of data to the analytics team. Since they aren’t in your brain, they will see you visited that website and looked at articles about plant-based nutrition, and assume that you may want to see videos about veganism.
    In an age of instant gratification and sensory overload, people want to see bigger and better things all the time. We consume more media than ever before, and eventually our desires for bigger and better, or perhaps, more extreme content continue to grow. It seems a bit far-fetched to say that YouTube is purposefully recommending more extreme topics to users in order to radicalize them. Instead, it makes more sense that YouTube is receiving your browsing information and cross-referencing your interests with those who are interested in similar things. Then, YouTube recommends you something else you may like based on that information.

Leave a Reply

Your email address will not be published. Required fields are marked *