Inside The Bizarre World of Internet Trolls and Propagandists

from TED

Journalist Andrew Marantz spent three years embedded in the world of internet trolls and social media propagandists, seeking out the people who are propelling fringe talking points into the heart of conversation online and trying to understand how they’re making their ideas spread. Go down the rabbit hole of online propaganda and misinformation — and learn we can start to make the internet less toxic.

More here.

, , ,

6 Responses to Inside The Bizarre World of Internet Trolls and Propagandists

  1. Sean Distelcamp September 12, 2019 at 9:07 pm #

    The top suggestion that Andrew Marantz makes for curbing propaganda and misinformation on social media is that the companies must change the algorithms. This would certainly stop the average person form stumbling across a hateful community by chance, and there would be less cases like the woman who went from being moderate to attending white supremacist rallies in a month, all because she felt like an outcast. However, without a serious reconsideration of how we view free speech and the internet, these communities will still exist. Websites like Twitter, Facebook, Reddit, and 4chan all have substantial alt-right and other hate group communities. They also have vastly different algorithms for showing content. It’s true that you can change what Facebook allows on the site, or how Twitter promotes outrage, but 4chan has no systems in place to promote posts. Everything on 4chan is anonymous and equal and that website is probably the most racist and hateful place on the internet. Conversely, Reddit has many admins and moderators who actively ban users, posts, and entire subreddits for hateful content. This certainly is better, but they simply cannot keep up. People still find ways to spread misinformation and outrage while either carefully maneuvering the site’s rules or evading the watch of the admins. Almost everywhere on the internet that allows people to comment or post has a fringe group of people spreading hate and misinformation regardless of the rules and search algorithms or lack thereof.
    The culture and the social norms of the internet are just different than any other format. People can be almost completely anonymous if they want to, and they can connect with people with the same views as them no matter how insane, offensive or obscure those views may be. Before the internet, there were certainly people who did not trust doctors or people who were unsure about getting vaccinated, but now these people can connect to affirm each others’ beliefs while simultaneously excluding and straw-manning any opposing opinions. Movements such as flat-earthers or anti-vaxxers have come seemingly out of nowhere from these echo chambers. Giving a website the power to decide that flat earth articles should be deleted because they are misinformation seems to be a fine solution. The problem is though that the “right” answers and the correct information is not always obvious. Facebook and other sites already have so much power over what we see and the information we digest. If they are allowed to police posts even more and decide what qualifies as misinformation, we are giving private corporations, who have their own interests prioritized, control over most of America’s main source of news and information.

  2. Joseph Antonucci September 14, 2019 at 2:57 am #

    Watching all of this was a real chore, and I should receive a medal for doing so. I’ll obviously be in the minority on this, but this guy is fear-mongering, plain and simple. He’s using terms such as “propaganda” to brand “fringe” viewpoints that don’t conform to the mainstream narrative as nonsensical or misleading. The only one engaging in propaganda in this dynamic is the man speaking in the video. He is trying to get the audience to cling even more tightly to the mainstream narrative and completely cast aside views that challenge that narrative.

    The conversation being had here is not “should these people be allowed to have and express their views, just like everyone else?” That’s what the conversation SHOULD be. Instead you have these numskulls pondering how we can “make the internet less toxic.” What does that mean? “Toxic” viewpoints are any viewpoints that oppose the mainstream narrative. For example, opposing mass immigration. If you hold this belief, you should be ostracized from society, kicked off social media, and fired from your job, because you are a bad person and you must hate everyone that does not look and talk like you. The reality is that there are plenty of substantive arguments for why allowing a ton of people into ANY country is going to cause some problems, and this is not a viewpoint that festers in spooky online message “propaganda” boards in most cases, it’s grounded in common sense and history.

    Andrew is trying to convince you that Internet memes cause people to become radicalized (again, this is a buzzword term and holds no real weight), and his proposed solution is to suppress the evil scary spooky fascist white supremacist racist sexist blah, blah blah… in other words, cut off free speech.

    To directly address his remedies:

    1) “Be a Smart Skeptic”
    Andrew condescendingly speaks to white teens who question such things as “male privilege” or “white privilege” and tells them they’re being a jerk for doing so. He also casually throws in a “flat earth” meme before saying that general skepticism and independent knowledge gathering is still a good thing,

    Response:
    a. His argument here is that you should be skeptical and open minded, but not TOO skeptical and open minded to the point where you question the mainstream. Doing so means you’re “toxic” or, as he said “being a jerk.” He’s encouraging skepticism but he’s also saying you shouldn’t question certain things, and if you do you’re a crappy person.
    b. He also sneakily takes such things as white privilege and male privilege (zero evidence that either of these things exist systemically) and lumps them in with the flat earth thing. He’s taking something completely outlandish that few people believe, the flat earth theory, and lumping those people in with anyone who questions things such as white/male privilege in an attempt to discredit them as “conspiracy theorists.”

    2) “Free Speech is just a starting point”
    Andrew claims to be pro free speech, and then says some people should not have Twitter accounts.

    Response:
    He says he supports free speech, except for people with bad political opinions who should be censored from the Internet… Free speech for some, not for all; free speech for anyone conforming to the mainstream, not for the “fringe” people.

    3) “Make decency cool again”
    Andrew talks here about how social networks should be “fixed” to prevent the spread of “hateful” information. He says this should be done via “algorithm” changes.

    Response:
    His solution is algorithm manipulation, which is something already practiced by most (if not all) social media platforms. The trending section of Twitter has been manipulated on many occasions before my very eyes, as one hashtag with hundreds of thousands of mentions is removed from the trending page, then replaced with something totally different, having only a quarter of the mentions of the first hashtag. Google manipulates search suggestions and results.

    I find it ironic how many people complain about the power of big tech monopolies and the influence they exert, yet many of these people are totally content letting these same companies have full control over what information is and is not accessible, or deciding who should be allowed to speak and who should not.

    Also worth noting: the key to preventing most or all future “white supremacist terrorists” (statistically a non-issue) from committing heinous acts is very simple. Anyone who has read the shooter “manifestos” that have circulated is easily able to determine what the proper course of action is. That is, unfortunately, a conversation for another day.

  3. Kathleen Watts September 20, 2019 at 8:47 pm #

    One thing that is always brought up in conversations about human nature is that people will always want to feel like they’re apart of something. Sometimes that means joining a climate change activist facebook group and sometimes that means following a man to northwestern Guyana and drinking poisoned kool aid. When people feel alone, they will seek human interaction wherever they can find it. Some people, like myself, went through a period in highschool where the people they hung out with weren’t people their parents would like to meet. Like Marantz says, many of these internet trolls really are just looking for a space where they belong, and often they find belonging in groups that target those who made them feel left out. Hitler, for example, targeted the Jews because he truly thought they were why he didn’t prosper in life. I admit, he’s an extreme example, but the propagandizers generally enact the same policy. A lot of white supremacists believe that non-whites are the reason why they can’t crawl out of their mother’s basement. This, in a way, empowers them to spread their hate speech because they can blame someone else other than themselves for their downfalls, and the people in the group to ascribe to enable and emulate this response. Moving forward, in response to Marantz’s first suggestion to stop this cycle from repeating, contrarianism truly is a huge issue in society. Devil’s advocates often try to engage the conversation in a productive way. Contrarians, on the other hand, purposefully disagree with most beliefs to feel intelligent.. As someone once said while referencing people who believe in the flat earth, this belief makes them think that they know something that other people don’t and that makes them feel special and smart. On the contrary, they are some of the most uneducated, close minded people on the globe. Furthermore, on his second point about free speech, contrarians and white supremacist’s are pro-free speech when it matter. I watched a documentary about Charlottesville and one of the nazi sympathizer’s claimed that the counter protests were infringing on the free speech of the racist rally set to take place. In reality, however, if these people actually cared about free speech in a true and intellectual manner they would respect the counter protest’s right to free speech. They don’t care about free speech truly, they want their speech to be more important than other people’s. On his third point, social media has absolutely incentivized these people. Businesses need to take more accountability for how much influence their sites have, not only the enabling these people, but also the effect the words of these people have on other users. All in all, society as a whole needs to move towards a safer, less provocative internet culture. At the end of the day, we control what we put on the internet.

  4. Corinne Roonan September 21, 2019 at 2:44 pm #

    This TED talk leaves me conflicted. Quite honestly, it does not take a genius to understand that everyone is going to do what they want regardless of new algorithms or different bans on social media posts; if someone wants to post something hateful to hurt other people, they are going to find a way to do so. On the same token, advocating for decency in our interactions with other people is an extremely positive undertaking. We can not rely on that to fix these issues, though. In reality, the speaker is aiming at changing human nature. The speaker wants to change human nature in a way that takes out our selfish nature and replaces it with a loving and caring one. That is impossible and is a waste of time.
    Human beings are human beings; we all have desires that are selfish. Even behind most selfless acts, there is a selfish motivation, no matter how much we wish to hide it. Without an incentive, humans would rarely do anything. It is not to say that this is a negative thing; there are reasons for everything and acknowledging them as a part of our nature is the best way to attempt to move past them. Within myself, I can recognize selfish tendencies and try to combat them. Many people attempt to do this everyday and to an extent, this separates the “good” from the “bad” of us. The issue is that no one can control this recognition within someone else. Trying to control other people’s selfish tendencies is a pointless endeavor that only leads to frustration. In that same way, changing social media algorithms and policies can only do so much.
    In a perfect world, imagine every single social media platform decided to ban offensive and hurtful posts from reaching the eye of the public. Those people who exist with an unending hole of hate in their heart are going to either find ways around these policies or they are going to create their own social media platform where they can post their hate. No matter what, they are going to find a way to succeed in their mission of hurting the masses.
    What can we do? Stop being bothered by senseless posts from idiotic people on the internet. Unless a direct threat exists, I do not understand the need for such regulations. Of course, my experiences are different from the experiences of others, but in my limited lived experience, online posts are just that: online posts.They display no immediate threat (usually). For situations where there is a larger scale threat, then that needs to be addressed by authorities, not by social media platforms.

  5. Ryan Geschickter November 7, 2019 at 10:19 pm #

    After reading this article, I was amazed at just how much misinformation and different propaganda that exists all over the internet as well as social media. Personally, I have come across many trolls in the comments section in many social media apps that are a complete and utter nuisance. While some are very funny some can be very offensive to much of the public eye. Propagandists are also an issue due to the fact that there is always a view that they have to share to be ‘better’ than the next person. In today’s world we are also met with ‘bots’, which have been taking the internet, as well as the press, by storm all over social media. Bots are important due to their stance in the election and in politics as they usually stand for whatever candidate that benefits them. Many of these ‘bots’ spread misinformation that many think is believable because there are so many bots making the stories sound believable in a sense. In order to combat misinformation form being spread as well as different propaganda, different social media companies should incorporate some type of security measure. While the security doesn’t have to be uptight and infringing the rights of people it should be necessary to filter out certain words or phrases or even different types of media. There should be a program implemented that filters out the media from different people’s posts in order to make sure that there is no bots or propagandists being allowed to post types of offensive material. The internet and social media should be a place where everyone should feel safe and comfortable. When there are so many types of issues on the internet solving them through different security measures should be last on the itinerary but would overall make sure that it would be almost impossible to bypass. Having these security measure would make everyone, both old and young, feel safe around social media and would prevent trolls and other types of negativity away from social media. Overall, there are many solutions to this problem but remaining positive with trolls and other propagandists is the best medicine for an issue like this.

  6. DeVante M. November 7, 2019 at 10:56 pm #

    This TedTalk was very interesting to say the least. The conversation is very fitting in today’s culture. For pretty much as long as the internet has existed internet trolls and propagandists have roamed through the internet. They are out to spread misinformation and also harass innocent (and not so innocent) people. Now myself (or whoever else) reading this may never be under the direct attack of a propagandist, but that doesn’t mean that we won’t see it at all. On social media, especially this issue arises. Not only are there trolls on social media but these trolls have what is called a “burner account.” Now these burner accounts are run by real people, but it is not their main account. It is simply an account that they use to harass other people, but their name is not associated with the account so therefore they don’t face any consequences. Sometimes People use burner accounts to people their real image. Take Kevin Durant for example. A couple years ago he was caught defending himself with another account. He was doing this because random people were suggesting reasons why he left his former basketball team. Kevin Durant used a fake account to say bad things about his former teammates. He did this so no one would know it was him and so he could say whatever he wanted without consequences. But he clearly didn’t do a great job because shortly after it was discovered that the burner account he used was indeed him. This was discovered when he commented something thinking he was back on his main account but he was still on his burner account. Now this clearly doesn’t seem like a huge deal to many people but it is surely frowned upon. This situation wasn’t anything too crazy, but there are many people out there whose sole purpose is to “troll” other people. Trolls will stop at nothing until they found out that they won. How do they win you may ask? They win when they know they’ve got your intention and have disrupted your life in some way. These people will harass you and at times try to bring your name down. In my experience the best thing you can do to avoid trolls is to do just that. Simply to not acknowledge their existence. They want a reaction from whoever they are trying to bother. If Kevin Durant would have followed this advice he probably wouldn’t have lost so much respect from the basketball community.

Leave a Reply to Kathleen Watts Click here to cancel reply.