Inside The Bizarre World of Internet Trolls and Propagandists

from TED

Journalist Andrew Marantz spent three years embedded in the world of internet trolls and social media propagandists, seeking out the people who are propelling fringe talking points into the heart of conversation online and trying to understand how they’re making their ideas spread. Go down the rabbit hole of online propaganda and misinformation — and learn we can start to make the internet less toxic.

More here.

, , ,

2 Responses to Inside The Bizarre World of Internet Trolls and Propagandists

  1. Sean Distelcamp September 12, 2019 at 9:07 pm #

    The top suggestion that Andrew Marantz makes for curbing propaganda and misinformation on social media is that the companies must change the algorithms. This would certainly stop the average person form stumbling across a hateful community by chance, and there would be less cases like the woman who went from being moderate to attending white supremacist rallies in a month, all because she felt like an outcast. However, without a serious reconsideration of how we view free speech and the internet, these communities will still exist. Websites like Twitter, Facebook, Reddit, and 4chan all have substantial alt-right and other hate group communities. They also have vastly different algorithms for showing content. It’s true that you can change what Facebook allows on the site, or how Twitter promotes outrage, but 4chan has no systems in place to promote posts. Everything on 4chan is anonymous and equal and that website is probably the most racist and hateful place on the internet. Conversely, Reddit has many admins and moderators who actively ban users, posts, and entire subreddits for hateful content. This certainly is better, but they simply cannot keep up. People still find ways to spread misinformation and outrage while either carefully maneuvering the site’s rules or evading the watch of the admins. Almost everywhere on the internet that allows people to comment or post has a fringe group of people spreading hate and misinformation regardless of the rules and search algorithms or lack thereof.
    The culture and the social norms of the internet are just different than any other format. People can be almost completely anonymous if they want to, and they can connect with people with the same views as them no matter how insane, offensive or obscure those views may be. Before the internet, there were certainly people who did not trust doctors or people who were unsure about getting vaccinated, but now these people can connect to affirm each others’ beliefs while simultaneously excluding and straw-manning any opposing opinions. Movements such as flat-earthers or anti-vaxxers have come seemingly out of nowhere from these echo chambers. Giving a website the power to decide that flat earth articles should be deleted because they are misinformation seems to be a fine solution. The problem is though that the “right” answers and the correct information is not always obvious. Facebook and other sites already have so much power over what we see and the information we digest. If they are allowed to police posts even more and decide what qualifies as misinformation, we are giving private corporations, who have their own interests prioritized, control over most of America’s main source of news and information.

  2. Joseph Antonucci September 14, 2019 at 2:57 am #

    Watching all of this was a real chore, and I should receive a medal for doing so. I’ll obviously be in the minority on this, but this guy is fear-mongering, plain and simple. He’s using terms such as “propaganda” to brand “fringe” viewpoints that don’t conform to the mainstream narrative as nonsensical or misleading. The only one engaging in propaganda in this dynamic is the man speaking in the video. He is trying to get the audience to cling even more tightly to the mainstream narrative and completely cast aside views that challenge that narrative.

    The conversation being had here is not “should these people be allowed to have and express their views, just like everyone else?” That’s what the conversation SHOULD be. Instead you have these numskulls pondering how we can “make the internet less toxic.” What does that mean? “Toxic” viewpoints are any viewpoints that oppose the mainstream narrative. For example, opposing mass immigration. If you hold this belief, you should be ostracized from society, kicked off social media, and fired from your job, because you are a bad person and you must hate everyone that does not look and talk like you. The reality is that there are plenty of substantive arguments for why allowing a ton of people into ANY country is going to cause some problems, and this is not a viewpoint that festers in spooky online message “propaganda” boards in most cases, it’s grounded in common sense and history.

    Andrew is trying to convince you that Internet memes cause people to become radicalized (again, this is a buzzword term and holds no real weight), and his proposed solution is to suppress the evil scary spooky fascist white supremacist racist sexist blah, blah blah… in other words, cut off free speech.

    To directly address his remedies:

    1) “Be a Smart Skeptic”
    Andrew condescendingly speaks to white teens who question such things as “male privilege” or “white privilege” and tells them they’re being a jerk for doing so. He also casually throws in a “flat earth” meme before saying that general skepticism and independent knowledge gathering is still a good thing,

    Response:
    a. His argument here is that you should be skeptical and open minded, but not TOO skeptical and open minded to the point where you question the mainstream. Doing so means you’re “toxic” or, as he said “being a jerk.” He’s encouraging skepticism but he’s also saying you shouldn’t question certain things, and if you do you’re a crappy person.
    b. He also sneakily takes such things as white privilege and male privilege (zero evidence that either of these things exist systemically) and lumps them in with the flat earth thing. He’s taking something completely outlandish that few people believe, the flat earth theory, and lumping those people in with anyone who questions things such as white/male privilege in an attempt to discredit them as “conspiracy theorists.”

    2) “Free Speech is just a starting point”
    Andrew claims to be pro free speech, and then says some people should not have Twitter accounts.

    Response:
    He says he supports free speech, except for people with bad political opinions who should be censored from the Internet… Free speech for some, not for all; free speech for anyone conforming to the mainstream, not for the “fringe” people.

    3) “Make decency cool again”
    Andrew talks here about how social networks should be “fixed” to prevent the spread of “hateful” information. He says this should be done via “algorithm” changes.

    Response:
    His solution is algorithm manipulation, which is something already practiced by most (if not all) social media platforms. The trending section of Twitter has been manipulated on many occasions before my very eyes, as one hashtag with hundreds of thousands of mentions is removed from the trending page, then replaced with something totally different, having only a quarter of the mentions of the first hashtag. Google manipulates search suggestions and results.

    I find it ironic how many people complain about the power of big tech monopolies and the influence they exert, yet many of these people are totally content letting these same companies have full control over what information is and is not accessible, or deciding who should be allowed to speak and who should not.

    Also worth noting: the key to preventing most or all future “white supremacist terrorists” (statistically a non-issue) from committing heinous acts is very simple. Anyone who has read the shooter “manifestos” that have circulated is easily able to determine what the proper course of action is. That is, unfortunately, a conversation for another day.

Leave a Reply