Support more videos like this at patreon.com/rebecca!
For many years, I’ve been critical of attempts to suppress hate speech, especially on the part of governments. In Germany and Austria, for instance, it’s illegal to deny that the Holocaust happened. Back in 2006, I went on record to say that I didn’t think David Irving should be imprisoned by Austria for denying the Holocaust, despite the fact that he’s wrong, his lies are actively harmful and encourage anti-semitism, and he’s a vile human being who I’d be quite happy to see punched in the face, repeatedly.
I was of the opinion at the time that these dangerous viewpoints need to be voiced if they exist, so that historians and scientists can actively combat them with the truth and so that all people can see what kind of ignorant bigotry is still out there.
I still think that argument has merits, but I’ve definitely softened my stance in the decade since, as Naziism rears its ugly head once again here in the United States. And I’ve pretty much always been of the opinion that aside from governments, private individuals and businesses have every right to censor whatever speech they want to.
That includes Reddit, where overall bigoted and offensive “edgy” speech tends to run rampant. Two years ago, the company decided to try to put a dent in that by banning some of the most obviously disgusting subreddits, with the most popular being r/fatpeoplehate and r/coontown. It’s pretty obvious from the names who the targets were: fat people (usually women) in the former and black people in the latter.
Of course, once these popular subreddits were banned, there were still plenty of smaller subreddits with similar viewpoints that continued to exist under the radar. So what happened when the ban was put in place? Did those users just relocate to other subreddits, to continue the same hate speech?
That’s the question that a researchers at Georgia Institute of Technology recently asked, and then tried to answer with science! One of the fun things about evaluating social media is that there’s so much juicy data to sort through. All you need is a smart question or two and good algorithm for sorting through it. In this case, the researchers built their algorithm by identifying key words used in all posts made in the r/fatpeoplehate and r/coontown in 2015. They compared posts made in the two hate subreddits to posts made throughout the rest of Reddit, automatically highlighting all the most frequently used unique words. These included slurs and also words tangentially related to bigoted arguments, like “IQ” and “hispanics” for r/coontown. The list of words was then manually pared down to just the most bigoted slurs. The researchers could then evaluate Reddit for instances of those words — it’s not perfect, since you can be racist without ever uttering a slur, and of course you can use slurs in an academic way to discuss the slur itself. But when you have such a large data set, it works for noting larger patterns.
Here are the patterns they saw: over 40% of FPH and CT users deleted or abandoned their accounts, compared to 20 to 30% of Reddit users in a matching control group. Among those who continued posting to other subreddits, their hate speech decreased by 80 to 90%. About 1,500 subreddits received influxes of FPH and CT “migrants” after the ban. However, the hate speech in those subs was mostly unaffected, meaning those “migrants” didn’t come in spewing the same bigotry they were talking in their hate subs.
The researchers conclude that the ban “worked,” in that Reddit decreased the overall amount of hate speech without spreading the “infection” that was FPH and CT.
But the researchers also point out that the ban most likely made these users someone else’s problem. Many of them went to other websites where they could congregate and share the same bigoted ideas they were batting around on Reddit, meaning that the ban didn’t “work” to make the Internet, in general, a safer or better place.
That said, it’s worth remembering the other research we have on the danger of “echo chambers” in radicalizing people with violent and bigoted ideas, like ISIS and white nationalists. Taking away one prominent echo chamber, where participants have ample opportunity to lure in new gullible minds, may make a real difference after all.