Science

Is YouTube Winning the Fight Against Conspiracy Theorists?

This post contains a video, which you can also view here. To support more videos like this, head to patreon.com/rebecca!

Transcript:

Ah, YouTube. My blessing. My curse. The possible cause of the radicalization and dumbing down of significant number of rightwing conspiracy theorists.

A new study from scientists at UC Berkeley has found that despite promises to do better, YouTube still has a problem with recommending conspiracy theory videos way, way too often to average viewers, trapping them in an echo chamber where a percentage of people will become radicalized.

The researchers spent more than a year looking at eight million recommendations. Those are the little thumbnails you see off to the right of your screen, at least until YouTube’s next redesign after which point I will be pointing at nothing. Or if you’re watching this on Skepchick or my Patreon, I guess I’m just pointing at nothing, too. Look, you’ve seen a video on YouTube before, stop acting dumb. You know what I’m talking about.

The researchers found that back at the end of 2018, YouTube was recommending conspiracy videos up to 10% of the time. In the Spring of 2019, the company promised to fix its shitty algorithm to stop pushing audiences to watch this garbage so frequently. Sure enough, at that point the recommendations dropped to about 3% by June. Mission accomplished, right? Well, it was a good start but over the rest of 2019, the recommendations once again rose up and have fluctuated pretty wildly ever since.

The researchers point out that YouTube didn’t do all it could to remove misinformation because of what the company decided to consider pseudoscience. Recommendations are down for things like flat-earthers and 9/11 conspiracy theorists, but they continue to flourish for misinformation about climate change and evolution, and even QAnon and Pizzagate conspiracies, which the researchers note “were described by the FBI as very likely to motivate some domestic extremists to commit criminal, sometime violent activity.”

So YouTube has made a bit of an effort, but they could make much more of an effort. I should note that this research is far from perfect — for instance, by necessity all the data on recommendations had to come from a logged-out account, but a huge part of YouTube’s algorithm is based on collecting information about each unique user. That’s part of the researchers’ overall point, though: the public can’t be assured that YouTube is actually doing all they can to stop the radicalization and weaponization of their platform if they continue to hide how their algorithm works and make it as hard as possible for researchers to study it.

I want to point out that as a skeptic, I’m actually really at risk of these changes negatively affecting me. It’s great if YouTube’s algorithm is recommending conspiracy theory videos less, but it’s not great if it can’t distinguish between a video that promotes a conspiracy and one that refutes or questions it. For instance, I’ve made three videos discussing the science and myths of COVID-19, or coronavirus, and all three have been demonetized, which also means they are being recommended less. 

And back in 2018, YouTube completely removed my video debunking the idea of “crisis actors,” in which conspiracy theorists claim that people are only pretending to be dead in the instance of terrorism or mass shootings. The company claimed that my video included “predatory behavior, stalking, threats, harassment, bullying, or intimidation,” though I hadn’t even mentioned any conspiracy theorists specifically. They claimed that the video had been flagged for review (with no explanation on whether that was due to user reports or an algorithm) and then deleted after review, which implied that a human watched it. That could not possibly have been true, but my only hope was to appeal the decision and wait weeks for them to reinstate the video, which they eventually did.

Honestly if they really do have humans on hand to actually review videos before they’re removed, demonetized, or limited in recommendations, I would be fully on board with them increasing the number of topics they crack down on. But the problem is that we are then expecting YouTube to have higher standards than, for instance, the History Channel, a mainstream cable network that regularly gives a platform to the idea that ancient Egyptians were too fucking stupid to build the pyramids so it must have been aliens. And when it comes to climate change, the President of the United States thinks it’s a Chinese hoax so we’d be expecting YouTube to censor him if he uploaded a video on it. And don’t pretend that it’s far-fetched that Donald Trump would upload a conspiracy theory video to YouTube. The only thing stopping him is the fact that he’s too fucking stupid to figure out how to do it.

So really, I’m glad that this report backs up what YouTube has said: that they managed to reduce the overall number of conspiracy theory videos people are watching. I hope they keep working on it and at least make an effort to help more researchers learn about what’s happening on the platform, before we have yet another QAnon mass murderer in the news.

Rebecca Watson

Rebecca is a writer, speaker, YouTube personality, and unrepentant science nerd. In addition to founding and continuing to run Skepchick, she hosts Quiz-o-Tron, a monthly science-themed quiz show and podcast that pits comedians against nerds. There is an asteroid named in her honor. Twitter @rebeccawatson Mastodon mstdn.social/@rebeccawatson Instagram @actuallyrebeccawatson TikTok @actuallyrebeccawatson YouTube @rebeccawatson BlueSky @rebeccawatson.bsky.social

Related Articles

One Comment

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Back to top button