Support more videos like this at patreon.com/rebecca!
A common theme of my videos is “how easily can pseudoscience spread on social media,” with usual answer of “very, very easily.” And the follow-up question is usually “what can we do about it,” and the usual answer is “well it’s complicated but also it doesn’t really matter unless the companies who run social networks actually step up to do something about it.”
In that vein, today I present you with a tale of two social networks. The worst of sites, of course, is Facebook. They are literally heralding the downfall of humanity, as the garbage dump where pseudoscience, xenophobia, misogyny, and straight up neo-Nazism go to propagate. Mark Zuckerberg is constantly trying to reassure us little people that he’s working on the problem, even while the problem keeps showing itself to be bigger and bigger, like an iceberg that is actually just the hat worn by a Lovecraftian horror from the dark depths.
Most recently, Zuckerberg held a public conversation in which he announced that Facebook was looking at using human fact-checkers to try to stop the flow of misinformation on the network — the problem being that they just need a lot of them, and so he was basically talking about crowdsourcing fact-checking.
The Guardian turned to former Snopes editor Brooke Binkowski (who, for the record, I think is great). She said, “You can’t apply an open-source model to factchecking and journalism. You have to have experts. You can’t just have Joe Schmo who thinks that the New York Times is a liberal rag, just because Trump says it’s the enemy of the people.”
I love Binkowski and hate Zuckerberg, but I do have to say that there is evidence that crowd-sourced fact-checking can work. I’ve spoken before about studies that show that in some cases, humans are good at telling fact from fiction, like in the aftermath of earthquakes when sharing critical information on social networks. Of course, in other cases it’s the opposite — humans just spread whatever information they have handy, true or not. It’s tricky, and what it usually comes down to is the social network itself — how is it set up? How does it subtly or obviously help people share what’s true and discourage them from sharing misinformation?
Take Wikipedia, for example. That’s an organization that somehow manages to motivate unpaid volunteers who are not necessarily experts in all the fields they’re writing about, and incentivizes them to be as accurate as possible.
Or take Reddit. As a whole, Reddit is built to spread misinformation, because misinformation is flashy and exciting and fun while the truth usually isn’t, and whatever is flashy and exciting and fun on Reddit gets upvoted, and boring things are lost forever. With no other moderation in place, it’s pure unadulterated garbage. But in some subreddits, volunteer (and often amateur) admins make all the difference. r/science is generally really good, and they use a heavy hand in deleting anything that isn’t on topic or backed up with evidence.
So no, you can’t just say “let’s crowdsource fact-checking” and have it magically work through the wisdom of crowds or some other pop-sci bullshit you read about in Reader’s Digest. (Does that still exist? I used to love it.) You need to have a well-designed algorithm helping things out. You need to give the people the tools they need. You need to incentivize the truth and decentivize lies. And to do all that, you need to care about it. And guess what Mark Zuckerberg cares about? Yep. Money. There’s a reason Wikipedia has to beg for donations while Mark Zuckerberg is worth $61 billion. Misinformation makes money. Fact checking? It’s not gonna happen.
On the other hand, there’s Pinterest. If you don’t have a girlfriend getting married soon you probably forgot Pinterest exists, but I assure you that 250 million people are currently using it (compared to about a billion on Facebook, just in case you were wondering). And they just enacted a really good measure. Pinterest realized that the majority of posts on their site that reference vaccines were spreading dangerous misinformation about them, claiming that they cause autism or other diseases (which they do not). We are currently seeing more and more outbreaks of vaccine-preventable diseases like measles in Washington State, and also in the Philippines where 189 people have died. It’s extremely important that we not just educate people about the safety of vaccines, but that we stop the spread of misinformation about them.
So Pinterest decided to stop returning any results for anyone searching for vaccines, positive or negative. Why not positive stories? Well, they probably have a hard time distinguishing those using an algorithm, an especially difficult task considering that the network is image-based. And since the majority were negative, they’ve just exed them all while they work out a better way to separate the pseudoscience from the science.
It’s not a perfect solution, but it’s a huge step that other social networks haven’t taken. Why? Probably because you’re risking a bunch of “Tiger Moms” getting angry and leaving for another network. But Pinterest has decided to make public health a priority over money. Apparently they’ve had a public health initiative in place since 2017, when they took steps to stop people from posting fake cancer cures like essential oil sellers. That’s huge — maybe it’s a stereotype but I feel that both essential oil cancer scammers and Pinterest have the same target market, so Pinterest is taking a big risk in alienating those people. But that overlap in target market is also why it’s so important for them to do the right thing here, and they have. It’s really cool, actually, and it makes me want to start using Pinterest. I mean, when I get married next time or whatever.
So yeah, good job Pinterest! And also good on them for now pointing at Facebook and asking them when they’re going to do something similar. Zuck? We’re waiting! Any minute now, I’m sure.