Well, season 2,022 of Earth has taken quite the turn from “global pandemic” to “land war in Europe” as Vladmir Putin did the thing that most experts agreed would be extremely stupid, and invaded Ukraine. It’s interesting that this absolutely horrific move has pretty much united the world against Putin, including a significant portion of Russians (though a less significant portion of conservative Americans). It’s like…okay, let’s set aside our feelings about vaccine mandates and just all agree that Putin is a pathetic bully who should be beaten to death in the streets.
Now, don’t worry, I promise I will not be one of those talking heads who suddenly becomes an expert on geopolitics. I think it’s pretty obvious who is in the wrong here, and I think that this is one of those cases where certain sanctions (like kicking Russia off SWIFT) can be effective, but all I can really do is trust the experts and throw money at humanitarian organizations like Doctors Without Borders.
But while I’m not an expert on, you know, war, I am something of an expert on misinformation, and this becomes a very big deal every time there’s a big, scary, live-breaking event. Waaaay back in the 90s we had to get our news once a day from bland newscasters with nice teeth, and for really important breaking news sometimes the bland people would interrupt Power Rangers.
The benefit of that system was that it gave everyone time to sort out what was really happening and one or two trusted sources to get that information. The drawback was that we only got one perspective and the delay meant that the news couldn’t be used to convey important information to the people affected.
Social media changed all that: now we get news literally as it happens, which can be crucial for helping people in the midst of an emergency, and it’s great that we get various perspectives from people on the ground, but the drawback is that it’s much, much easier to spread misinformation, either purposely or accidentally.
There’s been loads of research done to determine how misinformation spreads on social media, whether it’s a real problem, and how to stop it. I’ve talked about some of this before: data shows that corrections to viral but wrong tweets rarely reaches the same size audience, like during the Occupy Wall Street protests when NBC tweeted that the NYPD closed the airspace over the crowds. The NYPD quickly corrected them and within the hour NBC issued a Tweet retracting the information. The incorrect Tweet reached more than 200 retweets every 10 minutes, while the correction barely reached half that.
But other data suggests that in breaking situations, people tend to be pretty good at telling fact from fiction. During the Chilean earthquake of 2010, researchers looking at viral tweets found that “false rumors tend to be questioned much more than confirmed truths, which we consider a very positive result. As an application, given that detecting when a tweet is asking for information should be possible to do with state-of-the-art text classifiers, microblogging platforms could for instance warn people that many other users are questioning the information item they are reading. This would provide signals for users to determine how much to
trust a certain piece of information.” And of course, Twitter DID institute just such a thing, tagging potential misinformation during COVID.
But even careful people can share misinformation – for instance, the outrageously inspiring story from Snake Island, where Ukrainian border guards were threatened by a Russian warship to surrender or else be fired on. In audio that was shared widely, you can hear them telling the warship to go fuck themselves. Then, according to Ukrainian officials, they were all killed. That was reported on by mainstream news sources but this week the State Border Guard Service of Ukraine revealed that actually the guards may have been taken alive as hostages. They revealed this, fittingly, in a post on Facebook, in which they described Russian propagandists spreading misinformation about Ukrainian forces surrendering, which makes the situation even more difficult to understand. When even the officials aren’t sure what’s happening, how can the average person a million miles away understand it?
Personally, when I read about the Snake Island story, I saw that it was in several mainstream articles, then I listened to the audio, and I don’t speak Russian or Ukrainian but a lot of people who do confirmed that that’s what was being said, so I shared it. I only shared the audio exchange because I wasn’t positive they’d all been killed (though I believed it, and I still believe they may have been killed).
And when I shared it, I did think “well, on the off-chance it’s wrong at least it’s wrong in the ‘right’ direction: propaganda to inspire people to support the Ukrainians.” Not that I’m champing at the bit to share any misinformation, but there IS room for propaganda in war. It’s important. I’ve seen a lot of people mocking the number of memes flying around, and sure, Putin isn’t going to give up because he saw a mean meme about him. But propaganda does work: to inspire people on the ground to fight back against an overwhelming enemy, to convince bystanders to offer economic support, to exhaust the enemy and erode their support, and of course to sometimes drastically change the course of a war. Consider the famous example of British propagandists spreading the rumor that their pilots had superhuman eyesight due to eating carrots, in order to cover the fact that they had radar.
The “Ghost of Kyiv” rumor may fall into that category: is there really a Ukrainian fighter pilot who is shooting down Russian planes at an incredible rate? Yes, the photos and videos that made their way around social media weren’t real – they were clips from a fighting simulator video game – but the story MAY be true. Whether it’s true or not, though, the stories are an unqualified good for Ukraine’s psychological war. The idea inspires Ukrainians, fascinates foreigners who want to help, and might terrify the Russian troops who are sent into Kyiv.
In World War I, the “Red Baron” was a real fighter who really shot down something like 80 Allied planes, and he became Germany’s single biggest celebrity. The power of that inspirational propaganda made him “worth as much…as three divisions” according to a German general. In terms of staying under the radar, so to speak, it was probably not the smartest thing to paint his biplane bright red, but as a symbol it was outrageously effective. I mean, until he was shot down and killed by a bunch of people who could see him very clearly in the sky.
Don’t get me wrong, I’m not arguing in favor of knowingly sharing propaganda. But I do think it’s worth understanding that not all misinformation is equal, and I think it’s interesting to be experiencing a time when certain bits of misinformation can actually be….good?
That said, it’s better to double check everything before you share it. I’d rather not share anything than accidentally share something that is wrong and could be damaging to the people who are currently most at-risk on the ground in Ukraine. I highly recommend you do a few things before you smash that retweet button on a breaking news report or photo or video: first, ask why you’re sharing it and whether you need to, anyway. Then ask if it’s too good to be true, if it confirms your existing worldview a little too perfectly. Then ask if it’s being reported from several different sources, or if all the outlets are using the same single source. Mainstream news gets things wrong a lot, but if established outlets are all reporting the same thing, there’s a better chance that it’s true. And finally, there are loads of people out there who are committed to investigating and debunking things in real time, like Snopes, who just investigated the viral Ukrainian road sign supposedly reading “Go fuck yourself” that was shared on Facebook by the official State Agency of Automobile Roads. The sign was digitally altered as a joke to catch the eye of Ukrainians and urge them to tear down and alter road signs to confuse invading Russians. In other words, weaponized misinformation: the good kind.
It’s inevitable that people will be fooled by propaganda, that people will knowingly share misinformation, and that the “fog of war” will prevent even careful investigators from teasing fact from fiction in every circumstance. But you can help by not being part of the problem, and not sharing dodgy info on social media.