How the Pelosi Attack Conspiracy Theory Spread: Do People Believe It??
This post contains a video, which you can also view here. To support more videos like this, head to patreon.com/rebecca!
Transcript:
Look guys, I am extremely tired. Maybe it’s because I just spent a weekend in Vegas. Maybe it’s because I came home and stopped all use of caffeine and alcohol, “for my health”. Or maybe it’s because a conservative conspiracy theory convinced a guy to try to kidnap the Speaker of the House of Representatives and instead of conservatives being disgusted by this they just made up MORE CONSPIRACY THEORIES to make it seem like not a big deal and I am SO. TIRED.
Okay, so in the early morning of Friday, October 29, a 42-year old man broke a window to enter the home of Nancy Pelosi, Speaker of the House and 2nd in line of succession for the Presidency. He woke her sleeping husband and said, “Are you Paul Pelosi? Where’s Nancy? Where’s Nancy?” Paul Pelosi informed him that Nancy was out of town for a few days, managed to call the cops, and as the cops got there they said the would-be kidnapper started bashing Paul Pelosi with a hammer. They arrested him and took him away.
That’s…terrifying, actually! Like…okay, cards on the table, I am politically opposed to Nancy Pelosi. I am a progressive and she is a centrist who has in the past spent considerable time opposing progressives. I say this not to look like a saint for condemning her would-be abductor, but to simply provide an example of how a reasonable person disagrees with a reasonable politician: I think Pelosi should have held Trump’s feet to the fire before giving him nearly $5 billion for the border. But she’s also done good work pushing through efforts like the Affordable Care Act, and I understand that her senior position probably sometimes requires her to be more conservative than I’d like.
And despite my general dislike of Pelosi, one thing I won’t be doing is 1) laughing at the story of her 82-year old husband having his skull cracked by a hammer-wielding maniac or 2) making up and/or spreading conspiracy theories that the attacker was in some illicit relationship with him.
Things are always chaotic in the hours following a breaking news story, which is why personally I try to only publicly comment on them if I can do so with facts that are reasonable and backed up by solid sources. From the jump on Friday it seemed to be public knowledge that there was an attacker who broke in looking for Nancy Pelosi and attacked her husband when he couldn’t find her. I immediately assumed that this was politically motivated, by which I mean this was a far-right kook, because what are the chances?? But I kept my mouth shut and waited for more info.
And more info made that all crystal clear: David DePape admitted to first responders that “he was sick of the “lies coming out of Washington D.C.,” the filing said. “I didn’t really want to hurt him, but you know this was a suicide mission. I’m not going to stand here and do nothing even if it cost me my life.”
He also said he had other targets, “including a local professor as well as several prominent state and federal politicians and members of their families.”
He’s been in the news before, having been a “father figure” to a locally famous San Francisco nudist who is prominently a 9/11-truther. According to the LA Times, “DePape followed a number of conservative creators online, including Tim Pool, Glenn Beck, DailyWire+ and the Epoch Times.”
DePape also has a blog called Frenly Frens, which is a popular meme amongst neo-Nazis, ironic or otherwise. He used it to post typical far-right nonsense: pizzagate, casual anti-semitism, COVID isn’t real, women are bad, etc. He even says Gamergate was his introduction to this world of bullshit.
It’s all builds into a very clear picture: “wacky” guy who is obviously prone to conspiracy theories falls into the far-right echo chamber where he becomes radicalized. He hears nonstop about how evil Nancy Pelosi is, how she was central to “stealing” the election from Donald Trump, and so he decides to go do something about it. But it’s like with COVID, where no amount of evidence that it’s real, that it kills, that masks work, or that vaccines work can convince some people, reality simply does not matter here. Loads of conspiracy theorists are already claiming the blog is faked because they don’t understand the difference between when a site is published and when it is archived by archive.org.
And DePape’s history of dabbling in leftwing conspiracy theories have led many conservatives to pretend that all of that other obvious rightwing stuff is just made up or overly exaggerated. So if he isn’t a far-right conspiracy theorist, who is he? Well, obviously because Pelosi is a liberal in San Francisco, it’s clear the attacker is Paul Pelosi’s secret gay lover.
Yeah.
The New York Times has a very nice timeline on how prominent Republican leadership spread this weird, unfounded, homophobic conspiracy theory: first, a local SF Fox affiliate station falsely reported that DePape was found in his underwear (which, even if it was true, would still not even come close to suggesting a consensual relationship with the 82-year old husband of the Speaker of the House). Regardless, that kicked off a tsunami of homophobic speculation, which was then boosted by new King of Twitter Elon Musk, Representatives Ted Cruz and Marjorie Taylor Greene, Fox News’s Jesse Watters and Tucker Carlson, and many others.
All of this happened around the same time that I happened upon a Psychology Today post by Dylan Selterman Ph.D., about how “Experiments show that people share content on social media that they know is false.
People care more about their political tribe and popularity more than accuracy.
Those who share misinformation aren’t gullible. They’re social actors.”
The recent study Selterman uses to support this “found evidence that when sharing content on social media, people will sacrifice accuracy in exchange for social standing. This was true even for content that was not political. For instance, participants were asked if they were willing to share a story touting a debunked conspiracy theory that the TWA 800 plane crash was caused by a missile launch followed by a government cover-up (this is not a salient issue that divides liberals and conservatives). While most of the participants said they would never share such fake news, over 40% of participants were willing to do so, provided that there was an incentive. In part, they were doing it for the “likes,” or to be socially recognized. They were also likely to share misinformation when the researchers offered raffle tickets in exchange for the number of likes that they received. Participants seemed to realize that they would get more engagement if they shared conspiracy theories and that this would pay off for them.”
And that’s true! I found that study, published in the January 2023 issue of the Journal of Experimental Social Psychology. That’s right, hello from the future. It sucks here, same as back when you are. Don’t get your hopes up.
Anyway I’m happy to say this is a solid study! It was pre-registered, meaning the researchers couldn’t just look at a bunch of data and pick out whatever looked good to use as a hypothesis. The researchers built a little mini social network and recruited people to pick which “posts” they would share, including conspiracy theories and real news stories that were both rated by a 3rd party. So a subject would see either “The Astros won the World Series” (rated 6 out of 7 on the Truthiness Scale) or “Birds aren’t real” (rated 2 out of 7 on the Truthiness Scale). Even though people know that birds ARE real, they could be incentivized to share that one instead – not just through money or some other physical reward, but by “likes” and “shares” from, well, bots basically. It’s not a real social network.
That’s interesting, but a pretty obvious drawback of all this (as the study authors themselves note): the subjects know this isn’t the real world, and they know there will be no repercussions to sharing misinformation. No shaming, no loss of respect, no “what is wrong with you I raised you better than that” from their mom, nothing. So we can’t say that this is how people will behave in the real world, but we can say that this gives us a look at one aspect of the spread of conspiracy theories on social media: regardless of truth or even belief, if you reward people for sharing misinformation, some people will share misinformation. Even if the reward is a thumbs up, a fave, a like, an upvote, whatever. And even if those rewards come from bots.
And of course, we already know that this is true. We KNOW that SOME people will share misinformation for rewards even when they know it’s untrue and even when they know it’s dangerous. We have clear evidence that people like Alex Jones, Charlie Kirk, and Fox News grifters KNOW they’re telling lies but they do it anyway, because we reward them with money and attention and power. Like, come on, even I don’t think Tucker Carlson is so stupid that he thinks 82-year old Paul Pelosi was secretly paying for a 42-year old rentboy who looks like this. But it helps him cash those checks.
So it’s not surprising that the same thing happens on a smaller scale: little baby Tucker Carlsons will share an obvious lie because it gets them likes, retweets, upvotes, attention.
All of that sounds reasonable to me, though back on Psychology Today, Selterman wraps up by saying:
“These results challenge some of the widely held notions that people share misinformation on social media because they are gullible. On the contrary, most people who share falsehoods do so because they believe it provides them a social benefit—to gain attention, status, popularity, and respect in one’s community, or to help one’s political tribe defeat another political tribe. These motivations have little to do with people’s naivete or poor reasoning skills. This is why I don’t share the view that misinformation on the internet is an enormous problem that threatens democracy. I think at best, it’s a nuisance on par with similar types of misinformation found in supermarket tabloids, which have been around for more than a century.”
Uh, no? Just to be clear, this study did NOT show that most people share falsehoods because of the social benefit. It actually showed that most people don’t share falsehoods at all, and among those who do, some were more likely to share them if they were rewarded, in an environment in which they would suffer no negative consequences. In reality, there ARE PLENTY of people who are naive and have poor reasoning skills. It takes all kinds to spread a conspiracy theory, and that includes both the nefarious lie-pushers AND the credulous believers who lap it up. And yeah, I don’t think you can really compare the “stop the steal” conspiracy theory, for instance, to the Weekly World News. As far as I know, no one ever broke into someone’s house and cracked someone’s skull with a hammer because they thought Bat Boy was hiding inside.
No, let’s leave the actual conclusion to the authors of the study, who came to the opposite idea as Selterman:
“The spread of conspiracy theories has filled the online environment with misleading information that severely hinders our ability to make effective decisions and address global crises. To effectively curb the spread of these conspiracy theories, it is crucial to not only educate people on how to discern true news from conspiracy theories, but also to regulate and improve the social environment in which people share information (Serra-Garcia & Gneezy, 2021). It may be hard to suppress people’s motive to generate “likes” as they interact with others on social media. However, it may be feasible to reduce the social engagement rewards people derive from sharing conspiracy theories by regulating bots—and providing greater social rewards for sharing true news.”
Well, that sounds like a good idea to me. If it sounds good to you, please go ahead and LIKE COMMENT AND SUBSCRIBE, tell that algorithm what you like.