Study: Why Wikipedia is the Last Good Website

This post contains a video, which you can also view here. To support more videos like this, head to!

Thanks to a recent issue of Wikipedia’s Signpost newsletter, I learned about a really interesting study on Wikipedia itself. I often talk about how social media networks and other websites fail to moderate users, leading to a proliferation of sexist, racist, homophobic, transphobic, or other hateful content. Reddit, for instance, has an upvote/downvote system that rewards “edgy” humor and the bland, toxic thoughts of the average 12-year old unless the subreddit is moderated by people who are serious about their jobs. Their…unpaid jobs.

So it’s interesting that while Reddit, Twitter, and Facebook continue to circle the drain on their inevitable descent to the sewer, Wikipedia has only seemed to get better. It is also completely written and edited by random internet users, so why doesn’t it end up looking like the Nazi graffiti on dive bar toilet stalls?

Well, back in March a political scientist set out to answer that question. Well, sort of–that’s the question that interests me, but as a political scientist, Sverrir Steinsson wanted to challenge an existing political maxim that says that stable institutions only change dramatically due to external factors. He demonstrated that Wikipedia, in contrast, evolved over the years due almost entirely to internal change: namely, users leaving because they didn’t get their way.

So first, how does Wikipedia work? You might know that any anonymous user can make an account and change an article, and that other users can then revert that change. But the way that Wikipedia prevents this from creating an eternal edit war on every “controversial” page is through the use of several dispute settlement mechanisms: first, there are three main “rules” that can narrow down which edit is the preferred one. Content must not be original research but supported by previous research, it must be verifiable so that readers can always check the sources of everything, and it must be presented from a “neutral point of view,” which is going to be important later. If there’s still a dispute, users can appeal to a larger group of editors to weigh in, or escalate to a smaller group of elected administrators on the “Administrators Noticeboard” or arbitrators on the “Arbitration Committee.” Those higher level users can make final decisions, and that’s where we see positive change come about on Wikipedia.

Steinsson focuses on the “neutral point of view” rule, because it’s quite vague and leads to the most disputes: it means “representing fairly, proportionately, and, as far as possible, without editorial bias, all the significant views that have been published by reliable sources on a topic.” But let’s take homeopathy: does neutral point of view mean saying that some people think it’s real medicine and other people don’t? That IS neutral, but it’s also misleading, because no serious researcher or doctor in the world thinks that homeopathy is real medicine. A more accurate statement would be “homeopathy is pseudoscience,” but advocates for homeopathy would argue that that’s not “neutral.”

And by the way, the last time I mentioned homeopathy in a video I got a few requests for an overview video, so if you’re wondering what the deal is with homeopathy, I made a whole video about it that you can go watch right now!

It just so happens that the homeopathy Wiki page is one of 63 articles Steinsson reviewed due to their focus on “pseudoscience, conspiracy theories, extremism, and fringe rhetoric in public discourse.”

He carefully examined the history of each article over time, and at the end of each year he rated each article on how they approached the idea of neutrality when it came to fringe positions:

Category 1 was “Fringe normalization: The fringe position/entity is normalized and legitimized. There is an absence of criticism.

2.Teach the controversy: The fringe position/entity is presented as a matter of active scientific or political dispute (A says X, B says Y).

3.False balance: The lead places emphasis on the expertise, credibility, evidence, and arguments of the anti-fringe side (e.g., “some scientists say,” “some medical organizations say”), but the pro-fringe side still gets space to rebut.

4.Identification of the fringe view: The lead places emphasis on the legitimacy and the overwhelming numbers that compose the anti-fringe side (e.g., “scientific consensus,” “the scientific community”), but space is still given to the pro-fringe side.

5.Proactive fringe-busting: Space is only given to the anti-fringe side whose position is stated as fact in Wikipedia’s own voice. The evidence that supports the anti-fringe position is presented, whereas the flaws of the pro-fringe perspective are outlined.”

So, here’s an example of what happened to the homeopathy page over time: it was created in 2001, and from then until 2006 it was described as a “controversial system of alternative medicine,” which Steinsson categorizes as “teach the controversy.” 

In 2006, context was added to say there was a “lack of convincing scientific evidence supporting its efficacy,” it’s been “regarded as pseudoscience,” and in the words of a 1998 medical review it is a “placebo therapy at best and quackery at worst.” That’s “false balance.” Better, but not great.

In 2013 it was changed to read “the scientific community regards homeopathy as a sham” and “homeopathy is considered a pseudoscience.” So close! They’ve identified the fringe view as fringe.

Finally, in 2015 it was changed to read “Homeopathy is a pseudoscience.” Proactive fringe busting stated as fact in Wikipedia’s own voice.

There were similar evolutions for pages like vaccine hesitancy, conversion therapy, race and intelligence, and the “Lost Cause of the Confederacy,” the white supremacist myth that the Civil War wasn’t really about slavery.

To understand why this happened, Steinsson looked at the editors who were making the edits. He categorized them as either pro-fringe or anti-fringe, and then followed them to see what happened to them over time, and if they continued to be active on the site long term. What he found was that in Wikipedia’s earliest years, disputes between pro-fringe and anti-fringe users tended to resolve in favor of the antis, which caused a knock-on effect in which the pro-fringe editors got frustrated and either left the site voluntarily or led to their sanctioning.

Over the years, this led to a power imbalance in favor of the anti-fringe editors, who stuck around and used their longer editing history to be elected to be admins. They were then able to institute new guidelines to do things like require better sources for topics like medicine, making a hierarchy of sources that further went against the pro-fringe camp.

So again, he explains all this in terms of political institutions but I was interested in it first as a critical thinker who has supported fellow critical thinkers who have prioritized editing out pro-fringe content on Wikipedia since the early 2000s. And secondly I’m interested in it as a case study on how other sites might go about stopping extremists from proliferating. The bad news is that in this case, the success is due to immediate, early action that tilts things in favor of rationality, so it’s probably too late for Reddit or Facebook. It’s certainly too late for Twitter, I mean my god.

But other sites, like Blue Sky and Mastodon and whatever comes next, might be able to learn from this. It reminds me of this story from Michael B. Tager, originally posted on Twitter where he no longer posts because of, well, the Nazis:

I was at a shitty crustpunk bar once getting an after-work beer. One of those shitholes where the bartenders clearly hate you. So the bartender and I were ignoring one another when someone sits next to me and he immediately says, “no. get out.”

And the dude next to me says, “hey i’m not doing anything, i’m a paying customer.” and the bartender reaches under the counter for a bat or something and says, “out. now.” and the dude leaves, kind of yelling. And he was dressed in a punk uniform, I noticed

Anyway, I asked what that was about and the bartender was like, “you didn’t see his vest but it was all nazi shit. Iron crosses and stuff. You get to recognize them.” 

And i was like, ohok and he continues.

“you have to nip it in the bud immediately. These guys come in and it’s always a nice, polite one. And you serve them because you don’t want to cause a scene. And then they become a regular and after awhile they bring a friend. And that dude is cool too.

And then THEY bring friends and the friends bring friends and they stop being cool and then you realize, oh shit, this is a Nazi bar now. And it’s too late because they’re entrenched and if you try to kick them out, they cause a PROBLEM. So you have to shut them down.”

Good on Wikipedia for kicking out the Nazis, the conspiracy theorists, and the pseudoscientists immediately and shutting them down. Let’s hope future websites take note.

Rebecca Watson

Rebecca is a writer, speaker, YouTube personality, and unrepentant science nerd. In addition to founding and continuing to run Skepchick, she hosts Quiz-o-Tron, a monthly science-themed quiz show and podcast that pits comedians against nerds. There is an asteroid named in her honor. Twitter @rebeccawatson Mastodon Instagram @actuallyrebeccawatson TikTok @actuallyrebeccawatson YouTube @rebeccawatson BlueSky

Related Articles

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Back to top button