Wikipedia’s not as biased as you might think

All viewpoints welcome.
All viewpoints welcome.
Image: Reuters/Jim Young
We may earn a commission from links on this page.

The internet is as open as people make it.

Often, people limit their Facebook and Twitter circles to likeminded people and only follow certain subreddits, blogs, and news sites, creating an echo chamber of sorts. In a sea of biased content, Wikipedia is one of the few online outlets that strives for neutrality. After 15 years in operation, it’s starting to see results.

Researchers at Harvard Business School evaluated almost 4,000 articles in Wikipedia’s online database against the same entries in Encyclopedia Brittanica to compare their biases. They focused on English-language articles about US politics, especially controversial topics, that appeared in both outlets in 2012.

“That is just not a recipe for coming to a conclusion,” Shane Greenstein, one of the study’s authors, said in an interview. “We were surprised that Wikipedia had not flailed, had not fallen apart in the last several years.”

Greenstein and his co-author Feng Zhu categorized each article as “blue” or “red.” Drawing from research in political science, they identified terms that are idiosyncratic to each party. For instance, political scientists have identified that Democrats were more likely to use phrases such as “war in Iraq,” “civil rights,” and “trade deficit,” while Republicans used phrases such as “economic growth,” “illegal immigration,” and “border security.”

In its initial years, Wikipedia’s crowdsourced articles were tinted very blue, slanting more toward Democratic views and displayed greater bias than Britannica.”Wikipedia content, it’s true, starts out as a little bit of a loud Democrat, as I sometimes joke,” said Greenstein. However, with more revisions and more moderators volunteering on the platform, the bias wore away. In fact, the upper quartile of the Wikipedia’s sample had enough revisions that there was no longer any difference in slant and bias from its offline counterpart.

“In comparison to expert-based knowledge, collective intelligence does not aggravate the bias of online content when articles are substantially revised,” the authors wrote in the paper. “This is consistent with a best-case scenario in which contributors with different ideologies appear to engage in fruitful online conversations with each other, in contrast to findings from offline settings.”

More surprisingly, the authors found that the 2.8 million registered volunteer editors who were reviewing the articles also became less biased over time. “You can ask questions like ‘do editors with red tendencies tend to go to red articles or blue articles?'” Greenstein said. “You find a prevalence of opposites attract, and that was striking.” The researchers even identified the political stance for a number of anonymous editors based on their IP locations, and the trend held steadfast.

Wikipedia was not surprised by the study’s findings. “When you have that space of discourse, people who are dedicated to knowledge will come to more balanced perspectives over time,” Juliet Barbara, a representative at the site’s parent organization, the Wikimedia Foundation, tells Quartz. That’s not to say there aren’t vicious edit wars, but more often, both sides eventually come together.

In a bid to keep hateful discourse off the platform, the online knowledge database lays down ground rules. “Wikipedia does not agree on ‘one single truth’ on a topic,” Barbara said. She believes that Wikipedia users are dedicated individuals who make it a point to detect bias and get rid of it. Another equalizing rule is that opinion can’t reside on the platform—everything has to be backed up by reliable sources.

Today, Wikipedia is less overtly blue or red and instead looks purple with “a slight blue leaning to it,” says Greenstein. Though he hasn’t done such an analysis, he says New York Times editorials would look somewhat similar.