Gizmodo’s recent revelation that Facebook has been biasing its trending news feed against conservative political candidates led to a brief firestorm. At the height of the controversy, Republican senator John Thune from South Dakota threatened to hold hearings on the matter. It all quickly led to nothing. Facebook founder Mark Zuckerberg made nice with conservative leaders, Thune’s people talked to Zuckerberg’s people, and that was that.
Case closed, right? I don’t think so. There are at least five ways Facebook could influence voting in the US presidential election this fall. And that’s just on its Facebook.com social media platform. Because the platform has about a billion active users worldwide, including 72% of the adults in the US, if these methods are employed, they will have a significant impact on our elections. That said, the company also owns Instagram, WhatsApp, and about 50 other companies through which it might be able to exert considerable additional influence. For now, though, let’s stick with Facebook. Here are the five methods:
1) The trending box. Gizmodo quoted an anonymous whistle-blower who used to work on the “curator” team that decided which stories would run in the “trending box” (upper right on the Facebook page—see figure) and which would not. The whistle-blower said not only that the team was systematically removing conservative-oriented stories from the feed for this box, he also said he believed the team’s choices were being used to train an algorithm to do what the team was doing. If the latter is true, that’s scary. He didn’t say that he had any evidence that the team’s actions were being directed by Facebook executives. The biased choices the team was making could simply have been a natural consequence of how team members were recruited: They were mainly from Ivy League universities in the northeast, a demographic that’s typical of Silicon Valley companies. (It helps to explain the liberal-leaning tendencies of the high-tech industry.)
How could a trending box shift votes—or, for that matter, shift opinions about lots of things? An increasing body of research, including my own, has shown that when people are undecided about an issue, a trusted source of information can easily tip them one way or another. People in some demographic groups are especially susceptible to this kind of manipulation. A biased feed could potentially shift far more people than a biased source like Fox News because people expect a feed on a platform like Facebook to be impartial and unbiased. The more people trust the feed, the greater the influence, in part because a feed is a gateway: It links people to other material, some of which they read at length. When links to certain material have been omitted from a feed, fewer people read it, and its influence is diminished.
According to the company, the trending box shows you “a list of topics and hashtags that have recently spiked in popularity on Facebook,” which might imply that it is generated entirely by an algorithm. Gizmodo’s revelations show that there is also a significant human element to the selection process, which means potential for bias and manipulation.
2) The center news feed. Facebook’s main news box, which appears in the center of the page, can exert even more influence than the trending box, in part because Facebook sometimes pins one item to the top of the feed, thus increasing its impact dramatically. Do biased humans play any role in these selections? If the selections are entirely controlled by an algorithm, was that algorithm trained by biased humans? Are employees tinkering with the algorithm to tilt it one way or another? We don’t really know.
What we do know is that Facebook has become one of the most influential news sources in the US. The 2016 Edelman Trust Barometer, which measures the level of trust people have in both businesses and governments, continues to rate the tech industry as the most trusted of all industries by a wide margin, with 74% of people saying they trust tech “to do what is right.” Online news repositories like Facebook’s news feeds are trusted even more than newspapers.
3) The search bar. We don’t think of Facebook as a search engine, but it does have a search bar at the top of the page. Although people mainly use it to find other Facebook members, recently Facebook’s news feed has included a Facebook-sponsored video encouraging users to search for “Election 2016” in the search bar. When you do so, the news feed populates with election-related material, the order of which is entirely under Facebook’s control (see above). This is creepy.
Both the news and trending boxes can shift votes and opinions, but we don’t yet know by how much. What we do know is that the other two methods Facebook can use to shift votes, will likely have a much larger effect on voters than the news and trending boxes, and these other methods are stealthy as hell. These supercharged methods are:
4) Selective “go out and vote” reminders. In 2012, researchers at the University of California San Diego, working with Facebook personnel, published a study showing that when, on election day in 2010, Facebook sent “go out and vote” reminders to 60 million of its users, the reminders caused 340,000 more people to vote that day than otherwise would have. In 2014, Harvard law professor Jonathan Zittrain published an article in the New Republic pointing out that if Facebook chose to send those reminders only to people who favored one candidate or party, that could easily flip an election—the equivalent, said Zittrain, of “digital gerrymandering.” If Facebook chose to do this in November, how would anyone know? This kind of hypothetical shenanigan remains completely legal—not, as far as I can tell, because anyone thinks it should be legal, but simply because laws and regulations haven’t caught up with technology.
5) Selective voter registration reminders. In recent months, primary by primary, Facebook has been pinning “register to vote” reminders to the top of its center news feed. The registration reminder in the figure (above) is the one Facebook had been running in California in advance of the June 7th primary there. What if Facebook is sending those reminders only to people who favor one particular candidate or party? About a third of America’s 218 million eligible voters are currently not registered to vote; that’s 72 million people, 72% of whom might be Facebook users. What percentage of these 52 million people might Facebook be able to prod to register to vote over a period of months before an election? That’s one of the things my colleagues and I are now in the process of quantifying.
If Facebook is sending out registration reminders selectively right now, how would we know? What’s more, since all five of these manipulations are ephemeral—flashing before someone’s eyes, having an impact, and then disappearing forever—they leave no paper trail. All five methods are both legal and invisible, so why would Facebook executives not use them? In fact, if one presidential candidate is better for the company than the other, don’t Facebook executives have a fiduciary responsibility to their shareholders to use them?
It’s little more than an educated guess at this point, but based of our research on the Search Engine Manipulation Effect (SEME), I believe the Facebook.com platform—without the help of Instagram and the other platforms Facebook controls—could shift at least 700,000 votes in the November 2016 election—possibly far more. If, as expected, 140 million people vote and the election is very close (so, say, roughly 70 million people vote for each candidate), Facebook could theoretically shift 0.5% of the voters, creating a win margin of 1%, or 1.4 million votes (two times 700,000) for the candidate it prefers.
All of these manipulations work best if one can distinguish the Trump fans from the Clinton fans from the “swing” voters. Can Facebook do that? You bet. Facebook knows exactly who is supporting whom because of information people post on Facebook pages, messages people send, trending stories and news stories people click—and, thanks to recent business acquisitions, based on messages people send using WhatsApp and memes people favorite on Instagram. Facebook, following in Google’s footsteps, makes almost all of its money from targeted advertising, and, yes, you can even purchase targeted ads on Facebook that will reach particular political groups. (That option doesn’t bother me, by the way, because it’s competitive. Competitive activities are self-correcting, but there is no corrective for Facebook’s own manipulations.)
None of this would mean much, of course, if we had reason to believe that Facebook had no favorites in the upcoming presidential race, but we know just the opposite. Zuckerberg is on record making negative statements about the presumptive Republican candidate, Donald Trump, and newly released donation data show a dramatic preference by Facebook employees for the presumptive Democratic candidate, Hillary Clinton.
But, wait, didn’t we all just read articles reporting that Mark Zuckerberg had pledged to make Facebook content less biased and more impartial? Sure, but two weeks later, he also pledged to EU officials that they could use Facebook for propaganda purposes, supposedly to fight terrorism. This is what happens when powerful new industries are entirely unregulated; they do whatever the hell they want.
I think secretary Clinton would be an exceptional president, but I also believe a free and fair election is the cornerstone of democracy. When major corporations are able to shift large numbers of votes invisibly, democracy becomes little more than a joke.