Donald Trump’s campaign spent nearly as much money on Facebook ($56 million) as on TV ($68 million) to defeat Hillary Clinton in the 2016 presidential election.
“I wouldn’t have come aboard, even for Trump, if I hadn’t known they were building this massive Facebook and data engine,” Trump campaign CEO Stephen Bannon, who also heads the right-wing media site Breitbart, told Bloomberg before the election, explaining the campaign’s plans to drive down voter participation with anti-Clinton memes. “Facebook is what propelled Breitbart to a massive audience. We know its power.”
Trump won the election in a squeaker. Yet, if you ask Facebook CEO Mark Zuckerberg about his platform’s power, well, he’s not so familiar with it—particularly when it comes to accusations about fake news stories posted on Facebook.
“Personally I think the idea that fake news on Facebook, which is a very small amount of the content, influenced the election in any way—I think is a pretty crazy idea,” Zuckerberg said two days after the election. “Voters make decisions based on their lived experience.”
Does the CEO of this social media platform know how much his users’ lived experience is affected by Facebook? He may not. (The average Facebook user spends 50 minutes a day on Facebook’s platforms, according to internal company data.) The effects go beyond fake news appearing in one’s newsfeed; many also question whether the company’s affinity-matching services are siloing American users too much into like-minded networks.
Zuckerberg’s strategy is to avoid categorizing Facebook as a media company. Instead, he insists his company is a tech enterprise, which means he doesn’t need to be responsible for the content on his platform. As a media company, Facebook would need to develop ways to actively police the content or risk lawsuits for libel, free speech claims, and demands for fairness. Zuckerberg seems to be in denial here. His company is a primary distribution platform for digital media, and it makes its money through advertising.
Zuckerburg’s denial becomes a particular problem with the issue of fake news. Many online activists spread false news stories on Facebook. Even ostensibly objective digital news sites create inflammatory posts designed to spread on Facebook that may not true. Zuckerberg claims that these news stories don’t reach people, writing in a Nov. 12 Facebook post that “of all the content on Facebook, more than 99% of what people see is authentic. Only a very small amount is fake news and hoaxes.”
We’ve contacted Facebook to ask how Zuckerberg came up with that figure, or whether the word “authentic” is doing a lot of work in place of terms like “accurate” or “true.” If fake news is shared by a user who believes it, is that authentic?
Moreover, by claiming that fake news on his platform doesn’t affect users, Zuckerberg is also undercutting the argument for the efficacy of ads on the platform, as Hillary Clinton’s digital director noted on Twitter.
What, after all, are advertisements, but fake news? But making paid promotions, whether advertisements or fake news, more obvious to users may make it more difficult for Facebook to earn revenue from them. And attempting to police “fake news” on a user-driven website promises to be an exasperating, if not impossible task.
It’s no simple dilemma, for Zuckerberg, or Facebook fans who want to use and improve their platform. But one thing is for sure: You can’t create a world-changing platform for social interaction, then claim it had little to do with a cataclysmic shift in society.