Facebook is struggling to defend the integrity of its influential “trending” topics—and new documents leaked to the Guardian could make that a lot tougher.
Earlier this week, a report in Gizmodo alleged that the company had systematically suppressed conservative stories, as well as allowed human news curators to “inject” topics into the feed that weren’t trending organically.
The allegations touched a nerve. Facebook has an outsized influence on what stories people see and read each day. The site draws more than 1 billion daily users and is by far the No. 1 source of readers for media outlets. Facebook is also paying prominent news organizations such as the New York Times to publish to its live video player. Shortly after Gizmodo’s report broke, Senate Republicans launched an inquiry, demanding Facebook respond to questions on its news practices by May 24. “Facebook trending” began trending on Facebook.
The company, meanwhile, issued a denial. “Facebook does not allow or advise our reviewers to systematically discriminate against sources of any ideological origin and we’ve designed our tools to make that technically not feasible,” Tom Stocky, Facebook’s head of trending topics wrote in a Facebook post. And on the alleged “injection” practices: “We do not insert stories artificially into trending topics, and do not instruct our reviewers to do so.”
Leaked documents published by the Guardian on Thursday (May 12) indicate otherwise. The 21-page guide describes in detail Facebook’s guidelines for identifying, vetting, and labeling content in its trending section. It lays out when workers may “blacklist” subjects (“duplicate topic,” “doesn’t represent real-world event”) and the step-by-step process through which topics are approved. It also provides insight into the value judgments human curators make daily when they assign an “importance level” to items in the queue.
Facebook has four different importance levels it assigns to trending topics, according to the documents. “Normal” (“the default importance”), “National Story” (“if it is among the 1-3 top stories of the day”), “Major Story” (“if it is THE top story of the day”), and “Nuclear” (“Reserved for the truly ‘Holy S**t’ stories that happen maybe 1-3 times a year”). Within the national category, Facebook relies heavily on the editorial judgements of just 10 news sites: BBC News, CNN, Fox News, The Guardian, NBC News, the New York Times, USA Today, the Wall Street Journal, the Washington Post, and Yahoo or Yahoo News.
In another section, ”injecting topics,” Facebook gives two scenarios in which this is acceptable.
The editorial team CAN inject a topic to replace another topic(s) already appearing in the review tool (in the same scope) to consolidate a story/clean-up appearances.
The editorial team CAN inject a newsworthy topic that is not appearing in the review tool but is appearing in the demo tool (in the corresponding scope).
The trending “demo tool” finds stories based on Facebook conversations, a company spokesman told Quartz.
Facebook’s spokesman said in an email that the guidelines published by the Guardian “appears to be an older version.” The company also passed along a statement from vice president of global operations Justin Osofsky reiterating that Facebook “does not allow or advise reviewers to systematically discriminate against sources of any political origin, period.” Facebook has “at no time sought to weight any one view point over another, and in fact our guidelines are designed with the intent to make sure we do not do so,” Osofksy says.
The leaked documents reinforce once again how any story-picking process will come with limitations, whether from algorithms written by humans, algorithmic steps being executed by humans, or a team granting approval to “nuclear” status for a topic. Any time humans are involved, biases are inevitable.
That’s not necessarily a bad thing—the Guardian reports that Facebook “backed away from a pure-algorithm approach” in 2014 after it was criticized for not featuring enough coverage of political unrest in Ferguson, Missouri. And who knows what a purely algorithmic trending section might look like? As Ben Thompson points out at Stratechery, Facebook’s news feed algorithm “is arguably doing more damage to our politics than the most biased human editor ever could.”
What matters more is how Facebook is allowing these trending topics to be perceived. As we wrote in Quartz earlier this week, readers accept that stories published by traditional media outlets will contain those inherent editorial biases, but they’ve been told to expect otherwise from Facebook. Trending topics are supposed to be a uniquely legitimate snapshot of what people in any given moment actually want to read—an algorithmic reflection of our collective consciousness. Any human involvement—regardless of its purpose—diminishes that.