Don’t trust Facebook to give it to you straight on politics

Except maybe conservative news.
Except maybe conservative news.
Image: AP Photo/Eric Risberg
By
We may earn a commission from links on this page.

Facebook’s influential trending news feed is looking less and less organic.

A new report in Gizmodo alleges that Facebook workers charged with curating the company’s trending news topics “routinely suppressed news stories of interest to conservative readers.” Facebook buried stories that trended organically among users on Mitt Romney, Rand Paul, and the Conservative Political Action Conference, among others, according to Gizmodo’s unnamed source, a former journalist who used to work on trending news at Facebook.

In addition to suppressing conservative news, Facebook also allegedly boosted topics that managers felt weren’t receiving adequate attention from users, a method known at the company as ”injection.”

When users weren’t reading stories that management viewed as important, several former workers said, curators were told to put them in the trending news feed anyway. Several former curators described using something called an “injection tool” to push topics into the trending module that weren’t organically being shared or discussed enough to warrant inclusion—putting the headlines in front of thousands of readers rather than allowing stories to surface on their own. In some cases, after a topic was injected, it actually became the number one trending news topic on Facebook.

The report is the latest indication that, contrary to the image it has crafted, Facebook is less a neutral algorithmic arbiter of the news than a media platform with its own liberal-leaning biases.

Facebook didn’t respond to a request for comment.

Last week, Gizmodo reported on the small group of “news curators” Facebook employs to vet and edit its trending topics. That story noted that Facebook’s workers were “told to select articles from a list of preferred media outlets that included sites like the New York Times, Time, Variety,” but would “regularly avoid sites like World Star Hip Hop, The Blaze, and Breitbart.” It followed a report from April that Facebook employees had asked chief executive Mark Zuckerberg whether they should try to stop a Donald Trump presidency.

Facebook has worked hard to cultivate the impression that its trending queue is mostly algorithmic. On the help section of its website, Facebook explains that “[t]rending shows you topics that have recently become popular on Facebook … based on a number of factors including engagement, timeliness, Pages you’ve liked and your location.” An article published by Re/code last summer noted that while “an actual human being” approves trending topics and writes their brief descriptions, those people “don’t get to pick what Facebook adds to the trending section. That’s done automatically by the algorithm.”

The implications of Facebook’s pseudo-journalistic practices are profound. Facebook is far and away the no. 1 source of readers for news outlets. On mobile in particular, referrals on Facebook drive an outsized amount of traffic.

The combination of Facebook’s vast reach—more than 1 billion people use it each day—and its insistence that trending topics are organic can make its news items seem to be legitimate reflections of what people want to read. We accept that the stories in traditional news publications are arranged and edited, but the topics that trend on Facebook are supposed to somehow be a reflection of our collective consciousness.

If trending topics include Black Lives Matter, or the Syrian refugee crisis, or SpaceX’s rocket landing, it’s supposed to be because some critical mass of the Facebook community cares, not a random group of editors or reporters. If conservative news is conspicuously absent, it’s supposed to be because a critical mass of Facebook isn’t interested.

That Facebook is pulling the strings shouldn’t be surprising. But if the company is truly manipulating what people see and regard as “trending,” Facebook should be honest about that. What we see might not change, but how we think about it will.