The giant American tech companies currently dominating social networking and internet searches will soon have to think critically—if they aren’t already—about their role in the media landscape. Equally important, the size and influence of these companies means they must also come to terms with how that influence can be balanced, through transparency and accountability.
Facebook has yet to acknowledge that a need for such accountability exists, however, especially when it comes to the company’s relationship with journalism.
Andy Mitchell, Facebook’s director of news and global media partnerships, was a featured speaker at the International Journalism Festival in Perugia earlier in April. Each month, 1.4 billion people use Facebook. In America, 30% of adults get their news via Facebook (27% in the UK); and 88% of millennials in the US do so (71% in Italy). That makes Mitchell one of the most—if not the most—powerful directors of news distribution on the planet.
Mitchell’s talk was generally straightforward, with the exception of one glaring and important omission.
Facebook, according to Mitchell, wants to improve the “experience” (this word cropped up a lot) of people getting their news on mobile. Links to clunky news sites load slowly and the social media behemoth is talking to major sites (such as The New York Times and Buzzfeed) about embedding their journalism directly. He did not deny that most statistics available underline just how much people like getting their news on Facebook.
This was all fascinating, but there wasn’t any mention of how Facebook sees and handles its role as a news gatekeeper, influencing both the detail and flow of what people watch and read.
The issue didn’t come up until the very end of Mitchell’s session, when a Scandinavian audience member asked Mitchell about Facebook effectively censoring some news material linked to from his organization. This question was followed by a similar query from an Italian student in attendance. Mitchell batted both questions away without addressing either directly.
I asked Mitchell whether he thought Facebook was in any way accountable to its community for the integrity of its news feed. Mitchell, by now appearing frustrated, repeated that Facebook wanted people to have a “great experience,” that the feed gives them “what they’re interested in,” and that Facebook’s feed should be “complementary” to other news sources. In short, he didn’t come close to answering the question.
To continue to evade these topics is condescending, to say the least. Facebook is not—and knows quite well that it is not—a neutral machine passing on news. Its algorithm chooses what people see, it has “community standards” that material must meet, and it has to operate within the laws of many countries.
Shaping the news
To imply, as Mitchell did, that Facebook doesn’t have journalistic responsibilities is false. And, at least in the long run, it won’t work. Facebook is a private company that has grown, and made billions, by very successfully keeping more people on its site for longer and longer periods of time.
I can imagine that the idea of extra responsibilities, responsibilities which might distract from that mission, must seem like quite a nuisance.
Indeed, Google once claimed something similar. Its executives would sit in newspaper offices and claim, with perfectly straight faces, that Google was not a media company. As this stance gradually became more and more absurd, Google grew up and began to discuss its own power in the media.
Each day in Perugia was a reminder that Facebook is making (usually via its algorithms) news decisions every hour. Rasmus Kleis Nielsen, for example, mentioned in a presentation the disagreements over material temporarily blocked news stories from Berlinske, a Danish media company, from showing up on Facebook (at issue was a picture of some hippies in the 1960s frolicking nude in the sea). An editor for the Turkish daily Milliyet similarly reminded me that Facebook has strict rules about how Kurdish flags are displayed on its feed in Turkey.
My blog post about the talk with Andy Mitchell resulted in a lot of feedback from readers—clearly I had touched some kind of nerve. Jay Rosen of New York University, who first wrote powerfully on this subject in 2014, also commented on the topic in a long post to his Facebook page Apr. 21. Rosen credited Facebook with caring about news but asked them to stop pretending that their handling of news content is of no interest to anyone but themselves. He writes:
Facebook has to start recognizing that our questions are real. We are not suggesting that it “edits” the news feed in the same way that a newspaper editor once edited the front page. We are also not suggesting that algorithms work in the same way that journalism once operated—with elites deciding what is and is not news. It’s a different way. And that’s why we’re asking about it.
Sooner or later, Facebook’s state of denial will have to end.