Facebook is getting rid of its red flags on articles that signal that they are fake news. As it turns out, the company said in a blog post, putting red flags on fake news actually encourages people to click on them.
Instead, Facebook said it will include links to “related” articles under contentious posts that will display other, more trustworthy news sources reporting on the same topic. Last year, Facebook contracted with a number of fact-checking organizations in an effort to reduce the amount of fake news circulating on its site. Those groups have been flagging fake news on Facebook with, well, red flags.
“Academic research on correcting misinformation has shown that putting a strong image, like a red flag, next to an article may actually entrench deeply held beliefs—the opposite effect to what we intended,” writes product manager Tessa Lyons in a post.
Facebook has been engaged in a year-long effort to rid the site of content that looks like news, but is posted by organizations trying to generate ad revenue or falsely influence political outcomes. The company is also under fire for selling ads to Russian organizations that posted untrue information in an effort to influence the 2016 US presidential election. It hired Snopes, Politifact, and the National Review, among others, to flag fake stories; those organizations will continue reviewing posts.
Facebook designers wrote in a post on Medium that it was hard for users to find out why something was disputed, requiring several clicks, and that the process of flagging content was cumbersome and slow, since it required two external fact-checkers per post. Additionally, they wrote, the designation offered no nuance—it only worked for posts flagged as false, but not, for instance, those that were partially false or unproven.
The “related articles” effort, which Facebook started testing earlier this year, will tell you in a nice, non-judgemental way that there is more to the story. Posts with contentious content will include context: articles on the same topic from other sources, including from the fact-checkers. If you try to share it, a message will pop up telling you that you may want to check out the other sources before you do so.
Facebook has been adamant about fighting misinformation. Aside from posting related articles and putting red flags on stories, the company has tried prioritizing user comments that called news posts”fake”. That effort resulted in legitimate news being labeled as untrue.