In a conversation with media executives following his upbeat speech at Facebook’s annual developer conference, Mark Zuckerberg acknowledged that a legal provision helped enable the rise of his company, but added that things are very different now than when that law was enacted 20 years ago.
Referring to section 230 of the US Communications Decency Act, which protects tech companies from civil liability for content shared on their platforms, Zuckerberg said that he “wouldn’t have been able to start Facebook” without it. That’s because Facebook didn’t have the resources to screen content users posted for things such as libel, pornography, or hate speech, and came to rely on users to flag it after it was already on the site.
That “made a lot of sense in a world where it was not possible to understand the meaning of content at large scale,” said Zuckerberg, speaking on the record at the annual Off the Record conference of media executives organized by Jessica Lessin of the Information, Ben Smith of BuzzFeed, and Kevin Delaney of Quartz. But now Zuckerberg believes times have changed. Artificial intelligence and other technologies can allow internet services to automate the finding of harmful and offensive content, and immediately remove it.
The Communications Decency Act is now threatened by reform efforts, with one of them, the controversial FOSTA Act, which holds websites accountable for user content that promotes or enable sex trafficking, recently signed into law.
“I think we’re going to need to make sure that whatever the new framework is that the world moves to doesn’t stop the next companies from getting built,” Zuckerberg said in the meeting. “But we’re in a different place, right?” In any case, Facebook plans to have tens of thousands of workers screen for things such as hate speech, which AI still struggles to recognize.
Zuckerberg has been repeating that Facebook has to take a “broader view of its responsibility,” like a mantra in recent weeks. He explained in the meeting with journalists that his company will have to become more accountable for how the tools it offers are being used. In the last decade, his company was governed in a “community-driven” way, where violations on the site had to be flagged by users, and the company would deal with it reactively. ”I just think that that’s not going to fly going forward. I think we have a responsibility to do a lot more.”
One of the ways Facebook is trying to be more responsible is protecting elections from manipulation by bad actors. The company now frequently touts its ad transparency effort, which requires all political and issues advertisers to verify their identities.
“A lot of the debate that we had internally was: we’re essentially going to be losing money on running political ads because we’re hiring so many people to be able to make sure that we’re not taking bad money, that the cost is going to be greater than the money that we make,” Zuckerberg said. He said Facebook decided to continue accepting political ads despite the likelihood it would lose money on them because “this is an important part of democracy.”
Zuckerberg said he was committed to supporting journalism, and making sure people can get trustworthy news on the platform, and that the company aims to “build a sense of common ground and not polarize.”
He underlined that he particularly values the forms of journalism that are chronically underfunded: investigative, local, and international reporting.
“We view ourselves as having a responsibility to support the institution of journalism. We want to make sure that across all of the areas, including ones that have been traditionally less funded, like local news and some areas internationally, that we do our part to support funding of more of that,” he said.
He did say he didn’t plan to pay publishers a fee for accessing their content modeled after the carriage fees cable providers pay to cable TV channels. News Corp’s Rupert Murdoch and other media executives have proposed the carriage-fee approach. ”I’m not sure that makes sense,” Zuckerberg said.
Zuckerberg also said that Facebook has already started implementing a new system of boosting or suppressing the appearance of links to specific publishers’ content on its site based on how trustworthy users rated the publisher to be.