This week alone, police killings of two black men—Alton Sterling in Louisiana and Philando Castile in Minnesota—went viral on social media. The latter was first documented on Facebook Live.
Afterward, Facebook CEO Mark Zuckerberg posted a message to the platform, discussing the importance of such content, graphic as it may be, in furthering public discourse about social injustices.
Diamond Reynolds’ livestream of her boyfriend Castile’s death has reached over 5 million people. “I didn’t do it for pity, I didn’t do it for fame. I did it so that the world knows that the police are not here to protect and serve us. They are here to assassinate us. They are here to kill us. Because we are black,” Reynolds said, according to a report in the New York Times.
Motherboard neatly explains Facebook’s dilemma:
Facebook has become the self-appointed gatekeeper for what is acceptable content to show the public, which is an incredibly important and powerful position to be in. … [B]ecause the public relies on the website so much, Facebook’s rules and judgments have an outsized impact on public debate.
The Menlo Park, California-based company pegged a disclaimer to Reynolds’ video: “Warning—Graphic Video. Videos that contain graphic content can shock, offend and upset. Are you sure you want to see this?”
Some critics worry that a warning isn’t a strong enough solution for a viral video with highly disturbing content.
“Once that video goes viral, we move into a different space, a space of trauma and grief that becomes inescapable,” Desmond Upton Patton, assistant professor of social work at Columbia University, said in an interview. Upton, whose work focuses on how young people of color navigate violence in their communities and on social media platforms, said companies like Facebook should begin to think critically about the impact of secondary trauma from violent content and its impact on health and overall well-being of its users, particularly the grieving families and communities that are directly affected.
Facebook did not respond to requests for comment. After the death of a Chicago man was broadcast on Facebook Live in June, the company said that it was expanding the team that reviews live content around the clock.
“Facebook has to worry about normalizing this type of content, turning it into spectacle or voyeurism, as the result of people being desensitized after viewing a video several times or viewing several of these types of videos over the course of a month or year, rather than making the viewing experience focused on the moral weight of what the viewer is showing,” said Jeremy J. Littau, an assistant professor of journalism and communication at Lehigh University.
Littau said Facebook has an obligation to users who want to opt out, and that if live-streaming police shootings becomes a growing trend, Facebook may need to reconsider its autoplay feature or be clear that disabling it is an option.
Littau also questions the rules that the social media site has in place when Facebook Live is used to document a shooting, not from the perspective of the victim, but from that of the perpetrator. In August 2015, the gunman responsible for killing two TV anchors posted first-person videos on Twitter and Facebook. Both sites deleted the shooter’s page, but that action came too late—copies of the video had already made it onto the internet. The social media company has long deleted violent video from terrorists groups such ISIL. But it hasn’t a consistent policy for troubling video that surfaces randomly, Littau said.
“There should be some level of equity in what’s taken down and what’s considered threatening and problematic for users,” Patton said. “Perhaps a policy that reflects high risk, medium risk and low risk for social media users would be helpful.” He suggests that collaboration with researchers, social workers, mental health practitioners and community experts may be a step in the direction to appropriately identify and think about how to deal with disturbing content. Facebook could even consider adding time-limits on content that is violent and yet helpful with awareness.
“The graphic content highlights the severity of the situation and documents precisely what happens. We need this so that justice can be served…but we don’t need access to it forever,” says Patton.
An earlier version of this post incorrectly indicated that Alton Sterling’s death was streamed on Facebook Live. It was recorded on a smartphone and shared on social networks.