A suicide streamed live exposes Facebook’s thin line between violence and public service

Facebook’s definition of violence doesn’t include a live-streamed suicide—but should it?
Facebook’s definition of violence doesn’t include a live-streamed suicide—but should it?
Image: AP Photo/Czarek Sokolowski
We may earn a commission from links on this page.

A 12-year-old streamed her suicide live on social media, and for nearly two weeks, the video was easy to find on Facebook.

On Dec. 30 in Polk County, Georgia, Katelyn Nicole Davis posted a 43-minute livestream to social media platform In the video, Davis can be seen setting up a noose on a large tree. She later climbs onto a foothold in the tree, apologizes to the camera, and steps off. In the last 15 minutes of the recording, as the sky darkens and Davis’ body hangs in the foreground, her phone rings loudly, and people can be heard calling her name in the distance.

Davis’ family removed the video from, but within hours copies had already gone up on other sites, including YouTube and Facebook. YouTube removed the video earlier this week, saying it violated the site’s graphic-content policy. But the full clip remained available on Facebook until the afternoon of Jan. 12. It is still viewable on other blogs.

“We are making a specific request that anyone who has any knowledge, videos, or comments regarding this case, please keep this information off of the internet,” the Polk County police department said on Jan. 9. Authorities said they had received messages and calls from people outraged that more couldn’t be done to take the video offline, and requested that people refrain from sharing it “out of respect for the family of the departed and for the deceased themselves.”

This afternoon, Facebook started removing copies of the video from pages where it had been posted (including the version Quartz was able to access earlier today). The company said it removed the video for violating its community guidelines, which bar the “promotion of self-injury and suicide.” Those guidelines do not prevent people from sharing information about self-injury and suicide, as long as the information doesn’t promote those things. Facebook did not respond to questions about what made Davis’ video fall on one side of that line versus the other.

The decision to pull down a video isn’t one Facebook takes lightly. The social network’s policy is to not remove any user content, as long as the value of public discourse outweighs the discomfort caused by said content. And while Facebook explicitly forbids content that celebrates or glorifies violence, a succession of graphic images, videos, and live videos posted to the site have exposed the ambiguity of those rules. In 2014, for example, Facebook refused to take down a US marine’s self-harm images, despite pleas from his family, citing the distinction in its community guidelines between a person sharing their own self-harm as opposed to someone sharing another person’s self-harm. After public outcry, the site later relented.

Facebook has argued that leaving up disturbing content that doesn’t fall afoul of its guidelines could help raise awareness about important issues. But experts say the social network’s first priority should be the health and well-being of grieving families and directly affected communities, all of whom can experience secondary trauma as long as graphic videos remain live.

“Once that video goes viral, we move into a different space, a space of trauma and grief that becomes inescapable,” says Desmond Upton Patton, assistant professor of social work at Columbia University. Patton studies how young people of color navigate violence in their communities and on social media platforms.

Facebook does offer a set of suicide-prevention tools on its site, but Jeremy J. Littau, an assistant professor of journalism at Lehigh University, says the social network should do more to safeguard users. That could include giving them the option to automatically hide anything the site labels graphic. Littau is also skeptical of Facebook’s public-good argument.

“Even if you could argue that this spreads awareness about suicide or might serve the greater good, [Facebook] can’t expect the public to be able to figure out what that greater good is,” he told Quartz in an email. ”What happens if a bunch of people decide to troll the video by hitting the Like or Heart button? Whatever public service is done by leaving the video up, surely that can be easily undone (and perhaps move into the realm of harm to others struggling with the issue) by watching a bunch of people approve of a person’s suicide by social clicking.”

Ironically, Facebook has also come under fire for censoring content. Last year, its algorithm blocked a breast cancer awareness campaign and the iconic Vietnam War “Napalm Girl” image, among other things. For some, that side of the censorship slope is far more slippery than the one that kept Davis’ video up for 13 days.

“In the US, one of our core political values is to go ahead with contentious or upsetting speech, and allow the subsequent criticism to happen,” says Clay Shirky, a New York University professor who studies the internet and social media. “Facebook, as a private entity, is under no obligation to uphold this norm, but we should all hope they do, because a Facebook that over-censors is far more threatening than one that under-censors.”