Facebook has the disturbing power to rewrite our collective history

Nick Ut’s devastating photograph of 9-year-old Kim Phuc is an important part of our collective history.
Nick Ut’s devastating photograph of 9-year-old Kim Phuc is an important part of our collective history.
Image: AP Photo/Nick Ut
We may earn a commission from links on this page.

In 1972, a harrowing photograph of a young girl screaming out in pain from a napalm burn was on the front page of newspapers around the world. This photo taken by Nick Ut is often credited with helping to make real for audiences the atrocities of the Vietnam War, contributing to a shift in public opinion of the long-running conflict. It is an essential part of our collective history and shared visual consciousness. And last week, Facebook tried to censor it.

Facebook initially defended its decision to take down the photo because its subject, a young girl, is naked—a violation of the company’s community standards. Facebook has since reversed its decision, acknowledging “the history and global importance of this image in documenting a particular moment in time.” But the larger issues raised by Facebook’s censorship of this historical image are far from resolved.

While most people have seen Ut’s photo, few know the young girl’s story. In one of the defining events of my life, I had the opportunity to meet and interview Kim Phuc, the subject of this iconic photograph.

In 1972, Kim was nine years old, living a peaceful life with her extended family in a small village in Vietnam. She rode her bike to school every day, and she had a lot of friends. More important, Kim said, “I felt safe and loved.”

“I was never afraid before the war. And then the fighting started,” she recalled. “I will never forget when the soldiers knocked on our door. I knew fear for the first time.”

That day was June 8, 1972. Kim and the other villagers were instructed to hide in the village temple, which was designated as a safe place. But the temple was not safe. When soldiers heard the planes coming, they told the children to run away.

Burning napalm filled the air and clung to Kim’s young body. Her clothes caught fire, so she started running and tearing her clothing from her burning flesh.

Napalm is a thick, jelly-like substance that burns at 2,200 degrees Fahrenheit. Kim describes the burn as “the most terrible pain you can imagine.”

As Kim ran down the road in terror and in pain, Ut captured the moment on film. The next day the photo of Kim appeared on front pages around the world.

Photographs have the ability to create an emotional impact on a viewer that words alone cannot. Reading that a young girl has been badly burned is quite different than seeing her horror-struck face and her naked and traumatized body. The photo should have been seen in 1972, and it should still be seen today—especially on a platform that has 1.13 billion daily active users. The Vietnam War is over, but children in war-torn countries around the world continue to suffer. The visceral impact of the image is as important as ever.

It is true that Facebook is a private company with a legal right to censor content. But as a global giant that claims in its mission statement “to give people the power to share and make the world more open and connected,” Facebook has an ethical responsibility to facilitate the free flow of information and ideas, especially news. Instead, Facebook is giving users a dangerously manipulated view of that world and contributing to the age of truthiness.

In January of 2012, for example, Facebook intentionally tampered with the news feeds of almost 700,000 Facebook users. Facebook skewed news feeds to be either predominately positive, happy news or predominately negative, sad news in an attempt to manipulate the emotional reactions of users. Facebook then studied how these manipulated users responded to the experiment. But the manipulation of people’s emotions is incredibly unethical—especially in an age when mental health problems run rampant and are often ignored (or shunned) by society and the medical community.

Facebook has also altered the way in which we consume political news. Earlier this summer, the Wall Street Journal launched “Blue Feed, Red Feed,” a project that shows readers a side-by-side accounts of Facebook feeds as seen by conservative and liberal users. The news that readers of different political persuasions receive is vastly different, creating an “echo chamber” in which we are exposed only to news that fits our political alignment. Rather than informing the electorate and contributing to our democracy, the Facebook manipulation adds to political polarization.

Not only have we learned that Facebook can censor our history, we’ve also found that the company can spread false versions of it. Last week, its new algorithm-run “Trending Topics” section featured a tabloid story that suggested bombs were planted in the Twin Towers on 9/11.

What will Facebook algorithms (and employees) decide next? It is easy to imagine the social-media giant censoring graphic images such as that of the bloody, shocked five-year-old Syrian boy Omran Daqneesh in the back of an ambulance. But that photograph, like the one of Phuc, helped awaken us to the reality of 500,000 dead Syrians.

Similarly, what if Facebook had blocked the Facebook live video of the police killing of Philando Castile? That video bore witness to a shocking event and contributed to our collective knowledge of, and conversations about, race relations and police brutality.

We must hold Facebook accountable, as Norway’s prime minister and largest newspaper successfully did with the Vietnam photo. If we are to say that corporate oil giants have an ethical obligation to help combat climate change, and that global clothing brands have an ethical obligation to ensure that their products aren’t made in sweatshops, a tech company whose product is the distribution of information has an obligation to recognize its responsibility to contribute to—and not manipulate—public discourse.

The situation also speaks to the growing need for media literacy. Algorithms often determine the content we see (and don’t see) on Facebook, and users need to be aware that they can’t necessarily trust the information presented to them. While we can gleam valuable information and discourse from Facebook, skimming a newsfeed should not take the place of actively seeking out content generated by news organizations.

We must also educate ourselves about what is and what is not news. A screaming pundit pushing an agenda is not news. The piece you are currently reading is an op-ed; while rooted in fact, it is meant to persuade. We must teach our children to be media savvy, integrating media literacy in the K-12 curriculum and at the college level. This includes teaching them not to take photographs, videos and news articles at face value—to do their own research and learn about the stories that continue to unfold.

In Kim’s case, while the war forever altered her life and body, it did not destroy her spirit. She went on to become a global peace advocate. As a UNECSO Goodwill Ambassador, Kim travels the world promoting peace and understanding. She told me that she was “haunted” by the photo for many years, but learned, “if I couldn’t escape it, I could use it for peace.” The photo became a “powerful gift.”

She has also begun the Kim Foundation International, which helps to heal the physical and psychological wounds suffered by child victims of war. Kim travels to meet many of these innocent victims with the intention of restoring hope to their lives.

“Terrible things can happen,” she told me, “but if we are lucky, we can learn from our experiences and it can help us to become stronger.” But we can only learn from one another’s experiences if we have access to them. If our dominant communication channel guides us away from images, news and stories that could upset us or contradict our existing opinions, we risk forgetting our history and making the same grave mistakes again. And if we are exposed only to entertainment-driven and algorithm-dictated news, we risk the crumbling of our democracy.