Frances Haugen knew it was time to blow the whistle at Facebook

What does it really take to expose corporate wrongdoing? The Facebook whistleblower has some ideas.
Frances Haugen knew it was time to blow the whistle at Facebook
Image: Reuters (Reuters)
We may earn a commission from links on this page.

“I wanted,” says Frances Haugen, “to be able to sleep at night.”

This has become Haugen’s emphatic sine qua non, her non-negotiable clause in every contract whether business or personal. It’s a point she makes in magazine interviews and newspaper columns and on the radio; it’s one she gestures at in congressional testimonies and tech-forum talks. And it’s one she now details in a new memoir, The Power of One: How I Found the Strength to Tell the Truth and Why I Blew the Whistle on Facebook.

A former product manager on Facebook’s civic integrity team, Haugen rocketed to global recognition by releasing a trove of damning documents—some 22,000 in total, published by The Wall Street Journal under a serial release known as The Facebook Files—that exposed how the social media company knew its algorithms rewarded extremism and chose to obscure it.

Under the influence of unfettered algorithms, she writes, Facebook and its properties had become a hotbed of misinformation, a “spark plug” for political outcry. It stirred and stoked user outrage; it devastated teen girls’ mental health; it amplified inflammatory content later linked to ethnic violence and religious riots. It allowed human traffickers and drug cartels and armed militias to organize on its platforms. And it batted away employees ringing alarm bells. “Facebook, just like the Big Tobacco companies before it, had known the toxic truth of its poison,” Haugen writes, “and still fed it to us.”

If Haugen’s Facebook files were a bracing exposé of the algorithms powering the globe’s largest social network, her new memoir is a series of softer intimations, revealing the person behind the documents. But the story presents more than a personal narrative: It’s also an outline for what it takes to intervene when you find wrongdoing in your work. And in Haugen’s words, anyone can be capable—more capable, even, than they believe—of choosing conviction over the company line.

Facebook, by way of debate team (and Harvard)

In person, Haugen is an effusive and overlapping talker, quick to cut into conversation with animated agreement. We’re sitting in the back of a video studio green room; Haugen’s just come off an on-camera interview about the need for social media scrutiny and the new nonprofit she’s established to do it. Earlier, I’d watched her field questions from a room of journalists, nodding energetically to probes about the dangers of Big Tech. Even in a windowless room, she’s bright and blazing with vigor.

“I shy away from the idea that I was exceptional,” she tells me when we sit down together. “I think it’s more that I had a series of life experiences where I knew that I could come back.” She folds one leg over another, her black Converse sneakers smudged at the toe.

In Haugen’s story, those experiences prepared her to blow the whistle on what she witnessed at Facebook. She details a bright youth in Iowa, marked by Montessori education and a place in the high school debate circuit. It’s there that Haugen encounters the first of those catalysts: the death of a close friend, Tina, whose spot she would step into on the team. The experience that followed gave her not just an ability to communicate complicated ideas, but a sense of duty to do right with what she’d learned. “I had taken my friend’s place in the world,” she writes, “and needed to live up to Tina’s gift.”

Fast-forward through her early twenties: Haugen graduates from a computer engineering program at Olin College, begins a career in Silicon Valley at Google Books, takes a leave of absence to pursue her MBA at Harvard. Flickers of the duty she felt as a teenager flash in her memoir’s recounting. She writes her undergraduate thesis on civil disobedience. She interrupts her career for Harvard to “understand and identify power and influence dynamics.”

But by her late twenties, though, the sequence has accelerated. Haugen’s marriage ends; she battles debilitating and mystifying health problems thanks to undiagnosed celiac disease; in the wake of medical leave, she’s pushed out of her role at Google. One day, she wakes up to find that her leg has turned purple. It’s a blood clot, which spirals into hospital stays that span nearly two months. She’s given a drug that devastates her blood platelets, and she nearly bleeds to death.

It’s that accumulation of hardship, Haugen now says, that prepared her to blow the whistle. People who want to expose wrongdoing, she says, often fear what they risk to lose: Their careers, their savings, their relationships, their freedom. But in swift succession, Haugen had lost all of those things—and knew they were recoverable. “The cost was way cheaper for me, because I had come back from nothing before,” she says.

Zuckerberg’s all-powerful algorithm

It would be a few years before Haugen landed at Facebook, where she would get a look into the dark magic of its algorithm. When a recruiter calls in 2018, she picks up, telling them she’d consider a role working against misinformation. But when she arrives in 2019, the bottom falls out.

A core problem, as Haugen diagnoses it, is that Facebook is driven primarily by what can be measured in numbers—and doesn’t dedicate the force needed to tackle the issues that metrics can’t catch. When she raises questions about the company’s lack of resources for dealing with misinformation it can’t fact-check, a mentor tells her, “You need a different problem. This is a bad problem—it isn’t measurable.”

She manages a staggering effort to reduce misinformation in nations where the company has no third-party fact checkers. Her team’s focus is later turned toward the 2020 US election, where a risk grid tracks whether Facebook, Instagram, Whatsapp, and Meta’s other social media powerhouses are ready to handle compromises; nearly every square is coded red. She finds herself “agonizing” about whether she’s simply window dressing rather than dealing with the platform’s harms.

Beyond Facebook’s failings, Haugen says, what also motivated her to eventually come forward was the toll they took on her teammates. “When you’re in an environment where there are major deviations between the public line and the private line, people get traumatized,” she says. It’s known as moral injury—the psychological impact of having to betray our values and ethics, or watching them be betrayed without being able to stop them. Cognitive researchers link it with PTSD.

“You know, you’re sitting there working on genocide,” without the tools to intervene, Haugen says. “What does that do to a person?” She cites Facebook’s role in brutality against the Rohingya people in Myanmar, where reports find the platform failed to prevent calls to violence. Here were people committed to stopping harm on the platform, she says, traumatized by its heft.

Take a long peer into the algorithm: the cascades of code that follow individual behaviors, the ways it sorts and shuffles those decisions, sprawling out into real opinions and emotions and choices. The weight of the work can’t be measured in megabytes. It’s human, tender and trembling.

Meta’s whistleblower

It wouldn’t be until the pandemic hit in March 2020—and Haugen locked down with her parents in Iowa—that she began more seriously considering bringing public attention to the failings of Meta’s platforms, including Facebook, and the company culture that shrouded them. (Only later that year, after she was back in Silicon Valley and Facebook allegedly dissolved her civic integrity team, would Haugen contact a whistleblowing nonprofit and begin collecting the files she would later share with the Wall Street Journal.)

Over family dinners in Iowa, Haugen would share her concerns about the projects she was working on, the ripple effects of misinformation she was seeing translate into violence across continents. In contrast to teammates who would dismiss them as beyond scope or out of jurisdiction, her parents helped her discern how deeply worrying they were.

“I was seeing these things that were so unsettling, [but] I didn’t suffer alone,” she told me. Her father, a doctor, pointed out how in his medical labs, even the lowest-level technician knew how to report malpractice. “He was like, ‘Frances, you are working on national security issues, and you don’t know who to call.’” Those validations, she says, became essential to her whistleblowing.

“When I later started working with my lawyers, they would give me this compliment over and over again,’” Haugen tells me. The compliment was striking in its specificity: You’re so stable, they would say.

It wasn’t until later that Haugen would discover how stability was, oddly, atypical for a whistleblower. By the time most people exposed to high degrees of wrongdoing come forward, they’ve been psychologically stilted because of the alienation and stress that accompanies secrecy.

“If you think you’re holding a secret that impacts other people’s lives, the most important thing you can do is find one person—it could be a family member, a therapist, a friend—and if you trust them enough…be honest with them,” she adds. “Then you’re not wrestling with that alone.”

Frances Haugen isn’t giving up on Facebook

In what may be her most startling revelation, Haugen says that after the thousands of documents and discouragements, the lawyers and journalists, the investigations, the revelations, and the risks she dealt with, she would return to work for Facebook again. She is, as she’s said before, a persona non grata at Mark Zuckerberg’s Meta. So why would she go back to a company where the code of ethics had failed her own, a place she had to leave to try changing?

“We don’t get to leave Facebook behind,” she says. The platform, she reminds, is the internet for millions of people with its Free Basics program, which provides free access to specific platforms in nations where data is unaffordable. Facebook’s influence, she says, will reign there five years from now, and 10 years from now, and maybe beyond that, too.

She turns her gaze—one that’s looked at congressional questioners, and human rights litigators, and the darkest chambers of the algorithm—to meet mine and replies with an answer she’s had ready all along. “We don’t get to give up,” she says.