Most people have absolutely no idea what it takes to build a safe airplane—and they’ll never need to. For them, it might be easy to look at the crisis that Boeing has weathered in the wake of two 737 Max crashes and assume the lessons to be learned are strictly about software, or aerodynamics, or engineering. In other words: not their problem.
But that would be a mistake. In just a few months, Boeing has gone from being one of America’s most admired companies to one that is fighting to regain its reputation in the eyes of both its airline customers and passengers at large. And thus, there are numerous lessons that the public reckoning of one of the US’s most admired companies can teach us—whether we’re individuals, startups, or members of large companies.
1. It’s not enough to have an “andon cord”—your employees have to feel empowered to use it.
The andon cord is a manufacturing principle popularized by Toyota, which stated that any employee at any stage in the production line could “pull the cord” (thereby halting production) if she feared that safety or quality standards were in question or needed to be investigated.
Reporting in the wake of the Max crisis has suggested that Boeing’s internal culture created issues not just with the 737 Max, but aircraft such as the 787 Dreamliner too—both situations where that proverbial cord might have been pulled, but it appears it was not.
Having team members who feel empowered to pull that chord doesn’t just happen, though. Amy Edmondson, a Harvard Business School professor of leadership and management and author of the 2018 book The Fearless Organization, says it requires that employees experience “psychological safety” at work. She defines this as a workplace where employees feel they’re able to speak up and flag problems in an organization without fear of retribution (such as getting fired.)
“In so many organizations, people are talking to each other in the hallway, behind closed doors, but they don’t send it up, they don’t pull the chord. Even when it’d be a huge benefit for management or leadership to hear what those people are talking about,” Edmondson said. “Leadership is busy with so many things and they depend on and sometimes even assume that others will have something to say.”
Creating this safety, Edmondson recently wrote, comes from the process of “institutionalizing the behavior of speaking up,” and making sure employees know their feedback (however inconvenient to the company’s balance sheet it may be) will be taken into account.
2. To regain trust, take responsibility
In Boeing’s initial responses after the Lion Air crash, it did not explicitly mention MCAS (or Maneuvering Characteristics Augmentation System)—the Max software system that was the common link in both crashes, and that most pilots were unaware of. Since then, it’s straddled a fine line optics-wise: Its CEO claims there is “no technical gap or slip” yet at the same time, the company is in the process of creating a technical fix to MCAS. Boeing told Quartz the company is taking a “comprehensive, disciplined” approach to the update of this software and is offering simulator sessions to airline partners ahead of its release.
But one thing they haven’t done is unequivocally say their design was at fault. “The more contradictory they are, the less confidence I have in anything Boeing has to say,” said Eric J. McNulty, associate director of Harvard’s National Preparedness Leadership Initiative and a co-author of You’re It: Crisis, Change, and How to Lead When It Matters Most. “And I’m sure that’s true of pilots, I’m sure it’s true of passengers, I’m sure it’s true of investors.” Indeed, some shareholders have questioned Boeing’s stance that a “series of events” led to the crash, rather than a single fault connected to the design of MCAS.
After the second Ethiopian Crash, when aviation regulators around the world were grounding the plane, Boeing did not get out in front of the criticism by grounding the plane themselves. Michael Gordon, CEO of Group Gordon, a corporate communications firm based in New York that specializes in crisis PR, says this is a textbook mistake.
“After lives were lost in a potentially faulty aircraft, the right response would have been caution about passenger safety above all. But, instead of grounding the 737 Max proactively, Boeing called the president to keep its planes flying,” Gordon said. “Moreover, the initial silence from Boeing and its CEO put the company on the defensive when media reports dug into what Boeing knew about the plane’s issues.”
Ultimately, Gordon says, crisis PR involves not just getting out in front of an issue, but taking full accountability. “While blame may lie across many levels of the company, accountability needs to come from the top.”
3. Problem-solve with the end user in mind
Two weeks after the first crash, the presence of MCAS was first introduced into the public conversation not by Boeing, not by the airlines, not by the FAA, but by pilots. Time and again, commercial pilots have emphasized one thing in the wake of the Lion Air crash: that once everyone else’s job is done, once the plane is in service, pilots are the last stop for keeping passengers on it safe. And they take that job really seriously.
But pilots can’t do that job if they don’t have full information. MCAS design relied heavily on pilots taking a certain configuration of actions if it was triggered erroneously. Those actions, pilots have said, diverged considerably from the kinds of “muscle memory” protocols from the earlier model they might’ve naturally turned to. Boeing and the FAA both told Quartz that pilots should have been aware of how to override MCAS—using another existing protocol—even if they weren’t aware of the system itself.
American Airlines pilot union spokesman captain Dennis Tajer says he and other pilots were baffled by that stance.
“The go-to if everything fails is that the pilot would take over,” Tajer told Quartz soon after the Ethiopian crash. “Well, our response to them was, ‘How could we take over if we didn’t know the system even existed?'”
Of course, pilots were certainly involved in the design and testing of the Max. But how many of them were the pilots who would be up in air with close to 200 people sitting in back of them heading to a destination? In the age of automation, when software like MCAS will only become more prominent, figure out the equivalent of the commercial airline pilot for your organization—in essence, the end user. Make sure you’re not making any assumptions about how they might act in a situation. Better yet, ask them if there’s a way what you’re planning could work better.
4. To avoid catastrophe, assume the worst
The idea that the anti-stall system could work fine on a single sensor, that a malfunction wouldn’t be catastrophic, or that pilots would act a certain way in an emergency, were all based on assumptions that, now, turn out to have been wrong.
Boeing might have been able to avoid this by assuming the worst: the sensor will fail, MCAS will activate erroneously, pilots won’t immediately know what to do. Equally, in its response to the crash, it might benefit from eliminating this blue-sky thinking. After all, the relentless focus on a software fix that can get the planes back in the air also assumes that once that’s approved and planes are flying, passengers will be keen to fly it, and everything will be fine—versus imagining other kinds of less rosy scenarios and working to prevent them.
“They would be much better off if they operated with the assumption that Boeing could fail or might become a smaller company, or people might not be willing to fly on the 737 Max,” said Leonard J. Marcus, co-director of the National Preparedness Leadership Initiative and co-author of You’re It. “If that were their assumption they would make a very different set of decisions, and come to a very different set of conclusions. Trying to hold on to a belief that might not turn out to be true only exacerbates what might be an avoidable crisis.”
What’s required is an act of imagination about two possibilities—one, that Boeing will get through this okay, and the other, that Boeing won’t. These competing acts of imagination are valuable to any organization.
5. Crisis doesn’t have to be the only thing that leads to organizational change
It’s worth pondering why it took such a massive catastrophe—346 lives in all—to raise the question of whether Boeing’s internal culture needed reviewing. But therein lies a lesson: Assuming what happened to Boeing can’t happen to you or your organization, however successful it is, puts you at risk.
If you’d like to positively change a company’s culture to avert a crisis, you have to be proactive. Edmondson says in addition to creating the psychological safety mentioned in lesson one, employees need to feel that challenges they encounter in the workplace aren’t theirs to solve alone.
“Problems are a team sport. Most problems require more than one head to solve them—either because different expertise is needed or ingenuity is needed which kind of requires us to brainstorm,” Edmondson said. “When you sort of have the message—implicit or explicit—that you’re supposed to just figure it out on your own it’s a very dangerous situation.”
Ultimately, Edmondson says, that this could happen to one of America’s most admired companies is a lesson for all of us. “What people and organizations of all sizes should suddenly stop and reflect on is, ‘Is this happening here?'”