News

Industries

Companies

Jobs

Events

People

Video

Audio

Galleries

My Biz

Submit content

My Account

Advertise with us

How normalcy bias led Boeing to crash into disaster

Due to the grounding of its 737 Max airplane following two deadly crashes that killed 346 people, Boeing lost $5 billion in direct revenue by the summer of 2019. The overall losses - ranging from damage to the brand to losing customers - were valued by investors at over $25 billion by March 2019. In late 2019, new revelations about problems with the 737 Max further increased Boeing's losses. In late December, Boeing fired its CEO Dennis Muilenburg due to the 737 Max fiasco.

What led to this disaster for Boeing? On the surface, it came from Boeing’s efforts to compete effectively with Airbus’s newer and more fuel-efficient airplane, Airbus 320. To do so, Boeing rushed the 737 Max into production and misled the Federal Aviation Administration (FAA) to get rapid approval for the 737 Max. In the process, Boeing failed to install safety systems that its engineers pushed for and did not address known software bugs in the 737 Max, glitches that resulted in the eventual crashes.

The new normal

However, these surface-level issues had a deeper cause. Ironically, the transformation of the airline industry in recent decades to make airplanes much safer and accidents incredibly rare is key to understanding Boeing’s disaster.

Boeing’s leadership suffered from what cognitive neuroscientists and behavioral economists know as the normalcy bias. This dangerous judgment error causes our brains to assume things will keep going as they have been - normally. As a result, we drastically underestimate both the likelihood of a disaster occurring and the impact if it does.

Boeing’s 737 Max disaster is a classic case of the normalcy bias. The Boeing leadership felt utter confidence in the safety record of the airplanes it produced in the last couple of decades, deservedly so, according to statistics on crashes. From their perspective, it would be impossible to imagine that the 737 Max would be less safe than these other recent-model airplanes. They saw the typical FAA certification process as simply another bureaucratic hassle that got in the way of doing business and competing with Airbus, as opposed to ensuring safety.

Think it’s only big companies? Think again.

The normalcy bias is a big reason for bubbles: in stocks, housing prices, loans, and other areas. It’s as though we’re incapable of remembering the previous bubble, even if occurred only a few years ago.

Similarly, the normalcy bias helps explain why leaders at companies of all sizes were so vastly underprepared for COVID-19 and its impact. While pandemics post a major threat, it’s a low-likelihood, high-impact, slow-moving disaster, and the normalcy bias keeps tripping us up on such disasters.

Normalcy bias in a tech startup

Of course, the normalcy bias hits mid-size and small companies hard as well.

At one of my frequent pieces of training for small and mid-size company executives, Brodie, a tech entrepreneur shared the story of a startup he founded with a good friend. They complimented each other well: Brodie had strong technical skills, and his friend brought strong marketing and selling capacity.

Things went great for the first two and a half years, with a growing client list - until his friend got into a bad motorcycle accident that left him unable to talk. Brodie had to deal not only with the emotional trauma but also with covering his co-founder’s work roles.

Unfortunately, his co-founder failed to keep good notes. He also did not introduce Brodie to his contacts at the client companies. In turn, Brodie - a strong introvert - struggled with selling. Eventually, the startup burned through its cash and had to close its doors.

The normalcy bias is one of many dangerous judgment errors, mental blind spots resulting from how our brains are wired. Researchers in cognitive neuroscience and behavioural economics call them cognitive biases.

Fortunately, recent research in these fields shows how you can use pragmatic strategies to address these dangerous judgment errors, in your professional life, your relationships, or other life areas.

You need to evaluate where cognitive biases are hurting you and others in your team and organization. Then, you can use structured decision-making methods to make “good enough” daily decisions quickly; more thorough ones for moderately important choices; and in-depth ones for truly major decisions.

Such techniques will also help you implement your decisions well, and formulate truly effective long-term strategic plans. In addition, you can develop mental habits and skills to notice cognitive biases and prevent yourself from slipping into them.

Preventing normalcy bias disasters

With the normalcy bias in particular, it really helps to use the strategy of considering and addressing potential alternative futures that are much more negative than you intuitively feel are likely. That’s the strategy that Brodie and I explored in my coaching with him after the training session, as he felt ready to get back to the startup world.

While Brodie definitely knew he wouldn’t be up to starting a new business himself, he also wanted to avoid the previous problems. So we discussed how he would from the start push for creating systems and processes that would enable each co-founder to back up the other in cases of emergencies. Moreover, the co-founders would commit to sharing important contacts from their side of the business with each other, so that relationships could be maintained if the other person was out of commission for a while.

So what are the broader principles here?

  1. Be much more pessimistic about the possibility and impact of disasters than you intuitively feel or can easily imagine, to get over the challenges caused by the normalcy bias.

  2. Use effective strategic planning techniques to scan for potential disasters and try to address them in advance, as Brodie did with his plans for the new business.

  3. Of course, you can’t predict everything, so retain some extra capacity in your system - of time, money, and other resources - that you can use to deal with unknown unknowns, also called black swans.

  4. Finally, if you see a hint of a disaster, react much more quickly than you intuitively feel you should to overcome the gut reaction’s dismissal of the likelihood and impact of disasters.

*Note that Bizcommunity does not necessarily share the views of its contributors - the opinions and statements expressed herein are solely those of the author.

Let's do Biz