The Squawk Point

Organisational Mechanics

  • Home
  • Blog
    • People
    • Data
    • Process
    • Wild Cards
    • Index
  • Podcast
  • Book

Reducing Complexity

4 May, 2020 by James Lawther Leave a Comment

Avoiding Overwhelm

In their book Meltdown, Clearfield and Tilcsik show how complicated systems fail.  They discuss nuclear accidents, financial disasters and medical mishaps and show how we can avoid disasters by reducing complexity.

The more complex our systems become, the less easy they are to understand. So they become harder to fix when they break. Add to that complexity our desire to link and automate everything and it is easy to explain how unexpected events can cause horrific domino effects. Unforeseen events can rapidly spiral out of control. Think about the global financial crash of 2008 or the incident at Three-Mile Island and you will understand exactly what I mean

Rationalising alarms

One way to stop events from running away from you is to rationalise warnings and controls. Remove all the unnecessary alarms, warning bells and whistles. This sounds counterintuitive, but it keeps people focused on what is important, rather than distracting them with things that aren’t.

A real-world example

Clearfield and Tilcsik discuss the cockpit of an airliner.

Imagine being an airline pilot faced with one of these nightmare scenarios:

  1. A fire in the engine.
  2. Landing gear that isn’t down when coming into land.
  3. A plane that is about to stall aerodynamically.
  4. An engine that stops running.

How would you like to be alerted?  Which alarms and warnings would you like triggered?

My instinctive reaction is that I want the whole nine yards when any of them occurs.  I’d like to be in no doubt at all that something is wrong.  They all sound dire to me.

Yet that is not what happens.  The engineers at Boeing only turn on all the warnings when the aeroplane is about to stall.  If that happens, alarms sound, red lights flash and a red text message appear on the pilot’s screens.  If that isn’t enough the control sticks also start to shake violently. 

The pilots are left under no illusion that something bad is about to happen.  And it would be bad. If an aircraft stalls, it will drop out of the sky.  It is quite impressive the lengths the engineers have gone to. They capture the pilots’ undivided attention.

But that isn’t the clever bit. What is really interesting is the fact that nothing else triggers all the alarms. An engine fire sets off warning lights and text messages whilst a bell sounds.  The steering column, however, doesn’t start to shake.  An engine fire is (I imagine and hope never to find out) a serious event on an aircraft, but it isn’t as pressing an issue as a stall.  The plane won’t plummet out of the sky that instant.  Pilots have a little time to react.

Less serious incidents trigger less serious alerts with fewer noises or lights.  Amber text messages instead of red.

The logic is simple — don’t burden pilots with more information than they need.  The designers resisted the desire to add more and more warnings.  Instead, they set about reducing complexity and pared back the alerts so that the pilots’ attention is always drawn to the most pressing issue.  This reduces overwhelm and increases the probability of a safe landing.

Attention to detail

The rigour that goes into the design process is very impressive.

Boeing engineers work very hard to trigger the right reactions, which sometimes requires an even more nuanced approach.

For example, an engine that quits during the early portion of the takeoff roll requires quick reactions by the pilots to stop the airplane on the runway, so the warnings include red warning lights, a red text message, and a synthetic voice shouting “ENGINE FAIL.”

A few seconds later, as the plane accelerates, there isn’t enough runway to stop, so the airplane automatically inhibits all these warnings except the text message. This is done to avoid triggering the pilots into trying to stop the plane when that cannot be done.

And if an engine fails while the plane is in stable cruising flight, it only sets off amber lights, a beep sound, and an amber text message.

Chris Clearfield and Andras Tilcsik

What happens in your world?

How are your monthly management meetings and risk and control sessions? Are they pared right back, so that you are left in absolutely no doubt what the issues are? Or does your organisation prefer the “more is more” approach throwing as much information at you as it can? Just in case…

Any fool can make something complicated.  It takes genius to make it simple

Woodie Guthrie

If you enjoyed this post click here to receive the next

Aircraft cockpits are a good example of how to reduce complexity

Try the book Meltdown

Image by Vinnie C

Filed Under: Blog, Operations Analysis Tagged With: accident prevention, aviation, communication, complexity, data is not information, data presentation, error proofing, graphical user interface, less is more

About the Author

James Lawther
James Lawther

James Lawther is a middle-aged, middle manager.

To reach this highly elevated position he has worked in numerous industries, from supermarket retailing to tax collecting.  He has had several operational roles, including running the night shift in a frozen pea packing factory and carrying out operational research for a credit card company.

As you can see from his C.V. he has either a wealth of experience or is incapable of holding down a job.  If the latter is true this post isn’t worth a minute of your attention.

Unfortunately, the only way to find out is to read it and decide for yourself.

www.squawkpoint.com/

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Explore

accountability assumptions beliefs best practice blame bureaucracy capability clarity command and control communication complexity continuous improvement cost saving culture customer focus data is not information decisions employee performance measures empowerment error proofing fessing up gemba human nature incentives information technology innovation key performance indicators learning management style measurement motivation performance management poor service process control purpose reinforcing behaviour service design silo management systems thinking targets teamwork test and learn trust video waste

Receive Posts by e-Mail

Get the next post delivered straight to your inbox

Creative Commons

This information from The Squawk Point is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.
Creative Commons Licence
Customer Experience Update

Try This:

  • Regression to The Mean

  • Fish Bone Diagrams – Helpful or Not?

  • Glory Lasts Forever

  • Brilliance Alone Won’t Take You Far

Connect

  • E-mail
  • LinkedIn
  • RSS
  • YouTube
  • Cookies
  • Contact Me

Copyright © 2025 · Enterprise Pro on Genesis Framework · WordPress · Log in