The Squawk Point

Organisational Mechanics

  • Home
  • Blog
    • People
    • Data
    • Process
    • Wild Cards
    • Index
  • Podcast
  • Book

Human Error

18 May, 2017 by James Lawther 4 Comments

World War 2

During the second world war, over 60 million people were killed.  That was roughly 3% of the worldwide population.  It was a hazardous time.

Amongst the hardest hit were the bomber crews.  The Eighth Air Force suffered half of all the U.S. Air Force’s casualties.  The British fared as badly.  The chances of surviving the war as a member of the RAF’s bomber command were only marginally better than even.

If flying bombing raids wasn’t dangerous enough, landing when you returned home was also fraught with danger.  Pilots of the Boeing B-17 Flying Fortress had a series of runway crashes, accidentally retracting the landing gear when they touched down.

Human error

Accident investigators blamed these incidents on pilot (or human) error.  There was no obvious mechanical failure.

It wasn’t only Flying Fortresses that had the problem.  There were stories of pilots of P-17 Thunderbolts and B-25 Mitchells making the same mistake.

Nobody would deliberately retract the landing gear when they were still hurtling across the tarmac.  Why the pilots did so was anybody’s guess.  Perhaps the pilot’s attention wandered when they realised they were almost home.

Design error

The authorities asked Alphonse Chapanis, a military psychologist to explain the behaviour.  He noticed that the accidents only happened to certain planes and not others.  There were thousands of C-47 transport planes buzzing about.  Their pilots never suffered from such fatal inattention.

After inspecting the cockpits of the different planes the cause became clear.  On B-17s the controls for the flaps and undercarriage were next to one another.  They also had the same style of handle.   Pilots who retracted the undercarriage when the wheels were on the ground were actually trying to retract the flaps. They just pulled the wrong lever.

In the C-47 the two controls were very different and positioned apart from each other.  

The solution

Once he identified the cause Chapanis developed an equally simple solution.  Circular rubber disks were stuck to the levers for the undercarriage and triangles were stuck to the levers for the flaps.  When a pilot touched the rubber he felt the difference and the crashes stopped.

To err is human

The pilots were well aware of which lever to pull.  It was “human error” that caused the mistake.  But laying the blame on the pilots wasn’t ever going to solve that.

We all make mistakes.  It is in our nature.  Don’t fight it, fix it.

If you enjoyed this post click here to get the next

The possibility of human error was rife in a B-17 Cockpit

Read another opinion

Image by Joe A. Kunzler

Filed Under: Blog, Process Improvement Tagged With: accident prevention, error proofing, human nature

About the Author

James Lawther
James Lawther

James Lawther is a middle-aged, middle manager.

To reach this highly elevated position he has worked in numerous industries, from supermarket retailing to tax collecting.  He has had several operational roles, including running the night shift in a frozen pea packing factory and carrying out operational research for a credit card company.

As you can see from his C.V. he has either a wealth of experience or is incapable of holding down a job.  If the latter is true this post isn’t worth a minute of your attention.

Unfortunately, the only way to find out is to read it and decide for yourself.

www.squawkpoint.com/

Comments

  1. John Hunter says

    12 June, 2017 at 4:16 pm

    These types of process improvements combined data an understanding of the process and an understanding of people. I love seeing such system improvements that create more reliable processes using an understanding of data and of the system in question.

    Reply
    • James Lawther says

      12 June, 2017 at 5:56 pm

      Thanks for the comment John, it appealed to me as well. I think there is so much done in the name of “process improvement” that fails to take the human into account.

      Reply
  2. Dennis Gershowitz says

    9 September, 2017 at 3:27 pm

    Good reminder…I would have also mentioned it is important to consider all key types of personas for different needs.My experience is a faulty process fails many of us, but, the improvement may not have considered the the differences among us.

    Reply
    • James Lawther says

      9 September, 2017 at 5:31 pm

      That is a very interesting perspective Dennis. If you have some examples I’d love to hear them

      Reply

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Explore

accountability assumptions beliefs best practice blame bureaucracy capability clarity command and control communication complexity continuous improvement cost saving culture customer focus data is not information decisions employee performance measures empowerment error proofing fessing up gemba human nature incentives information technology innovation key performance indicators learning management style measurement motivation performance management poor service process control purpose reinforcing behaviour service design silo management systems thinking targets teamwork test and learn trust video waste

Receive Posts by e-Mail

Get the next post delivered straight to your inbox

Creative Commons

This information from The Squawk Point is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.
Creative Commons Licence
Customer Experience Update

Try This:

  • Regression to The Mean

  • Glory Lasts Forever

  • Fish Bone Diagrams – Helpful or Not?

  • Brilliance Alone Won’t Take You Far

Connect

  • E-mail
  • LinkedIn
  • RSS
  • YouTube
  • Cookies
  • Contact Me

Copyright © 2025 · Enterprise Pro on Genesis Framework · WordPress · Log in