Tuesday, November 20, 2012

Irrational Positive Bias in Aircraft Maintenance

In Decision Making - Bias  I discussed how our unconscious mind injects bias into our decision making. Decisions what we think are conscious, rational decisions are made mostly by our unconscious bias mind  1.; But what direction is our unconscious mind leading us toward?

"The brain has evolved to be an optimist" 1.

Optimism and Confidence are necessary for Success. If you don't have optimism and confidence you won't be able to do it. Whether it's training for a sporting event, going to school, or setting life's goals, if you don't have confidence you won't work towards the goal and you won't succeed. The unconscious positive bias is how we survive as a species. Successful people have lots of optimism and confidence.
People will seek the most optimist answer
  • "Their just stress relief cracks" Chalk Airlines accident
  • One study found that most general aviation pilots believe they are less likely  than other pilots to experience an aircraft accident (Wichman & Ball, 1983).
  • 95% of pilots estimate their chance of being  in an accident at a rate that is less than reality (O'Hare , 1990). 
  • Researchers Dale Wilson and Marte Fallshore found for the  specific accident scenario of VFR flight into IFR conditions,  pilots were overly optimistic regarding their chances of experiencing  such an accident and were also overconfident in their ability to avoid or successfully fly out of IMC (Wilson & Fallshore, 2001)2 
Irrational Positive Bias
Tali Sharot  in her book A tour of the irrationally positive brain, calls optimism bias "irrationally positive". Although we believe we are making a rational decision, it's often irrationally positive.

Examples of irrationally positive bias:

  • Indian Airlines Flight 440 crash found to be crew error in letting the aircraft descend below glide-path.
  • "... the control tower had discouraged the pilots from landing due to the density of the fog at Smolensk airport at the time." Tupolev-154 2010 Air Force. Ninety-six people died.
  • "It'll be OK, I'm in a hurry and gotta go"
  • "It flew in so it will fly out"
  • "It's been that way for 2 months"
  • "It doesn't look too bad to me"
  •  "and the more they flew the more they demonstrated that the problem had no consequences."

  •  When operating or maintaining aircraft, an irrationally positive bias needs an external control system to avoid failure.

    How do we limit Irrational Optimism Bias from aircraft maintenance?
    • Standards
    • Culture
    Standards: Pilots are familiar with "minimum descent altitude". Even if we "think" we can safely go below the minimum descent altitude. We are instructed not too and,
    Culture: we know of stories of those who thought they could and failed. Dr. Tali Sharo offers a similar answer where we develop "plans and rules to protect ourselves from unrealistic optimism."3. 

     'If you want to fix a problem, you can’t just fire the responsible person. You have to fix the organization, or else the next person to take the job will just experience the same pressures. Like Columbia after Challenger, the harmful behavior persists."  Interview with Diane Vaughan about Space Shuttle Challenger Accident.

    For the aircraft mechanic, standards are maintenance regulations, rules, manuals, and Acceptable Methods and Practices. If a part is worn beyond service limit then it's not airworthy and must be removed from service. Our mind might believe that the part can still function with that amount of wear (the Chalk mechanics thought that the airplane could fly with cracks in the wing); indeed, our boss or aircraft owner might insist that the part can continue to function safely; but that is not the issue -- the standard says we must not use it. The standard protects us from our irrational optimism bias.

    When not working to standards, one must be "on-guard" and consciously aware of how positive bias might lead us to making the wrong decisions. New mechanics, or non-mechanics working on aircraft, may lack knowledge of the standards and culture and therefore have not developed a respect for the control mechanisms necessary to control irrational bias.

    Successful People "bolt people think their fate is almost entirely in their own hands."
    The first urban legend I heard as a pilot was the high accident rate of doctors in aviation. I don't know if this is true or not but it was a widely held belief at the time. Over achievers must have confidence to achieve, and their achievements give them even more confidence; their decisions have proven to be correct. But that confidence, when used to evaluate airworthiness, might lead them into danger. It's not just doctors. Successful people have confidence-- when mixed with ego it becomes a dangerous mix. Call it an unhealthy dose of "Irrational Positive Bias."4

    NASA's managers had more confidence than the engineers on the failed launch of Challenger. They knew the standards of launch (temperature too cold). Yet they launched over the objections of their engineers. The decision to launch was not rational but driven by irrational positive bias.

    Tribunals, Penalties and Sanctions
    • "Pilot Error"
    • "Failed to follow established procedures and regulations"
    • "Failed to follow the maintenance manual"
    These are common phrases found in accident reports attributable to what caused the accident but aren't these the outcome or "effects" of another undiscovered (un-investigated) cause?

    Possible the maintenance manual was not available? or could it be that the maintenance manual was available and understood, but the mechanic "felt" that it would be OK to do the repair in some other manner? Why weren't "established procedures and regulations" followed? Unless we know these answers then we don't know the true cause of the accident; Tribunals, penalties and sanctions do not serve as a deterrent and the whole exercise of accident investigation fails in its purpose of preventing future accidents.

    Human Factors - FAA 
    More correctly renamed "Bureaucracy Factors" as the investigation is extended from the body of the person into the body of the organization in an inhuman attempt to re-integrate the human into the system  -- further distorting the human's mental construct of reality. Instead of blaming someone we blame something. It just so happens that something usually has deeper financial pockets than someone. It is much more lucrative to fine the airline than to fine the mechanic.

    Human Factors - The Optimist
    "When optimists succeed they attribute it to their superior abilities. When optimists fail they attribute it to external reasons."

    Normalization of Deviance
     "And as they recurrently observed the problem with no consequence they got to the point that flying with the flaw was normal and acceptable. Of course, after the accident, they were shocked and horrified as they saw what they had done." Interview with Diane Vaughan about Space Shuttle Challenger Accident.

    1. "Secrets of the Brain" PBS series

    2."What is surprising is the effect of experience on ability biases. One would  think that as experience increases, a person would gain  a more realistic appraisal of their abilities. Instead, it  appears that flight experience may lead to overestimates of one’s ability to both avoid and  successfully fly out of IMC." OPTIMISTIC AND ABILITY BIASES IN PILOTS’ DECISIONS AND PERCEPTIONS OF RISK  

    3. Tali Sharot "The Optimism Bias" 

    4. "Ignorance more frequently begets confidence than does knowledge." Darwin

    No comments:

    Post a Comment