Helicopter Decision Making[PDF]

Introduction

Research into the human factors related to aircraft accidents and incidents has highlighted Decision Making as a crucial element. Pilots usually intend to fly safely, but they sometimes make errors. It has been observed that the majority of fatal crashes are attributable to decision errors rather than to perceptual or execution errors.

Human Error.png

While we cannot eliminate human error, a thorough understanding of human factors principles can lead to appropriate strategies, means and practical tools to prevent most errors, better detect and manage them, and mitigate their adverse impact on aviation safety.

Human Factors

Decision errors in aviation are typically not slips or lapses but mistakes. In other words, the problem doesn’t lie with a failure to execute a correct decision but to making a wrong or poor decision in the first instance.

Human Factors research and theories have described, using several models, the characteristics of human decision making, which rather differs from the way aircraft systems for instance ‘make decisions’.
The SHELL model for instance provides a framework that illustrates the various components and interfaces or interactions between the different subsystems involved in aircraft operations.

SHELL MODEL.png

The LIVEWARE constitutes the hub of the model, the most critical as well as the most flexible component in the system. Adverse mental states may contribute to poor decision making.

Pilots behaviours and motivations affect decision making and training aims at improving the decision making process.

Hazardous attitudes

Five hazardous attitudes increase the risk of poor decisions. They are shown in the table below. These attitudes must be carefully addressed in training. Safer attitudes, often referred to as “antidotes”, are also identified in the table. Compliance with the SOP’s is a common, powerful antidote.

HAZARDOUS ATTITUDES ANTIDOTES
Anti-authority Don’t tell me what to do!” This attitude is found in people who do not like anyone telling them what to do. In a sense, they tend to regard rules, regulations, and procedures as unnecessary. Follow the rules: they are usually right.
Impulsivity “Must do something now!” This is the attitude of people who frequently feel the need to do something, anything, immediately. They do not take the time to think about what they are about to do; therefore they often do not select the best alternative. Not so fast.

Think first, and think twice.

Invulnerability “It won’t happen to me.” Many people feel that accidents happen only to others, but can’t happen to them. They never really feel or believe that they will be personally involved. Pilots who think this way are more likely to take chances and increase risk. It could happen to me too.
Macho/Egocentric “I can do it – I’ll show them.” Pilots with this type of attitude often take risks to prove that they are good and to impress others. Taking risks is foolish.
Resignation “What's the use? There is nothing I can do.” The pilot will leave the action to others, for better or worse. Sometimes, such pilots will even go along with unreasonable requests just to be a “nice guy”. I’m not helpless. I can make a difference.

Behavioural Traps & Biases

There are a number of behavioural traps and biases that can distort decision making. Pilots should be aware of these traps and take steps to avoid getting caught.

BEHAVIOURAL TRAPS & BIASES
Peer Pressure Poor decision making may be based upon an emotional response to peers, rather than evaluating a situation objectively. The solution offered by the peers is accepted without further assessment, even when this solution is wrong.
Confirmation bias (fixation) The tendency to search for or interpret information in a way that confirms one’s pre-conceptions or backs up the decision that has already been made. Counter evidence is not considered or dis-
Overconfidence The human tendency to be more confident in one’s skills, competences and capacities than one should be.
Loss-Aversion Bias The strong tendency for people to prefer avoiding losses. Changing the plan means losing all the effort you have already expended on. Explains why decisions are sometimes hard to change.
Anchoring Bias (attentional tunnel) The tendency to rely too heavily, “anchor,” or focus attention on one or a few elements or pieces of information only.
Complacency A state of self-satisfaction with one’s own performance coupled with a lack of awareness of potential risks. Feeling to be at ease with the situation, which often result in lack of monitoring.


Decision making biases lead to poor decisions and put the safety of the flight at risk.
Knowing the biases is important but is not enough: biases should be actively combated!

Decision Error Factors

When exploring factors that contribute to decision errors, a common pattern is the pilot’s decision to continue with their original plan whereas conditions suggest that other courses of action might be more prudent or appropriate.

FACTORS CONTRIBUTING TO DECISION ERRORS
Situational factors (ambiguity) The situations are not recognised as requiring a change of course of action, due to the ambiguity of the cues resulting in a poor representation or understanding of the situation (poor situation awareness).
Erroneous risk perception & risk management Pilots typically under-assess the level of threat or risk associated with the situation, due to risk misperception or tolerance to risk. Pilots risk pressing on into a rushed landing or deteriorating weather (DVE) simply because they do not realise the risks associated with doing so or accept to take the risk.
Goal Conflicts Pilots may be willing to take a safety risk (an unlikely loss) to arrive on time (a sure benefit). Social factors (for instance to please passengers) can also play a role. Among pilots, peer pressure can encourage risky behaviour. Also people seem to disregard risk to avoid losses. An en-route diversion can be seen as a loss.
Workload & Stress Workload and stress may overload pilots, deteriorate mental processes (i.e.: tunnelling of attention or vision, memory limitation, etc.) and lead to errors. As situations degrade, risk and time pressure may increase up to a point where making correct decisions becomes very difficult.

Decision Making Models

Many models have been developed to describe decision making. Two of these are presented below.

NASA model

NASA research describes a decision process model for aviation that involves two components: Situation Assessment (SA) and choosing a Course of Action (CoA).

Situation assessment

Situation assessment and awareness is crucial. It involves defining the situation or problem as well as assessing the levels of risk associated with it and the amount of time available for solving it. It is also an awareness of what the situation will be in the future.

Course of action

Once the problem is defined, the course of action is selected from the options available (known about) in the situation. Once the pilot understands a situation, an acceptable course of action is often easily identified.

The OODA Loop model

This simple model based on the Observe, Orientate, Decide and Act stages originates from the military fighter pilot community. Developed for single-pilot operations, it describes control of behaviour in a rapidly changing environment.

Observation, orientation, and action occur continuously and simultaneously in flight (Skill based behaviour).

The decide stage is dependent on remaining resources. During periods of rapid change, these can be very limited (hence the importance of flight preparation).

Orientation (safety oriented approach) is the most important part of the OODA loop model because it shapes the way we observe, the way we decide, and the way we act.
Because decision making is not always perfect and may suffer short cuts, pilots should be trained to better prepare and review their decisions, as time allows.

How to Improve Decision Making

The following strategies can improve decision making. Training pilots on these solutions will allow them to make better decisions.

Standard operations procedure (SOPs)

SOPs are widely used throughout the commercial aviation community as a means to manage risk. Establishing safety oriented SOPs (including personal and weather minimums) will provide pilots with pre-planned responses that manage the risks and break the “chain of events” leading to accidents.

Pre-flight Planning

Planning conducted prior to a flight in a low stress environment can enable a pilot to produce a safe strategy for the flight (i.e.: the pilot can be proactive and plan ahead to select a safe route and establish “decision points” during each flight phase).

Illusion of Safety - the Plan B

Research has suggested that having a plan B (safety net) encourages continuation and possibly more risky behaviour. Naturally it is indeed easier to take a risk when you know that you can count on a plan B. Pilots however rarely assess their plan B properly; so the protection can be weaker than expected.

Simulator training

Simulators can allow training decision making in high stress, high workload situations with poor or conflicting information. Training scenarios can be tailored to the trainees needs. In addition, simulators allow exploration of the consequences of poor decisions without endangering the safety of the aircraft and its occupants.



See also

  • None

Reference

  • Decision Making leaflet by European Helicopter Safety Implementation Team (EHSIT)

Author

  • VID 522050 - Creation

DATE OF SUBMISSION

  • 12:44, 23 February 2021

COPYRIGHT

  • This documentation is copyrighted as part of the intellectual property of the International Virtual Aviation Organisation.

DISCLAIMER

  • The content of this documentation is intended for aviation simulation only and must not be used for real aviation operations.