Human reliability


Human reliability is related to the field of human factors and ergonomics, and refers to the of humans in fields including manufacturing, medicine and nuclear power. Human performance can be affected by many factors such as age, state of mind, physical health, attitude, emotions, propensity for certain common mistakes, errors and cognitive biases, etc.
Human reliability is very important due to the contributions of humans to the resilience of systems and to possible adverse consequences of human errors or oversights, especially when the human is a crucial part of the large socio-technical systems as is common today. User-centered design and error-tolerant design are just two of many terms used to describe efforts to make technology better suited to operation by humans.

Common Traps of Human Nature

People tend to overestimate their ability to maintain control when they are doing work.
The common characteristics of human nature addressed below are especially accentuated when work is performed in a complex work environment.
Stress The problem with stress is that it can accumulate and overpower a person, thus becoming
detrimental to performance.
Avoidance of Mental Strain Humans are reluctant to engage in lengthy concentrated thinking,
as it requires high levels of attention for extended periods.
The mental biases, or shortcuts, often used to reduce mental effort and expedite decision-making include:
Limited Working Memory - The mind's short-term memory is the “workbench” for problem solving and decision-making.
Limited Attention Resources - The limited ability to concentrate on two or more activities challenges the ability to process information needed to solve problems.
Mind-Set People tend to focus more on what they want to accomplish and less on what needs to be avoided because human beings are primarily goal-oriented by nature. As such, people tend to “see” only what the mind expects, or wants, to see.
Difficulty Seeing One's Own Error - Individuals, especially when working alone, are particularly susceptible to missing errors.
Limited Perspective - Humans cannot see all there is to see. The inability of the human mind to perceive all facts pertinent to a decision challenges problem-solving.
Susceptibility To Emotional/Social Factors - Anger and embarrassment adversely influence team and individual performance.
Fatigue - People get tired. Physical, emotional, and mental fatigue can lead to error and poor judgment.
Presenteeism - Some employees will be present in the need to belong to the workplace despite a diminished capacity to perform their jobs due to illness or injury.

Analysis techniques

A variety of methods exist for human reliability analysis. Two general classes of methods are those based on probabilistic risk assessment and those based on a cognitive theory of control.

PRA-based techniques

One method for analyzing human reliability is a straightforward extension of probabilistic risk assessment : in the same way that equipment can fail in a power plant, so can a human operator commit errors. In both cases, an analysis would articulate a level of detail for which failure or error probabilities can be assigned. This basic idea is behind the Technique for Human Error Rate Prediction. THERP is intended to generate human error probabilities that would be incorporated into a PRA. The Accident Sequence Evaluation Program human reliability procedure is a simplified form of THERP; an associated computational tool is Simplified Human Error Analysis Code. More recently, the US Nuclear Regulatory Commission has published the Standardized Plant Analysis Risk - Human Reliability Analysis method to take account of the potential for human error.

Cognitive control based techniques

Erik Hollnagel has developed this line of thought in his work on the Contextual Control Model and the Cognitive Reliability and Error Analysis Method. COCOM models human performance as a set of control modes—strategic, tactical, opportunistic, and scrambled - and proposes a model of how transitions between these control modes occur. This model of control mode transition consists of a number of factors, including the human operator's estimate of the outcome of the action, the time remaining to accomplish the action, and the number of simultaneous goals of the human operator at that time. CREAM is a human reliability analysis method that is based on COCOM.

Related techniques

Related techniques in safety engineering and reliability engineering include failure mode and effects analysis, hazop, fault tree, and SAPHIRE.

Human Factors Analysis and Classification System (HFACS)

The Human Factors Analysis and Classification System was developed initially as a framework to understand the role of "human error" in aviation accidents. It is based on James Reason's Swiss cheese model of human error in complex systems. HFACS distinguishes between the "active failures" of unsafe acts, and "latent failures" of preconditions for unsafe acts, unsafe supervision, and organizational influences. These categories were developed empirically on the basis of many aviation accident reports.
"Unsafe acts" are performed by the human operator "on the front line". Unsafe acts can be either errors or violations. The errors here are similar to the above discussion. Violations are the deliberate disregard for rules and procedures. As the name implies, routine violations are those that occur habitually and are usually tolerated by the organization or authority. Exceptional violations are unusual and often extreme. For example, driving 60 mph in a 55-mph zone speed limit is a routine violation, but driving 130 mph in the same zone is exceptional.
There are two types of preconditions for unsafe acts: those that relate to the human operator's internal state and those that relate to the human operator's practices or ways of working. Adverse internal states include those related to physiology and mental state. A third aspect of 'internal state' is really a mismatch between the operator's ability and the task demands; for example, the operator may be unable to make visual judgments or react quickly enough to support the task at hand. Poor operator practices are another type of precondition for unsafe acts. These include poor crew resource management and poor personal readiness practices.
Four types of unsafe supervision are: inadequate supervision; planned inappropriate operations; failure to correct a known problem; and supervisory violations.
Organizational influences include those related to resources management, organizational climate, and organizational processes.

Footnotes

Standards and guidance documents

*
*
*
*