A Compact Introduction To Human Error
A Compact Introduction To Human Error
Stu Moment
University of Illinois Human Factors Division
©2008, Stu Moment
Stu Moment web page
APA reference: Moment, S. L. (2008). A Compact Introduction to Human Error, University of Illinois
Human Factors Division Proceedings, Retrieved [month] [day],[year], from
http://www.humanfactors.illinois.edu/research/HumanElementArticles/CompactIntroToHumanError/
Copyright notice: This publication may be used, reproduced, printed and redistributed for personal,
academic, research or non-commercial purposes as long as 1) it is not modified, 2) credit is
attributed to the Human Factors Division at the University of Illinois at Urbana-Champaign and the
author, and 3) the copyright notice and this notice are reproduced on any copies. If you have any
questions regarding distribution of this paper contact the author.
Introduction
Human Error is relatively new concentration of Human Factors. Until recently, it’s
causal [construct], analysis and intervention methods have been more art than a
computational science, yet advances in the science occur as qualified factors are
identified and organization-wide analysis are created.
The application of analytical methods in human error evaluation is in its infancy. Tools
are being utilized and more important, data is being collected. Explanations of the
underlying causes of human error at the individual level, developed over the last 60
years, show stability and growth on a relatively solid foundation of knowledge.
Philosophies of human performance in a job environment have been refined to a point
where most contributions to an error may be mapped, thus giving today’s safety
systems designer and analyst useful structures and methods.
Contents:
Definition of Human Error
Society’s Focus on Human Error
The Errors
Error Analysis
Intervention Strategies
References
How should we look at error? In Sheridan’s introduction to a famed Rasmussen article he states, "Error is simply a
difference between an actual state and a desired state” (Sheridan 2003). But important errors are ones which
adversely affect direct personal well being, the environment, or a system which society considers important. In all
but the most basic systems, a performer will never achieve an optimal state. Yet adverse states will occur when a
threshold of performance is not met or when multiple performance thresholds produce an adverse condition.
Public transportation accidents are more constant examples of errors which are in the
public spotlight. The recorded investigation of railroad accidents in the United States
began in 1911. Aircraft accident investigations became formal in the 1920’s. In 1939,
New Jersey created the “Traffic Bureau” which was charged with, “compiling accident
statistics, conducting studies of congestion, accident causes, and the effectiveness of
current safety measures.” The Occupational Safety and Health Act of 1970 mandated
reporting of at work accidents. After the 1999 report, To Err is Human: Building a Safer
Health System, The Federal Government specified programs to improve error analysis
in our health systems. States are starting similar programs.
In all of these examples, government is the investigative and analyzing agency. Internal company investigations
have historically been for the purposes of protecting the company in the case of law suit. But due to competitive
safety, internal economics and reduced potential of claimed negligence, both from the standpoint of the accident
itself and from the standpoint of practiced safety emphasis, more companies are analyzing incidents internally.
The “Company Safety Officer” has become a standard position in many larger institutions.
The notion of human error should extend beyond tangible incidents. Economic planning and military command
errors are as valid a target for analysis as a medical diagnosis error.
The ability to communicate safety emphasis can easily be improved in any organization
but many classic studies in organizational psychology illustrate organizational behavior
phenomena which become barriers to commitment and risk assessment in
organizations. Ambiguity in job definitions and safety personnel’s power, is aggravated
by lack of knowledge, resources or a perceived high cost of system fixes.
At the individual practitioner’s level, worry of legal exposure or job repercussions will
limit the flow of information on all aspects of organizational safety except in the case of
a mandatory accident report.
vonThaden, Kessel and Ruengvisesh divide their safety culture questionnaires and their
analysis of safety culture into general groups of formal safety program, informal
aspects of safety, organizational commitment, operations personal and overall safety
(von Thaden, T. L., Kessel, J., Ruengvisesh, D. 2008). A subcategory within “formal
safety program” is “reporting system”. The questionnaire solicits attitudes about fear
of negative repercussions, ease of use and the respondent’s opinion of other
employee’s use of incident reports.
The acceptance and use of a good reporting system, by all levels of employees, may not be an indicator of decent
safety climate in itself, but is a necessary prerequisite to establishing data collection. Data collection is the weak
point of human error analysis.
Egon Brunswik set a foundation for the analysis of human decision making for tasks
requiring evaluation. Brunswik’s lens model relates human judgements to cue
availability, cue utilization and cue validity in a probabilistic fassion (Brunswik 1955).
Brunswik’s works, as adapted by cognitive engineering researches, became the core of
many models of human performance.
Donald Norman’s often quoted 1980 article, Errors in Human Performance, expands on
the notion of mistakes vs. slips, a dichotomy which also appears in many modern
analysis (Norman 1980). Norman’s approach to Human Error goes beyond just those
observable behavior types. Norman observes that accidents are difficult to categorize
and that the human operator is working in a system.
Jens Rasmussen’s often quoted 1983 article, Skill, Rules, and Knowledge; Signals,
Signs, and Symbols, and Other Distinctions in Human Performance Models, presents
three categories of human behavior, skill-based, rule-based and knowledge-based,
which make their way into modern error analytical systems (Rasmussen 1983).
David Meister’s 1989 article, The Nature of Human Error, expands on the system’s role
in error (Meister 1989). While not prescribing a specific taxonomy, Meister describes
many pieces of error systems from which a taxonomy for a specific negative
performance can be constructed.
Perhaps the most popular human error writer, James Reason, combines much of the
earlier notions of both the errors themselves and the systems in which humans work,
into a the general but complete “Swiss Cheese” model through which errors are
allowed to occur, in his 1990 book, Human Error (Reason 1990).
There are many other names which should be included in this evolution of Human Error study. Also, there are
many new names presenting credible expansion of generic Human Error Analysis as well as people expanding the
study into their specific industry, an extremely important lateral development since no single analysis method fits
many systems.
Basic Cognition
The implications of cognitive man go beyond just the notion that we know and think as
opposed to exhibit programmed reactions. Our ability to make decisions involve
cognitive principles of perceptions and reasoning. Bloom’s cognitive level taxonomy of
knowledge, comprehension, application, analysis, synthesis and evaluation give
considerations applicable to specific job demands and the systems designed to help
man meet those demands. Research examining the transition of a human in a specific
task, from evaluative to rote reaction responses, has implications where forcing
evaluation is always necessary, even when diversions from normal are seldom
encountered.
Working memory
Research has found that we have very limited working memory. Working memory is the
memory directly involved with a micro-analysis in a task. Working memory is also used
to retrieve information from long term memory. System designs may allow working
memory to be overloaded. Performance degradation during overload has been
measured in many studies. Conditions which force overload have also been studied.
Attention
In their book, Applied Attention Theory, Wickens and McCarley present chapters,
among others, on attention control, information sampling, resources and effort, time
sharing, interruptions and task management (Wickens and McCarley 2008). The
findings in the study of attention apply directly to most task systems and
improvements made in such systems can contribute greatly to error reduction.
Early schema research came from two different disciplines, learning psychology and
personality disorder psychology. Studies in learning have concentrated on how best to
learn. Those studies better define the notation that new learning is attached to some
old knowledge. Individual areas of practiced knowledge are part of the whole set of
schemas, termed schemata.
Schema may not be ignored. It affects many task types. Schema that is difficult to measure can still be included in
error analysis and system design. After-the fact references to schema effects will acknowledge its importance to
some systems. Desired schema activators can be programmed into systems.
In [contrast] with such a bleak look at human fallibility, we humans perform rather well
at work. In the work place, the human is placed into a system which controls behavior
relevant to a specific role. A work role places the human into a structure which directs
important activities in a micro-world, that is a relatively small number of relevant
occurrences, and usually, a relatively small number of desired responses. Systems
designs have evolved to reduce the potential for error in desired responses. Some
systems evolved informally and some by a modern concerted effort.
The system must be designed to reduce errors specific to the task-type/error-type. In addition, a good system
design should account for the performance modifiers of physical or mental personal states and variable
environmental conditions.
James Reason characterized the “trajectory of accident opportunity” with his “Swiss
cheese” model of accident causation (Reason 1990). This model separates the
performance of unsafe acts from psychological precursors and organizational defenses.
In order for an accident to occur, the trajectory of accident opportunity passes though
holes in the cheese slices. The unsafe act is an observable error such as choice of a
wrong procedure, task miss-prioritization, or a violation of established methods.
Reason’s psychological precursors or preconditions include failure to use safety
equipment, personal stress and many other conditions which modify performance.
Reason’s defensive systems include among other things, engineered safety features
and safety equipment requirements, rules and procedures, training, drills etc. (Reason
1997)
The Department of Defense’s HFACS analysis system is a descendent of Reason’s 1990 Swiss cheese model
(Department of Defense 2005). Four levels where failures have occurred are analyzed; organizational influences,
unsafe supervision, preconditions for unsafe acts and the unsafe acts themselves. This adaptation of Reasons
architecture was developed by Wiegmann and Shappel and published in their book, A Human Error Approach to
Aviation Accident Analysis, The Human Factors Analysis and Classification System (Wiegmann and Shappel 2003).
Figure 1 The "Swiss Cheese" Model (adapted from Reason,
1990) from Department of Defence HFACS (Department of Defense 2005)
Reason’s latest organization of error analysis, simplifies the cause assignment and
investigation into three levels; the organization, error provoking conditions and the
unsafe acts. (Reason 1997) Reason’s newest analytical structure does more than
reorganize the four tier model which evolved into HFACS. Reason uses the three tier
model for both analyzing causes and initiating investigation into errors.
As we organize the elements that allow human error, we will mix many author’s
arrangements of error factors and place these elements into a refined three level
structure, the system level, error potential modifiers, and the act task-type/error-type.
This arrangement does not define a taxonomy, but instead defines a structure to build
taxonomies specific to an organization. Within the structure, mathematical
relationships may be added for help with system design.
For the purpose of system identification, we will create a new acronym for this 3 level
system, the Human Environment Diagnostic/Design System (HEDS). Again, this system
is not a new format for error evaluation, just a customizable mix and modification to
previous systems. Multiple factors may be viewed in parallel for any given incident.
Intervention strategies are directly derived from this analysis.
A systems approach to task accomplishment does differ from the HFACS/Swiss cheese model in one important
way. Whereas "defenses to latent conditions" appear the be the theme of the HFACS/Swiss cheese model, a
more positive "locating potential for improved performance" may be integrated into a systems approach.
Decision making will also be influenced by many system level factors as well as what
we will term, error potential modifiers. Also the desired short term goal of the human’s
behavior may not be as well specified as in a task structure of established rules and
procedures. Many judgments may be part of a series of judgments required to obtain a
goal.
The effects of the practitioner’s desired outcome adds another complicating factor to the strategy of ecological
improvement. The practitioner may not know when to use a defensive mode of decision making, that is, when to
minimize the probability of adverse effects as opposed to maximizing the probability of positive effects.
Simple tasks may require fewer systems. A carpenter may be working in simple
systems. A carpenter may be required to use shields on a cut-off saw. A carpenter may
not be allowed to work unless another employee is present. A carpenter’s task may
involve many short duration tasks, some of a repetitive nature and some with variety.
The safety systems designer must design systems to improve task performance but must always assume
imperfect performance [caused] by either common task errors or [magnified] by error potential modifiers.
When creating categories for a particular organization, the safety system designer may
need to place similar sounding factors at more than one of the three simplified levels.
Technology/equipment specification normally belongs at the system level yet a valid
reference to how an individual interfaces with the technological environment (or its
failure) can be made in this Error Potential Modifier section.
Other factors which may be placed with specificity in this section include time pressure, [importance] pressure,
attention predisposition, procedural memory activators, attention overload and relevant schema base/activation
variables which are not part of the system design.
Because DOD HFACS was designed for the analysis of aviation errors, many
organizations will find other categories useful. Errors of omission may be an oft used
category for tasks with multiple analytical stages. Attention error breakdowns could be
useful in automated systems. Ambiguity caused errors are also found where equipment
malfunctions are disbelieved or crew member authority is poorly defined.
Basic error types, as defined by Reason, are slips/lapses, rule based mistakes and knowledge based mistakes.
Reason incorporates these types in his “unsafe acts” slice of the Swiss cheese model with subdivisions of
unintended action and intended action. Unintended actions include slips and lapses. Intended actions include
mistakes and violations. Different systems based on the Swiss cheese/HFACS model of analysis subdivide
violations into one or more categories depending on the cause and organizational acceptance of the violation.
Table 1 illustrates error types in the unsafe acts section of DOD HFACS.
Skill-based
Inadvertent Operation
Checklist Error
Procedural Error
Overcontrol/Undercontrol
Breakdown in Visual Scan
Inadequate Anti-G Straining Maneuver
Violations
Violation - Based on Risk Assessment
Violation - Routine/Widespread
Violation - Lack of Discipline
Task types, defined in a systems description, should be connected to error types in a good analysis system.
Error Analysis
Due to differences in system level structures, no existing analysis system gives a
perfect fit for their individual error situations. The individual analyst must outline the
peculiarities of the system and tasks in a particular job and either adapt currently
available systems or invent new analysis.
Human Error analytical techniques may be divided in to predictive models and post-
incident investigation. Predictive models include probabilistic risk assessment models
and cognitive engineering models. Predictive models will not be covered in this article
but it is important to note that predictive models will improve as post-incident
investigation models create data relevant to them.
Categorizing
Classic categories often do not contribute directly to error analysis but may be useful
when error types are examined by category. The classical categories in the medical
field: diagnosis, drug and medical procedures will probably reveal different error
systems, modifiers and types, when data is retrieved by individual category.
Besides classical categories, other descriptive categories can be built into report and
investigation forms. Certain questions on the circumstances surrounding the incident
become natural categories on which to examine for differences when report quantities
become substantial.
Root Cause
Root cause analysis has its roots in the analysis of machine failures. In the 1970’s and
1980’s, most documented applications of root cause analysis were made in product
quality assurance and equipment failures. By the early 1990’s root cause analysis was
extended to cover issues beyond equipment. In his 1991 article, In Search of the Root
Cause, John Dew professed “Systemic issues concern how the management of the
organization plans, organizes, controls, and provides quality assurance and safety in
five key areas: personnel, procedures, equipment, material, and the
environment.” (Dew 1991). But use of root cause analysis in human affected systems
may be limited to the human's duties in the system as opposed to the causes of
suboptimal human performance. The Joint Commission on the Accreditation of
Healthcare Organizations (JCAHO) avoids the issue and notes “A root cause analysis
focuses primarily on systems and processes, not on individual performance.” (JCAHO
2009)
Some credible applications of root cause(s), in support of human error analysis, can be
made, mostly, to discover seminal events which, when similar seminal events occur
with [high] frequency, will pinpoint an area of needed system improvement. A seminal
event is an event which, an analyst decides, begins a chain of events which leads to an
error. While root cause analysis may have a place in human error investigation, it
contributes little to the examination of simultaneous holes in systems and defenses
against probable modifiers. Root cause analysis usability is further diminished when
multiple parallel conditions combine to enable an error. Despite the limitations of root
cause analysis, a good investigation system will allow a remitter or investigator to label
a contributing element as “seminal”, thus allowing future inspection of common
seminal events as data size grows.
Multi-tier Models
Multi-tier type models like HFACS and the simplified three-tier models, have been
described in previous sections. These type of analysis are built on the human error
foundations of knowlege. Multi-tier analysis, which tightly connects the levels of error,
will provide useful data for analytical methods currently available or to be developed in
the future. Vertical connection refers to the grouping of system/environment, modifiers
and error types for each error noted in an incident, and, keeping them separate from
the factors associated with other errors found in the same incident. Table 2 contrasts
poor factor assignment from vertically connected factor assignment. Vertically
connected analysis can better pinpoint acceptable system improvements and
intervention strategies.
- Non-Coupled Vertical
Paths -
Level Type Comment
Two mainstream cognitive assessment models, the lens model and the ACT-R model
enable quantitative assessment of error contribution to tasks. Like the traditional
industrial reliability models, their application can be made to local tasks in a system.
Cross-Mapping
The use of any human error analysis method may be mapped to other methods if there is relational identification
to a case number in all descriptions, categorical, hierarchical, seminal, system reliability or cognitive task models.
When using hierarchical models, inclusion of comments associated with a particular vertical column, will aid
transfer of finished analysis to improved systems where new factors are included.
Safety system design (or engineering in systems where mathematical relationships are
developed) allows for many solutions.
Training is often mentioned as a cure for problems but training is suboptimal or
ineffective as a band-aid for systems which are in need of improvement. The following
is a short-list of possible improvements.
system enhancements which artificially form a closed loop feedback system where
practitioner normally uses memory to check on the outcome of previous actions.
Safety analysis, system design and error reduction go hand in hand with a well designed analysis system.
Conversely, without a good analysis system, error improvement is much less efficient.
References:
Brunswik, E. (1955) Representative Design and Probabilistic Theory in a Functional
Psychology, Psychological Review, Vol 62(3), 193-217.
Dew, J. R. (1991) In Search of the Root Cause, Quality Progress 24(3) 97-102
Hammond, K.R., Summers, D.A. (1972) Cognitive Control, Psychological Review, Vol
79(1), 58-67
Norman, D.A. (1980), Errors in Human Performance, University of California, San Diego,
Center for Human Infromation Processing Report No. 8004
RASMUSSEN, J. (1983), Skill, Rules, and Knowledge; Signals, Signs, and Symbols, and
Other Distinctions in Human Performance Models, IEEE Transactions on Systems Man
and Cybernetics, Vol. 13(3) 257-266
U.S. Department of Labor, Bureua of Labor Statistics (n.d.). Fatal occupational injuries
by industry and event or exposure, Retrieved October 12, 2008,
fromhttp://www.bls.gov/iif/oshwc/cfoi/cftb0223.pdf
von Thaden T. L., Gibbons, A. M. (2008), The Safety Culture Indicator Scale
Measurement System (SCISMS). University of Illinois Human Factors Division Technical
Report HFD-08-03/FAA-08-02
von Thaden, T. L., Kessel, J., Ruengvisesh, D. (2008), Measuring Indicators of Safety
Culture in a Major European Airline’s Flight Operations Department. The Proceedings of
the 8th International Symposium of the Australian Aviation Psychology Association.
NovotelBrightonBeach, Sydney.
Wickens, C. D., McCarley, J. S. (2008), Applied Attention Theory, Boca Raton: CRC Press.
Wiegmann, D. A., & Shappell, S. A. (2003), A human error approach to aviation accident
analysis: The human factors analysis and classification system, VT: Ashgate.
Wiegmann, D. A., Zhang, H., von Thaden, T. L., Sharma, G., Mitchell, A. A. (2002) A
Synthesis of Safety Culture and Safety Climate Research. University of Illinois Aviation
Research Lab Technical Report
Wiegmann, D. A., Shappell, S. A. (2001) Applying the human factors analysis and
classification system (HFACS) to the analysis of commercial aviation accident
data .Proceedings of the 11th International Symposium on Aviation Psychology.