0% found this document useful (0 votes)
127 views16 pages

A Compact Introduction To Human Error

Human error

Uploaded by

Chan Tung Ning
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
127 views16 pages

A Compact Introduction To Human Error

Human error

Uploaded by

Chan Tung Ning
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 16

A Compact Introduction to Human Error

Stu Moment
University of Illinois Human Factors Division
©2008, Stu Moment
Stu Moment web page

APA reference: Moment, S. L. (2008). A Compact Introduction to Human Error, University of Illinois
Human Factors Division Proceedings, Retrieved [month] [day],[year], from
http://www.humanfactors.illinois.edu/research/HumanElementArticles/CompactIntroToHumanError/

Copyright notice: This publication may be used, reproduced, printed and redistributed for personal,
academic, research or non-commercial purposes as long as 1) it is not modified, 2) credit is
attributed to the Human Factors Division at the University of Illinois at Urbana-Champaign and the
author, and 3) the copyright notice and this notice are reproduced on any copies. If you have any
questions regarding distribution of this paper contact the author.

Introduction
Human Error is relatively new concentration of Human Factors. Until recently, it’s
causal [construct], analysis and intervention methods have been more art than a
computational science, yet advances in the science occur as qualified factors are
identified and organization-wide analysis are created.

The application of analytical methods in human error evaluation is in its infancy. Tools
are being utilized and more important, data is being collected. Explanations of the
underlying causes of human error at the individual level, developed over the last 60
years, show stability and growth on a relatively solid foundation of knowledge.
Philosophies of human performance in a job environment have been refined to a point
where most contributions to an error may be mapped, thus giving today’s safety
systems designer and analyst useful structures and methods.

Contents:
 Definition of Human Error
 Society’s Focus on Human Error

 Real Weighted Occurrences of Human Error

 Analyzing Organizational Commitment (at all levels); Safety Culture

 Foundations of the Study of Human Error

 Contributing Psychology and Human Factors Sub-Disciplines

 The Structure of Human Error in the Work Place

 Latent Conditions in Organizational Systems - A Defensive Approach


 Ecological Improvement – An Offensive Approach

 System Level Potential for Human Error

 Error Potential Modifiers

 The Errors

 Error Analysis

 Intervention Strategies

 References

Definition of Human Error


The definition of human error appears simple, errors caused by humans rather than
machines. But such a definition is too shallow for fruitful analysis and corrective
prescription. 50 years ago human error was often described just at the level of the
human directly involved with error. One example of this assignment is Coast Guard
investigations of ferry operators. The investigations assigned the cause of most errors
as, inattentiveness, poor judgment or negligence by crew members. Modern definitions
look to failures at any point in a system within an organization, from micro-task
decisions and task implementation, personal monitoring systems, to team member
assignments and actual coordination of middle organizational decisions relating to the
implementation of people and equipment chosen for task accomplishment. The
systems themselves are scrutinized for their tolerance to human frailty. These systems
are designed by humans. Human error can occur in any stage of a task or strategy
where a human is involved.

How should we look at error? In Sheridan’s introduction to a famed Rasmussen article he states, "Error is simply a
difference between an actual state and a desired state” (Sheridan 2003). But important errors are ones which
adversely affect direct personal well being, the environment, or a system which society considers important. In all
but the most basic systems, a performer will never achieve an optimal state. Yet adverse states will occur when a
threshold of performance is not met or when multiple performance thresholds produce an adverse condition.

Society’s focus on Human Error


Many references to human error are connected with high-profile catastrophes. Nuclear
power plant accidents, oil rig fires and chemical plant disasters are very apparent to
the public and their investigation [solicits] human error analysis.

Public transportation accidents are more constant examples of errors which are in the
public spotlight. The recorded investigation of railroad accidents in the United States
began in 1911. Aircraft accident investigations became formal in the 1920’s. In 1939,
New Jersey created the “Traffic Bureau” which was charged with, “compiling accident
statistics, conducting studies of congestion, accident causes, and the effectiveness of
current safety measures.” The Occupational Safety and Health Act of 1970 mandated
reporting of at work accidents. After the 1999 report, To Err is Human: Building a Safer
Health System, The Federal Government specified programs to improve error analysis
in our health systems. States are starting similar programs.

In all of these examples, government is the investigative and analyzing agency. Internal company investigations
have historically been for the purposes of protecting the company in the case of law suit. But due to competitive
safety, internal economics and reduced potential of claimed negligence, both from the standpoint of the accident
itself and from the standpoint of practiced safety emphasis, more companies are analyzing incidents internally.
The “Company Safety Officer” has become a standard position in many larger institutions.

Real Weighted Occurrences of Human Error


Many references to human error are connected with high-profile catastrophes but the real distribution of
accidents are reflected in OSHA documents. Agriculture, carpentry and non-commercial transportation including
driving and boating, lead the list when data is grouped by industry. General at work accidents, not associated
with units main purpose, account for a surprisingly large number of injuries when data is not grouped by industry.
(U.S. Department of Labor 2007)

The notion of human error should extend beyond tangible incidents. Economic planning and military command
errors are as valid a target for analysis as a medical diagnosis error.

Analyzing Organizational Commitment (at all levels) Safety


Culture
“Safety culture has previously been defined as the enduring value and prioritization of
worker and public safety by each member of each group and in every level of an
organization. It refers to the extent to which individuals and groups will commit to
personal responsibility for safety; act to preserve, enhance and communicate safety
information; strive to actively learn, adapt and modify (both individual and
organizational) behavior based on lessons learned from mistakes; and be held
accountable or strive to be honored in association with these values.” (von Thaden T.
L., Gibbons, A. M. 2008)

The ability to communicate safety emphasis can easily be improved in any organization
but many classic studies in organizational psychology illustrate organizational behavior
phenomena which become barriers to commitment and risk assessment in
organizations. Ambiguity in job definitions and safety personnel’s power, is aggravated
by lack of knowledge, resources or a perceived high cost of system fixes.

At the individual practitioner’s level, worry of legal exposure or job repercussions will
limit the flow of information on all aspects of organizational safety except in the case of
a mandatory accident report.

vonThaden, Kessel and Ruengvisesh divide their safety culture questionnaires and their
analysis of safety culture into general groups of formal safety program, informal
aspects of safety, organizational commitment, operations personal and overall safety
(von Thaden, T. L., Kessel, J., Ruengvisesh, D. 2008). A subcategory within “formal
safety program” is “reporting system”. The questionnaire solicits attitudes about fear
of negative repercussions, ease of use and the respondent’s opinion of other
employee’s use of incident reports.

The acceptance and use of a good reporting system, by all levels of employees, may not be an indicator of decent
safety climate in itself, but is a necessary prerequisite to establishing data collection. Data collection is the weak
point of human error analysis.

Foundations of the Study of Human Error


The psychological constructs of limited human performance benefited from many
studies in the 1950’s and 60’s. Human Error studies prominent in the 1980’s include
studies of basic notions of cognitive failures and add notions of error systems and error
taxonomies.

Egon Brunswik set a foundation for the analysis of human decision making for tasks
requiring evaluation. Brunswik’s lens model relates human judgements to cue
availability, cue utilization and cue validity in a probabilistic fassion (Brunswik 1955).
Brunswik’s works, as adapted by cognitive engineering researches, became the core of
many models of human performance.

Kenneth Hammond applied Brunswik’s notions in human performance studies.


Hammond's 1972 article, Cognitive Control, gives us many considerations to include in
analysis of judement/decision making tasks (Hammond 1972).

Donald Norman’s often quoted 1980 article, Errors in Human Performance, expands on
the notion of mistakes vs. slips, a dichotomy which also appears in many modern
analysis (Norman 1980). Norman’s approach to Human Error goes beyond just those
observable behavior types. Norman observes that accidents are difficult to categorize
and that the human operator is working in a system.

Jens Rasmussen’s often quoted 1983 article, Skill, Rules, and Knowledge; Signals,
Signs, and Symbols, and Other Distinctions in Human Performance Models, presents
three categories of human behavior, skill-based, rule-based and knowledge-based,
which make their way into modern error analytical systems (Rasmussen 1983).

David Meister’s 1989 article, The Nature of Human Error, expands on the system’s role
in error (Meister 1989). While not prescribing a specific taxonomy, Meister describes
many pieces of error systems from which a taxonomy for a specific negative
performance can be constructed.

Perhaps the most popular human error writer, James Reason, combines much of the
earlier notions of both the errors themselves and the systems in which humans work,
into a the general but complete “Swiss Cheese” model through which errors are
allowed to occur, in his 1990 book, Human Error (Reason 1990).

There are many other names which should be included in this evolution of Human Error study. Also, there are
many new names presenting credible expansion of generic Human Error Analysis as well as people expanding the
study into their specific industry, an extremely important lateral development since no single analysis method fits
many systems.

Contributing Psychology and Human Factors Sub-


Disciplines
A quick mention of a few contributing sub-disciplines is important since productive
error analysis depends on this root-level knowledge as applied within identified
human/organization/equipment system types as well as the design of improved
systems. Organizations need to staff or consult people with root-level knowledge in
areas appropriate to their systems, often independent of the group chosen to develop
analysis systems. These contributing sub-disciplines are not mutually exclusive. Some
disciplines gain from or build on findings in other disciplines.

Basic Cognition
The implications of cognitive man go beyond just the notion that we know and think as
opposed to exhibit programmed reactions. Our ability to make decisions involve
cognitive principles of perceptions and reasoning. Bloom’s cognitive level taxonomy of
knowledge, comprehension, application, analysis, synthesis and evaluation give
considerations applicable to specific job demands and the systems designed to help
man meet those demands. Research examining the transition of a human in a specific
task, from evaluative to rote reaction responses, has implications where forcing
evaluation is always necessary, even when diversions from normal are seldom
encountered.

Working memory
Research has found that we have very limited working memory. Working memory is the
memory directly involved with a micro-analysis in a task. Working memory is also used
to retrieve information from long term memory. System designs may allow working
memory to be overloaded. Performance degradation during overload has been
measured in many studies. Conditions which force overload have also been studied.

Attention
In their book, Applied Attention Theory, Wickens and McCarley present chapters,
among others, on attention control, information sampling, resources and effort, time
sharing, interruptions and task management (Wickens and McCarley 2008). The
findings in the study of attention apply directly to most task systems and
improvements made in such systems can contribute greatly to error reduction.

Schema (development and activation)


Schema is a human’s collection of information and practiced evocation of knowledge,
behavior or required action on a particular [topic]. A person’s schema which surfaces at
a particular time is dependent on the environment and actions which bring that schema
out.

Early schema research came from two different disciplines, learning psychology and
personality disorder psychology. Studies in learning have concentrated on how best to
learn. Those studies better define the notation that new learning is attached to some
old knowledge. Individual areas of practiced knowledge are part of the whole set of
schemas, termed schemata.

Studies in personality disorders have successfully measured schema in areas some


academicians term as personality factors. Personality factors are formed in much the
same way as knowledge schema, that is, by attaching new information to information
currently in mind and associated, at the time, with a certain state or assessment of the
environment. Like knowledge schema, different [formations] of a topic related schema
are brought out by different activators. For the sake of incident analysis, we will include
personality factors as schema assessments.

Schema may not be ignored. It affects many task types. Schema that is difficult to measure can still be included in
error analysis and system design. After-the fact references to schema effects will acknowledge its importance to
some systems. Desired schema activators can be programmed into systems.

The Structure of Human Error in the Work Place


Some writers introduce the “Nature of Human Error” in terms of unwanted occurrences
just waiting to happen. The fallible human does little faultless activity while getting
ready for work, eating, communicating with a spouse, driving to work or while working.
Errors may range from sub optimal activities such as driving a route which adds miles
or time, to activity with the potential of a negative consequence such as forgetting to
relocate flammable liquids to a safe place after unloading.

In [contrast] with such a bleak look at human fallibility, we humans perform rather well
at work. In the work place, the human is placed into a system which controls behavior
relevant to a specific role. A work role places the human into a structure which directs
important activities in a micro-world, that is a relatively small number of relevant
occurrences, and usually, a relatively small number of desired responses. Systems
designs have evolved to reduce the potential for error in desired responses. Some
systems evolved informally and some by a modern concerted effort.

The system must be designed to reduce errors specific to the task-type/error-type. In addition, a good system
design should account for the performance modifiers of physical or mental personal states and variable
environmental conditions.

Latent Conditions in Organizational Systems – A Defensive


Approach
The latent conditions approach to human error potential may best fit in task
environments with established rules or procedures and few evaluative portions of the
task.

Whereas “fallibility” is a term usually attached to the imperfect human, “latent


conditions” describe potential imperfections in a system which allow errors to occur.
However one desires to think of fallibility and latency, the importance behind the
concepts are to block possible paths of error.

James Reason characterized the “trajectory of accident opportunity” with his “Swiss
cheese” model of accident causation (Reason 1990). This model separates the
performance of unsafe acts from psychological precursors and organizational defenses.
In order for an accident to occur, the trajectory of accident opportunity passes though
holes in the cheese slices. The unsafe act is an observable error such as choice of a
wrong procedure, task miss-prioritization, or a violation of established methods.
Reason’s psychological precursors or preconditions include failure to use safety
equipment, personal stress and many other conditions which modify performance.
Reason’s defensive systems include among other things, engineered safety features
and safety equipment requirements, rules and procedures, training, drills etc. (Reason
1997)

The Department of Defense’s HFACS analysis system is a descendent of Reason’s 1990 Swiss cheese model
(Department of Defense 2005). Four levels where failures have occurred are analyzed; organizational influences,
unsafe supervision, preconditions for unsafe acts and the unsafe acts themselves. This adaptation of Reasons
architecture was developed by Wiegmann and Shappel and published in their book, A Human Error Approach to
Aviation Accident Analysis, The Human Factors Analysis and Classification System (Wiegmann and Shappel 2003).
Figure 1 The "Swiss Cheese" Model (adapted from Reason,
1990) from Department of Defence HFACS (Department of Defense 2005)

Reason’s latest organization of error analysis, simplifies the cause assignment and
investigation into three levels; the organization, error provoking conditions and the
unsafe acts. (Reason 1997) Reason’s newest analytical structure does more than
reorganize the four tier model which evolved into HFACS. Reason uses the three tier
model for both analyzing causes and initiating investigation into errors.

As we organize the elements that allow human error, we will mix many author’s
arrangements of error factors and place these elements into a refined three level
structure, the system level, error potential modifiers, and the act task-type/error-type.
This arrangement does not define a taxonomy, but instead defines a structure to build
taxonomies specific to an organization. Within the structure, mathematical
relationships may be added for help with system design.

For the purpose of system identification, we will create a new acronym for this 3 level
system, the Human Environment Diagnostic/Design System (HEDS). Again, this system
is not a new format for error evaluation, just a customizable mix and modification to
previous systems. Multiple factors may be viewed in parallel for any given incident.
Intervention strategies are directly derived from this analysis.

A systems approach to task accomplishment does differ from the HFACS/Swiss cheese model in one important
way. Whereas "defenses to latent conditions" appear the be the theme of the HFACS/Swiss cheese model, a
more positive "locating potential for improved performance" may be integrated into a systems approach.

Ecological Improvement – An Offensive Approach


The cognitive engineering approach may best serve task environments with many
evaluative tasks.

Cognitive engineering models vary, but include common components of the


environment, human perceptions and the achievement desired of the human.
Judgments are influenced by cues obtained from the environment. Cue availability as
well as the practitioner’s assessment of their ecological validity affect judgment.
Technological aids which give a practitioner more information and do so in a non-
confusing way are offensive tools for error reduction.

Decision making will also be influenced by many system level factors as well as what
we will term, error potential modifiers. Also the desired short term goal of the human’s
behavior may not be as well specified as in a task structure of established rules and
procedures. Many judgments may be part of a series of judgments required to obtain a
goal.

The effects of the practitioner’s desired outcome adds another complicating factor to the strategy of ecological
improvement. The practitioner may not know when to use a defensive mode of decision making, that is, when to
minimize the probability of adverse effects as opposed to maximizing the probability of positive effects.

System Level Potential for Human Error


Systems vary with the complexity of the process, the potential for error and the cost of
error. Systems level implementations may take a variety of forms from
technology/equipment specifications, monitoring systems, specified procedures,
checklists, team member requirements, training and re-training requirements, etc.
These systems should be implemented with a knowledge of, and connection to, task
types and task duration.

Simple tasks may require fewer systems. A carpenter may be working in simple
systems. A carpenter may be required to use shields on a cut-off saw. A carpenter may
not be allowed to work unless another employee is present. A carpenter’s task may
involve many short duration tasks, some of a repetitive nature and some with variety.

A medical practitioner may be involved in complex systems with sophisticated


equipment requirements, strict crew-member requirements and constant monitoring
requirements. The duration of a medical task may have considerable variance.
Monitoring the results of a procedure may cover a large time frame. Parts of a
procedure may appear rote. Other parts may require considerable evaluation and
knowledge search.

The safety systems designer must design systems to improve task performance but must always assume
imperfect performance [caused] by either common task errors or [magnified] by error potential modifiers.

Error Potential Modifiers


Reason’s psychological precursors were presented in the previous section. Wiegmann
& Shappel referred to this layer as “preconditions for unsafe acts” and add to this
category, personal subfactors such as personal readiness and the environmental
factors of both physical and technological environment. Wiegman and Shappel divide
“condition of operators” into adverse mental, physiological and physical/mental states
(Wiegmann and Shappell 2003).

When creating categories for a particular organization, the safety system designer may
need to place similar sounding factors at more than one of the three simplified levels.
Technology/equipment specification normally belongs at the system level yet a valid
reference to how an individual interfaces with the technological environment (or its
failure) can be made in this Error Potential Modifier section.

Other factors which may be placed with specificity in this section include time pressure, [importance] pressure,
attention predisposition, procedural memory activators, attention overload and relevant schema base/activation
variables which are not part of the system design.

Unsafe Acts: Error Types/Task Types


Basic error types identified by Rasmussen are skill-based, rule-based and knowledge-
based. DOD HFACS uses a similar approach to the classification of errors types. The
error types are skill-based, decision-making or perceptual.

Because DOD HFACS was designed for the analysis of aviation errors, many
organizations will find other categories useful. Errors of omission may be an oft used
category for tasks with multiple analytical stages. Attention error breakdowns could be
useful in automated systems. Ambiguity caused errors are also found where equipment
malfunctions are disbelieved or crew member authority is poorly defined.

Basic error types, as defined by Reason, are slips/lapses, rule based mistakes and knowledge based mistakes.
Reason incorporates these types in his “unsafe acts” slice of the Swiss cheese model with subdivisions of
unintended action and intended action. Unintended actions include slips and lapses. Intended actions include
mistakes and violations. Different systems based on the Swiss cheese/HFACS model of analysis subdivide
violations into one or more categories depending on the cause and organizational acceptance of the violation.
Table 1 illustrates error types in the unsafe acts section of DOD HFACS.

Skill-based
Inadvertent Operation
Checklist Error
Procedural Error
Overcontrol/Undercontrol
Breakdown in Visual Scan
Inadequate Anti-G Straining Maneuver

Judgment and Decision-Making


Risk Assessment - During Operation
Task Misprioritization
Necessary Action - Rushed
Necessary Action - Delayed
Caution/Warning - Ignored
Decision-Making During Operation

Perception (DOD HFACS does not include subcategories under perception.


There are few subcategories of perception in rule or procedure task
structures. This category will need to be expanded where Ecological
Improvement strategies apply.)

Violations
Violation - Based on Risk Assessment
Violation - Routine/Widespread
Violation - Lack of Discipline

Table 1. DOD HFACS Errors:

Task types, defined in a systems description, should be connected to error types in a good analysis system.

Error Analysis
Due to differences in system level structures, no existing analysis system gives a
perfect fit for their individual error situations. The individual analyst must outline the
peculiarities of the system and tasks in a particular job and either adapt currently
available systems or invent new analysis.

Human Error analytical techniques may be divided in to predictive models and post-
incident investigation. Predictive models include probabilistic risk assessment models
and cognitive engineering models. Predictive models will not be covered in this article
but it is important to note that predictive models will improve as post-incident
investigation models create data relevant to them.

Categorizing
Classic categories often do not contribute directly to error analysis but may be useful
when error types are examined by category. The classical categories in the medical
field: diagnosis, drug and medical procedures will probably reveal different error
systems, modifiers and types, when data is retrieved by individual category.

Besides classical categories, other descriptive categories can be built into report and
investigation forms. Certain questions on the circumstances surrounding the incident
become natural categories on which to examine for differences when report quantities
become substantial.

Root Cause
Root cause analysis has its roots in the analysis of machine failures. In the 1970’s and
1980’s, most documented applications of root cause analysis were made in product
quality assurance and equipment failures. By the early 1990’s root cause analysis was
extended to cover issues beyond equipment. In his 1991 article, In Search of the Root
Cause, John Dew professed “Systemic issues concern how the management of the
organization plans, organizes, controls, and provides quality assurance and safety in
five key areas: personnel, procedures, equipment, material, and the
environment.” (Dew 1991). But use of root cause analysis in human affected systems
may be limited to the human's duties in the system as opposed to the causes of
suboptimal human performance. The Joint Commission on the Accreditation of
Healthcare Organizations (JCAHO) avoids the issue and notes “A root cause analysis
focuses primarily on systems and processes, not on individual performance.” (JCAHO
2009)

Some credible applications of root cause(s), in support of human error analysis, can be
made, mostly, to discover seminal events which, when similar seminal events occur
with [high] frequency, will pinpoint an area of needed system improvement. A seminal
event is an event which, an analyst decides, begins a chain of events which leads to an
error. While root cause analysis may have a place in human error investigation, it
contributes little to the examination of simultaneous holes in systems and defenses
against probable modifiers. Root cause analysis usability is further diminished when
multiple parallel conditions combine to enable an error. Despite the limitations of root
cause analysis, a good investigation system will allow a remitter or investigator to label
a contributing element as “seminal”, thus allowing future inspection of common
seminal events as data size grows.

Other Industry Inspired Methods.


Traditional Methods other than root cause have made their way into human error
literature. Fault Tree Analysis, Probability Tree Analysis and Markov Reliability Analysis
have provided quantitative reliability and error analysis in machine systems. While, like
root cause analysis, they are not built on the human error foundations of knowledge,
their application to systems involving humans can produce helpful task design
considerations.

Multi-tier Models
Multi-tier type models like HFACS and the simplified three-tier models, have been
described in previous sections. These type of analysis are built on the human error
foundations of knowlege. Multi-tier analysis, which tightly connects the levels of error,
will provide useful data for analytical methods currently available or to be developed in
the future. Vertical connection refers to the grouping of system/environment, modifiers
and error types for each error noted in an incident, and, keeping them separate from
the factors associated with other errors found in the same incident. Table 2 contrasts
poor factor assignment from vertically connected factor assignment. Vertically
connected analysis can better pinpoint acceptable system improvements and
intervention strategies.

- Non-Coupled Vertical
Paths -
Level Type Comment

practitioner omitted a check at


error omitted part of procedure
stage 5.
procedure often omitted in past
error violation (routine)
with acceptable results

Three procedures were waiting and


modifier work overload
marked high priority
acceptable results in the past have
modifier risk taking schema changed the practitioners risk
taking schema.

comment supervision did note low


system monitoring - staffing quantity staffing, did not request
intervention
no procedural checklist
system *@## !
required for the process

- Coupled Vertical Paths -


Level Type Comment

practitioner omitted a check at


error omitted part of procedure
stage 5
modifier work overload Three procedures were waiting and
marked high priority
supervision did note low staffing,
system monitoring - staffing quantity
did not request intervention

that part of the procedure often


error violation (routine) omitted in past with acceptable
results
acceptable results in the past have
modifier risk taking schema changed the practitioners risk
taking schema
no procedural checklist
system *@## !
required for the process
Table 2. illustration of non-coupled 3 level analysis vs. closely coupled
assignment of contributions to error. The second set provides connected
information which can be referenced and quantified with similar sets in
other incident reports. Resulting analysis can better pinpoint acceptable
system improvements and intervention strategies.

Cognitive Task Models


Cognitive control architectures relate the human’s assessment of the environment to
judgments, decisions or actions. Ecological effects are only [crudely] involved in current
frameworks of error analysis, yet may be the foremost contributor to error assessment
when knowledge-based errors occur.

Two mainstream cognitive assessment models, the lens model and the ACT-R model
enable quantitative assessment of error contribution to tasks. Like the traditional
industrial reliability models, their application can be made to local tasks in a system.

Cross-Mapping
The use of any human error analysis method may be mapped to other methods if there is relational identification
to a case number in all descriptions, categorical, hierarchical, seminal, system reliability or cognitive task models.
When using hierarchical models, inclusion of comments associated with a particular vertical column, will aid
transfer of finished analysis to improved systems where new factors are included.

Intervention Strategies and Safety System


Design/Engineering
Correct intervention strategies will be found directly in a well designed analysis system.
The costs of each error reduction strategy or contributors to that strategy will limit the
choices.

Safety system design (or engineering in systems where mathematical relationships are
developed) allows for many solutions.
Training is often mentioned as a cure for problems but training is suboptimal or
ineffective as a band-aid for systems which are in need of improvement. The following
is a short-list of possible improvements.

 system controls which activate correct knowledge


 carefully specified crew staffing composition

 system controls which reduce load at critical times

 technological aids which feed correct and easily interpreted information

 system controls to detect adverse physiological states

 reduction of potential ambiguity where choices in action exist

 improved information dissemination on errors which have occurred in similar


circumstances

 personality schema activation at critical times (used in sports)

 system enhancements which artificially form a closed loop feedback system where
practitioner normally uses memory to check on the outcome of previous actions.

Safety analysis, system design and error reduction go hand in hand with a well designed analysis system.
Conversely, without a good analysis system, error improvement is much less efficient.

References:
Brunswik, E. (1955) Representative Design and Probabilistic Theory in a Functional
Psychology, Psychological Review, Vol 62(3), 193-217.

Dew, J. R. (1991) In Search of the Root Cause, Quality Progress 24(3) 97-102

Hammond, K.R., Summers, D.A. (1972) Cognitive Control, Psychological Review, Vol
79(1), 58-67

Meister, D. (1989), The nature of human error, IEEE Global Telecommunications


Conference, Communications Technology for the 1990s and Beyond, volume 2, 783-
786

Norman, D.A. (1980), Errors in Human Performance, University of California, San Diego,
Center for Human Infromation Processing Report No. 8004

RASMUSSEN, J. (1983), Skill, Rules, and Knowledge; Signals, Signs, and Symbols, and
Other Distinctions in Human Performance Models, IEEE Transactions on Systems Man
and Cybernetics, Vol. 13(3) 257-266

Reason, J. (1990), Human error. New York: Cambridge University Press.

Reason, J. (1997), Managing the Risks of Organizational Accidents. Aldershot: Ashgate


Publishing Limited

Sheridan, T. B. (2003) Human Error,Quality in Health Care, 2003;12, 383-385

U.S. Department of Defense (2005). DoD HFACS, Attachment 1, Retrieved November


3, 2008, from http://www.safetycenter.navy.mil/hfacs/downloads/hfacs.pdf

U.S. Department of Labor, Bureua of Labor Statistics (n.d.). Fatal occupational injuries
by industry and event or exposure, Retrieved October 12, 2008,
fromhttp://www.bls.gov/iif/oshwc/cfoi/cftb0223.pdf

von Thaden T. L., Gibbons, A. M. (2008), The Safety Culture Indicator Scale
Measurement System (SCISMS). University of Illinois Human Factors Division Technical
Report HFD-08-03/FAA-08-02

von Thaden, T. L., Kessel, J., Ruengvisesh, D. (2008), Measuring Indicators of Safety
Culture in a Major European Airline’s Flight Operations Department. The Proceedings of
the 8th International Symposium of the Australian Aviation Psychology Association.
NovotelBrightonBeach, Sydney.

Wickens, C. D., McCarley, J. S. (2008), Applied Attention Theory, Boca Raton: CRC Press.

Wiegmann, D. A., & Shappell, S. A. (2003), A human error approach to aviation accident
analysis: The human factors analysis and classification system, VT: Ashgate.

Wiegmann, D. A., Zhang, H., von Thaden, T. L., Sharma, G., Mitchell, A. A. (2002) A
Synthesis of Safety Culture and Safety Climate Research. University of Illinois Aviation
Research Lab Technical Report

Wiegmann, D. A., Shappell, S. A. (2001) Applying the human factors analysis and
classification system (HFACS) to the analysis of commercial aviation accident
data .Proceedings of the 11th International Symposium on Aviation Psychology.

You might also like