0% found this document useful (0 votes)
57 views4 pages

Human Error: Human Error Refers To Something Having Been Done That Was "Not Intended by The Actor Not Desired by A

biography

Uploaded by

froggoand decrew
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
57 views4 pages

Human Error: Human Error Refers To Something Having Been Done That Was "Not Intended by The Actor Not Desired by A

biography

Uploaded by

froggoand decrew
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Human error

Human error refers to something having been done that was "not intended by the actor; not desired by a
set of rules or an external observer; or that led the task or system outside its acceptable limits".[1] Human
error has been cited as a primary cause contributing factor in disasters and accidents in industries as diverse
as nuclear power (e.g., the Three Mile Island accident), aviation (see pilot error), space exploration (e.g.,
the Space Shuttle Challenger disaster and Space Shuttle Columbia disaster), and medicine (see medical
error). Prevention of human error is generally seen as a major contributor to reliability and safety of
(complex) systems. Human error is one of the many contributing causes of risk events.

Contents
Definition
Performance
Categories
Sources
Controversies
See also
References

Definition
Human error refers to something having been done that was "not intended by the actor; not desired by a set
of rules or an external observer; or that led the task or system outside its acceptable limits".[1] In short, it is a
deviation from intention, expectation or desirability.[1] Logically, human actions can fail to achieve their
goal in two different ways: the actions can go as planned, but the plan can be inadequate (leading to
mistakes); or, the plan can be satisfactory, but the performance can be deficient (leading to slips and
lapses).[2][3] However, a mere failure is not an error if there had been no plan to accomplish something in
particular.[1]

Performance
Human error and performance are two sides of the same coin: "human error" mechanisms are the same as
"human performance" mechanisms; performance later categorized as 'error' is done so in hindsight:[3][4]
therefore actions later termed "human error" are actually part of the ordinary spectrum of human behaviour.
The study of absent-mindedness in everyday life provides ample documentation and categorization of such
aspects of behavior. While human error is firmly entrenched in the classical approaches to accident
investigation and risk assessment, it has no role in newer approaches such as resilience engineering.[5]

Categories
There are many ways to categorize human error:[6][7]
exogenous versus endogenous error (i.e., originating outside versus inside the individual)[8]
situation assessment versus response planning[9] and related distinctions in
error in problem detection (also see signal detection theory)
error in problem diagnosis (also see problem solving)
error in action planning and execution[10] (for example: slips or errors of execution
versus mistakes or errors of intention[11][3])
by level of analysis; for example, perceptual (e.g., optical illusions) versus cognitive versus
communication versus organizational
physical manipulation error[12]
'slips' occurring when the physical action fails to achieve the immediate objective
'lapses' involve a failure of one's memory or recall
active error - observable, physical action that changes equipment, system, or facility state,
resulting in immediate undesired consequences
latent human error resulting in hidden organization-related weaknesses or equipment flaws
that lie dormant; such errors can go unnoticed at the time they occur, having no immediate
apparent outcome
equipment dependency error – lack of vigilance due to the assumption that hardware
controls or physical safety devices will always work
team error – lack of vigilance created by the social (interpersonal) interaction between two or
more people working together
personal dependencies error – unsafe attitudes and traps of human nature leading to
complacency and overconfidence

Sources
The cognitive study of human error is a very active research field, including work related to limits of
memory and attention and also to decision making strategies such as the availability heuristic and other
cognitive biases. Such heuristics and biases are strategies that are useful and often correct, but can lead to
systematic patterns of error.

Misunderstandings as a topic in human communication have been studied in conversation analysis, such as
the examination of violations of the cooperative principle and Gricean maxims.

Organizational studies of error or dysfunction have included studies of safety culture. One technique for
analyzing complex systems failure that incorporates organizational analysis is management oversight risk
tree analysis (MORT).[13][14][15]

Controversies
Some researchers have argued that the dichotomy of human actions as "correct" or "incorrect" is a harmful
oversimplification of a complex phenomenon.[16][17] A focus on the variability of human performance and
how human operators (and organizations) can manage that variability may be a more fruitful approach.
Newer approaches such as resilience engineering mentioned above, highlight the positive roles that humans
can play in complex systems. In resilience engineering, successes (things that go right) and failures (things
that go wrong) are seen as having the same basis, namely human performance variability. A specific
account of that is the efficiency–thoroughness trade-off principle (ETTO principle),[18] which can be found
on all levels of human activity, in individual as well as collective.
See also
Behavior-shaping constraint
Error-tolerant design
Human reliability
Poka-yoke
User error
Technique for human error-rate prediction
Fallacy
To err is human
Autrey, T.D. (2015). 6-Hour Safety Culture: How to Sustainably Reduce Human Error and
Risk (and do what training alone can't possibly do) (https://6hoursafetyculture.com). Human
Performance Association.

References
1. Senders, J.W. and Moray, N.P. (1991) Human Error: Cause, Prediction, and Reduction (http
s://books.google.co.uk/books/about/Human_Error_cause_Prediction_and_Reducti.html?id=
JRFxiDg3GoQC&redir_esc=y). Lawrence Erlbaum Associates, p.25. ISBN 0-89859-598-3.
2. Hollnagel, E. (1993) Human Reliability Analysis Context and Control. Academic Press
Limited. ISBN 0-12-352658-2.
3. Reason, James (1990) Human Error. Cambridge University Press. ISBN 0-521-31419-4.
4. Woods, 1990
5. Hollnagel, E., Woods, D. D. & Leveson, N. G. (2006). Resilience engineering: Concepts and
precepts. Aldershot, UK: Ashgate
6. Jones, 1999
7. Wallace and Ross, 2006
8. Senders and Moray, 1991
9. Roth et al., 1994
10. Sage, 1992
11. Norman, 1988
12. DOE HDBK-1028-2009 ( https://www.standards.doe.gov/standards-documents/1000/1028-
BHdbk-2009-v1/@@images/file)
13. Jens Rasmussen, Annelise M. Pejtersen, L.P.Goodstein (1994). Cognitive Systems
Engineering (https://books.google.com/books?id=i2xRAAAAMAAJ&q=10:+0471011983).
John Wiley & Sons. ISBN 0471011983.
14. "The Management Oversight and Risk Tree (MORT)" (https://web.archive.org/web/20140927
003111/http://www.icma-web.org.uk/06-9_mort.html). International Crisis Management
Association. Archived from the original (http://www.icma-web.org.uk/06-9_mort.html) on 27
September 2014. Retrieved 1 October 2014.
15. Entry for MORT on the FAA Human Factors Workbench (http://www.hf.faa.gov/Workbenchto
ols/default.aspx?rPage=Tooldetails&toolID=151)
16. Hollnagel, E. (1983). "Human error. (Position Paper for NATO Conference on Human Error,
August 1983, Bellagio, Italy" (https://www.researchgate.net/publication/327212150).
17. Hollnagel, E. and Amalberti, R. (2001). The Emperor’s New Clothes, or whatever happened
to “human error”? Invited keynote presentation at 4th International Workshop on Human
Error, Safety and System Development.. Linköping, June 11–12, 2001.
18. Hollnagel, Erik (2009). The ETTO principle : efficiency-thoroughness trade-off : why things
that go right sometimes go wrong. Farnham, England Burlington, VT: Ashgate. ISBN 978-0-
7546-7678-2. OCLC 432428967 (https://www.worldcat.org/oclc/432428967).

Retrieved from "https://en.wikipedia.org/w/index.php?title=Human_error&oldid=1059138180"

This page was last edited on 7 December 2021, at 17:55 (UTC).

Text is available under the Creative Commons Attribution-ShareAlike License; additional terms may apply. By using
this site, you agree to the Terms of Use and Privacy Policy. Wikipedia® is a registered trademark of the Wikimedia
Foundation, Inc., a non-profit organization.

You might also like