Katana VentraIP

Human error

Human error is an action that has been done but that was "not intended by the actor; not desired by a set of rules or an external observer; or that led the task or system outside its acceptable limits".[1] Human error has been cited as a primary cause and contributing factor in disasters and accidents in industries as diverse as nuclear power (e.g., the Three Mile Island accident), aviation, space exploration (e.g., the Space Shuttle Challenger disaster and Space Shuttle Columbia disaster), and medicine. Prevention of human error is generally seen as a major contributor to reliability and safety of (complex) systems. Human error is one of the many contributing causes of risk events.

For other uses, see Human Error (disambiguation).

versus endogenous error (i.e., originating outside versus inside the individual)[8]

exogenous

[9]

by level of analysis; for example, perceptual (e.g., ) versus cognitive versus communication versus organizational

optical illusions

[12]

active error - observable, physical action that changes equipment, system, or facility state, resulting in immediate undesired consequences

resulting in hidden organization-related weaknesses or equipment flaws that lie dormant; such errors can go unnoticed at the time they occur, having no immediate apparent outcome

latent human error

equipment dependency error – lack of vigilance due to the assumption that hardware controls or physical safety devices will always work

– lack of vigilance created by the social (interpersonal) interaction between two or more people working together

team error

personal dependencies error – unsafe attitudes and traps of human nature leading to complacency and overconfidence

There are many ways to categorize human error:[6][7]

Sources[edit]

The cognitive study of human error is a very active research field, including work related to limits of memory and attention and also to decision making strategies such as the availability heuristic and other cognitive biases. Such heuristics and biases are strategies that are useful and often correct, but can lead to systematic patterns of error.


Misunderstandings as a topic in human communication have been studied in conversation analysis, such as the examination of violations of the cooperative principle and Gricean maxims.


Organizational studies of error or dysfunction have included studies of safety culture. One technique for analyzing complex systems failure that incorporates organizational analysis is management oversight risk tree analysis.[13][14][15]

Behavior-shaping constraint

Error-tolerant design

Human reliability

Poka-yoke

SHELL model

User error

Technique for human error-rate prediction

Fallacy

To err is human

Autrey, T.D. (2015). . Human Performance Association. Archived from the original on 2021-04-11. Retrieved 2020-08-21.

6-Hour Safety Culture: How to Sustainably Reduce Human Error and Risk (and do what training alone can't possibly do)