Human reliability

Human reliability

Human reliability is related to the field of human factors engineering, and refers to the reliability of humans in fields such as manufacturing, transportation, the military, or medicine. Human performance can be affected by many factors such as age, circadian rhythms, state of mind, physical health, attitude, emotions, propensity for certain common mistakes, errors and cognitive biases, etc.

Human reliability is very important due to the contributions of humans to the resilience of systems and to possible adverse consequences of human errors or oversights, especially when the human is a crucial part of the large socio-technical systems as is common today. User-centered design and error-tolerant design are just two of many terms used to describe efforts to make technology better suited to operation by humans.

Human Reliability Analysis Techniques

A variety of methods exist for Human Reliability Analysis (HRA) (see Kirwan and Ainsworth, 1992; Kirwan, 1994). Two general classes of methods are those based on probabilistic risk assessment (PRA) and those based on a cognitive theory of control.

PRA-Based Techniques

One method for analyzing human reliability is a straightforward extension of probabilistic risk assessment (PRA): in the same way that equipment can fail in a plant, so can a human operator commit errors. In both cases, an analysis (functional decomposition for equipment and task analysis for humans) would articulate a level of detail for which failure or error probabilities can be assigned. This basic idea is behind the Technique for Human Error Rate Prediction (THERP) (Swain & Guttman, 1983). THERP is intended to generate human error probabilities that would be incorporated into a PRA. The Accident Sequence Evaluation Program (ASEP) Human Reliability Procedure is a simplified form of THERP; an associated computational tool is [http://www.osti.gov/energycitations/product.biblio.jsp?osti_id=10162198 Simplified Human Error Analysis Code (SHEAN) (Wilson, 1993)] . More recently, the US Nuclear Regulatory Commission has published the Standardized Plant Analysis Risk (SPAR) human reliability analysis method also because of human error ( [http://www.nrc.gov/reading-rm/doc-collections/nuregs/contract/cr6883/ SPAR-H] ) (Gertman et al, 2005).

Cognitive Control Based Techniques

Erik Hollnagel has developed this line of thought in his work on the Contextual Control Model (COCOM) (Hollnagel, 1993) and the Cognitive Reliability and Error Analysis Method (CREAM) (Hollnagel, 1998). COCOM models human performance as a set of control modes -- strategic (based on long-term planning), tactical (based on procedures), opportunistic (based on present context), and scrambled (random) -- and proposes a model of how transitions between these control modes occur. This model of control mode transition consists of a number of factors, including the human operator's estimate of the outcome of the action (success or failure), the time remaining to accomplish the action (adequate or inadequate), and the number of simultaneous goals of the human operator at that time. CREAM is a human reliability analysis method that is based on COCOM.

Related Techniques

Related techniques in safety engineering and reliability engineering include Failure mode and effects analysis, Hazop, Fault tree, and SAPHIRE: Systems Analysis Programs for Hands-on Integrated Reliability Evaluations.

Human Error

Human error has been cited as a cause or contributing factor in disasters and accidents in industries as diverse as nuclear power (e.g., Three Mile Island accident), aviation (see pilot error), space exploration (e.g., Space Shuttle Challenger Disaster), and medicine (see medical error). It is also important to stress that "human error" mechanisms are the same as "human performance" mechanisms; performance later categorized as 'error' is done so in hindsight (Reason, 1991; Woods, 1990): thereofore actions later termed "human error" are actually part of the ordinary spectrum of human behaviour. The study of absent-mindedness in everyday life provides ample documentation and categorization of such aspects of behavior. Recently, human error has been reconceptualized as resiliency to emphasize the positive aspects that humans bring to the operation of technical systems (see Hollnagel, Woods and Leveson, 2006).

Categories of Human Error

There are many ways to categorize human error (see Jones, 1999).
* exogenous versus endogenous (i.e., originating outside versus inside the individual) (Senders and Moray, 1991)
* situation assessment versus response planning (e.g., Roth et al, 1994) and related distinctions in
** errors in problem detection (also see signal detection theory)
** errors in problem diagnosis (also see problem solving)
** errors in action planning and execution (Sage, 1992) (for example: slips or errors of execution versus mistakes or errors of intention; see Norman, 1988; Reason, 1991)
* By level of analysis; for example, perceptual (e.g., optical illusions) versus cognitive versus communication versus organizational.

The cognitive study of human error is a very active research field, including work related to limits of memory and attention and also to decision making strategies such as the availability heuristic and other cognitive biases. Such heuristics and biases are strategies that are useful and often correct, but can lead to systematic patterns of error.

Misunderstandings as a topic in human communication have been studied in Conversation Analysis, such as the examination of violations of the Cooperative principle and Gricean maxims.

Organizational studies of error or dysfunction have included studies of safety culture. One technique for organizational analysis is the Management Oversight Risk Tree (MORT) (Kirwan and Ainsworth, 1992; also search for MORT on the [http://www2.hf.faa.gov/workbenchtools/ FAA Human Factors Workbench] .

Human Factors Analysis and Classification System (HFACS)

:"See Human Factors Analysis and Classification System in Main article: National Fire Fighter Near-Miss Reporting System"The Human Factors Analysis and Classification System (HFACS) was developed initially as a framework to understand "human error" as a cause of aviation accidents (Shappell and Wiegmann, 2000; Wiegmann and Shappell, 2003). It is based on James Reason's Swiss cheese model of human error in complex systems. HFACS distinguishes between the "active failures" of unsafe acts, and "latent failures" of preconditions for unsafe acts, unsafe supervision, and organizational influences. These categories were developed empirically on the basis of many aviation accident reports.

Unsafe acts are performed by the human operator "on the front line" (e.g., the pilot, the air traffic controller, the driver). Unsafe acts can be either errors (in perception, decision making or skill-based performance) or violations (routine or exceptional). The "errors" here are similar to the above discussion. Violations are the deliberate disregard for rules and procedures. As the name implies, routine violations are those that occur habitually and are usually tolerated by the organization or authority. Exceptional violations are unusual and often extreme. For example, driving 60 mph in a 55-mph zone speed limit is a routine violation, but driving 130 mph in the same zone is exceptional.

There are two types of preconditions for unsafe acts: those that relate to the human operator's internal state and those that relate to the human operator's practices or ways of working. Adverse internal states include those related to physiology (e.g., illness) and mental state (e.g., mentally fatigued, distracted). A third aspect of 'internal state' is really a mismatch between the operator's ability and the task demands; for example, the operator may be unable to make visual judgments or react quickly enough to support the task at hand. Poor operator practices are another type of precondition for unsafe acts. These include poor crew resource management (issues such as leadership and communication) and poor personal readiness practices (e.g., violating the crew rest requirements in aviation).

Four types of unsafe supervision are: Inadequate supervision; Planned inappropriate operations; Failure to correct a known problem; and Supervisory violations.

Organizational influences include those related to resources management (e.g., inadequate human or financial resources), organizational climate (structures, policies, and culture), and organizational processes (such as procedures, schedules, oversight).

Controversies

Some researchers have argued that the dichotomy of human actions as "correct" or "incorrect" is a harmful oversimplification of a complex phenomena (see Hollnagel and Amalberti, 2001). A focus on the variability of human performance and how human operators (and organizations) can manage that variability may be a more fruitful approach. Furthermore, as noted above, the concept of "resiliency" highlights the positive roles that humans can play in complex systems.

ee Also

CCPS, Guidelines for Preventing Human Error. This book explains about qualitative and quantitative methodology for predicting human error. Qualitative methodology called SPEAR: Systems for Predicting Human Error and Recovery, and quantitative methodology also includes THERP, etc.

ee also

* Performance shaping factor
* Latent human error

References

*cite book|author=Gertman, D. L. and Blackman, H. S.|year=2001|title=Human reliability and safety analysis data handbook|publisher=Wiley
*cite book|author=Gertman, D., Blackman, H., Marble, J., Byers, J. and Smith, C.|year=2005|title=The SPAR-H human reliability analysis method. NUREG/CR-6883. Idaho National Laboratory, prepared for U. S. Nuclear Regulatory Commission [http://www.nrc.gov/reading-rm/doc-collections/nuregs/contract/cr6883/]
*cite book|author=Hollnagel, E.|year=1993|title=Human reliability analysis: Context and control|publisher= Academic Press
*cite book|author=Hollnagel, E.|year=1998|title=Cognitive reliability and error analysis method: CREAM|publisher=Elsevier
*cite book|author=Hollnagel, E. and Amalberti, R.|year=2001|title=The Emperor’s New Clothes, or whatever happened to “human error”? Invited keynote presentation at 4th International Workshop on Human Error, Safety and System Development.|publisher=Linköping, June 11-12, 2001
*cite book|author=Hollnagel, E., Woods, D. D., and Leveson, N. (Eds.)|year=2006|title=Resilience engineering: Concepts and precepts|publisher=Ashgate
*cite book|author=Jones, P. M.|year=1999|title=Human error and its amelioration. In "Handbook of Systems Engineering and Management" (A. P. Sage and W. B. Rouse, eds.), 687-702|publisher= Wiley
*cite book|author=Kirwan, B.|year=1994|title=A practical guide to human reliability assessment|publisher=Taylor & Francis
*cite book|author=Kirwan, B. and Ainsworth, L. (Eds.)|year=1992|title=A guide to task analysis|publisher=Taylor & Francis
*cite book|author=Norman, D.|year=1988|title=The psychology of everyday things|publisher=Basic Books
*cite book|author=Reason, J.|year=1990|title=Human error|publisher=Cambridge University Press
*cite book|author=Roth, E. et al|year=1994|title=An empirical investigation of operator performance in cognitive demanding simulated emergencies. NUREG/CR-6208, Westinghouse Science and Technology Center|publisher=Report prepared for Nuclear Regulatory Commission
*cite book|author=Sage, A. P.|year=1992|title=Systems engineering|publisher=Wiley
*cite book|author=Senders, J. and Moray, N.|year=1991|title=Human error: Cause, prediction, and reduction|publisher=Lawrence Erlbaum Associates
*cite book|author=Shappell, S. & Wiegmann, D.|year=2000|title=The human factors analysis and classification system - HFACS. DOT/FAA/AM-00/7, Office of Aviation Medicine, Federal Aviation Administration, Department of Transportation. [http://www.nifc.gov/safety_study/accident_invest/humanfactors_class&anly.pdf]
*cite book|author=Swain, A. D., & Guttman, H. E.|year=1983|title=Handbook of human reliability analysis with emphasis on nuclear power plant applications.|publisher=NUREG/CR-1278 (Washington D.C.)
*cite book|author=Wiegmann, D. & Shappell, S.|year=2003|title=A human error approach to aviation accident analysis: The human factors analysis and classification system.|publisher=Ashgate
*cite book|author=Wilson, J.R.|year=1993|title=SHEAN (Simplified Human Error Analysis code) and automated THERP|publisher=United States Department of Energy Technical Report Number WINCO--11908 [http://www.osti.gov/energycitations/product.biblio.jsp?osti_id=10162198]
*cite book|author=Woods, D. D.|year=1990|title=Modeling and predicting human error. In J. Elkind, S. Card, J. Hochberg, and B. Huey (Eds.), "Human performance models for computer-aided engineering" (248-274)|publisher=Academic Press

Further reading

*cite book|author=Autrey, T.D. |year=2007|title= [Mistake-Proofing Six Sigma: How to Minimize Project Scope and Reduce Human Error] http://www.practicingperfectioninstitute.com/reports/sixsigma.aspx|publisher=Practicing Perfection Institute
*cite book|author=Davies, J.B., Ross, A., Wallace, B. and Wright, L.|year=2003|title=Safety Management: a Qualitative Systems Approach|publisher=Taylor and Francis
*cite book|author=Dismukes, R. K., Berman, B. A., and Loukopoulos, L. D.|year=2007|title=The limits of expertise: Rethinking pilot error and the causes of airline accidents|publisher=Ashgate
*cite book|author=Forester, J., Kolaczkowski, A., Lois, E., and Kelly, D.|year=2006|title=Evaluation of human reliability analysis methods against good practices. NUREG-1842 Final Report|publisher=U. S. Nuclear Regulatory Commission [http://www.nrc.gov/reading-rm/doc-collections/nuregs/staff/sr1842/]
*cite book|author=Goodstein, L. P., Andersen, H. B., and Olsen, S. E. (Eds.)|year=1988|title=Tasks, errors, and mental models|publisher=Taylor and Francis
*cite book|author=Grabowski, M. and Roberts, K. H.|year=1996|title=doi-inline|10.1109/3468.477856|Human and organizational error in large scale systems, "IEEE Transactions on Systems, Man, and Cybernetics", Volume 26, No. 1, January 1996, 2-16
*cite book|author=Greenbaum, J. and Kyng, M. (Eds.)|year=1991|title=Design at work: Cooperative design of computer systems|publisher=Lawrence Erlbaum Associates
*cite book|author=Harrison, M.|year=2004|title=Human error analysis and reliability assessment|publisher=Workshop on Human Computer Interaction and Dependability, 46th IFIP Working Group 10.4 Meeting, Siena, Italy, July 3-7, 2004 [http://www.laas.fr/IFIPWG/Workshops&Meetings/46/05-Harrison.pdf]
*cite book|author=Hollnagel, E.|year=1991|title= The phenotype of erroneous actions: Implications for HCI design. "In G. W. R. Weir and J. L. Alty (Eds.), Human-computer interaction and complex systems"|publisher=Academic Press
*cite book|author=Hutchins, E.|year=1995|title=Cognition in the wild|publisher=MIT Press
*cite book|author=Kahneman, D., Slovic, P. and Tversky, A. (Eds.)|year=1982|title=Judgment under uncertainty: Heuristics and biases|publisher=Cambridge University Press
*cite book|author=Leveson, N.|year=1995|title=Safeware: System safety and computers|publisher=Addison-Wesley
*cite book|author=Morgan, G.|year=1986|title=Images of organization|publisher=Sage
*cite book|author=Mura, S. S.|year=1983|title=Licensing violations: Legitimate violations of Grice's conversational principle. "In R. Craig and K. Tracy (Eds.), Conversational coherence: Form, structure, and strategy" (101-115)|publisher=Sage
*cite book|author=Perrow, C.|year=1984|title=Normal accidents: Living with high-risk technologies|publisher=Basic Books
*cite book|author=Rasmussen, J.|year=1983|title=Skills, rules, and knowledge: Signals, signs, and symbols and other distinctions in human performance models. "IEEE Transactions on Systems, Man, and Cybernetics", SMC-13, 257-267
*cite book|author=Rasmussen, J.|year=1986|title=Information processing and human-machine interaction: An approach to cognitive engineering|publisher=Wiley
*cite book|author=Silverman, B.|year=1992|title=Critiquing human error: A knowledge-based human-computer collaboration approach|publisher=Academic Press
*cite book|author=Swets, J.|year=1996|title=Signal detection theory and ROC analysis in psychology and diagnostics: Collected papers|publisher=Lawrence Erlbaum Associates
*cite book|author=Tversky, A. and Kahneman, D.|year=1974|title= Judgment under uncertainty: Heuristics and biases. "Science", 185, 1124-1131
*cite book|author=Vaughan, D.|year=1996|title=The Challenger launch decision: Risky technology, culture, and deviance at NASA|publisher=University of Chicago Press
*cite book|author=Wallace, B. and Ross, A.|year=2006|title=Beyond human error|publisher=CRC Press
*cite book|author=Woods, D. D., Johannesen, L., Cook, R., and Sarter, N.|year=1994|title=Behind human error: Cognitive systems, computers, and hindsight. CSERIAC SOAR Report 94-01|publisher=Crew Systems Ergonomics Information Analysis Center, Wright-Patterson Air Force Base, Ohio

External links

tandards and Guidance Documents

* [http://stdsbbs.ieee.org/descr/1082-1997/ IEEE Standard 1082 (1997): IEEE Guide for Incorporating Human Action Reliability Analysis for Nuclear Power Generating Stations]

Tools

* [http://www.eurocontrol.int/hifa/public/standard_page/Hifa_HifaData_Tools_HumErr.html Eurocontrol Human Error Tools]
* [http://www.epri.com/hra/discuss.html EPRI HRA Calculator]

Research Labs

* [http://www.ida.liu.se/~eriho/ Erik Hollnagel] at the [http://www.ida.liu.se/labs/cselab/ Cognitive Systems Engineering Laboratory] at Linkoping University
* [http://reliability.sandia.gov/Human_Factor_Engineering/Human_Reliability_Analysis/human_reliability_analysis.html Human Reliability Analysis] at the US Sandia National Laboratories
* [http://www.orau.gov/chrs/chrs.htm Center for Human Reliability Studies] at the US Oak Ridge National Laboratory
* [http://hsi.arc.nasa.gov/flightcognition/| Flight Cognition Laboratory] at NASA Ames Research Center
* [http://csel.eng.ohio-state.edu/woods/ David Woods ] at the [http://csel.eng.ohio-state.edu/ Cognitive Systems Engineering Laboratory] at The Ohio State University

Media coverage

* [http://www.iienet.org/magazine/magazinefiles/IENOV2004_outliers_p66.pdf “Human Reliability. We break down just like machines“] Industrial Engineer - November 2004, 36(11): 66

Networking

* [http://www.linkedin.com/groups?gid=673677 High Reliability Management group at LinkedIn.com]


Wikimedia Foundation. 2010.

Игры ⚽ Поможем написать курсовую

Look at other dictionaries:

  • Human error assessment and reduction technique — (HEART) is a technique used in the field of human reliability assessment (HRA), for the purposes of evaluating the probability of a human error occurring throughout the completion of a specific task. From such analyses measures can then be taken… …   Wikipedia

  • Reliability theory of aging and longevity — is a scientific approach aimed to gain theoretical insights into mechanisms of biological aging and species survival patterns by applying a general theory of systems failure, known as reliability theory. OverviewReliability theory allows… …   Wikipedia

  • Human cognitive reliability correlation — (HCR) is a technique used in the field of Human reliability Assessment (HRA), for the purposes of evaluating the probability of a human error occurring throughout the completion of a specific task. From such analyses measures can then be taken to …   Wikipedia

  • Reliability engineering — is an engineering field, that deals with the study of reliability: the ability of a system or component to perform its required functions under stated conditions for a specified period of time. [ Definition by IEEE] It is often reported in terms… …   Wikipedia

  • Human information processor model — Human processor model or MHP (Model Human Processor) is a cognitive modeling method used to calculate how long it takes to perform a certain task. Other cognitive modeling methods include parallel design, GOMS, and KLM (human computer… …   Wikipedia

  • Human Error — may also refer to: * Human Error (artist) is a Polish electronic musician. * Human Error (album), album by Unseen Terror * Human Error ( Voyager episode), Star Trek: Voyager episode * Human Error ( House episode), an episode of the series House M …   Wikipedia

  • Human multitasking — Human multi tasking or multitasking is the performance by an individual of appearing to handle more than one task at the same time. The term is derived from computer multitasking. An example of multitasking is listening to a radio interview while …   Wikipedia

  • Human factors — For other uses, see The Human Factor (disambiguation). Research subject in a human fatigue study. Human factors science or human factors technologies is a multidisciplinary field incorporating contributions from psychology, engineering,… …   Wikipedia

  • Reliability, Availability and Serviceability — are computer hardware engineering terms. It originated from IBM to advertise the robustness of their mainframe computers. The concept is often known by the acronym RAS . Mainframe computers have a multitude of features that help them stay up… …   Wikipedia

  • Reliability of Wikipedia — Vandalism of a Wikipedia article. The section on the left is the normal, undamaged version; and on the right is the edited, damaged version. The reliabili …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”