- Healthcare error proliferation model
The Healthcare Error Proliferation Model is the adaptation of James Reason’s
Swiss Cheese Modeldesigned to illustrate the complexityinherent in the contemporary healthcaredelivery system and the attributionof human error within these systems. The Healthcare Error Proliferation Model (Palmieri, et. al, 2008) explains the sequence of events typically leading to adverse outcomes, emphasizing the role that organizational and external cultures contribute to error identification prevention, mitigation, and defense construction.
Healthcare systems are "complex" in that they are diverse in both structure (e.g. nursing units, pharmacies, emergency departments, operating rooms) and professional mix (e.g. nurses, physicians, pharmacists, administrators, therapists) and made up of multiple interconnected elements with "adaptive" tendencies in that they have the capacity to change and learn from experience. The term "complex adaptive systems" (CAS) was coined at the interdisciplinary
Santa Fe Institute(SFI), by John H. Holland, and Murray Gell-Mann. Subsequently, scholars such as Ruth Anderson, Rubin McDaniels, and Paul Cilliers have extended CAS theory and research to the social sciences such as education and healthcare.
The Healthcare Error Proliferation Model (HEPM) adapts the Swiss Cheese Model (Reason, 1990) to the complexity of healthcare delivery systems and integrated organizations. The Swiss Cheese Model, likens the complex adaptive system to multiple hole infested slices of Swiss cheese positioned side-by-side (Reason, 1990, 2000). The cheese slices are dubbed defensive layers to describe their role and function as the system location outfitted with features capable of intercepting and deflecting hazards. The layers represent discrete locations or organizational levels potentially populated with errors permitting error progression. The four layers include: 1) organizational leadership, 2) risky supervision, 3) situations for unsafe practices, and 4) unsafe performance.
The HEPM portrays hospitals as having multiple operational defensive layers outfitted with essential elements necessary to maintain key defensive barricades (Cook & O'Connor, 2005; Reason, 2000). By examining the defensive layers attributes, prospective locales of failure, the etiology of accidents might be revealed (Leape et al., 1995). Experts have discussed the importance of examining these layers within the context of the complex adaptive healthcare system (Kohn et al., 2000; Wiegmann & Shappell, 2003) and considering the psychological safety of clinicians. Hence, this model expands Reason’s seminal work.
The model incorporates the complex adaptive healthcare system as a key characteristic. Complex adaptive systems characteristically demonstrate self-organization as diverse agents interact spontaneously in nonlinear relationships [cite journal | author = Anderson, R. A., Issel, M. L., & McDaniel, R. R | year = 2003 | title = Nursing homes as complex adaptive systems: Relationship between management practice and resident outcomes | journal = Nursing Research | volume = 52 | issue = 1 | pages = 12–21.] [cite book | author = Cilliers, P. | year = 1998 | title = Complexity and post modernism: Understanding complex systems | publisher = New York: Routledgel. | ISBN = 978-0415152860] where professionals act as information processors (Cilliers, 1998; McDaniel & Driebe, 2001) and co-evolve with the environment (Casti, 1997). Healthcare professionals function in the system as diverse actors within the complex environment utilizing different methods to process information (Coleman, 1999) and solve systemic problems within and across organizational layers (McDaniel & Driebe, 2001).
A complex adaptive healthcare system (CAHS) is a care delivery enterprise with diverse clinical and administrative agents acting spontaneously, interacting in nonlinear networks where agents and patients are information processors, and actively co-evolve with their environment with the purposed to produce safe and reliable patient-centered outcomes. [cite journal | author = Palmieri, P. A., DeLucia, P. R., Ott, T. E., Peterson, L. T., & Green, A. | year = 2008 | title = The anatomy and physiology of error in averse healthcare events | journal = Advances in Health Care Management | volume = 7 | pages = 33–68 | doi = 10.1016/S1474-8231(08)07003-1 | accessdate 2008-08-29]
* Anderson, R. A., Issel, M. L., & McDaniel, R. R. (2003). Nursing homes as complex adaptive systems: Relationship between management practice and resident outcomes. Nursing Research, 52(1): 12-21.
* Berta, W. B. & Baker, R. (2004). Factors that impact the transfer and retention of best practices for reducing error in hospitals. Health Care Management Review, 29(2): 90-97.
* Chiles, J. R. (2002). Inviting disaster: Lessons from the edge of technology. New York: HarperCollins Publishers.
* Coleman, H. J. (1999). What enables self-organizing behavior in business. Emergence, 1(1): 33-48.
* Cook, R. I., Render, M., & Woods, D. D. (2000). Gaps in the continuity of care and progress on patient safety. British Medical Journal, 320(7237): 791-794.
* Leape, L. L., Bates, D. W., Cullen, D. J., Cooper, J., Demonaco, H. J., Gallivan, T., R., H., Ives, J., Laird, N., Laffel, G., Nemeskal, R., Peterson, L. A., Porter, K., Servi, D., Shea, B. F., Small, S. D., Sweitzer, B. J., Thompson, B. T., & van der Vliet, M. (1995). Systems analysis of adverse drug events. ADE prevention study group. Journal of the American Medical Association, 274(1): 35-43.
* Leape, L. L. & Berwick, D. M. (2005). Five years after "To err is human": What have we learned? Journal of the American Medical Association, 293(19): 2384-2390.
* Leduc, P. A., Rash, C. E., & Manning, M. S. (2005). Human factors in UAV accidents, Special Operations Technology, Online edition ed., Vol. 3.
* Leonard, M. L., Frankel, A., & Simmonds, T. (2004). Achieving safe and reliable healthcare: Strategies and solutions. Chicago: Health Administration Press.
* Rasmussen, J. (1990). The role of error in organizing behavior. Ergonomics, 33: 1185-1199.
* Rasmussen, J. (1999). The concept of human error: Is it useful for the design of safe systems in health care? In C. Vincent & B. deMoll (Eds.), Risk and safety in medicine: 31-47. London: Elsevier.
* Reason, J. T. & Mycielska, K. (1982). Absent-minded? The psychology of mental lapses and everyday errors. Englewood Cliffs, NJ: Prentice-Hall Inc.
* Reason, J. T. (1990). Human error. New York: Cambridge University Press.
* Reason, J. T. (1997). Managing the rosks of organizational accidents. Aldershot: Ashgate Publishing.
* Reason, J. T. (1998). Managing the risks of organizational accidents. Aldershot, England: Ashgate.Reason, J. T. 2000. Human error: Models and management. British Medical Journal, 320:7680770.Reason, J. T., Carthey, J., & de Leval, M. R. (2001). Diagnosing vulnerable system syndrome: An essential prerequisite to effective risk management. Quality in Health Care, 10(S2): 21-25.Reason, J. T. & Hobbs, A. (2003). Managing maintenance error: A practical guide. Aldershot, England: Ashgate.Roberts, K. (1990). Some characteristics of one type of high reliability organization. Organization Science, 1(2): 160-176.Roberts, K. H. (2002). High reliability systems. Report on the institute of medicine committee on data standards for patient safety on September 23, 2003.
;BooksCilliers, P. (1998) Complexity and post modernism: Understanding complex systems. New York: Routledge. (ISBN: 978-0415152860)
* Holland, J. H. (1992). Adaptation in natural and artificial systems. Cambridge, MA: MIT Press. (ISBN: 978-0262581110)
* Holland, J. H. (1995). Hidden order: How adaptation builds complexity. Reading, MA: Helix Books. (ISBN: 978-0201442304)
* Holland, J. H. (1998). Emergence: From chaos to order. Reading, MA: Addison-Wesley. (ISBN: 978-0738201429)
* Waldrop, M. M. (1990). Complexity: The emerging science at the edge of order and chaos. New York: Simon & Schuster (ISBN: 978-0671767891)
None at this time.
Adverse effect (medicine)
International healthcare accreditation
Patient safety organization
Patient Safety and Nursing
Serious adverse event
Swiss Cheese modelof accident causation in human systems
Wikimedia Foundation. 2010.
Look at other dictionaries:
Medical error — A medical error may be defined as a preventable adverse effect of care, whether or not it is evident or harmful to the patient. This might include an inaccurate or incomplete diagnosis or treatment of a disease, injury, syndrome, behavior,… … Wikipedia
Organizational models of accidents — Models of accident causation are used for the risk analysis and risk management of human systems. Since the 1990s they have gained widespread acceptance and use in healthcare, in the aviation safety industry, and in emergency service… … Wikipedia
Business and Industry Review — ▪ 1999 Introduction Overview Annual Average Rates of Growth of Manufacturing Output, 1980 97, Table Pattern of Output, 1994 97, Table Index Numbers of Production, Employment, and Productivity in Manufacturing Industries, Table (For Annual… … Universalium
Hyaluronan — Identifiers CAS number 9004 61 9 … Wikipedia
Predictive analytics — encompasses a variety of techniques from statistics and data mining that analyze current and historical data to make predictions about future events. Such predictions rarely take the form of absolute statements, and are more likely to be… … Wikipedia
Professional certification — Professional certification, trade certification, or professional designation, often called simply certification or qualification, is a designation earned by a person to assure qualification to perform a job or task. Many certifications are used… … Wikipedia
Health and Disease — ▪ 2009 Introduction Food and Drug Safety. In 2008 the contamination of infant formula and related dairy products with melamine in China led to widespread health problems in children, including urinary problems and possible renal tube… … Universalium
Fusion power — The Sun is a natural fusion reactor. Fusion power is the power generated by nuclear fusion processes. In fusion reactions two light atomic nuclei fuse together to form a heavier nucleus (in contrast with fission power). In doing so they release a … Wikipedia
Hermeneutics — In religious studies and social philosophy, hermeneutics (English pronunciation: /hɜrməˈn(j)uːtɨks/) is the study of the theory and practice of interpretation. Traditional hermeneutics which includes Biblical hermeneutics refers to the study of… … Wikipedia
Heparin — Systematic (IUPAC) name see Hep … Wikipedia