Organizational models of accidents
Models of accident causation are used for the risk analysis and risk management of human systems. Since the 1990s they have gained widespread acceptance and use in healthcare, in the aviation safety industry, and in emergency service organizations. Many of them focus on so-called the cumulative act effects.
James Reason
James Reason hypothesizes that most accidents can be traced to one or more of four levels of failure: Organizational influences, unsafe supervision, preconditions for unsafe acts, and the unsafe acts themselves. In this model, an organization's defences against failure are modelled as a series of barriers, with individual weaknesses in individual parts of the system, and are continually varying in size and position. The system as a whole produces failures when all individual barrier weaknesses align, permitting "a trajectory of accident opportunity", so that a hazard passes through all of the holes in all of the defenses, leading to a failure.[1][2] The model includes, in the causal sequence of human failures that leads to an accident or an error, both active failures and latent failures. The former concept of active failures encompasses the unsafe acts that can be directly linked to an accident, such as (in the case of aircraft accidents) pilot errors. The latter concept of latent failures is particularly useful in the process of aircraft accident investigation, since it encourages the study of contributory factors in the system that may have lain dormant for a long time (days, weeks, or months) until they finally contributed to the accident. Latent failures span the first three levels of failure in Reason's model. Preconditions for unsafe acts include fatigued air crew or improper communications practices. Unsafe supervision encompasses such things as, for example, two inexperienced pilots being paired together and sent on a flight into known adverse weather at night. Organizational influences encompass such things as reduction in expenditure on pilot training in times of financial austerity.[2][3]
The same analyses and models apply in the field of healthcare, and many researchers have provided descriptive summaries, anecdotes, and analyses of Reason's work in the field. For example, a latent failure could be the similar packaging of two different prescription drugs that are then stored close to each other in a pharmacy. Such a failure would be a contributory factor in the administration of the wrong drug to a patient. Such research has led to the realization that medical error can be the result of "system flaws, not character flaws", and that individual greed, ignorance, malice, or laziness are not the only causes of error.[4] References to accident models in general can be found in.[5]
See also
- Healthcare error proliferation model
- Latent human error
- Root cause analysis
- Swiss cheese model
- System accident
- Systems engineering
References
- ↑ Smith, D. R., Frazier, D., Reithmaier, L. W. and Miller, J. C. (2001). Controlling Pilot Error. McGraw-Hill Professional. p. 10. ISBN 0-07-137318-7.
- 1 2 Stranks, J. (2007). Human Factors and Behavioural Safety. Butterworth-Heinemann. pp. 130–131. ISBN 9780750681551.
- ↑ Wiegmann, D. A. and Shappell, S. A. (2003). A Human Error Approach to Aviation Accident Analysis: The Human Factors Analysis and Classification System. Ashgate Publishing, Ltd. pp. 48–49. ISBN 0-7546-1873-0.
- ↑ Hinton-Walker, P., Carlton, G., Holden, L. and Stone, P. W. (2006-06-30). "The intersection of patient safety and nursing research". In J. J. Fitzpatrick and P. Hinton-Walker. Annual Review of Nursing Research Volume 24: Focus on Patient Safety. Springer Publishing. pp. 8–9. ISBN 0-8261-4136-6.
- ↑ Taylor, G.A., Easter, K.M., Hegney, R.P. (2004). Enhancing Occupational Safety and Health. Elsevier. pp. 241–245, see also pages 140–141 and pages 147–153, also on Kindle. ISBN 0750661976.