Home Current Issue Previous Issues Published Ahead-of-Print Collections For Authors Journal Info
Skip Navigation LinksHome > February 2005 - Volume 80 - Issue 2 > The SHEL Model: A Useful Tool for Analyzing and Teaching the...
Academic Medicine:
Article

The SHEL Model: A Useful Tool for Analyzing and Teaching the Contribution of Human Factors to Medical Error

Molloy, Gerard J.; O'Boyle, Ciarán A. PhD

Free Access
Article Outline
Collapse Box

Author Information

Mr. Molloy is a research fellow at the School of Psychology, University of Aberdeen, Aberdeen, Scotland.

Dr. O'Boyle is professor of psychology, Department of Psychology, Royal College of Surgeons, Dublin, Ireland.

Correspondence should be addressed to Mr. Molloy, School of Psychology, University of Aberdeen, College of Life Sciences and Medicine, William Guild Building, University of Aberdeen, Aberdeen, AB24 2UB, Scotland; telephone: + 44-1-224-273141; e-mail: 〈g.molloy@abdn.ac.uk〉.

Collapse Box

Abstract

Recent reports on the problem of medical error pointed to a discipline that has been until recently, largely disregarded by the medical profession. The interdisciplinary science of Human Factors, the reports argue, provides a pragmatic framework for analyzing and assessing risk and reducing error in health care. The argument for applying Human Factors analysis to health care is increasingly accepted, and the application of Human Factors systems models for understanding medical error in particular have proved to be especially illuminating. The authors present a conceptual model of Human Factors–the SHEL model (named after the initial letters of its components’ names, Software, Hardware, Environment, and Liveware)–that has been used in investigations of error in aviation. The authors use this simple model to examine and elucidate the Human Factors issues in a specific real-life example of medical error. The SHEL model is particularly useful in examining Human Factors issues in microsystems in health care such as the emergency room or the operating theatre; it argues that mismatches at the interface between the components in these health care microsystems are often conducive to medical errors. The authors propose that the SHEL model may have some unexploited potential in analyzing error and in training medical professionals about the science of Human Factors and its application to medical error. Empirical studies are needed, however, to ascertain the optimal amount of training needed to make clinically significant reductions in the occurrence of medical error.

The development of the multidisciplinary science of Human Factors has taught us a great deal about the ways in which individuals and their working environment interact. Essentially, Human Factors concerns the relationship of individuals to machines and equipment, procedures, and the environment around them; it also concerns these elements’ relationships with one another. In its groundbreaking report on medical error, the Institute of Medicine1 stated that “Human Factors is … the study of the interrelationships between humans, the tools they use and the environment in which they live and work.” Its twin objectives can be summarized as maximizing the effectiveness of the system (which includes safety and efficiency) and the well being of the individual.

Human Factors education and training have been greatly assisted by the development of models that attempt to elucidate and define the field and its applications. The literature on medical error and risk management draws heavily on the work of psychologist James T. Reason, particularly his model of accident causation.2–4 Such “system” approaches are increasingly accepted as crucial in efforts to tackle the problem of medical error. 5–8

Several researchers have identified parallels between the aircraft cockpit and health care environments. Schaefer and Helmreich9–11 have pioneered the application of aviation Human Factors research to intensive medical settings, and their work is increasingly recognized as making a significant contribution to risk management and error prevention in health care.

In this article we introduce a conceptual model of Human Factors, the SHEL model,12 that is well-known in aviation but has not, to the best of our knowledge, yet been applied in health care. The SHEL model outlines the training, environmental, and resource issues that could contribute to errors in “microsystems” such as the operating theatre or emergency room, rather than in the entire organizational structure (the “macrosystem”) within which the individual has to work. This model was specifically designed and developed to aid flight crew training and to illustrate the principles of crew resource management in aviation, and is often used in the analysis of accidents and incidents.13 The model could provide a useful supplement to the widely cited Reason model5 in analyzing medical error and in teaching medical professionals about Human Factors. It is particularly relevant for analyzing the functioning of teams in highly technical environments such as operating theaters and hospital emergency rooms.

Back to Top | Article Outline

The SHEL Model

The SHEL concept—named after the initial letters of its components’ names, Software, Hardware, Environment, and Liveware—has been developed and advocated principally in the aviation literature by Hawkins.12 The model is often depicted in diagrammatic form as in Figure 1. Using this model, the relevant psychological and behavioral elements of the individual (i.e., the Human Factors), and the interactions between the individual and other components of the system, can be identified and examined. In the following paragraphs, we examine the SHEL model's components in more detail.

Figure 1
Figure 1
Image Tools
Back to Top | Article Outline
Liveware

In the center of the model is the human operator, or the Liveware, represented by the L. In health care, this might be a doctor, nurse, pharmacist, laboratory technician or any individual whose job is relevant to patient care. This is the most valuable as well as the most flexible component in the system but it is subject to many variations in performance and suffers many limitations, most of which are now predictable in general terms. Individual factors include physical characteristics, personality, communication style, motivation, risk orientation, learning styles, stress tolerance, skills, knowledge, and attitudes. In medicine, the proposed shift from time-based training to competency-based training is relevant here. The process of defining areas of competence and mapping them onto the core curriculum at both undergraduate and postgraduate levels is designed to ensure that the individual is capable of functioning effectively and safely in a range of situations and environments.

Selection and recruitment procedures are also relevant, since even the best training systems cannot obviate all of the intrinsic limitations of individuals. The other components of the SHEL model must be adapted and matched to this component for optimum performance. The uneven edges of the blocks in the SHEL model illustrate that the interdependent components of complex systems will never match perfectly and that a perpetual effort should be made to improve the match of such systems. Problems tend to arise when there is a mismatch between one component and another at their interface.

Back to Top | Article Outline
Liveware–Hardware.

The first component that must be matched with the characteristics of the human operator is the Hardware, and much of the science of ergonomics is concerned with this interface. The introduction and subsequent development of monitoring systems in anesthesia provides a good example of the importance of this interface in maximizing patient safety.14 Hawkins12 highlights the danger of the natural tendency of humans to adapt to Liveware–Hardware mismatches by developing what are most commonly referred to as “tricks of the trade.” Often this strategy can conceal potential hazards that may be easily overcome by more experienced health care professionals, but which may pose a great risk to junior staff, who may not have the wealth of experience to overcome the challenges posed by poor equipment design.

Back to Top | Article Outline
Liveware–Software.

The Liveware–Software interface encompasses the nonphysical aspects of the system such as the procedures, manuals, checklists, symbology, and, increasingly, the computer programs. The problems here are often less tangible than those associated with the Liveware–Hardware interface and more difficult to resolve. In aviation, a great deal of attention is paid to this interface, since pilots must operate in a work environment that is dominated by standard operating procedures and ubiquitous checklists. An adequate Liveware–Software interface should produce a situation where procedural omissions are very difficult to make. The requirement at Dana-Farber, following the fatal chemotherapy overdoses at that facility in 1995, that all drug prescriptions must be processed through a computer system that is designed to check dosage, among other factors, against a standard protocol, is a good example of the achievement of a sophisticated Liveware–Software interface following a fatal error.15

Back to Top | Article Outline
Liveware–Environment.

One of the earliest interfaces recognized in aviation was that between the Liveware and the Environment. Pilots were fitted with helmets, goggles, flying suits, and oxygen masks to adapt them to the Environment. As aviation progressed, a trend developed to reverse this process, and—through the use of pressurization, air conditioning, and soundproofing systems—to adapt the Environment to match human requirements. The Liveware–Environment interface may be particularly relevant in some medical contexts such as the emergency room, where the staff experience environmental challenges far beyond those found in other branches of clinical medicine.16 An emergency room department is host to environmental stressors such as noise, poor acoustics, overcrowding, intoxicated individuals, irate patients and relatives, and frequent disturbances from bleeps and intercoms, all of which can interfere with safe patient care. In the original SHEL model, the Environment was characterized largely as the physical environment. However, the political, cultural, and financial environment may also be important here. Managerial decisions, for example, based on a challenging financial environment may create the conditions in which staff are overworked and underresourced, fostering what Reason would call latent failures in the system, which may at some point combine with active failures to create an error5,8,9.

Back to Top | Article Outline
Liveware–Liveware.

The Liveware–Liveware interface is the interface between people; interpersonal communication is its most obvious example.17 Traditionally, in aviation, questions about performance focused on the characteristics of the individual pilot or crew member. However, modern approaches focus on the breakdown in teamwork or the system of ensuring safety through redundancy. The Liveware–Liveware interface is concerned with leadership, team coordination and cooperation, personality interactions, status hierarchies, conflict resolution, and—particularly important in health care—continuity of information flow in patient care. The Liveware–Liveware interface should allow the efficient flow of important information between individuals or agencies. Shortcomings here result in situations where the important information is not adequately disseminated or important information is difficult to obtain.

For an example of the application of the SHEL model to a case of fatal medical error, see the Appendix.

Back to Top | Article Outline

The Model's Value in Reducing Error and in Education

The particular applications of the SHEL model to high-technology team situations in health care delivery paints a clearer picture of the importance of Human Factors in medical settings. The model focuses attention on Human Factors that are relevant to “microsystems” in the medical arena, particularly the work of the more intensive and team-oriented branches of medicine. An analysis of errors or “near-miss” incidents through Human Factors approaches that emphasize only the role of the broader macro-system or organization, often results in vague conclusions akin to Marcellus's proclamation in Hamlet, “Something is rotten in the state of Denmark.” This type of infinite regress upward in the system in search for the cause of a particular medical error is sometimes of questionable value in reducing the chance of the error reoccurring. While broader organizational failures are often uncovered in many cases of medical error, an overemphasis on broader system failures may preclude discovery of the more proximal determinants of error, which may be more amenable to change and more likely to result in immediate risk reduction. The SHEL model helps reveal these more proximal determinants of error. And, as with the Reason model of error,2 the model highlights the complexity of human–system interactions and emphasizes once again that, in attempting to understand and counteract error, one must do much more than simply blame and punish the “front-end-operator.”

The SHEL model provides a useful visual aid for teaching medical students and residents about the role of Human Factors in analyzing and assessing risk and reducing error in health care. Students could use the model as a guide to tease out any Human Factors issues in specific examples of medical error, such as those described in this article. Indeed, given the magnitude of the problem of medical error, there may be an argument for making Human Factors a mandatory part of medical curricula, particularly in those branches of medicine where medical practice is more prone to mismatches between Human Factors components, such as the Liveware–Hardware interface in anesthesia. While empirical studies are lacking, one could hypothesize that Human Factors training for medical professionals may be an effective means of reducing medical error. Such studies are clearly recommended in order to ascertain the optimal dose of training needed to make clinically significant reductions in the occurrence of medical error.

Back to Top | Article Outline

References

1 Institute of Medicine. To Err is Human: Building a Safer Health System. Washington, DC: National Academy Press, 1999.

2 Reason J. Human error: models and management. BMJ. 2000;320:768–70.

3 Reason JT. Managing the Risks of Organizational Accidents. Aldershot: Ashgate, 1997.

4 Reason JT. Understanding adverse events: the human factor. In: Vincent CA (ed). Clinical Risk Management. 2nd ed. London: BMJ Books, 2001:9–30.

5 Department of Health. An organisation with a memory: report of an expert group on learning from adverse events in the NHS. London: DOH, 2000.

6 Glavin RJ, Maran NJ. Integrating human factors into the medical curriculum. Med Educ. 2003;37(11 suppl):S59–64.

7 Vincent C, Taylor-Adams S, Stanhope N. Framework for analysing risk and safety in clinical medicine. BMJ. 1998;316:1154–7.

8 Larson EB. Measuring, monitoring, and reducing medical harm from a systems perspective: a medical director's personal reflections. Acad Med. 2002;77:993–1000.

9 Helmreich RL, Musson DM. Surgery as a team endeavour. Can J Anaesth. 2000;47:391–2.

10 Schaefer HG, Helmreich RL, Scheidegger D. Human factors and safety in emergency medicine. Resuscitation. 1994;28:221–5.

11 Helmreich RL. On error management: lessons from aviation. BMJ. 2000;320:781–5.

12 Hawkins FW. Human Factors in flight. Aldershot: Ashgate, 1994.

13 Swiss Federal Department of Environment, Transport, Energy and Communications. Investigation. Report of the Aircraft Accident Bureau on the accident to aircraft AVRO146-RJ100,HB-IXM operated by Crossair under flight number CRX3597, on 24th November 2001 near Basserdorf/ZH. Zurich: Federal Department of Environment, Transport, Energy and Communications, 2004.

14 Ziegler J. A medical specialty blazes a trail. In: Findlay S (ed). Reducing Medical Errors and Improving Patient Safety 〈http://www.nchc.org/releases/medical_errors.pdf〉. Accessed 31 May 2004. The National Coalition on Health Care and the Institute for Healthcare Improvement, 2000:26-8.

15 Paul C. Back from the brink: making chemotherapy safer. In: Findlay S (ed). Reducing Medical Errors and Improving Patient Safety 〈http://www.nchc.org/releases/medical_errors.pdf〉. Accessed 31 May 2004. The National Coalition on Health Care and the Institute for Healthcare Improvement, 2000:4-8.

16 Driscoll P, Thomas M, Touquet R, Fothergill J. Risk management in accident and emergency medicine. In: Vincent CA (ed). Clinical Risk Management. 2nd ed. London: BMJ Books, 2001:151–74.

17 Kalet A, Pugnaire MP, Cole-Kelly K, et al. Teaching communication in clinical clerkships: models from the Macy initiative in health communications. Acad Med. 2004;79:511–20.

Back to Top | Article Outline
Cited Here...
Appendix Application...
Appendix Application...
Image Tools

Cited By:

This article has been cited 3 time(s).

Medical Journal of Australia
The national inpatient medication chart: critical audit of design and performance at a tertiary hospital (Retracted article. See vol. 188, pg. 432, 2008)
Millar, JA; Silla, RC; Lee, GE; Berwick, A
Medical Journal of Australia, 188(2): 95-+.

Applied Ergonomics
Human performance interfaces in air traffic control
Chang, YH; Yeh, CH
Applied Ergonomics, 41(1): 123-129.
10.1016/j.apergo.2009.06.002
CrossRef
Advances in Chronic Kidney Disease
Clinical practice management issues
Fadem, SZ; Spry, L; Yee, J
Advances in Chronic Kidney Disease, 15(1): 3-6.
10.1053/j.ackd.2007.10.009
CrossRef
Back to Top | Article Outline

© 2005 Association of American Medical Colleges

Login

Article Tools

Images

Share