Skip Navigation LinksHome > February 2010 - Volume 110 - Issue 2 > Bring Your Life into FOCUS!
Anesthesia & Analgesia:
doi: 10.1213/ANE.0b013e3181c920cc
Editorial: Editorials

Bring Your Life into FOCUS!

Spiess, Bruce D. MD, FAHA*; Wahr, Joyce A. MD†; Nussmeier, Nancy A. MD‡

Free Access
Article Outline
Collapse Box

Author Information

From the *VCURES Shock Center, VA Commonwealth University Medical Center; †Society of Cardiovascular Anesthesiologists Foundation, Richmond, Virginia; and ‡Department of Anesthesiology, SUNY Upstate Medical University, Syracuse, New York.

Accepted for publication October 8, 2009.

The FOCUS initiative is a collaborative project of the Society of Cardiovascular Anesthesiologists, the SCA Foundation, and the Johns Hopkins University Quality and Safety Research group. FOCUS is funded exclusively by the SCA Foundation.

Address correspondence and reprint requests to Bruce D. Speiss, MD, FAHA, VA Commonwealth University Medical Center, Richmond, VA. Address e-mail to bdspiess@hsc.vcu.edu.

All of us make mistakes. The nature of being human is to err. John Nance, speaking at the Society of Cardiovascular Anesthesiologists (SCA) 29th Annual Meeting (April 22, 2007, Montreal, Canada) during the Flawless Operative Cardiovascular Unified Systems (FOCUS) initiative opening address noted, “although individuals may make mistakes, it is possible for teams to be flawless.” Although this concept has been embraced by other high-risk industries, notably aviation and nuclear power, it has not yet been embraced by medicine. Cockpit (crew) resource management (CRM) is a key facet of the safety culture in aviation.1–3 CRM represents a culture of safety within the entire team responsible for servicing, preparing, and flying the aircraft, and includes the concept that if anyone notes anything untoward, they are to speak up, be heard, and know that they will be taken seriously for the safety of all.1–3 It recognizes that no one individual, no matter how high their rank, is infallible. Checklists, guidelines, and standardizations are all part of CRM. Above all, teamwork through communication for a common goal, safety, is paramount. Each team member creates crosschecks through briefings, debriefings, expectations, and supervision in the overall spirit of creating a flawless outcome. Therefore, CRM is a culture, or unified human system, by which teams can operate flawlessly!

FOCUS represents a progressive, revolutionary leadership initiative for the SCA. The anagram for this program was chosen carefully, because it is only through teamwork that we can be flawless. In this editorial, we summarize the vision and progress that is the FOCUS initiative.

The goal of the FOCUS initiative is to substantially decrease the incidence and severity of human error in the cardiac operating room through scientific analysis leading to culture change.

Back to Top | Article Outline

HUMAN ERROR

Let us frame the problem of human error in medicine. Since the late 1990s, human errors in medicine have gained the interest of the lay press.4–8 It has been estimated that 48,000 to 98,000 lives are lost every year in US hospitals because of medical mistakes.4,5 That is almost certainly a gross underestimate.6–8 Worldwide the number may exceed 500,000 lives! In aviation terms, that would mean the crash of a fully loaded Boeing 747 every other day for 1 year (to account for the known statistics in the United States alone). What would the public outcry be if such events actually occurred in commercial aviation?

By no means does America have a corner on the human error market. Recognizing the damage caused by human errors in surgery, the World Health Organization has made reduction of this specific problem one of its goals since 2005.9,10 However, despite lofty ideals, rarely a week goes by wherein print or broadcast media (e.g., Newsweek, USA Today, or CNN) do not report another story on human error in medicine.

The aviation and nuclear power industries are held up as examples in which human error has been dramatically reduced if not virtually eliminated. It was not always that way, however! The single largest airline casualty event was caused by 2 fully loaded 747s colliding because of a misinterpretation of a tower communication. The near meltdown at Three Mile Island and the catastrophic events at Chernobyl were caused by human errors. However, after those errors, these high-risk industries undertook human error analysis to determine how human errors occur and to put policies and processes into place to prevent simple errors from becoming catastrophes.

Back to Top | Article Outline

HUMAN FACTORS ANALYSIS OR ENGINEERING

Human factors analysis is a complex and evolving science.1 Investigators examine not just what error occurred, but why the events happened, exploring human-to-human interactions, human-to-machine interfaces, and what circumstances set up a well-meaning and well-trained individual to make an error. Human factors analysis focuses on the process that permitted the error, rather than on the individual who made the error. Human factors analysis also understands that an accident is caused by the culmination of a series of events, rather than a single, isolated error.

Every person reading this article can reflect on the last or most memorable of his or her personal errors. An example may be a drug error made in the administration of fentanyl, wherein another “blue syringe” containing neosynephrine is injected. A sudden blood pressure increase might be the result, and the patient might develop a small parietal intracerebral hemorrhage after surgery. No death occurred. Small strokes are not uncommon after heart surgery. What was the cause of this adverse outcome, the syringe swap error? Actually, perhaps 5 to 10 human errors led to that single event (a drug error).

Figure 1 depicts the “Swiss Cheese” analogy of the error process. Only when the holes, i.e., a series of human events or errors, line up in series does an adverse outcome result. It is important to understand why a highly trained individual committed the error and how other events lined up to permit the error. If factors (human) failed the individual and allowed such an event, then a true culture of safety, using human factors analysis, would examine not just the error, but what circumstances set up the event, and could then build systems to overcome the deficiencies. In-depth understanding of the human factors that led to error can help us create systems to prevent it.

Figure 1
Figure 1
Image Tools
Back to Top | Article Outline

HUMAN ERROR ANALYSIS IN MEDICINE

Anesthesiology has led medicine in improving patient safety and has been rewarded in kind by a reduction in malpractice premiums.11 Advances in technology, such as the pulse oximeter, have improved patient safety in our specialty, often by reducing human error. By warning of hypoxia, the pulse oximeter notifies the anesthesiologist of the error of esophageal intubation.

The Anesthesia Patient Safety Foundation has led medicine by providing research, education, simulation, establishment of practice guidelines, and other methods. The American Society of Anesthesiology Closed Claims study is a prime example of learning through retrospective analysis of adverse outcomes. Closed claims analysis helps us understand patient and practice risk factors that contribute to bad outcomes. Such work is laudable and important, but the complexity of human factors may not be evident from retrospective analysis. So, in addition to these national efforts, guidelines, scenario training and simulation, better equipment, and more rigorous training have been developed to improve patient safety. However, human errors persist.

Why, with such efforts, has medicine (and anesthesiology in particular) not been able to duplicate the successes seen in aviation? Perhaps most importantly, although human error has been studied in the operating rooms,12–18 human systems in medicine have rarely been systematically studied to the depth at which commercial aviation (or nuclear power, interstate transportation, construction crews, or even the assembly lines of automobile manufacturers) has been studied.19 Traditionally, patient safety research within hospitals is driven by quality assurance committees reviewing sentinel events. Themes of major, recurring sentinel events may drive change that is highly influenced by local politics. Human error may be dealt with by disciplinary action and/or the shunning and dismissal of highly trained individuals.

Although it has been shown that the rate of adverse outcomes is lower in operating rooms where patient safety is highly valued (Fig. 2), rather than seek to change the local culture to one of safety, the motivation for change may be to reduce malpractice payouts.20 The cardiac operating rooms, in particular, are incredibly complex environments in which a number of highly trained professionals interact with each other and with a wide range of highly sophisticated electronic equipment to provide care of a single patient. Few systematic studies in cardiac anesthesiology or surgery have been performed regarding human error/factors across multiple institutions or internationally. The studies that have been done in cardiac and trauma surgery have been within a single institution having a unique culture.15,16

Figure 2
Figure 2
Image Tools

All of us caring for cardiac patients have factors that affect our ability to be at peak performance: emotional stresses, fatigue, ability to concentrate, situational awareness, and so on. These normal stressors may result in a lack of concentration, leading to error. These factors are well described in aviation, transportation, and other high-risk industries but have not been systematically researched in the operating room.

Other high-risk industries have adopted national error and near-miss reporting systems that drive process improvement. The transfusion community makes use of an Eindhoven classification model, a near-miss taxonomy developed for the chemical industry. This analysis classifies near misses into 3 major subheadings: technical (equipment and software), organizational (oversight, regulations, and guideline), or human (violations or failure to follow established rules). The use of such a system would guide processes to improve patient safety. There are no such punitive-free, anonymous, national error reporting systems in surgery or anesthesiology.

Although it would be tempting to conclude that the operating rooms would be suddenly improved by adoption of the CRM process, the answer to decreased human error in the cardiac operating room is not so simple as to say that the anesthesiologists, surgeons, perfusionists, and nurses have to be better communicators. Communication may be one of the major areas of focus. However, should the efforts to improve patient safety be focused only on CRM or on other systems or processes? For example, we must also know whether there are inadequacies, latent and hidden, within any of the machine/human interfaces that could result in error. Acquisition of proper and modern equipment, labels for syringes that cannot be misinterpreted, or operating room design with proper ergonomic function might be extremely important. Other areas for improvement might well include administrative efforts to provide better equipment, changes in personnel numbers, shift and work structure, or call schedules that make work more efficient or safety oriented (Table 1). Clearly, improvements in safety must be driven by research and data.

Table 1
Table 1
Image Tools
Back to Top | Article Outline

THE FOCUS INITIATIVE

FOCUS is a progressive, revolutionary leadership initiative for the SCA. The SCA has functioned for its 30-year history as a research- and education-facilitating society. In a larger sense, through education and by advancing knowledge, patient safety is continuously improved. This mission cannot be diminished, but, by embracing the FOCUS initiative, the SCA leadership and the SCA Foundation have charted a bold new course: one whereby, through research and implementation, they can improve patient safety. The SCA plans to lead the way in creating a culture of safety in the cardiovascular operating room, wherein Flawless (teams working together) Unified (we can all adopt the measures) Systems (human interactive programs) will result.

The FOCUS initiative has been in its formative stages for >5 years, with the intent to study surgeon/anesthesiologist communication, and how a whole team and its individual parts work (or don't work). The FOCUS Steering Committee articulated the goals of the initiative (Spring 2006) in a request for proposals. A number of responses were forthcoming, leading to 6 finalists who were interviewed. The Johns Hopkins Quality and Safety Research group under the direction of Peter Pronovost, MD, was contracted to conduct the research and collaborate on implementation of the findings. The Johns Hopkins group (an extensive multidisciplinary team) has an international reputation of scientific leadership, publication, and success in patient safety. An accompanying commentary by Martinez et al.21 explains the “LENS” (Locating Errors through Networked Surveillance) processes to be used. Along the way, the SCA and the SCA Foundation have been joined by representatives from cardiothoracic surgery (the Society of Thoracic Surgeons and the American Association of Thoracic Surgeons), nursing (the Association of Operating Room Nurses), and perfusion (the American Society of Extracorporeal Technology). The process has had the valuable consulting services of Scott Shappell, PhD (Clemson University), who has a unique background in defense department (naval aviation) air accident investigation, civilian human factors accident investigation (TWA flight 800), and, most recently, in medical human factors work. FOCUS is lucky to have found such team members.

How then will FOCUS achieve the goal stated earlier, to substantially decrease the incidence and severity of human error in the cardiac operating room through scientific analysis leading to culture change? First and foremost is to study the current processes, cultures, and human factors at play in cardiac operating rooms. Research and data must drive this effort. As described in the accompanying article from the Johns Hopkins University Quality and Safety Research group,21 a thorough research of the literature and error databases was conducted to guide the observational research plan. Then, 5 initial cardiac surgical sites (plus Johns Hopkins University as a test site for the surveys) were selected, and observations and surveys were performed. Now, having concluded the first wave of observations, analysis of all of the accumulated data is being conducted. Through this analysis, preliminary recommendations for interventions will be created. The interventions will then be tested at newly selected cardiac surgical sites, using well-defined methods for measurement of human errors.

At this time, we cannot presume to know what the research will find, but interventions will not be based on supposition or armchair guessing. If an intervention cannot be tested, observed, and supported by data, it will not go forward. Parallel development of self-study programs or peer-to-peer assessment will allow all operative sites represented in the SCA to become familiar with the concepts and improvements identified by FOCUS. Any future successful reduction in human error will depend on the solid groundwork laid by careful and thorough research.

The task is immense. The SCA is committed to this goal over the long haul and has contributed significant financial support to perform the first phase of research. In addition to financial support, SCA members can participate by being on FOCUS committees and by participating in the next phase of research. Every SCA member can and should get involved with FOCUS. We owe this to our patients, who expect (and rightfully so) Flawless Operative Cardiovascular care. Individuals will always make errors, but teams who develop a culture of safety through FOCUS can be flawless. The cause has charisma!22

Back to Top | Article Outline

REFERENCES

1. Wiegmann D, Shappell SA. A human error approach to aviation accident analysis: the human factors analysis and classification system. Burlington, VT, Ashgate Publishing Company, 2003:1–165

2. Helmreich RL, Merritt AC, Wilhelm JA. The evolution of Crew Resource Management training in commercial aviation. Int J Aviat Psychol 1999;9:19–32

3. Barker JM, Clothier CC, Woody JR, McKinney EH, Brown JL. Crew resource management: a simulator study comparing fixed versus formed aircrews. Aviat Space Environ Med 1996;67:3–7

4. Kohn LT, Corrigan JM, Donaldson MS, eds. To err is human: building a safer health system. Washington, DC: National Academy Press, 1999

5. Brennan TA, Leape LL, Laird NM, Hebert L, Localio AR, Lawthers AG, Newhouse JP, Weiler PC, Hiatt HH. Incidence of adverse events and negligence in hospitalized patients. Results of the Harvard Practice Study I. N Engl J Med 1991;324:370–6

6. Leape LL, Berwick DM. Five years after To Err is Human: what have we learned? JAMA 2005;293:2384–90

7. Altman DE, Clancy C, Blendon RJ. Improving patient safety—five years after the IOM report. N Engl J Med 2004;351:2041–43

8. Stelfox HT, Palmisani S, Scurlock C, Orav EJ, Bates DW. The “To err is Human” report and the patient safety literature. Qual Saf Health Care 2006;15:174–8

9. Sherman H, Loeb J. Project to develop the international patient safety event taxonomy: updated review of the literature 2003–2005. The WHO: World Health Organization alliance for patient safety. Geneva, Switzerland: World Health Organization, 2005

10. Merry AF. Ct28 creating the “error environment”. ANZ J Surg 2007;77(suppl 1):A13

11. Stoelting RK, Khuri SF. Past accomplishments and future directions: risk prevention in anesthesia and surgery. Anesthesiol Clin 2006;24:235–53

12. Guerlain S, Adams RB, Turrentine FB, shin T, Collins SR, Calland JF. Assessing team performance in the operating room: development and use of a “black box” recorder and other tools for the intraoperative environment. J Am Coll Surg 2005;200:29–37

13. Lanier W. A three-decade perspective on anesthesia safety. Am Surg 2006;72:985–9

14. Pate-Cornell ME, Lakats LM, Murphy DM, Gaba DM. Anesthesia patient risk: a quantitative approach to organizational factors and risk management options. Risk Anal 1997;17:511–23

15. MacKenzie CF, Jeffries NJ, Bernhard WN, Xiao Y. Comparison of self-reporting of deficiencies in airway management with video analyses of actual performance. LOTAS group. Level One Trauma Anesthesia Simulation. Hum Factors 1996;38:623–35

16. ElBardissi AW, Wiegman DA, Dearani JA, Daly RC, Sundt TM III. Application of the human factors analysis and classification system methodology to the cardiovascular surgery operating room. Ann Thorac Surg 2007;83:1412–9

17. Carthey J, deLeval MR, Reason JT. The human factor in cardiac surgery: errors and near misses in a high technology medical domain. Ann Thorac Surg 2001;72:300–5

18. Christian CK, Gustafson ML, Roth EM, Sheridan TB, Gandhi TK, Dwyer K, Zinner MJ, Dierks MM. A prospective study of patient safety in the operating room. Surgery 2006;139:159–73

19. Van der Schaaf TW, Lucas DA, Hale AR, eds. Near miss reporting as a safety tool. Oxford, UK: Butterworth and Heineman, 1991

20. Makary MA, Sexton JB, Freischlag JA, Millman EA, Pryor D, Holzmueller C, Pronovost PJ. Patient safety in surgery. Ann Surg 2006;243:628–35

21. Martinez EA, Marsteller JA, Thompson DA, Gurses AP, Goeschel CA, Lubomski LH, Kim GR, Bauer L, Pronovost PJ. The society of cardiovascular anesthesiologists FOCUS initiative: locating errors through networked surveillance (LENS) project vision paper. Anesth Analg 2010;110:307–11

22. Porras J, Emery S, Thompson M. The cause has charisma. Leader Leader 2007;43:26–31

© 2010 International Anesthesia Research Society

Login

Become a Society Member