Errors in medicine: punishment versus learning medical adverse events revisited – expanding the frame : Current Opinion in Anesthesiology

Secondary Logo

Journal Logo

ETHICS, ECONOMICS AND OUTCOMES: Edited by Edoardo De Robertis

Errors in medicine: punishment versus learning medical adverse events revisited – expanding the frame

Brattebø, Guttorma,b,c; Flaatten, Hans Kristiand,e

Author Information
Current Opinion in Anaesthesiology 36(2):p 240-245, April 2023. | DOI: 10.1097/ACO.0000000000001235
  • Open

Abstract

INTRODUCTION

Modern medicine is extremely complicated and the margins between effective treatment and complications are narrow. Further, as therapeutic options and technological advances constantly are pushing the limits for new treatments that can be offered in a certain condition, new risks are constantly emerging. According to the theory of ‘bad apples’, one might think that safety problems in healthcare can be solved by telling personnel to be more careful and compliant with guidelines and protocols, or by reprimanding those involved in adverse events. This understanding rests on an understanding that the problem is due to some few ‘bad’ healthcare workers. As the number of harms to patients increases, often as a result or lack of, certain actions, comes the need for remedial actions. Hence, when something goes wrong, a natural reaction will be to focus on the individual healthcare personnel responsible for the treatment actions, in particular immediately prior to or during the specific event causing injury. This may lead to disciplinary actions for wrongdoing.

Frequently, when the cases are analysed in the aftermath, harm to patients will be judged as preventable and most often, the causes are labelled ‘human errors’. The nuance of language in patient safety is important and using the word ‘error’ as a posthoc construction, often implies the need for correction and punitive actions [1]. From one perspective, this serves our deep wish for a plausible explanation of what went wrong and why. However, from another perspective, there is a tendency for neglecting the more subtle and hard to identify factors, lying hidden and latent in the system, just waiting for the right conditions to occur. During recent years, numerous studies and publications have addressed these problems, emphasizing the need for learning from adverse events to possibly reduce future risk [2▪,3▪▪]. 

FB1
Box 1:
no caption available

COMPLEXITY OF HEALTHCARE

One alternative and more intriguing way of looking at healthcare, is to view it as a complex adaptive system [4]. Such systems consist of large numbers of interdependent parts, influenced by numerous dependent forces. The relationships between these interdependent factors are often nonlinear and discontinuous. Sometimes, minute changes in one part of the system may have small impacts at a given time, but major impacts given other conditions. On the contrary, effects of changes in variables can vary from negligible to large, depending on the state of other variables. Because these systems seem to live their own lives, one cannot foresee every possible result of their actions with 100% certainty. Examples of such systems are the stock market, the human immune system and healthcare. Thus, one must accept the possibility of new unexpected or unforeseen events, despite numerous measures trying to harness the variability within the system such as protocols and checklists.

Too many protocols may even reduce safety in emergencies, as there may be no time available to look up every detail in a procedure or guideline. There may also exist local peculiarities and cultures in various units, departments, organizations and hospitals, or other environmental factors. Nevertheless, some risk areas have been identified, for example ‘where the ice is thin’, like during patient transferals and transitions. In these circumstances, sober use of tools such as checklists and routine descriptions seems indicated [5]. However, still the possibility for new risks must be kept in mind.

SAFETY CULTURE AND SAFETY CLIMATE

There is broad agreement that healthcare organizations should develop a culture of safety, often defined as the result of values, attitudes, perceptions and behaviour within a certain department or hospital [6]. This determines the commitment, style and proficiency of an organization's continuous will to minimize patient harm, based on shared norms. A safety culture includes a wish for learning from both success and failure, realizing that both are results of the same system. A culture of safety also embraces mutual trust and openness, welcoming reports of both ‘good catches’ and adverse events [2▪,3▪▪].

James Reason described various types of organizational cultures based on his extensive research, including healthcare, and we would like to mention two [7]:

A learning culture: A culture where an organization can learn from its mistakes and make changes. Employees want to and are competent to draw the right conclusion from available safety information, and the will to implement mitigating corrections/changes.

A just culture: A culture where an atmosphere of trust is present and people are encouraged or even rewarded for providing essential safety-related information, but there is also a clear line between acceptable and unacceptable behaviour.

On a theoretical basis, it is possible to divide actions into groups such as lapses, unsafe behaviour and recklessness, but it is harder in the real world because it is nearly impossible to elucidate every possible factor that might have influenced the actual event.

The natural reaction after adverse events may be self-blame, blaming others or reluctance to acknowledge that an adverse event has occurred. Often there is lack of a system's view, so that it is the individual health professional at the ‘sharp end’ who are investigated, rather than the environment in which the health services were delivered [8]. There may also be a lack of follow-up, both towards the patient and relatives, and the personnel involved. Sometimes, there should be offered some sort of compensation, but national regulations may require that someone is declared ‘guilty’. In other healthcare systems, like Norway, there are some sort of no-fault compensations [9].

CULTURE CHANGE

Changing the culture of a system is challenging and takes time. The actual behaviour of personnel and results to patients can be seen as results of the organizational culture [10▪,11]. Especially, values seem to predict behaviour. However, there may be substantial disagreement and differences between the actual values and the espoused values. The latter found in public documents, expressing the values that management believes should be present, while the actual values are expressed by each individual in the organization by their action. This represents the actual culture in the organization. The distinction between safety culture and safety climate is hard to delineate, but safety climate is often seen as the expression of the safety culture within an organization [12]. There are several questionnaire tools available for measuring safety climate in an organization or a department [10▪,13,14]. However, more important than the scores on an instrument at a certain time, is the actual change over time, and results from a questionnaire must always be evaluated within context, such as recent changes in policies or organisation. Implementing principles from so-called High Reliability Organizations (HROs) such as nuclear power and aviation, has been suggested as one effective way of promoting a safety culture, but the level of evidence is low [15]. This may be caused by inconsistent and even conflicting understanding of HRO concepts, often related to social and professional norms and practices, combined with an individualized rather than a systemic approach [16].

REPORTING OF ADVERSE EVENTS

The idea of learning from mistakes or adverse events makes a system for adverse events reporting and analysis often seen as a key attribute for a learning organization [10▪,15,16]. This is primarily based on the assumption that by careful analysis of a reported event, there may be discovered learning points from which future risks may be reduced. As mentioned above, this is challenging, and an alternative approach is to ask those working on the sharp-end to identify where they believe the potential risks are. Unfortunately, fear of sanctions is the main reason for underreporting [8]. In an organization with a ‘just culture’, individuals genuinely are eager to learn from events that may represent a safety threat, because there is no wish for assigning blame [2▪,17,18].

Nevertheless, the key issue is the organization's ability to act on such information, thereby proactively trying to reduce risks. The analysis of events already occurred, have the inherent risk of hindsight bias [19]. This is our tendency for making conclusions, not respecting the subtleness and unclear relations often being the case before an adverse event happened. Our well intended search for a reasonable explanation may block real insight into the weaknesses of a system, which from the outside looks ‘idiot-proof’. We also have a tendency for being harsher in our judgement of events having results that are more serious [19]. We should rather search for all possible contributing factors, than merely looking for the only-one root cause. Safety in healthcare may be seen more like an ecosystem with numerous individual units operating on their own, rather than one big machine, with many interconnected cogwheels in a set configuration [4].

The reader should remember the fact that healthcare seems to be constantly immersed in numerous demanding challenges, of which lack of sufficient resources, is the most prominent. This may be lack of money for personnel or facilities, or inadequate funding for maintaining or replacing faulty equipment. Time pressure is also constant in most healthcare systems, as the financial models often are based on performance goals and pay for (certain) services, rather than the actual patient load.

Theoretical frameworks, like the so-called efficiency thoroughness trade off-principle (ETTO-principle), invites us to remember that it is the same system that produces both the catastrophic adverse events and the medical victories at the same time [17]. The rather novel idea of resilience engineering in healthcare also builds on the idea that it is more fruitful to rather analyse those hazardous procedures or challenges that are actually handled well, rather than looking into the past and search for the root causes in adverse events [20]. The former often is referred to as a Safety II-approach, while the latter (and more traditional) has been called a Safety I-approach. Although a Safety II-approach may sound more attractive, it also has its challenges [21▪▪].

HORIZONTAL AND VERTICAL REPORTING

A major problem is how to learn from adverse events. A multilevel model of learning was discussed more than 15 years ago [22]. In that model, learning was described as individual level, group level and organizational level. Although not specifically mentioned, learning on an organizational level requires dissemination of experience on a horizontal level. This may happen within one organization (hospital) or between clusters of hospitals.

Traditional methods for learning from reports is insufficient according to this model. Just reporting adverse events vertically in the organization may not reach others groups who need to know. Achieving learning from adverse events across units and departments is difficult, and horizontal communication of adverse events and their contributing factors is challenging. A real problem in many hospitals is that when a safety concern is identified, important safety information may be confined to the ‘silo’ in which it was reported. Our health trusts have tried to counteract this by introducing a system for horizontal (lateral) reporting in addition to the conventional vertical course of reports.

In the Health trust of Western Norway with its 1.1 million inhabitants and four large hospitals including one regional hospital, we have started a pilot project called ‘horizontal learning’. Certain adverse events with a particular potential to harm patients reported in one hospital automatically can be sent to the other hospitals for information through a common event reporting system. This way, the learning from the event can be deeper and more disseminated within in our health trust. Of course, this system requires that reports are spread to all who might benefit from the information, and that the total information load is not too immense. A similar system is in operation in aviation where problems with a particular airplane immediately will be shared with all aviation companies having similar airplanes. The share number of possible adverse events in healthcare, makes the two types of organizations quite dissimilar, but still some of their experiences can be transferred to healthcare as well [23].

ACCOUNTABILITY, FAIRNESS AND TRUST IN THE SYSTEM

As mentioned, a culture of safety is based on the individual's willingness to report or speak-up on factors possibly endangering patients or personnel, often referred to as a just culture [3▪▪]. One basic assumption is that there should be no fear of punishment from the system. On one side, this is obvious because then there is no risk attached to reporting. However, on the contrary, there may be concerns that a totally nonpunitive system could open for recklessness because the individual will be immune to disciplinary sanctions, regardless of the actual role in the event. Recklessness as such, is perhaps not so difficult to identify because this is behaviour will not be tolerated in a well functioning safety conscious team [11,15,16].

Building trust in the system is important, for not only the healthcare workers, but also the patients and their families [3▪▪,4]. Depending on the stakeholder's position in the system, they may stress different interpretations of both the concepts of trust and justice [3▪▪]. Despite understanding of the value of the patients’ view, significant difficulties in listening to and involving patients and families in the organizational responses to safety incidents remain [24]. Although emotionally challenging, the involvement of family members in the investigation of severe adverse events has been advocated [25▪]. The next of kins’ reported rationale for being involved is to prevent the reoccurrence of similar events and ensure that the system improves. Therefore, it has been recommended that regulatory bodies acknowledge the value of inputs from next of kin [25▪].

The focus on communication between personnel is another important area for both improving care and reducing risks, also as a way of learning from adverse events. Interestingly, this points to the issue of a true safety culture, in which any professional will have no fear of reporting on safety issues or adverse events because they feel well tolerated and do not fear negative retributions or disciplinary actions [10▪]. Nevertheless, the colleagues or the organization will not tolerate bad behaviour or recklessness.

EMERGING PROBLEMS: HARM FROM GUIDELINES AND RECOMMENDATION

When discussing medical harm, we also wish to focus on emerging medical adverse events related to blind adherence to advocated treatment guidelines and recommendations. As a rather gruesome example, we will use the current opioid crisis that have developed the last decades. In this context, the concept of medical error is shifted away from the individual physician to the system that develops, endorse and promotes guidelines.

Pain relief is an important part in everyday practice of practicing physicians both outside and inside our healthcare facilities and is an integrated part of anaesthesiology. Traditionally, we have divided different types of pain according to a simple classification: Acute pain, Acute pain leading to chronic pain and Pain caused by a malignant process, like cancer.

Physicians have used different modalities of pain relief and hence choice of drugs within each of these modes of pain. Broadly, we use two different classes of analgesics much based on their mechanism of action. One group is the opioid receptors agonists that may be subdivided into natural opioids (heroin, morphine, codeine and so on) and other synthetic opioids. Development of a broad range of synthetic opioids for clinical use have been ongoing for decades, starting with synthetic opioids used during anaesthesia such as fentanyl, introduced into anaesthesia in 1986 [26]. This development continued, and today, we have a wide range of synthetic opioids to use as injections or transdermal, sublingual and oral applications.

Oxycodone is a synthetic opioid first developed in 1917 but became popular as an analgesic under the brand name OxyContin by Purdue Pharma in 1995 [27]. OxyContin was hailed as a medical breakthrough: A long-lasting narcotic with reduced sedative properties that effectively could relief moderate to severe pain. The drug became a blockbuster, and rapidly attracted much interest and was increasingly used to treat chronic nonmalignant pain. The long practice of restrain from using opioids in nonmalignant chronic pain was changed, initially also with approval from the Food and Drug Administration (FDA) that believed abuse of this oral slow-release drug had less potential for misuse and addiction [28].

The drug increasingly found its use in a variety of nonmalignant pain disorders. The manufacturer also successfully conducted an aggressive campaign to increase the use of OxyContin. However, it soon became apparent that this drug also had a large potential for addiction and subsequent crossover to illegal drugs such as heroin if the patient did not get further prescriptions [29]. This started the so-called ‘new opioid epidemic’ not only in the USA, but in many other countries as well [30▪▪]. The bottom line is that there has been a steady increase in deaths from overdose to a staggering number of more than 600 000 deaths in the USA alone since 1999 [31].

In our opinion, this also deserves to be included into the concept of ‘medical error’. Here, the ‘error’ lies not primarily at individual physicians following accepted guidelines and recommendations from medical industry as well as the medical society, but in those providing and advocating these guidelines to the medical community.

There are also other examples of severe adverse events and harm from guidelines. In the diagnosis of prostate cancer, the Prostate-Specific Antigen-test is often used for screening. When the PSA-level is elevated, there has been the norm then to perform a biopsy of the prostate, usually as a transrectal procedure. This is not without complications, and some develop sepsis after this procedure, with an estimated 90-day mortality rate of 1% pending on age and comorbidity [32]. This practice has been challenged, particularly in men above 70 years, where there is uncertainty of the benefit of screening for prostate cancer in this age group [33]. Hence, to restrain from further diagnostics will decrease the risk for iatrogenic complications from performing a prostate biopsy.

In both examples above, the concept ‘less is more’ is applicable, and will ultimately lead to better care and less incidence of adverse events. We claim that the whole concept of the choosing wisely campaign is an important part of the struggle to reduce medical harm, but where the focus is on better guidelines and recommendations, and not towards the individual physicians [34▪].

CONCLUSION

Moving from a system in which individuals involved in adverse events are seen as the ones causing the problem, to a system where such events are seen as valuable inputs in a strive for improvement and safety, is demanding. Understanding local cultures, especially values, are a basic requirement for making progress. However, changing culture is hard work, requires leadership and takes time. Wisely involving both front-line staff, middle and top-managers, as well as patients and relatives, is the way forward.

Acknowledgements

None.

Financial support and sponsorship

None.

Conflicts of interest

There are no conflicts of interest.

REFERENCES AND RECOMMENDED READING

Papers of particular interest, published within the annual period of review, have been highlighted as:

▪ of special interest

▪▪ of outstanding interest

REFERENCES

1. Brattebø G, Bergström J, Neuhaus C. What's in a name? On the nuance of language in patient safety. Br J Anaesth 2019; 123:534–536.
2▪. Murray JS, Clifford J, Larson S, et al. Implementing just culture to improve patient safety. Mil Med 2022. usac115.
3▪▪. Cribb A, O’Hara JK, Waring J. Improving responses to safety incidents: we need to talk about safety. BMJ Qual Saf 2022; 31:327–330.
4. Braithwaite J, Clay-Williams R, Nugus P, Plumb J. Hollnagel E, Braithwaite J, Wears RL. Health care as a complex adaptive system. Resilient healthcare. Boca Raton, Florida: US: CRC Press; 2013. 57–76.
5. Suclupe S, Kitchin J, Sivalingam R, McCulloch P. Evaluating patient identification practices during intrahospital transfers: a human factors approach. J Patient Saf 2022 [Epub ahead of print].
6. Østergaard D, Madsen MD, Ersbøll AK, et al. Patient safety culture and associated factors in secondary healthcare of the Capital Region of Denmark: influence of specialty, healthcare profession and gender. BMJ Open Qual 2022; 11:e001908.
7. Reason J. Managing the risks of organisational accidents. Aldershot: Ashgate Publishing; 1997.
8. Aljabari S, Kadhim Z. Common barriers to reporting medical errors. Sci World J 2021; 2021:6494889.
9. Lee SK, Rowe BH, Flood CM, Mahl SK. Canada's system of liability coverage in the event of medical harm: is it time for no-fault reform? Healthc Policy 2021; 17:30–41.
10▪. Preckel B, Staender S, Arnal D, et al. Ten years of the Helsinki Declaration on patient safety in anaesthesiology: an expert opinion on peri-operative safety aspects. Eur J Anaesthesiol 2020; 37:521–610.
11. Eng DM, Schweikart SJ. Why accountability sharing in healthcare organizational cultures means patients are probably safer. AMA J Ethics 2020; 22:E779–E783.
12. Schein EH. Organisational culture and leadership. Hoboken, NJ: John Wiley & Sons; 2010.
13. Flin R, Burns C, Mearns K, Yule S, et al. Measuring safety climate in healthcare. Qual Saf Healthcare 2006; 15:109–115.
14. Waterson P, Carman EM, Manser T, Hammer A. Hospital survey on patient safety culture (HSPSC): a systematic review of the psychometric properties of 62 international studies. BMJ Open 2019; 9:e026896.
15. Veazie S, Peterson K, Bourne D, et al. Implementing High-Reliability Organization principles into practice: a rapid evidence review. J Patient Saf 2022; 18:e320–e328.
16. Rotteau L, Goldman J, Shojania KG, et al. Striving for high reliability in healthcare: a qualitative study of the implementation of a hospital safety programme. BMJ Qual Saf 2022; 0:1–11.
17. Hollnagel E. The ETTO principle: efficiency-thoroughness trade-off: why things that go right sometimes go wrong. Farnham, UK: Ashgate; 2009.
18. Fencl JL, Willoughby C, Jackson K. Just culture: the foundation of staff safety in the perioperative environment. AORN J 2021; 113:329–336.
19. Banham-Hall E, Stevens S. Hindsight bias critically impacts on clinicians’ assessment of care quality in retrospective case note review. Clin Med (Lond) 2019; 19:16–21.
20. Patil N, Manwani R, Vyas V, et al. Resilience of healthcare professionals involved in anesthesia practice: a cross-sectional questionnaire based pilot study. J Anaesthesiol Clin Pharmacol 2022; 38:191–195.
21▪▪. Verhagen MJ, de Vos MS, Sujan M, Hamming JF. The problem with making Safety-II work in healthcare. BMJ Qual Saf 2022; 31:402–408.
22. Chuang YT, Ginsburg L, Berta WB. Learning from preventable adverse events in healthcare organizations: development of a multilevel model of learning and propositions. Healthcare Manage Rev 2007; 32:330–340.
23. Flaatten H, Wallevik M, Harthug S, Ebbing M. Better patient safety with error reporting laterally (Bedre pasientsikkerhet med avviksmeldinger på tvers? Dagens Medisin: https://www.dagensmedisin.no/artikler/2021/03/04/avviksmeldinger-pa-tvers-kan-styrke-pasientsikkerheten/ [Accessed 21 December 2022].
24. Kok J, Leistikow I, Bal R. Patient and family engagement in incident investigations: exploring hospital manager and incident investigators’ experiences and challenges. J Health Serv Res Policy 2018; 23:252–261.
25▪. Wiig S, Haraldseid-Driftland C, Tvete Zachrisen R, et al. Next of kin involvement in regulatory investigations of adverse events that caused patient death: a process evaluation (Part I - The next of kin's perspective). J Patient Saf 2021; 17:e1713–1718.
26. Stanley TH. The history and development of the fentanyl series. J Pain Symptom Managem 1992; 7: (3 Suppl): S3–S7.
27. Chow R. Purdue Pharma and OxyContin: a commercial success but public health disaster. Harvard Public Health Review 2019; 25: https://www.jstor.org/stable/45345199. [Accessed 27 October 2022].
28. Kolodny A. How FDA failures contributed to the opioid crisis. AMA J Ethics 2020; 22:E743–750.
29. Van Zee A. The promotion and marketing of OxyContin: commercial triumph, public health tragedy. Am J Public Health 2009; 99:221–227.
30▪▪. The Lancet. Managing the opioid crisis in North America and beyond. Lancet 2022; 399:495.
31. Hedegaard H, Miniño AM, Spencer MR, et al. Drug overdose deaths in the United States, 1999-2020. NCHS Data Brief, no 428. Hyattsville, MD: National Center for Health Statistics; 2021.
32. Lundstrom K-J, Drevin L, Carlsson S, et al. Nationwide population based study of infections after transrectal ultrasound guided prostate biopsy. J Urol 2014; 192:1116–1122.
33. Hofman RM. Screening for prostate cancer. https://www.uptodate.com/contents/screening-for-prostate-cancer. [Accessed 27 October 2022].
34▪. Baron RJ, Lynch TJ, Rand K. Lessons from the choosing wisely campaign's 10 years of addressing overuse in healthcare. JAMA Health Forum 2022; 3:e221629.
Keywords:

adverse event; error reporting; just culture; medical error; patient safety; safety culture

Copyright © 2023 The Author(s). Published by Wolters Kluwer Health, Inc.