Secondary Logo

Journal Logo

High-Reliability Healthcare

Building Safer Systems Through Just Culture and Technology

Adelman, Jason, MD

Journal of Healthcare Management: May-June 2019 - Volume 64 - Issue 3 - p 137–141
doi: 10.1097/JHM-D-19-00069
MANAGING RISK
Free
SDC

chief patient safety officer and associate chief quality officer; executive director, Patient Safety Research Program; and codirector, Patient Safety Research Fellowship in Hospital Medicine, NewYork–Presbyterian Hospital/Columbia University Irving Medical Center, New York, New York

For more information about the concepts in this column, contact Dr. Adelman at adelman.jason@columbia.edu.

The author declares no conflicts of interest.

High-reliability organizations (HROs) operate in complex, high-risk environments with low incidences of serious accidents or catastrophic failures. The airline industry is a standard-bearer for HROs, with relatively few incidents occurring among the millions of flights each year. In 2018, there was only one fatal incident for every 3 million flights—a 94% decrease over the previous 45 years (Aviation Safety Network, 2019). In healthcare, if we only performed appendectomies on healthy people, we would likely achieve a similarly low risk of harm. The reality is that healthcare faces countless complex workflows that range from a psychiatrist treating a teenager with suicidal ideation to a pharmacist preparing a chemotherapeutic agent to an orthopedic surgeon performing hip replacement surgery.

Despite a heightened interest in patient safety over the past two decades, the frequency of preventable harms in healthcare remains high, with implications for mortality, morbidity, costs, and quality of life. Approximately 1 in 10 hospitalized patients experiences an adverse event, such as a hospital-acquired infection, pressure ulcer, preventable adverse drug event, or fall (Agency for Healthcare Research and Quality, 2014), and 1 in 2 surgeries is associated with a medication error or adverse drug event (Nanji, Patel, Shaikh, Seger, & Bates, 2016). In my research, I have found that 1 in 37 hospitalized patients had an order placed for them that was intended for another patient (Adelman et al., 2013).

Back to Top | Article Outline

ACHIEVING HIGH-RELIABILITY HEALTHCARE

A major contributor to the success of the airline industry in achieving high reliability is the use of technology. Today, if a pilot falls asleep during a commercial flight, the plane can fly itself—a major improvement over the early days of aviation. Healthcare, on the other hand, remains inherently complex, rapidly changing, and heavily reliant on people rather than technology to protect patients from harm. Applying advances in health information technology (IT) and other emerging technologies is essential to achieving high reliability in healthcare. In particular, building protections into the electronic health record (EHR) system can significantly reduce the occurrence of many adverse events (Adelman et al., 2013).

How do we reach our potential? HROs demonstrate five characteristic traits: preoccupation with failure, reluctance to simplify, sensitivity to operations, deference to expertise, and commitment to resilience (Weick & Sutcliffe, 2007). Although high reliability may seem aspirational rather than practical in the healthcare environment, we can adopt basic principles drawn from the experience of HROs. As a chief patient safety officer and health IT safety researcher, I have found that high reliability in healthcare requires a two-pronged approach:

  • Vigilance. Creating a just culture—an environment where staff feel it is safe to report errors and the focus is on correction of systems issues
  • Resilience. Leveraging technology to build systems that are resilient to human error

How quickly we get to high-reliability healthcare will depend on the extent to which we embrace and pursue these two paths.

Back to Top | Article Outline

MAINTAINING VIGILANCE IN A JUST CULTURE

The first major shift to high-reliability healthcare is the transition to a just culture. As Lucian Leape, a patient safety expert and member of the Institute of Medicine’s Committee on Quality of Health Care in America, writes, “We need to move from looking at errors as individual failures to realizing they are caused by system failures” (Leape, 2009, p. 4).

In preoccupation with failure, a characteristic trait of HROs, staff are vigilant for potentially hazardous conditions and mechanisms are in place to report them without fear of reprisal. In the nonpunitive culture of aviation, safety issues are resolved by means of system corrections rather than punishment. A just culture encourages the reporting of human errors and near misses as opportunities to learn, exposes systems issues, and drives improvement, creating a virtuous cycle. The only way to improve systems is by knowing what to fix, but in healthcare, most errors and near misses go unreported for fear of punishment. According to the Agency for Healthcare Research and Quality’s 2018 Hospital Survey on Patient Safety Culture, nonpunitive response to error is consistently the lowest-performing domain, with 53% of hospital employees indicating their perception that human errors are held against them (Famolaro et al., 2018).

Punishing healthcare providers for making human errors diverts attention from the underlying systems issues and delays systematic improvements that would prevent future errors. Frontline healthcare staff are in a good position to uncover previously unknown safety issues in their daily workflows, and they are more likely to do so in a supportive environment that embraces just culture. In a personal communication (January 25, 2017), Leape shared with me the two myths of punitive culture: the perfection myth (if people try hard enough, they will not make any errors) and the punishment myth (if we punish people when they make errors, they will make fewer of them). Healthcare leaders must work to dispel these harmful myths.

Recognizing that a punitive culture does not make systems safer, progressive healthcare organizations are now transitioning to a just culture based on the following core set of principles:

  • Acknowledgment that all humans make errors
  • Recognition that punitive action fails to correct systems issues and discourages error reporting
  • Encouragement of self-reporting of errors, including near misses
  • Exposure of underlying systems issues that contribute to, or fail to prevent, human errors
  • Creation of systems with safety mechanisms that prevent human errors
  • Support for healthcare personnel in the aftermath of a human error
  • Individual accountability for errors made as a result of reckless behavior

A just culture creates a safe environment where health systems can identify potential hazards and strengthen systems to eliminate risk.

Back to Top | Article Outline

LEVERAGING TECHNOLOGY TO BUILD RESILIENT SYSTEMS

Adoption of a just culture changes the focus of health systems from punishing providers to fixing systems, which provides the opportunity to achieve the second major shift: leveraging technology to build systems that are increasingly resistant to human error. Rapid technological advances—EHRs, artificial intelligence (AI), machine learning, robotics, telemedicine, smart devices, simulation, genomics, proteomics, precision medicine, and 3-D printing, among others—are transforming healthcare. The more these technologies advance and are integrated into standardized processes, the faster healthcare will become a high-reliability system.

Technology can augment human processes in healthcare to reduce variability and help prevent error. Technology also can standardize and consequently increase the accuracy and efficiency of cognitive, visual, manual, and coordinating (workflow) processes:

  • Cognitive processes. Harnessing the power of vast repositories of healthcare data, AI and machine learning can reduce both diagnostic and therapeutic errors. Natural language processing of EHR and other sources of data is being used to identify adverse drug reactions. My research, for example, focuses on using log data from the millions of orders placed in large health systems each year to quantify and reduce electronic order errors using novel health IT safety interventions built into EHR systems.
  • Visual processes. AI and machine learning are transforming the fields of pathology and radiology. Despite some technical obstacles, the large volume of available data is enabling computer vision and AI to “read” pathology slides and imaging studies.
  • Manual processes. Robotics has revolutionized the precision of surgery and other procedures, particularly in the high-risk settings of heart and brain surgery.
  • Coordinating processes. Beyond diagnosis and treatment, technology is being applied in healthcare settings to optimize the movement of patients and supplies through the system safely and reliably.
Back to Top | Article Outline

LEADING A CULTURE OF SAFETY

HROs apply other critical tools of safety culture, including checklists, teamwork, simulation, communication, and—importantly—leadership. Facilitating changes in approach and priorities involves leadership at all levels of the organization. In particular, effective leaders do the following:

  • Commit to creating and maintaining a culture of safety
  • Consistently make safety a top priority in their decision-making
  • Actively create an environment where all staff feel safe reporting their concerns
  • Set the tone for teamwork, collaboration, and respect
  • Recognize that most adverse events involve a failure of systems and processes
  • Respond by providing resources and support to improve systems
  • Model active leadership skills
    • Share information
    • Invite team members to contribute their expertise and concerns
    • Make themselves approachable
Back to Top | Article Outline

CONCLUSION

How do we achieve a high-reliability healthcare system? First, we must create a just culture where the emphasis is on fixing systems and providing a safe environment for workers to report human errors and near misses. Second, we must leverage technology to build systems that are resilient to human error. The extent and pace at which we incorporate just culture principles and technological advances into healthcare will determine whether and how soon we will achieve high reliability in healthcare.

Back to Top | Article Outline

NOTE

Dr. Adelman served on the Culture of Safety Roundtable, convened by the American College of Healthcare Executives (ACHE) and the Institute for Healthcare Improvement Lucian Leape Institute. The roundtable developed Leading a Culture of Safety: A Blueprint for Success, which is available via the ACHE website at http://safety.ache.org/blueprint/.

Back to Top | Article Outline

REFERENCES

Adelman J. S., Kalkut G. E., Schechter C. B., Weiss J. M., Berger M. A., Reissman S. H., (2013). Understanding and preventing wrong-patient electronic orders: A randomized controlled trial. Journal of the American Medical Informatics Association, 20(2), 305–310.
Agency for Healthcare Research and Quality. (2014). Interim update on 2013 annual hospital-acquired condition rate and estimates of cost savings and deaths averted from 2010 to 2013. Retrieved from https://www.ahrq.gov/professionals/quality-patient-safety/pfp/interimhac2013-ap2.html
Aviation Safety Network. (2019, January 1). Aviation Safety Network releases 2018 airliner accident statistics (press release). Retrieved from https://news.aviation-safety.net/2019/01/01/aviation-safety-network-releases-2018-airliner-accident-statistics
Famolaro T., Yount N. D., Hare R., Thornton S., Meadows K., Fan L., Sorra J. (2018). Hospital survey on patient safety culture: 2018 user database report. Rockville, MD: Agency for Healthcare Research and Quality.
Leape L. L. (2009). Errors in medicine. Clinica Chimica Acta, 404(1), 2–5.
Nanji K. C., Patel A., Shaikh S., Seger D. L., Bates D. W. (2016). Evaluation of perioperative medication errors and adverse drug events. Anesthesiology, 124(1), 25–34.
Weick K. E., Sutcliffe K. M. (2007). Managing the unexpected: Sustained performance in a complex world (2nd ed.). San Francisco, CA: Jossey-Bass.
© 2019 Foundation of the American College of Healthcare Executives