This editorial is a reaction to “A Randomized Trial of a Supplemental Alarm for Critically Low Systolic Blood Pressure” by Panjasawatwong et al.1 from the Cleveland Clinic, Department of Outcomes Research. The authors created a clinical decision support (CDS) system in their electronic health record (EHR) to alert providers when hypotension occurred. Providers in half of the cases were randomly assigned to receive alerts, delayed by 3 to 5 minutes. The authors then examined the time taken for the hypotension to return to normal but found no difference between the group who were alerted and those who were not. Although the results of the trial were negative, we believe that the model the authors created to test the efficacy of a CDS system has a great deal of potential, and we applaud the editors for choosing to publish their work. There are important lessons to be learned from this study.
Before 2013, there was reluctance on the part of EHR companies to provide decision support as part of their software. This was because of concern that such support would provoke regulatory requirements that would wash over onto the EHR itself, requiring the company to seek Food and Drug Administration approval as a medical device for any new or updated software. This tension is illustrated in the Institute of Medicine report Health IT and Patient Safety: Building Safer Systems for Better Care and in the dissenting opinion in Appendix E written by Dr. Richard Cook, the only anesthesiologist on this workgroup.2 In April 2014, much of this tension was resolved by publication of FDASIA Health IT Report: Proposed Strategy and Recommendations for a Risk-Based Framework by the Food and Drug Administration, which includes the specific statement:
Clinical decision support (CDS) provides health care providers and patients with knowledge and person-specific information, intelligently filtered or presented at appropriate times, to enhance health and health care. Because its risks are generally low compared to the potential benefits, FDA does not intend to focus its oversight on most clinical decision support. FDA, instead, intends to focus its oversight on a limited set of software functionalities that provide clinical decision support and pose higher risks to patients, such as computer aided detection/ diagnostic software and radiation therapy treatment planning software.a
This statement, coming at a time of rapid development in mobile information technology, is fueling a boom in CDS software, including in the operating room.
According to the Office of the National Coordinator for Health Information Technology, CDS combines knowledge and data to generate and present helpful information to clinicians as care is being delivered.3 With the incorporation of EHRs and specifically, anesthesia information management systems, there are abundant opportunities to affect care. Whether certain decision support actions improve care is a different question altogether and one that this study begins to explore.
COMBINING KNOWLEDGE AND DATA
The patient monitor provides the anesthesiologist with lots of data. In fact, a difficult skill for a resident to master is the ability to process the data and discern which elements need more attention than others. Alarms, pop-ups, flashing numbers, and other “alerts” assist in processing the data, and these are configurable for specific patients and cases. The temporal proximity of the alarm to the event cannot be beaten and despite the occasional temptation to focus on the EHR as a guide for therapy, the clinician’s primary source of data is the monitor itself. The EHR does, however, hold knowledge. When maintained properly, the patient’s EHR holds a wealth of valuable information that guides the clinician. This includes not only a list of medications, allergies, and diagnoses but also a record of previous experiences. An alert that identifies the patient as having a difficult airway can and should be used by the anesthesiologist. Another example might be a timely alert reminding the clinician to redose antibiotics or check the activated clotting time. Unrelated to the patient monitor, the EHR can provide useful support. The future of CDS involves combining the knowledge and data. For example, if the monitor broadcasts a core body temperature of 35° and the medical record contains a history of sickle cell disease, an alert can suggest “Temp dropping in patient with sickle cell disease.” When installing the EHR, the anesthesia group should review available CDS triggers and choose to configure all, some, or none.
PRESENT HELPFUL INFORMATION
An alert that is not clinically helpful can be more detrimental than not having an alert at all. It poses a distraction and reduces the likelihood that the anesthesiologist will pay attention to future alerts. For this reason, alerts should be created with the mind-set: “What is the clinician most likely to miss?” The clinician seeing the alert needs to think: “That’s useful. It’s telling me something I don’t already know.”
AS CARE IS BEING DELIVERED
CDS after the fact is too late to be effective. Even when clinically relevant, if the information is not delivered in a timely manner, it can detract from care. An alert that signals to a driver not to change lanes because of a car in the adjacent lane would be ineffective if it comes after the lane change (and collision) has occurred. An effective alert should deliver information in a way that can either prevent a problem or steer a clinician in a helpful way. Unfortunately, there is a theoretical risk of clinicians relying on alerts and feeling falsely secure if no alerts pop up. Will the driver of the car bother to look in the mirror if she/he can rely on the lane-changing alert?
So how well did the researchers at the Cleveland Clinic do with their decision support system? They picked an easy target, hypotension, with a known relationship to adverse outcomes, based on their own published experience.4 Few anesthesiologists would argue with the need to support organ system perfusion (although some might argue with blood pressure as the metric) and all of us are trained from the first day of residency to react to clinical hypotension. In one respect, this was a good target to aim for because it is obviously important; from a different perspective, however, the support provided was not “telling them something they didn’t already know.”
Not surprisingly, therefore, the study produced negative results. Hypotension was treated just as quickly with or without a cue provided by the EHR. The obvious follow-up question: “Did faster treatment lead to better outcomes?” never even came into play. It would be easy for any reader to dismiss this article out of hand based on these results.
We urge a deeper look. Beyond the clinical findings, what this study really documents is a method to use the EHR to conduct low-cost, high-volume research in the most pragmatic possible setting, our own operating room. CDS is here to stay, just like the “oil pressure low” light on our dashboards, but documenting the benefits of any particular system will require further study. Panjasawatwong et al. have shown us an efficient approach to assess the first concern with CDS: “Is it helpful?” Using the EHR not just to provide the support but also to conduct the study is an important advance. The EHR randomized cases to control or study group in a nonacademic, large community hospital, without conscious attention by a researcher, and then gathered all the outcome data necessary for the study. This allowed the researchers to focus on the up-front programming and the post-study analysis, enabling a prospective randomized trial of >3000 patients, conducted in just 1 year, at a minimal cost in time and effort. This model is worth repeating, especially when assessing the value of the EHR itself.
We assert that the authors missed an opportunity to survey practitioners regarding the perceived intrusiveness of the alerts. Specifically, whether the alerts were appreciated or considered a nuisance by the clinician. As noted earlier, the helpfulness of the CDS is an important consideration, which must be measured in tandem with accuracy and timeliness. Perhaps, when the alert pops up on the EHR screen, there should be 2 options for clearing it: “thank you, that was helpful” versus “I knew that already, go away!” Analyzing these data over time (and over multiple kinds and styles of alerts) will help guide future development. This is a feature that should be included in future studies of this sort.
In summary, although the authors studied a CDS that was not shown to affect care, as illustrated by their negative results, they did illustrate a significant benefit of implementing an EHR. We look forward to more studies of this kind.
Name: Richard P. Dutton, MD, MBA.
Contribution: This author contributed to writing and finalizing this editorial.
Attestation: Richard P. Dutton approves this manuscript.
Name: Ori Gottlieb, MD.
Contribution: This author contributed to writing and finalizing this editorial.
Attestation: Ori Gottlieb approves this manuscript.
This manuscript was handled by: Maxime Cannesson, MD, PhD.