In the 1967 film The Graduate, an older man offers the young character played by Dustin Hoffman one word of advice, and that word is plastics. If we were to remake that film today, the word could be algorithms. Like plastics, algorithms are an increasingly ubiquitous aspect of the human experience. On the bright side, plastics and algorithms are rational choices for many applications. On the dark side, both outlive their intended purposes, often to detrimental effects as time goes on and they persist unchanged.
At their best, algorithms support workflows and decision-making processes to optimize healthcare cost, quality, and access. At their worst, algorithms institutionalize undesired activities and decision-making rules. With respect to the latter result (locking in inefficient workflows), the primary culprit seems to be the electronic health record. Without a doubt, the electronic health record has locked in inefficient workflows and led to the rise of something called “pajama time” when providers complete their documentation work after hours.
Using algorithms in decision-making can lead to other downsides. When artificial intelligence and machine learning analyses are used to create algorithms, the potential for systemic bias being introduced and codified is dramatic. When an algorithm is “trained” to rely on data from particular samples of the population, any bias in the sampling frame ultimately will become a feature of the algorithm. In healthcare, the long history of clinical trials has largely focused on adult white males. As a result, much of our clinical guidance is most effective on that population. A classic example is the protocol for diagnosing a heart attack. Shortness of breath, crushing chest pain, and pain radiating down the left arm are common symptoms of a heart attack in men. The problem with this diagnostic algorithm is that it ignores the symptoms women often experience when they are having a heart attack, potentially delaying the onset of care. Such built-in biases occur across many of the tools currently used to prioritize and treat patients, often along racial and ethnic lines.
So, what is to be done? Outlawing algorithms is impractical and unwarranted, as the good they do far outweighs the harm. Nevertheless, we should strike expressions like “I can’t change my mind because the system won’t let me.” For now (anyway), the machines work for us, and we should give our human employees the authority and flexibility to alter the machines’ decisions. In particular, we should give frontline caregivers the tools to request sensible changes to algorithms and workflows they are responsible for carrying out.
The content in this issue of the Journal of Healthcare Management touches on workflows as algorithms in various ways and the occasional breakdowns in their design and application.
In our opening interview, Nicole B. Thomas, FACHE, reveals what drives her personal commitment to addressing the needs of underrepresented populations, starting with the development of new leaders who reflect the diversity of the populations they serve.
Joseph R. Betancourt, MD, opens our new yearlong series of columns on The Future Leader with his thoughts on how different patient groups navigate the health system and experience different outcomes—often along racial and socioeconomic lines. Moreover, those differences in outcomes tend to be worse for patients in at-risk populations. Dr. Betancourt highlights four activities that leaders can undertake to remedy or reduce the impact of such biases. The fourth activity directly calls for interventions to target the biases built into organizational workflows and systems.
Our new yearlong series of columns on Financial Challenges begins with Susan DeVore’s discussion of ways hospitals are reorganizing care delivery to optimize performance. She notes that many interventions confine themselves to the existing boundaries of the workflow. In such instances, there may well be gains, but they will be illusory if the new workflow’s benefits are not measured against past performance. Instead, interventions should extend beyond the end of the workflow at hand and integrate practices that contribute to the input and output points normally used. In doing so, the healthcare system will become inherently more integrated and achieve better financial efficiency and care outcomes.
In the first research article, Justin M. Bachmann, MD, FACC, FAHA; David R. Posch; Gerald B. Hickson, MD; C. Wright Pinson, MD; Sunil Kripalani, MD, SFHM; Robert S. Dittus, MD; and William W. Stead, MD, FACMI, look at integrating patient-reported outcomes into the care process. The biggest takeaway from their case study is that the modification of an existing system requires an enterprise-level commitment, given the complexity and challenges involved. As the federal government makes patient-reported outcomes a larger component of its reimbursement schema, that buy-in from the top should be coming along shortly. With that said, the potential for introducing bias into the system is considerable if only those who choose to report serve as the basis for care decision-making.
Another truism in workflow and algorithm design is that size matters. There is very little need to routinize an activity that rarely occurs. Realizing that repetition is essential to create the economies needed to justify workflow and algorithm development, healthcare organizations are both consolidating and attempting to draw patients from further afield. The research by John W. Huppertz, PhD; David J. Leung, MD; Samuel F. Hohmann, PhD; Mandeep S. Sidhu, MD; and Dennis P. McKenna, MD, explores the use of advertising to attract patients from other communities. To the extent that consumers have choice, the authors suggest that advertising can influence the decision-making process.
Up to this point in this editorial, the reader might infer an underlying assumption that there is one best way to design a workflow or decision-making algorithm. No such assumption was intended. In fact, the article by Amy Mills; Asta Sorensen; Emily Gillen, PhD; Nicole M. Coomer, PhD; Elysha Theis; Stephanie Scope; Christopher Beadles, MD, PhD; and Jihan Quraishi, RN, AE-C, CCRC, dispels that notion nicely. Different leaders will organize work differently and enlist different professionals to carry out the tasks. According to Mills et al., staffing models for anesthesia services follow environmental and organizational characteristics.
Our fourth article deals with how consumers interact with the workflows that health systems design. Taylor J. Horyna, PharmD, BCPS; Rosalinda Jimenez, EdD, APRN, FNP-BC, PMHNP-BC; Linda McMurry, DNP, RN, NEA-BC; Dolores Buscemi, MD, FACP; Barbara Cherry, DNSc, RN, NEA-BC; and Charles F. Seifert, PharmD, FCCP, BCPS, describe the use of patient navigators to help consumers move from one provider’s workflow on to the next. They find that better navigation is likely to lead to better results in both financial and care outcomes.
Healthcare delivery is a complex endeavor, and it is not surprising that professional and lay people alike should struggle to intuit how it all fits together—particularly because many parts do not fit together seamlessly. Layering on experts to help manage workflows (e.g., physician scribes) or understand care path algorithms (e.g., patient navigators) should serve as a red flag that our systems need constant human intervention. Otherwise, as the cliché goes, “Every system is perfectly designed to get the results it is currently getting.”