Contemporary conversations about health care commonly reference a “learning health care system” (LHS). 1 The opportunity of an LHS is to learn from what we do and do what we learn. Given advances in health information technology and the ubiquity of data, it is assumed this paradigm can be used to improve patients’ health and health care experience while simultaneously guiding investments of constrained health care dollars into programs and operations that have a real evidence base. Numerous centers have described a cycle of learning from new program implementation in integrated care systems, or systems to improve quality and value through use of data. 2,3 One such center has advanced the adoption of rapid-cycle randomized testing of care approaches under a quality improvement paradigm. 4 At Vanderbilt University Medical Center (VUMC), we have pioneered a scalable LHS that fully embeds pragmatic effectiveness trials into routine processes of care, championed at the executive vice president level and led by a team of senior research and operational leaders. Here, we describe the program elements we have found necessary and how the LHS research support platform has organized them. Central to these efforts is the process roadmap (Figure 1), which begins with framing the question or problem that the LHS Platform is asked to address and culminates with system change based on newly available evidence. Our example can be adapted by other emerging learning health care systems, and we hope to stimulate other innovations in learning health care as we journey toward optimizing the quality, safety, efficiency, and value of health care.
Ensuring Stakeholder Engagement
A mature LHS should be typified by a culture in which stakeholders embrace and expect systematic evaluation of the content and processes of care. A willingness to change behavior to adopt a better approach must be inbuilt. Achieving this culture requires comprehensive engagement with the broad community, patients and their families, the full spectrum of health care providers, health care administration, and system leadership (Table 1).
Patients and community
When a health system adopts a learning philosophy, both the community at large and patients should be involved to help direct and communicate about the scope and intent of an LHS. Our community has expressed a desire for transparency and wants to be a part of the dialogue. Many keenly recognize the importance of research, but comment “my doctor still knows best.” A community representative, who is both a patient and embedded in the local neighborhoods, serves on the LHS Platform Steering Committee to help navigate such expectations. We also use community engagement studios to obtain comprehensive input from patients and area residents on individual topics—each community engagement studio is a coordinated conversation with community members, led by skilled facilitators with expertise in community outreach. 5 As one example of benefit, our community representative has identified a general lack of awareness around research regulations allowing for waiver of informed consent in certain situations, which could breed distrust if not addressed. Thus, we are embarking on broad education to ensure transparency of LHS activities, emphasizing our community partnerships and ensuring that research findings are publicly shared.
Health care providers
The provider community can drive the success or failure of any learning health care initiative. For any health care system to truly learn, all health care providers must be engaged stakeholders. We encourage partnerships between frontline providers and those developing LHS projects. Whether the intervention is one that is intended to improve nursing workflow, allied health services, physician practice, or all of these, the approach to partnership is similar. Representatives from affected groups are invited to help design the project from the outset, facilitated by intentionally accommodating scheduling and geographic constraints of the providers and by introduction of theoretical constructs and research processes that respect the providers’ training and background.
We have found that provider engagement is not only essential to build alliances but also improves the research product. For example, we supported a project to evaluate the deployment of behavioral intervention teams in the inpatient setting to improve the management of disruptive patients. 6 Absent the resources to deploy the intervention to all clinical units, we proposed a unit-level cluster crossover trial. Nursing leadership expressed concern that staffing satisfaction would be harmed if a highly desired intervention was first provided and then taken away again. Consequently, we engaged unit nursing staff who immediately led the culture change necessary to meet the challenge. This strategy of engaging frontline providers to champion the importance of fidelity to a defined intervention and of having a good control group for comparison was met with success. Our studies now typically request a provider stakeholder to partner with the study team. Table 2 describes similar experiential learnings.
Clinical and operational leadership
A successful LHS must have the support from clinical and operational leaders both to provide resources and to support the paradigm shift within the organization. Engaging system leadership informed us how to better marry the needs of the health care system with the rigor to provide robust information to promote systems and practice change (Table 1). We are now moving toward intentionally leveraging the system’s quality and educational programs for broader dissemination and implementation efforts. In addition, we are developing new infrastructure to more completely address questions related to efficiency, which include detailed cost and process analyses.
We note that the expected actions in response to an LHS project’s findings should be explored with stakeholders during the design phase, and clinical and operational leadership support is essential. This becomes keenly important when the study does not support the proposed intervention. One of our projects evaluated an intervention widely believed to be effective, the routine bathing of critically ill patients with chlorhexidine to reduce hospital-acquired infections. 7 Our findings did not provide evidence in support of this program in a real-world health system environment, which was met with skepticism by some. This led to interest in exploring methodological weaknesses and confounding variables as the explanation for the unexpected findings, rather than acceptance. Moreover, current Centers for Disease Control guidelines for infection control present more structural barriers to de-implementation. 8 Ultimately, there was no potential for the system to act in light of our findings. This concern was underappreciated at the outset. Therefore, we do not recommend investing resources into such projects when foreseeable.
Our net for identifying potential projects is broad. Opportunities to engage the LHS Platform have been presented at the School of Medicine Research Enterprise Forum, Executive Leadership Retreat, Hospital Board Meeting, and research town halls. We have surveyed the medical center for potential projects. Others come via word of mouth and referral from the front lines. We also work directly with health system leadership to identify their top priorities. As the partnership between research and operations has matured, when ideas for development and initiation of new clinical programs emerge, operational leaders are asking whether the value of the program could be evaluated by the LHS Platform. This is sometimes included as a contingency for institutional support. We note that we have not yet sought potential projects from patients. This remains an underdeveloped opportunity that could elevate topics of importance to our key community stakeholders, with the benefit of increasing engagement in the LHS.
The pipeline (Figure 1) begins with potential projects first being discussed at our weekly LHS workshop. This moderated forum, staffed by leading pragmatic trialists, biostatisticians, project managers, implementation scientists, and medical education and dissemination experts, explores all aspects of proposed projects. The idea originator prepares a brief project summary before the workshop to help guide conversation, ideally including a succinct description of the proposed intervention, measurements, and study design. Because many of those proposing a project have no research background, the LHS Platform team often helps draft the summary.
At the workshop, the team explores rigorous approaches to answering the project’s underlying questions. The project is evaluated for the willingness of the clinical environment to adopt a study into workflow and the potential practical and ethical challenges to conducting a pragmatic trial. If the project originators are not representative of the stakeholders in whom the intervention will be deployed, the project originators are asked to engage appropriate personnel and invite them to join ongoing study design discussions during the workshop.
Projects are fleshed out in workshops and designed in collaboration with the LHS biostatisticians. As the project matures, we are intentional about structuring the project team to include both research and topic domain expertise, also including trainees whenever possible. When the design is sufficiently detailed to inform a decision on whether it should be fully supported, it is presented to the LHS Platform Steering Committee. Approved projects are then provided with the LHS Platform’s wraparound services to help the project come to fruition (Figure 1). When projects do not fit, we link the project lead to other resources that might be available, such as the system’s quality office or research support office. 9
Essential Design Elements of Selected LHS Projects
We have identified design elements that are essential for any project to be included within the LHS Platform portfolio, around which much of our intake work is structured. Foremost, there must be the potential to use randomization to reduce bias. Then, it must be possible to adequately define the intervention, the outcome variables must be clear, and it must be possible to conduct the trial with a high degree of pragmatism.
Randomization to control bias
Early in the evolution of the LHS Platform, we elected to focus on integrating pragmatic comparative effectiveness trials into the process of health care delivery with bias control achieved using randomization. However, randomization presents a barrier to pragmatism. Design considerations must include how to integrate structured variation into care pathways and how to do it in such a way that there really is a difference in the intervention between groups. If randomization can occur at the level of provider, unit, or clinic, then study options include cluster designs, stepped wedge designs, and other contemporary approaches. These designs tend to be more pragmatic, and they obviate the most common flaws found in quasi-experimental research.
Measuring the right variables
Central to any robust activity designed to systematically improve operations or practice are data. Quantifying and understanding both the inputs (i.e., the intervention) and outputs (e.g., health state) are necessary to draw valid conclusions. Without defining the input and output variables, the study is likely to collect data that are uninformative for the question at hand. While the research is designed with the primary question in mind, secondary questions related to resource use are generally included in the analysis plan since our motivating goal is to improve both the quality and efficiency of care. Nonetheless, for an LHS project, the 2 core variables are defined by answering the following questions:
- Is the intervention or activity (the treatment variable) operationalized and described in such a way that it can be replicated? This can be particularly challenging for complex interventions, where standardized reporting approaches are helpful. 10,11
- Are the outcome measures of interest sufficiently reliable, valid, and mechanistically linked to the intervention to draw inferences about cause and effect?
The traditional approach researchers take to study design is to maximize internal validity through control of every aspect of the experiment. While useful to demonstrate efficacy, this does not lend itself to an LHS where the priority is external validity, i.e., to generate evidence about effectiveness in real-world practice. 12 There are numerous excellent resources describing pragmatic studies. In particular, the PRECIS-2 framework evaluates how pragmatic a given study is. 13 For the LHS Platform, we focus on 4 primary aspects of the study design:
- Is it feasible to randomize at the patient, provider, unit, or system level?
- Will the approach require detailed, individual-level informed consent?
- Are relevant, reliable, and valid data for identifying patients of interest and for evaluating outcomes readily available in existing, preferably electronic data systems?
- Is patient volume sufficient for a study to have the power to draw meaningful conclusions within a reasonable time frame, typically a year or less?
We have underwritten the LHS Platform with a number of key resources. These are highly scalable and found at most academic medical centers and many health systems. They include project management, data science capability, regulatory support, and research expertise. Expertise in health care operations or the clinical question of interest is generally an asset provided by the project originator, who may or may not be versed in the conduct of research.
Program management and project coordination
Learning health care projects require collaboration among multiple stakeholders. The effort to keep projects moving is substantial. Whether it is helping a nonresearch partner develop comfort with research constraints or negotiating for priority access to data, skilled project managers are crucial to overcome navigational challenges. Much like traditional clinical research professionals, the project managers are responsible for supporting individual projects throughout their life span. The project management team is led by the LHS program director, who is responsible for maintaining policies and procedures, providing global oversight to the portfolio of projects, and serving as a direct liaison to LHS researchers and executive leadership. The program director and project managers work closely with the platform’s faculty leads and are the essential implementation arm of the LHS Platform.
An almost unfathomable amount of data can be gleaned from today’s health care environment. Data science focuses on extracting knowledge from these data, broadly ranging from inferential statistics to computational processes and data management. We have purposefully embedded 2 key domains in the LHS Platform: biostatistics and clinical informatics.
Biostatistics is necessary for study design, analysis, and drawing correct conclusions from the data. A standard 2-group randomized controlled trial with a binary outcome might be analyzed using a chi-square test (or logistic regression if adjusting for covariates); yet cluster designs, crossover designs, and stepped wedge trials have a complex underlying data structure. Ignoring that structure can lead to false conclusions. 14 Accounting for within-cluster correlations and periodicity as well as exploring differential treatment effects require a more sophisticated approach. Methods for computing power for these trial designs often require comprehensive simulation. We partner with the Department of Biostatistics to provide faculty-level guidance for the design and analysis of LHS Platform projects and to drive the speed and efficiency with which questions can be asked and answered.
For data collection and operationalizing clinical decision support as a strategy for implementing or deimplementing interventions, we rely on clinical informatics. The entire data and software infrastructure within health care information technology has potential use, including the financial and the medical record database architectures, as well as any affiliated systems such as imaging, laboratory, pharmacy, and operational (e.g., scheduling) systems. The systems are useful both as a source of data and to help deploy interventions and randomize patients through, for example, clinical decision support. Our experience is that identifying and selecting variables to extract from these systems, either for analysis or for clinical decision support, requires input from the clinician (or other data generator), the informatician (who can access the data and build decision support tools), and the biostatistician (who needs to understand measurement uncertainty to best design the project and analyze the data).
In our system, LHS activities are expected to involve systematic investigation and interaction with living people or their private health information, with the purpose of developing generalizable knowledge. This meets the definition of human subjects research and so regulations to protect research participants apply. 15 Our projects are submitted for institutional review board (IRB) review and oversight, they are registered on ClinicalTrials.gov, and we comply with requirements for informed consent.
Most projects tackled by the LHS Platform are not evaluating investigational products or experimental interventions but are comparing process or content of usual care. As such, studies typically meet criteria for modification or waiver of consent, such as when the research is considered minimal risk, the research cannot practicably be conducted without a waiver or modification of the consent process, and the participants’ rights and welfare will not be harmed by the research. 15 We ensure the protocol language reflects both the true nature of the interventions and the appropriate protections for patients, including safety monitoring, data security, and consent considerations. Given the significant resources traditionally needed to obtain informed consent, 16 we have generally classified studies requiring full informed consent as less pragmatic and these are not prioritized for support unless we can identify ways to integrate the research consent into the usual process of care.
To facilitate appropriate ethical and regulatory consideration for approaches to consent, we have actively engaged members of our IRB as well as local experts in empirical bioethics. The existing regulations are variably interpreted in the context of pragmatic trials, especially those where an intervention is assigned based on which clinical unit or provider is involved, i.e., randomization is at the level of a cluster and not at the level of a patient. 17 Guidance from the U.S. Department of Health and Human Services and the clinical research community at large is needed to ensure consistent and clear decision making around regulatory requirements for pragmatic trials conducted in this manner. In the meantime, we have developed a template for gathering and presenting information to the IRB about factors affecting the practicability of consent such as introducing bias, influence of the consent process on fidelity of the intervention, and risk introduced by the consent process itself.
In the early stages of LHS development, research principles are unlikely to pervade the clinical environment in which the most relevant and pressing questions arise. The LHS Platform actively links health system staff with an experienced clinical investigator to make sure that the study is designed and implemented in a manner that answers the question of interest in a generalizable manner. If the research collaborator does not recognize the fundamental importance of pragmatism, the very nature of LHS culture will be undermined. As with stakeholder engagement, the researcher needs to ensure questions are answered accurately, but do so without exerting so much control on intervention and measurement that scalability, replication, and implementation, when warranted, become uncertain.
Structure and Governance
Meaningful collaboration between research, operations, and administration is likely the most important facet of a quality LHS. Thus, the LHS Platform boasts a matrixed partnership among these groups, sponsored at the most senior level by the senior vice president and chief innovation officer for the Vanderbilt Health Affiliated Network and the executive vice president for research. A strategic committee, consisting of the executive sponsors, the hospital chief executive officer, chief operating officer and chief nursing officer, the faculty leads, and the program director, meets quarterly and defines new initiatives, opportunities, and directions. A monthly steering committee with representatives from administration, nursing and the allied health professions, data science, clinical quality, patient stakeholders, and research reviews progress of ongoing efforts evaluates new projects and advises the LHS Platform team generally and specifically when roadblocks arise. These meetings also provide experiential learning for recipients of implementation science and learning health systems career development awards.
Given our focus on generalizability, we expect that findings from the LHS Platform are broadly applicable to other settings and thus warrant publication, and we have published our trial designs and our findings in the highest-tier journals. 7,18,19 An LHS publication committee reviews manuscripts to ensure that the institutionally supported activities, regulatory positioning, and operational framing of the problem under study are represented appropriately and that International Committee of Medical Journal Editors guidelines for authorship are followed. 20
The LHS Platform infrastructure consists of personnel and activities enabled by institutional support and funding through our institutional Clinical and Translational Science Award. 9 Most of the effort, however, is subsumed within usual work activities of researchers, clinicians, and administration. As such, the relative cost to conduct an embedded, pragmatic, randomized, controlled LHS trial is considerably less than a traditional, explanatory, randomized, controlled trial. As trials are delivered within the context of routine delivery of care, many trial costs are not applicable. For example, any therapeutic interventions are administered as usual care and are thus not paid for by the study. There is also minimal expense for enrollment. Costs incurred include extracting and curating data from the electronic health record, and the biostatistical needs can be high given the complex nature of the data. Investigator time and support from the platform must also be factored in.
While we have leveraged grant support for initiating the LHS Platform, this support is not generalizable to other institutions, and reliance on grant funding is not sustainable. We have developed a model for ongoing financial support in partnership with health system leadership. Consistent with the experience of other learning health systems, central to the funding model is the idea that through the LHS Platform, the health care system can increase not only the quality and effectiveness of care but also the efficiency and value of care delivery. 4 By sustaining the LHS Platform, the health care system is investing in itself.
We initially designed the LHS Platform to conduct pragmatic effectiveness trials. However, as projects come to conclusion, we must implement, refine, or deimplement the interventions under investigation. We initially felt this would be relatively easy since at least one arm of the intervention was already embedded in clinical care. However, this is proving challenging for both positive and negative results, so we are shifting to embrace dissemination and implementation in both practice and research.
As implementation comes to the fore, there is an opportunity for exploring interventions that sustain observed effects. For example, the SMART and SALT-ED trials,* which showed that routine use of balanced crystalloids improves mortality and renal function when compared with saline, 18,19 used best practice alerts in the electronic health record to advise providers about fluid choice. These alerts have been more broadly deployed, and our pharmacy has switched its preferred fluid to balanced crystalloids. While a sustained reduction in use of saline was evident (Figure 2), we did not have the opportunity to compare different strategies. In part, this was because local equipoise had been lost and efforts moved rapidly toward implementing system-level change to adapt to the new evidence.
We note there is an important distinction between changing systems to encourage evidence uptake and changing providers’ practice based on knowledge. We intend to move beyond the traditional dissemination of publications and lectures to improve uptake of newly discovered best practices. For example, using a cluster crossover design, we are testing the effectiveness of QuizTime, a mobile electronic application for active learning. The first study is a randomized controlled crossover design to determine whether this education can change provider prescribing behavior to be consistent with both new regulations around opioid prescribing and the findings of the SMART and SALT-ED fluid studies. If successful, we will continue to evolve the platform as a dissemination vehicle.
We also intend to communicate findings beyond academic and provider audiences and use this as an opportunity to further engage our broader community. By crafting intentional, extended dissemination of findings to the local population, we aim to achieve not only the goal of return of results but also the engagement that further empowers the community as partners in the LHS. Providing a robust acknowledgment of patients’ roles and contributions is essential to LHS transparency and success.
As the LHS Platform is taking its place as a strategic pillar of the health care system at VUMC and, as we develop the model for financial sustainability, we have recognized that our ability to provide detailed answers about the cost-effectiveness of interventions is underdeveloped. We are constructing the necessary framework for integrating an accurate assessment of costs alongside the measurement of the intervention and intervention outcomes. This involves engaging additional stakeholders, including health economists and finance teams. Augmenting the platform with these skill sets will provide us with the capacity to go beyond learning about the quality and effectiveness of care and also to address questions of value as we balance the costs of services with the accruing health outcomes.
We have not yet conducted a broad evaluation of our program from the perspective of project leaders, providers, trainees, and others who we are attempting to engage in broad culture change. As we continue our evolution as an LHS, we expect to conduct qualitative evaluation to continue to meet the needs of the enterprise and the patients that we serve. We will also expand our service to trainees by intentionally engaging them in projects, identifying learning opportunities available by engaging in LHS Platform activities, and developing intentional linkages between the LHS Platform and institutional training programs such as our K12 programs in implementation science and learning health systems, and our Veterans Administration-supported Quality Scholars program.
The journey toward an LHS has itself been educational (Table 2). Our commitment to a research paradigm is rooted in Vanderbilt’s academic culture and emphasis on developing generalizable knowledge. While working at the intersection of academic medicine’s multiple missions presents numerous challenges, we have shown it can be done and that it can be sustained. Integral to our current successes are unceasing commitment to partnership between operations, research, and stakeholder groups such as patients; a focus on the rigor, and not volume, of projects; insistence on pragmatism, including with regulatory affairs; exceptional project management; and incorporation of data science expertise. With this infrastructure in place, new investments in clinical programs are now being asked to undergo study during implementation to validate claims of benefit and potential for scalability and spread. There is still much work, not only in learning from what we do but also in doing what we learn.
The Learning Health Care System (LHS) Platform has been supported by hundreds of people across the Vanderbilt University Medical Center (VUMC) and the broader community. The authors owe particular gratitude to Dr. Jeff Balser, Dean of Vanderbilt University School of Medicine, President and CEO of VUMC, for championing the development of the LHS at VUMC, and to members of the Steering Committee and the LHS Workshop for their enduring service and support. These include Dikshya Bastakoty, Marc Beller, Bree Burks, Henry Domenico, Shon Dwyer, Jordan Everson, Estefania Gibson, Paul Harris, Jim Hayman, Gerald Hickson, William Hiser, Catherine Ivory, Kevin Johnson, Ruth Kleinpell, Heather Limper, Lee Ann Liska, Patrick Luther, Jessica McAllister, Scott McCarver, Donald Moore, Jay Morrison, Thomas Nantais, Henry Ong, Mariann Piano, Lisa Price, Kris Rehm, Russell Rothman, Matthew Semler, Jennifer Slayton, Philip Walker, Li Wang, Asli Weitkamp, Consuelo Wilkins, John Wolfe, Adam Wright, and Autumn Zuckerman. The authors would also like to acknowledge Nicole Zaleski for graphic design support during manuscript preparation.
1. Institute of Medicine. Best Care at Lower Cost: The Path to Continuously Learning Health Care in America. 2013.Washington, DC: National Academies Press;
2. Greene SM, Reid RJ, Larson EB. Implementing the learning health system: From concept to action. Ann Intern Med. 2012;157:207–210.
3. Kraft S, Caplan W, Trowbridge E, et al. Building the learning health system: Describing an organizational infrastructure to support continuous learning. Learn Health Syst. 2017;1:e10034.
4. Horwitz LI, Kuznetsova M, Jones SA. Creating a learning health system through rapid-cycle, randomized testing. N Engl J Med. 2019;381:1175–1179.
5. Joosten YA, Israel TL, Williams NA, et al. Community engagement studios: A structured approach to obtaining meaningful input from stakeholders to inform research. Acad Med. 2015;90:1646–1650.
6. Sledge WH, Lee HB. Proactive Psychiatric Consultation For Hospitalized Patients, A Plan for the Future. Health Affairs Blog. https://www.healthaffairs.org/do/10.1377/hblog20150528.048026/full
. Published 2015. Accessed February 9, 2021.
7. Noto MJ, Domenico HJ, Byrne DW, et al. Chlorhexidine bathing and health care-associated infections: A randomized clinical trial. JAMA. 2015;313:369–378.
8. Centers for Disease Control and Prevention. Strategies to Prevent Hospital-onset Staphylococcus aureus Bloodstream Infections in Acute Care Facilities. https://www.cdc.gov/hai/prevent/staph-prevention-strategies.html
. Published 2019. Accessed February 9, 2021.
9. Pulley JM, Bernard GR. Proven processes: The Vanderbilt Institute for Clinical and Translational Research. Clin Transl Sci. 2009;2:180–182.
10. Mohler R, Kopke S, Meyer G. Criteria for Reporting the Development and Evaluation of Complex Interventions in Healthcare: Revised guideline (CReDECI 2). Trials. 2015;16:204.
11. Hoffmann TC, Glasziou PP, Boutron I, et al. Better reporting of interventions: Template for Intervention Description and Replication (TIDieR) checklist and guide. BMJ. 2014;348:g1687.
12. Gartlehner G, Hansen RA, Nissman D, Lohr KN, Carey TS. Criteria for Distinguishing Effectiveness From Efficacy Trials in Systematic Reviews. 2006. Rockville, MD: Agency for Healthcare Research and Quality (US); http://www.ncbi.nlm.nih.gov/books/NBK44029
. Accessed November 30, 2018.
13. Loudon K, Treweek S, Sullivan F, Donnan P, Thorpe KE, Zwarenstein M. The PRECIS-2 tool: Designing trials that are fit for purpose. BMJ. 2015;350:h2147.
14. Davey C, Hargreaves J, Thompson JA, et al. Analysis and reporting of stepped wedge randomised controlled trials: Synthesis and critical appraisal of published studies, 2010 to 2014. Trials. 2015;16:358.
15. Office for Human Research Protections. Electronic Code of Federal Regulations [Government Printing Office]. Title 45, CFR 46. https://www.ecfr.gov/cgi-bin/retrieveECFR?gp=&SID=83cd09e1c0f5c6937cd9d7513160fc3f&pitd=20180719&n=pt45.1.46&r=PART&ty=HTML
. Published 2018. Accessed February 9, 2021.
16. Shinall MC Jr, Karlekar M, Martin S, et al. COMPASS: A pilot trial of an early palliative care intervention for patients with end-stage liver disease. J Pain Symptom Manage. 2019;58:614–622.
17. Asch DA, Ziolek TA, Mehta SJ. Misdirections in informed consent—Impediments to health care innovation. N Engl J Med. 2017;377:1412–1414.
18. Semler MW, Self WH, Wanderer JP, et al.; SMART Investigators and the Pragmatic Critical Care Research Group. Balanced crystalloids versus saline in critically ill adults. N Engl J Med. 2018;378:829–839.
19. Self WH, Semler MW, Wanderer JP, et al.; SALT-ED Investigators. Balanced crystalloids versus saline in noncritically ill adults. N Engl J Med. 2018;378:819–828.
20. International Committee of Medical Journal Editors. Recommendations for the Conduct, Reporting, Editing, and Publication of Scholarly Work in Medical Journals. Defining the Role of Authors and Contributors. http://www.icmje.org/recommendations/browse/roles-and-responsibilities/defining-the-role-of-authors-and-contributors.html
. Published 2019. Accessed February 9, 2021.
References cited only in Table 2
21. Van Driest SL, Wang L, McLemore MF, et al. Acute kidney injury risk-based screening in pediatric inpatients: A pragmatic randomized trial. Pediatr Res. 2020;87:118–124.
22. Bloom SL, Stollings JL, Kirkpatrick O, et al. Randomized clinical trial of an ICU recovery pilot program for survivors of critical illness. Crit Care Med. 2019;47:1337–1345.
23. Stone CA Jr, Stollings JL, Lindsell CJ, et al. Risk-stratified management to remove low-risk penicillin allergy labels in the ICU. Am J Respir Crit Care Med. 2020;201:1572–1575.
24. Yiadom MYAB, Domenico H, Byrne D, et al. Randomised controlled pragmatic clinical trial evaluating the effectiveness of a discharge follow-up phone call on 30-day hospital readmissions: Balancing pragmatic and explanatory design considerations. BMJ Open. 2018;8:e019600.
25. Morrison J, Hasselblad M, Kleinpell R, et al.; Vanderbilt Learning Healthcare System Investigators, Vanderbilt University. The Disruptive bEhavior manageMEnt ANd prevention in hospitalized patients using a behaviORal intervention team (DEMEANOR) study protocol: A pragmatic, cluster, crossover trial. Trials. 2020;21:417.