Share this article on:

Evaluating Clinical Decision Support Rules as an Intervention in Clinician Workflows With Technology


CIN: Computers, Informatics, Nursing: January-February 2011 - Volume 29 - Issue 1 - p 36-42
doi: 10.1097/NCN.0b013e3181f9dbb1
Feature Article

The implementation of electronic health records in rural settings generated new challenges beyond those seen in urban hospitals. The preparation, implementation, and sustaining of clinical decision support rules require extensive attention to standards, content design, support resources, expert knowledge, and more. A formative evaluation was used to present progress and evolution of clinical decision support rule implementation and use within clinician workflows for application in an electronic health record. The rural hospital was able to use clinical decision support rules from five urban hospitals within its system to promote safety, prevent errors, establish evidence-based practices, and support communication. This article describes tools to validate initial 54 clinical decision support rules used in a rural referral hospital and 17 used in clinics. Since 2005, the study hospital has added specific system clinical decision support rules for catheter-acquired urinary tract infection, deep venous thrombosis, heart failure, and more. The findings validate the use of clinical decision support rules across sites and ability to use existing indicators to measure outcomes. Rural hospitals can rapidly overcome the barriers to prepare and implement as well as sustain use of clinical decision support rules with a systemized approach and support structures. A model for design and validation of clinical decision support rules into workflow processes is presented. The replication and reuse of clinical decision support rule templates with data specifications that follow data models can support reapplication of the rule intervention in subsequent rural and critical access hospitals through system support resources.

Author Affiliations: University of Iowa, Iowa City (Dr Brokel); Trinity Health, Novi, MI (Drs Brokel and Kramer); Mercy Medical Center-North Iowa, Mason City, IA (Ms Schwichtenberg); Center for Health Care Quality University of Missouri, Columbia (Dr Wakefield); Department of Health Management and Policy, University of Iowa, Iowa City (Dr Ward); and Trinity Information Systems, Farmington Hills, MI (Dr Shaw).

This study was supported by the Agency for Healthcare Research and Quality for Health Information Technology grant #HS015196; Trinity Information Systems and Trinity Health of Novi, Michigan; Mercy Medical Center-North Iowa; the University of Iowa; and the University of Missouri Center for Health Care Quality.

Corresponding author: Jane M. Brokel, PhD, RN, 482 NB, College of Nursing, 50 Newton Rd, University of Iowa, Iowa City, IA 52242 (

Clinical information systems in rural hospitals lag 40% to 50% behind urban hospitals. The high cost of implementing the electronic health records (EHRs) has been a significant obstacle in rural areas.1 The clinical decision support (CDS) applications within the EHR are powerful information technology tools to foster efficiencies, effect outcomes, and costs2 but require informatics resources to design, validate, monitor, and upgrade.3 Osheroff et al4 define CDS as clinicians' use of evidence-based knowledge and person-specific or population information logically filtered and presented at appropriate times to foster better health processes and better patient care. Automated CDSs are known to increase adherence to guidelines and protocols,3 expand surveillance and monitoring for disease conditions,2,5 and reduce risks and medication errors. Translating empirical guidelines and conceptual safety measures into CDS applications necessitates a process to substantiate the CDS goals. This study describes the process of validation of one of the complex types of CDS interventions, the CDS rule (CDSR). Validation is checking of correctness and comparing of this CDSR intervention with the stated goal as a frame of reference.6 A CDSR is an automated evaluation of a pattern of patient data that prompts a clinician to take an action with patient care to meet regulations, care standards, and measurable goals. There are multiple components necessary to implement a CDSR successfully within the EHR. As a result, studies are needed to validate CDSR designs, capabilities, effectiveness, and reuse in rural areas where resources are limited.

Back to Top | Article Outline


The study site was a rural referral hospital and clinics within a large virtually integrated healthcare system. The hospital and clinics applied preexisting CDSRs from five urban hospitals in the same system in 2005 and from urban ambulatory clinics in 2007. In 2008, the same CDSRs are applied within seven rural critical access hospitals using the same EHR with CDSRs. The CDSRs were used to support approximately 12 960 admissions for acute encounters, 1120 newborns, 34 300 emergency room visits, and a portion of the 585 000 outpatient visits per year. The study site was able to select CDSRs from a Web-accessible system-level catalog of CDSRs used by other hospitals and then modify roles for local use. The catalog provided the study hospitals and clinics with eight core types of information including (1) CDSR title and filename, (2) the clinical process supported by the CDSR, (3) purpose, (4) an explanation of the specific rules logic, (5) the actions supporting the clinician, (6) evidence supporting the rule's use, (7) original start date, and (8) use of the CDSR elsewhere in the health system.

The implementation included 54 unique inpatient and 17 ambulatory CDSRs. What was helpful to the hospitals and clinics to implement preexisting and new CDSRs? Do the selection and validation of the CDSRs operate the same when implementing the EHR for additional rural hospitals and likewise for clinics? An institutional review board approved the study site's participation.

Back to Top | Article Outline

CDSR Validation

In the CDSR implementation, we were concerned with validating the content and consistency when CDSRs are used across settings. The catalog of CDSRs provided a first effort to take and reuse CDSRs. When the study hospital and clinics selected rules previously used by five other urban hospitals and clinics, preparation time was taking 3 weeks for CDSRs. When the study hospital requested a new unique rule, the average preparation was 7 weeks. Preparation included obtaining evidence to support the CDSR, finding the needed documented or downloaded data in the EHR, building the logic, and testing the action response of new CDSRs in a test domain. The study site often needed to specify locations, facilities, and the roles involved for the CDSR. The site accepted the final details to complete design and build within a Discern Expert application (Cerner Corporation, Kansas City, MO) integrated with the EHR. All CDSRs were implemented at the EHR "Go-Live" on July 8, 2005 as planned.

When the CDSRs were operating, the validation determined if clinicians were receiving computerized system patient orders to guide documentation (eg, patient education templates, Braden scale assessments) and consult orders to disciplines (eg, dietitian, certified diabetic educator), messages (eg, patient was in the emergency room), reminders (eg, screenings), and alerts (eg, contrast media with metformin order). Often, these CDSRs were using a nurse's or other professional's electronic clinical documentation to initiate and sort out the logic of a CDSR. To make this work, the rural hospitals and clinics were able to use 95% of the electronic forms for interdisciplinary documentation prepared from previous urban hospital use, which allowed reuse of 92% (n = 67) of CDSRs.

Back to Top | Article Outline


The CDSRs were triggering action responses after implementation in 2005. No CDSRs were discontinued. The health system knew this because each facility reports issues within a Web-accessible issues tracking system. End-users and managers reported issues regarding the EHR, CDSR, and related clinical applications. The issues tracking system was used to collect both problematic findings and new requests for CDSR development. In review of reported issues routed to the Discern Expert analyst, no CDSR problematic issues were reported by the study hospital and clinics.

In validating 71 CDSRs, the research assistant and investigator used the purpose, explanation statements, and department-supported data fields from the catalog. A Discern Expert analyst provided additional information needed to describe the logic and actions for each CDSR and who receives the response (eg, reminder, alert). The CDSRs supported 12 different clinical disciplines and services. Thirty-four of the CDSRs (48%) were designed with a purpose and goal to guide evidence-based practices by placing system patient orders. Twenty-five (35%) supported communication among disciplines and services. Six CDSRs (8%) promoted safety, four prevented errors, and three did not fit within these goal-oriented categories. Fifteen CDSRs (21%) assisted nurses for all four of the goals. Seven CDSRs supported three goals to help pharmacists. All clinical services were supported by CDSRs to achieve one or more goals previously identified in guiding principles for CDS.7 Clinical decision support rules required cited standards, published research, and evidence-based practices to guide the decision on how data were used for the build of logic into Discern Expert application, but we found 35% of the CDSRs without supporting information.

Once implemented, data over the initial 3 months (July-October 2005) were extracted from the clinical data repository for the EHR to evaluate the number of times the action response was triggered for each CDSR. The number of triggered action responses within the rural and urban hospitals was consistent within but varied between hospitals. The differences were found with pneumococcal vaccine CDSR, which triggered 22 to 33 times per month at the rural hospital and ranged from five to 106 responses among other hospitals. Another CDSR for swallow screening triggered 15 to 17 times per month at a rural study hospital and ranged from 10 to 122 per month at other sites. When the numbers of triggered action responses in the rural setting matched response rates found in other system hospitals, these CDSRs promoted communication or use of evidence-based practice assessments such as Fall Risk (Morse scale), triggering 755 to 805 times per month; Braden Assessment, 2086 to 2254 per month; consults for functional screening, 200 to 206 per month; and domestic violence, 128 to 155 per month. These CDSRs included automated functions to guide timely patient education and clinical documentation and to monitor patient progress with assessment indicators. When nurses received notification of the need to collect home medications, allergies, measured weight, and other admission data, the signed documentation demonstrated completion. Other CDSRs initiated within the obstetrics area guided patient education for mothers and standard newborn assessments and screenings at birth.

The variation in CDSR action responses was seen when dietitians received consults for low albumin levels, Braden Scale score less than 12, special diets for medications, and findings from nutrition screening in adult, pediatric, and obstetric areas. The variation depended on a nurses' completion of assessments and physicians' ordering diagnostic tests. Social workers received automated consults for documented domestic violence and psychosocial screening needs. Rehabilitation therapists and speech pathologists received automated consults for mobility and swallowing deficits identified in functional screening. Pharmacists received dose range checking and inappropriate routes for medications alerts from physician ordering practices. These initial reports did not provide information to explain this variation. Despite the catalog of CDSRs, the key factor(s) that influenced CDSR was unknown.

In 2009, the study hospital continued to use 17 adverse drug alerts established in 2001, the 54 CDSRs established in 2005, and the 17 CDSRs for two clinics established in 2007. Some of these original CDSRs have undergone updates during this time. Another 24 CDSRs (eg, reminder for catheter removal to prevent urinary tract infection, inappropriate routes for medications) were added after having been piloted in study hospital and clinic or other facilities. The CDSR patient outcomes were discharge instructions for heart failure (94%), smoking cessation counseling if heart failure (100%), Pneumovax vaccine (93%) and influenza vaccine (87%) if pneumonia, and patients with heart attack given aspirin (100%) and smoking cessation counseling (100%).

Back to Top | Article Outline


This study expanded upon previous work by two of the authors who constructed the original health system CDSR tool kits containing education modules, catalog of CDSRs, and procedure steps used by the study hospital.7 The formulation of a new conceptual model, new education modules, and a CDS data use model is discussed here. These evolving new tools provided validating steps to use preexisting and new CDSRs. The end-users, their managers, and our investigators needed more information than was in the catalog to describe what data are used, what inclusions and exclusions are necessary, who is affected, how will the CDSR interact with the clinicians, what is the clinician expected to do in response (eg, alert, notification, message), and, finally, what is expected as the patients' outcome.

Initially, in 2005 the extracted reports used a report function within the EHR, and very limited volume of data could be used to confirm and validate the operation of the CDSRs with this process. To extract for all facilities that used the CDSR, this required multiple reports. Between 2003 and 2005, up to six reports containing 1 to 15 days of data were used to analyze the number of triggered responses. This report process was not a viable option in the long term, so we proceeded to find alternative reporting processes over the next 3 years to validate CDSRs. By late 2007, the ability to extract reports from a business objects tool (PowerInsight Explorer, Cerner Corporation, Kansas City, MO) was available. During the next 18 months into 2009, we have established reports to regularly extract and evaluate CDSRs for the system and each facility without affecting the EHR operations. The validation processes require steps to establish the correct data to compare findings across settings and with external results. In this article, we discuss why a conceptual model and data use model are helpful to specify standard data for the CDSR designing, validating, monitoring, and evaluation process.

First, why is validation necessary to maintain CDSRs in the long term? When the Centers for Disease Control and Prevention updates vaccination recommendations, the evidence and other regulations lead to revisions of CDSRs and other EHR documentation over time. When screening logic (eg, pneumococcal and influenza vaccination, nutrition screening, fall risk) is updated, is the CDSR operating correctly? These requirements for evidence and standards that first compelled building a CDSR also require regular updates and revalidation. The roles of clinicians vary in urban, rural referral, and critical access settings. What works for one does not always demonstrate the same results in the other. So validation was necessary within and between settings and over time because of vendor upgrades, which can and do disrupt the CDSRs, and for the evolving changes to continuously improve EHRs.

Our early results in 2005 from rural hospitals showed that every CDSR was triggering action responses, although we could neither determine who received the CDSR nor prove the clinician had followed through with synchronous CDSRs. Furthermore, data extraction was necessary to match the triggered event with the patient encounter, the clinician, the clinician performance data, and the patient outcome data, which is now possible with business objects reporting tools. A conceptual model and data use model were tools to clarify standard data necessary for validation and evaluation. The validating reports included specific data on date and time of the triggered response, the specific CDSR filename, the encounter number, and the position type seeing the response. The reporting methods did demonstrate for the health system that CDSRs triggered responses to various roles within rural referral and smaller critical access hospitals as well as within urban hospitals and clinics. These reports were easily uploaded into SPSS (SPSS Inc, Chicago, IL) for statistical analysis.

Second, to validate CDSRs required extensive attention to standards and content design,8 which we identified in the template for a CDS data use model. The clinician users required this same understanding about the content design to understand what will the CDSR templates accomplish for their facility or site. Therein rests the value of having a standard model for integrating CDSRs with other design decisions for the EHR. The processing of CDSRs often was dependent on documentation being completed within a clinician's workflow. The use of CDSR in workflow impacts the volume of triggered responses and subsequent actions by clinicians. Healthcare informaticists will find, as was experienced in this study hospital, that clinicians want to integrate decisions on design within the decisions on clinical workflow processes. Having displayed data use to clinicians by using CDSR templates has reduced the confusion about use and improved CDSR implementations.

Back to Top | Article Outline

Conceptual Model and Data Use Model for CDSR

Although little has been done to validate the integration of CDS tools for all disciplines into their clinical workflow processes, CDS tools were first integrated within EHRs merely 20 years ago for medicine. This conceptual model illustrates the major data, information, and knowledge evidence that are needed to design content for the clinician's (or clinicians') decisions, leading to the patient outcome.8 The content is used to explain the event and data needed to initiate sequentially the CDS expert rule, what conditional limits include or exclude patients, the action message and response(s) to the clinician and their performed response, and the patient outcome (Figure 1). It was important to consider previous frameworks to completely represent the complexity of decision support within a practice setting. The model includes an informatics framework embracing three core sciences that together function to support decision making.9 The model considers factors previously identified by the decision support framework of Patterson et al,10 a health systems theory, life cycle of change, and nursing's metaparadigm related to care.11,12 A recent project by Wright et al13 provided elements for a CDS taxonomy to make an inventory of the CDS capabilities used within EHRs similar to the catalog used for this study. This CDS taxonomy should be considered for stating components for validation and a catalog of CDSRs. A CDS application becomes useful only through the use of data and conditions for information based on evidence. The clinician's wisdom is how data, information, and knowledge are applied using health information technology.8 This model integrates the standards that offer replication of preexisting CDSRs across care sites, which Kawamoto et al suggested.14 If the CDSR generates no decision or action, the CDS has no value and should not be inventoried for further use, whereas a CDSR that supports a clinician decision and action for the patient needs to be shared and disseminated to others.



The templates for a CDS data use model (Figure 2) specified greater precision to support actions for the clinician's decision process using the CDS application. The data use model displays the interaction and use of patient data with CDSRs. The data documented by clinicians within the EHR are used to trigger CDSRs to evaluate the conditional limits of data to produce actionable responses. The schematic model (Figure 1) illustrates data entered, uploaded from devices, or transmitted into the EHR from other applications within the framework of information and data use. The arrows depict data use for required elements in CDSR operation. The model represents the CDSR application use of data in a triggered event within a clinician's workflow to respond to a message, alert, reminder, order, or found potential problem. The translation of research evidence into practice by integrating knowledge into each of the CDSR steps to produce the best results has been well documented and is central to the validity of CDSRs. The conceptual model guides the data use model formalism to sort out the clinical content into specifications for the CDSRs. This requires mapping the standard content using specifications (eg, SNOMED-CT codes) to the CDS expert rule data use model, as seen in Figure 2.



Our early findings show variability, and therefore, the impact in rural, rural referral, and urban healthcare settings will need to be validated with evaluation methods that are still evolving. In our methods development of reports to analyze the CDSR, we are noticing roles that have performed response and others that do not. The first validation report type will show which clinician role is not responding to the CDSR alerts and when a CDSR is repeating for the patient's episode. What is important to note is that state laws, professional practice standards, organizational policy, and legal interpretation avow practice responsibilities. A clinician's response or no response to CDSR may affect the differences we see in performance between urban and rural locations. Our overall concern is how the CDSRs can be replicated across urban and rural organizations and within different states to achieve similar clinician performance and patient outcomes. We are indeed early in our methods development but with further research efforts can move this validation and evaluation methods forward. The data use model that includes clear specification for the role is now considered a core data item within our validation and evaluation procedures.

Inherent in this design is the unequivocal need for translation of research evidence into practice,11,13,15 to support valid and reliable use of information within each CDSR; thus, the evidence arrows point to the information and computer sciences, as well as professional standards that direct the clinician workflow. The clinician's actions with the data used in the CDSR are displayed in workflow processes, another tool. Each discipline interacts with CDSR in an initiator role that triggers the events, or in a receiver role to decide the action response during the patient care workflow. The nurse often documents assessments that trigger action responses to other disciplines. Because of the dynamic nature of evidence, workflow processes, and policy changes, the model includes change control procedures, which serve to identify the importance of validation to initiate, manage, evaluate, and upgrade the CDSRs safely over time. These sustaining procedures required the retooling of our original tool kit to represent the data specifications and evidence in CDSRs to foster use across the hospitals and clinics and the interoperability across regions. The conceptual model and data use model are used as part of a program of studies to validate the triggered response, the clinician receiving the response, the performed response, and the anticipated patient outcome for CDSRs.

Back to Top | Article Outline


While the study hospital and clinics use 100 CDSRs, the health system maintains more than 300 CDSRs for use. With this many CDSRs, new approaches were necessary. In the end, it became easier for clinical end-users to view a template of the CDSR based on data used for trigger, inclusion and exclusion criteria, the CDSR actions, the clinician involved, the clinician's performed actions, and the outcome for the patient. This led to development of a template for each CDSR.

Little attention has been given to validation of CDSRs that are used within the clinical workflow processes that involve the integrated EHR, where all the disciplines may be impacted while working directly with the patient in different settings. Tools were necessary to guide the validation of CDSR design and monitor the response to various clinical professionals. The model demonstrates how physician, nurse, dietitian, radiology technician, and pharmacist workflows are integrated with the requested CDSRs to achieve the four goals to support patient safety, prevent errors, establish evidence in daily practices, and improve communication. We are beginning to know the effectiveness in achieving goals for some of the CDSRs and the consequences created by others. This article shares our learning from evaluating a clinic's and hospital's use of CDSRs with all disciplines and how the evaluation led to the conceptual model, the CDS data use model, and the integration of CDSRs into the clinical workflows for professionals who interact with patients. Further research is planned to test effectiveness of more CDSRs across facilities. This model will be used to monitor and evaluate CDSRs and guide the health system with a tool kit for management of CDSR.

Back to Top | Article Outline


1. Ward MM, Jaana M, Bahensky JA, Vartak S, Wakefield DS. Clinical information system availability and use in urban and rural hospitals. J Med Syst. 2006;30:429-438.
2. Chaudhry B, Wang J, Wu S, et al. Systematic review: impact of health information technology on quality, efficiency, and costs of medical care. Ann Intern Med. 2006;144(10):742-752.
3. Kawamoto K, Houlihan CA, Balas EA, Lobach DF. Improving clinical practice using clinical decision support systems: a systematic review of trials to identify features critical to success. BMJ. 2005;330:765-773.
4. Osheroff JA, Teich JM, Midleton BF, Steen EB, Wright A, Detmer DE. A roadmap for national action on clinical decision support. Am Med Inform Assoc. 2006:1-97. Office National Coordinator Contract HHSP233200500877.
5. Lin J, Chu P, Liou J, Hwang J. Applying a Multiple Screening Program Aided by a Guideline-Driven Computerized Decision Support System-A Pilot Experience in Yun-lin, Taiwan. Taipei, Taiwan: The Association; 2007.
6. Brender J. Handbook of Evaluation Methods for Health Informatics. New York: Academic Press; 2006:9-10.
7. Brokel JM, Shaw MG, Nicholson C. Expert clinical rules automate steps in delivering evidence-based care in the electronic health record. Comput Inform Nurs. 2006;24(4):196-207.
8. Englebardt SP, Nelson R. Health Care Informatics: An Interdisciplinary Approach. St Louis: Mosby; 2002:13.
9. Staggers N, Thompson CB. The evolution of definitions for nursing informatics: a critical analysis and revised definition. J Am Med Inform Assoc. 2002;9(3):255-261.
10. Patterson ES, Nguyen AD, Halloran JP, Asch SM. Human factors barriers to the effective use of ten HIV clinical reminders. J Am Med Inform Assoc. 2004;11(1):50-59.
11. Effken JA. An organizing framework for nursing informatics research. Comput Inform Nurs. 2003;21(6):316-323.
12. Fawcett J. Analyzing and Evaluation of Contemporary Nursing Knowledge: Nursing Models and Theories. Philadelphia: FA Davis Co; 2003:3-71.
13. Wright A, Sittig DF, Ash JS, Sharma S, Pang JE, Middleton B. Clinical decision support capabilities of commercially-available clinical information systems. J Am Med Inform Assoc. 2009;16(5):637-644.
14. Kawamoto K, Lobach DF, Huntington RW, Ginsburg GS. A national clinical decision support infrastructure to enable the widespread and consistent practice of genomic and personalized medicine. BMC Med Inform Decis Mak. 2009;9:17.
15. Titler MG, Everett LQ. Translating research into practice: considerations for critical care investigators. Crit Care Nurs Clin North Am. 2001;13(4):587-604.

Clinical decision support; Clinical processes; Decision support systems formative evaluation; Informatics; Rule logic

© 2011 Lippincott Williams & Wilkins, Inc.