According to the U.S. Department of Health and Human Services, health information technology (HIT) consists of tools that allow comprehensive, timely, and secure access to and management of medical information by both providers and consumers. Although HIT ranges from telemedicine systems to personal digital assistants and even wearable sensors, the most extensively studied technologies are those used for inpatient- and outpatient-focused electronic health records (EHRs), order entry systems, bar code medication administration systems, and personal health records. Recent reports from the Institute of Medicine,1,2 the Leapfrog Group,3 and a number of federal agencies, including the Center for Medicaid and Medicare Services,4 all call for increased adoption of HIT.
According to a recent report published by Chaudhry and colleagues,5 HIT can “improve the efficiency, cost-effectiveness, quality, and safety of medical care delivery by making best practice guidelines and evidence databases immediately available to clinicians … throughout a health care network.” The benefits of HIT include increased availability and dissemination of clinical information, improved guideline adherence and protocol-based care through integrated decision support systems, and a host of other interventions too numerous to list.6–8
Despite data showing the benefits of HIT, almost one quarter of the HIT efficacy literature comes from only four institutions.5 Each convincingly documents the benefits of a multifunctional HIT system on quality, efficiency, and costs, but many other institutions do not yet use HIT at the level employed by these four. The limited adoption of HIT9–11 has the potential to create a disparity, based on technical sophistication, in training environments.
We wanted to investigate what happens to clinician confidence after they transition to less supervised care, while simultaneously losing the safety net of HIT. Do clinicians exposed to HIT perceive themselves as more or less prepared to deliver high-quality care? We designed this study to assess the perceptions and self-reported behaviors of residents who transitioned away from environments with advanced order entry and EHR systems.
The study took place at Vanderbilt University, an HIT-rich training environment and one of the nation’s 100 Most Wired Hospitals and Health Systems.12
The Vanderbilt University Medical Center (VUMC) has implemented a variety of HIT tools during the past decade.13–28 These tools have been continuously modified to improve their usability, the quality of patient care they enable, and the workflows they support. Input from end users has assisted this effort. These users—physicians, nurses, pharmacists, administrative staff, and medical students—have access to a biomedical informatics department that deploys HIT educational resources and provides 24-hour, on-site assistance. VUMC extensively trains new providers in the skills required for HIT-mediated patient care. These users become proficient in incorporating these tools into their daily clinical duties, and they adopt the institution’s policies governing HIT use. Clinicians most commonly use two systems at VUMC. The first, StarPanel, is an EHR system that has been available on both inpatient and outpatient units since 2001.18,20 StarPanel provides an integrated view of discharge summaries, progress notes, visit summaries, and every diagnostic test or procedure result available for patients at VUMC since the mid-1990s. StarPanel also provides access to scanned documents, electronic forms, customizable encounter templates, and a very popular secure messaging system that connects to all clinicians throughout VUMC. These functions allow clinicians to manage patients without a paper chart, and they make the entire medical record available wherever it is needed. A study by Joos et al20 demonstrated high levels of satisfaction with these functions in StarPanel.
WizOrder, a computerized provider order entry (CPOE) system with integrated decision support, has been in place at VUMC since 1994.16,17,22,25,28 Clinicians use WizOrder at the point of care as support for making important decisions regarding a patient’s clinical course. WizOrder helps to ensure the highest quality of care for VUMC patients while reducing medical errors. It incorporates relevant information resources, such as links to Web sites, expert advice about when to order specific tests, and tools to walk users through adjusting medication doses for specific conditions, which enhances and supports decision making. Clinicians now use the technology on virtually all 640 beds at VUMC, including throughout the emergency departments. Clinicians enter more than 10,000 orders into WizOrder daily. WizOrder provides approximately 400 decision support reminders and alerts every day, resulting in altered behavior 10% to 25% of the time. Additional alerts have led to the appropriate substitution of one drug for another drug, as recommended by the institutional Pharmacy and Therapeutics Committee.
This study evaluated perceptions of medical school and residency graduates from VUMC whose training included extensive use of both StarPanel and WizOrder. The medical school and residency programs at VUMC train more than 400 medical students and 840 residents annually in various specialty and subspecialty programs.
The survey for this study assessed domains of interest based on the Institute of Medicine’s dimensions of quality29—safe, effective, efficient, patient centered, and timely—that were likely impacted by the presence or absence of HIT. We did not include questions relating to equitable care (care that does not vary in quality in association with patients’ social or personal characteristics), because we felt HIT would have a limited impact on this dimension. These dimensions were dependent on the years spent at VUMC, current practice environment, medical specialty, extent of time caring for patients versus time doing other medically related activities (i.e., clinical activity), and availability and sophistication of HIT at the respondent’s current institution. The survey consisted of 23 questions that focused on delivering care that was (1) safe, (2) effective (evidence based), (3) efficient, (4) patient centered (emphasizing communication), and (5) timely (focusing on system learning, which impacts the ability to execute tests, consults, or other procedures). We designed questions about perceptions and used a Likert scale for response categories. Additional demographic questions addressed age, gender, date of departure from VUMC, date of arrival at the current institution, level of training at VUMC, number of years of VUMC, current practice environment, area of specialty, number of patients per day on inpatient service and percent of medication and diagnostic test orders entered by the respondent (clinical activity), number of workstations available in current work environment, and comprehensiveness of the computer systems at the current institution. Expert evaluators and former Vanderbilt students reviewed all survey questions for content validity and suggested changes based on this review. Postresidency fellows in cardiology pilot-tested the survey before it was finalized. Vanderbilt’s IRB reviewed the final version of the survey and approved the study.
We obtained mailing lists from the Vanderbilt Alumni Office and the Office of Graduate Medical Education. These lists contained contact information for each of 679 people who graduated between 2001 and 2003 and then moved to other institutions. The research team conducted three mailings, each spaced two months apart, via regular United States Postal Service mail. We labeled all the surveys returned because of an incorrect or insufficient address as “Not Delivered.” We excluded respondents and undelivered survey entries from each successive mailing. We assigned the graduates individual identification numbers that we used subsequently to track the respondents. We entered each survey’s data into a computer when it arrived, and then we destroyed the survey after data entry. Each respondent received a $5 Starbucks gift certificate as a token of appreciation.
We calculated descriptive statistics for demographic questions. For Likert-scaled responses, the analytic methods varied based on the characteristics of the measured item. For each of the first four survey subscales, (safety, evidence-based practice, efficiency, and communication), we evaluated questions for internal consistency using standardized Cronbach alpha coefficients. We removed subscale questions from the summary variable for that subscale if the correlation with the total was less than 0.2 and if the standardized coefficient increased more than 5% after deleting the question. We calculated the final summary variables for each subscale as the average of the complete components to reduce the missing percentage in this variable. We also calculated median and interquartile ranges of subscale scores, as well as frequencies and percentages of subscale scores. We used analysis of covariance (ANCOVA) models to study relationships between summary variables and predefined explanatory variables when the summary variable had at least six levels and, initially, proportional odds models when the summary variable had fewer than six levels. We considered further logistic regression models on the dichotomized outcome if the proportional odds assumption was rejected.
We included demographic variables in a multivariate analysis to evaluate their impact on the categories of safety, evidence-based practice, efficiency, communication, and learning of hospital systems. We analyzed all of these categories using the ANCOVA model except the learning of hospital systems; we reported learning hospital systems (which was a dichotomous response) as a log odds ratio. For this analysis, we treated gender as a binary variable. Highest degree obtained in training, years since leaving VUMC, and extent of computerization at the current institution were scaled or continuous variables. Practice environment and specialty were categorical variables, and clinical activity and availability of HIT were indices calculated from survey items. We calculated the index for clinical activity using the following equation:
The index for availability of HIT was the average of scaled results from survey items quantifying the extent to which the respondent used paper or computer systems to complete nine common inpatient tasks. We completed data analysis using Statistical Analysis System, version 9.1. We considered P values <0.05 of two-sided tests to be statistically significant.
We identified a total of 679 graduates who left VUMC immediately after completing medical school or postgraduate training as potential study participants; we mailed surveys to all of them. Of these, 128 surveys were returned undelivered because of a wrong address, leaving 551 eligible respondents. Among these, 356 people returned surveys, with 328 surveys (60% of eligible respondents) completed sufficiently to analyze. Twenty-eight surveys (8%) were largely incomplete and therefore not analyzed. We evaluated both the geographic location (state and region of current institution) and level of training (undergraduate or graduate) at VUMC for respondents and nonrespondents. There was no significant difference in the geographic location (χ2 = 44.0; P = .56) or level of VUMC program (χ2 = 3.36; P = .07) between the two groups.
Table 1 summarizes respondent characteristics. Among respondents, 255 (78%) reported that the HIT at their current institution was less sophisticated than what they had experienced in training, whereas 73 (22%) transitioned to institutions with equal or better HIT than at VUMC.
Table 2 summarizes the survey results of respondents from institutions with less HIT versus institutions with the same or higher HIT penetration (HIT-rich environments). Compared with respondents transitioning to HIT-rich environments, respondents transitioning to environments with less HIT perceived more issues with handwriting interpretation and felt less confident about drug interactions and prescribing safely (all P < .001). In the area of evidence-based practice, respondents from environments with less HIT were more neutral about the impact on ordering appropriate tests and following guidelines for medication ordering. In the area of efficiency, respondents from environments with less HIT perceived that it was taking more time to gather patient history information and to retrieve pertinent documentation (both P < .001). Respondents were less enthusiastic about the impact of HIT on recording information (P = .01) and writing laboratory orders (P = .03). In the area of communication, respondents did not report significant changes in their ability to interact with patients and families; 129 (51%) of respondents in environments with less HIT, and 37 (51%) of respondents from the HIT-rich group, gave neutral responses. There was, however, a significant difference between the perception of communication with colleagues (P = .01), with those in HIT-rich environments perceiving worse collegiate communication.
Table 3 summarizes respondents’ attitudes about HIT. Overall, 254 (77%) perceived that electronic systems improved their efficiency, whereas 255 (78%) perceived a positive impact on their ability to deliver safer care. Although 211 respondents (64%) indicated that technology was not a significant factor in their choice of where to practice, those who transitioned to environments with less HIT missed the availability of EHRs and CPOE (P < .001).
ANCOVA analysis (Table 4) revealed that, after controlling for the demographic characteristics of gender, level of medical training at VUMC, and amount of clinical activity, lack of computerization at the current institution was associated with lower perceptions of safety, communication, effectiveness, and learning systems of care. There were no differences by gender, level of training at VUMC, or amount of clinical activity.
Our study participants were VUMC undergraduates and residents who left VUMC to continue their medical training or to begin medical practice at other institutions. A significant number of respondents reported that their new institution had less sophisticated information technology than at VUMC, where they had trained. Compared with their peers who moved on to places with more or similar HIT penetration, this group felt hindered in various technology-driven aspects of patient care at their new institutions. Absence of HIT was associated with lower perceived quality of care in many domains surveyed, including safety, efficiency, and system learning. Of considerable note, this group reported having less confidence in their knowledge about drug interactions and drug management than they did during their training, even months after changing institutions. Additionally, many respondents felt weakened in their ability to prescribe medications safely.
These acknowledgements are alarming, yet not surprising given the complexity of today’s health care environment. In any health care setting, but particularly in training environments, computer reminders and alerts complement safety practices and arguably decrease the need for manual processes to acquire information from disparate sources. They also provide just-in-time data to decision makers. They act as safety nets that help trainees deliver safe care with confidence. This safety net provides a level of comfort for trainees acquiring the skills of their profession while frequently battling anxiety and fatigue.30 The sudden removal of this safety net, concomitant with increased responsibility, a wider variation of medications, new systems of practice that must be mastered, and a change in one’s living environment, might have a negative impact on perceptions of safety. Published literature suggests that the lack of alerts and reminders leads to predictable increases in redundant orders,23,31–33 inappropriate ordering,32,34–36 and higher adverse drug event rates,37–39 as well as the higher costs associated with these actions.
Our study also revealed that although information retrieval is easier with HIT, there was no reported change in efficiency of recording patient information. This may reflect the variety of ways patient encounters can be recorded at various institutions. For example, some providers prefer using shorthand and symbols in handwritten documentation to typing notes or using a transcription service. Furthermore, because of the potential bias introduced by reviewing the notes of others, it is possible that any gains in efficiency through electronic systems (such as those systems that provide access to progress notes and previous discharge summaries) are offset by the additional work done to ensure that the information is accurate and complete. Additional studies could assess the information-gathering process and the impact of HIT on the accuracy and completeness of that process.
In efforts to promote standardized, quality patient care, providers must incorporate the current best evidence and medical guidelines into everyday practice. This task can be daunting, given the large volume of new clinical data in the medical literature. One way to make it more manageable is to use HIT with integrated decision support systems. These tools can increase adherence to guideline- or protocol-based care through computerized reminders.
Our survey focused on tools such as EHRs and CPOE systems. Because these systems often provide patient-specific alerts and reminders, they may have powerful educational value in a teaching environment. They promote a standardized approach to patient care and endorse institutional policies. Physicians who moved to hospitals that lacked these systems reported that they became less familiar with the most up-to-date evidence and practice guidelines. Without standardized order sets, medication protocols, computerized dosing schedules, and utilization cost guidelines, consistent patient care becomes difficult to provide. In an environment with a high number of providers, treatment decisions become fragmented when evidence-based guidelines are not widely implemented.
Our study was survey based and focused on self-perceptions of a variety of clinical skills and competencies, and it did not evaluate objective measures of proficiency or efficiency. Obtaining objective measures of these variables, given the spectrum of HIT tools used in institutions around the country, would be difficult; however, our results provide a compelling reason to do so. We asked participants about their first year away from VUMC to negate any possible effects of long-term adjustment without HIT. Because some respondents had left VUMC two or three years previously, the results are limited by recall bias. Subsequent studies should be conducted prospectively so that participants are surveyed after a fixed time away from VUMC. Whether and how well physicians eventually adapt to less sophisticated HIT is yet to be determined. In addition, this study does not assess how well these respondents performed in their new environment or how much their perception or practice improved with time. Possibly, any lowered perception would be short-lived and have little or no impact on performance. Finally, this study included only graduates from one institution. These results may not generalize to graduates from other HIT-rich training environments; however, informal discussions with other major teaching centers suggest that our approach to health care and HIT education is similar to those of other wired institutions. To our knowledge, this is the first report about transitioning from an HIT-rich environment to an environment with less HIT, and additional prospective, multicenter studies should follow. Despite these limitations, our results suggest that the sudden removal of HIT-assisted care delivery can be challenging for providers.
As health information accumulates, experts predict that paper-based systems will be unsustainable, forcing health care providers to turn to HIT.40 In the meantime, an educational dilemma persists.
One implication of this study is that if HIT reduces error rates but is not yet ubiquitous, administrators at technologically sophisticated environments might need to expose their junior physicians to unsupported and less safe care environments as learning experiences. If HIT-enabled medical centers prepare graduates for environments that do not provide a technological safety net, they may help their graduates make smoother transitions. Thus, trainees transitioning to environments without CPOE, clinical decision support tools, and other HIT should experience simulated situations where they must think through the process of delivering safe care without HIT support. Ideally, these situations would also provide opportunities for evaluation and constructive comments about proficiency in skills needed to practice safe care, including, as appropriate for the setting,
* using paper-based medical records,
* understanding the role of medical records departments and personnel,
* efficiently communicating medical information orally,
* learning hospital procedures with respect to ordering tests and reporting results,
* mastering clinician-initiated tools (such as PubMed) that help retrieve evidence that can be applied at the point of care, and
* respecting nuances of the local system of care (personnel and timing of activities such as scheduling procedures, requesting consults, and communicating with outside physicians).
In some settings, where technology is especially limited, tools such as personal digital assistants may provide ready access to important reference material.41–43
We found that many providers who transition to environments with less sophisticated HIT perceive their care as less safe and less efficient than it had been in the HIT-rich environment. These results provide support for the continued adoption of information technology and underscore the need to provide formal education to new trainees, faculty, and staff who are transitioning to less HIT-dependent systems of care.
The project was supported in part by a grant from the United States National Library of Medicine (Rosenbloom, 5K22 LM008576-02), and through a stipend provided by the Vanderbilt University Emphasis Program (Johnson).
The authors thank Dr. Emil Petrusa for his thoughtful review of this manuscript.
1 Kohn LT, Corrigan J, Donaldson MS, eds; Committee on Quality of Health Care in America, Institute of Medicine. To Err Is Human: Building a Safer Health System. Washington, DC: National Academy Press; 2000.
2 Committee on Identifying and Preventing Medication Errors, Board on Health Care Services; Aspden P, Wolcott JA, Bootman JL, Cronenwett LR, eds. Preventing Medication Errors. Washington, DC: National Academies Press; 2007.
3 Meadows G, Chaiken BP. Computerized physician order entry: A prescription for patient safety. Nurs Econ. 2002;20:76–77,87.
5 Chaudhry B, Wang J, Wu S, et al. Systematic review: Impact of health information technology on quality, efficiency, and costs of medical care. Ann Intern Med. 2006;144:742–752.
6 Dexter PR, Perkins S, Overhage JM, Maharry K, Kohler RB, McDonald CJ. A computerized reminder system to increase the use of preventive care for hospitalized patients. N Engl J Med. 2001;345:965–970.
7 Dexter PR, Perkins SM, Maharry KS, Jones K, McDonald CJ. Inpatient computer-based standing orders vs physician reminders to increase influenza and pneumococcal vaccination rates: A randomized trial. JAMA. 2004;292:2366–2371.
8 Tierney WM, Hui SL, McDonald CJ. Delayed feedback of physician performance versus immediate reminders to perform preventive care. Effects on physician compliance. Med Care. 1986;24:659–666.
9 Valdes I, Kibbe DC, Tolleson G, Kunik ME, Petersen LA. Barriers to proliferation of electronic medical records. Inform Prim Care. 2004;12:3–9.
10 Ash JS, Gorman PN, Seshadri V, Hersh WR. Computerized physician order entry in U.S. hospitals: Results of a 2002 survey. J Am Med Inform Assoc. 2004;11:95–99.
11 Ash JS, Stavri PZ, Kuperman GJ. A consensus statement on considerations for a successful CPOE implementation. J Am Med Inform Assoc. 2003;10:229–234.
12 Solovy A, Hoppszallern S, Brown SB. The 2007 Most Wired results. Ten lessons from the top 100. Hosp Health Netw. 2007;81:40–55.
13 Boord JB, Sharifi M, Greevy RA, et al. Computer-based insulin infusion protocol improves glycemia control over manual protocol. J Am Med Inform Assoc. 2007;14:278–287.
14 FitzHenry F, Kiepek WT, Shultz EK, Byrd J, Doran J, Miller RA. Implementing outpatient order entry to support medical necessity using the patient’s electronic past medical history. Proc AMIA Symp. 2002:250–254.
15 Geissbuhler A, Miller RA. Distributing knowledge maintenance for clinical decision-support systems: The “knowledge library” model. Proc AMIA Symp. 1999:770–774.
16 Geissbuhler A, Miller RA. Clinical application of the UMLS in a computerized order entry and decision-support system. Proc AMIA Symp. 1998:320–324.
17 Geissbuhler A, Miller RA. A new approach to the implementation of direct care-provider order entry. Proc AMIA Annu Fall Symp. 1996:689–693.
18 Giuse DA. Supporting communication in an integrated patient record system. AMIA Annu Symp Proc. 2003:1065.
19 Johnson KB, Serwint JR, Fagan LM, Thompson RE, Wilson MH. Computer-based documentation: Effect on parent and physician satisfaction during a pediatric health maintenance encounter. Arch Pediatr Adolesc Med. 2005;159:250–254.
20 Joos D, Chen Q, Jirjis J, Johnson KB. An electronic medical record in primary care: Impact on satisfaction, work efficiency and clinic processes. AMIA Annu Symp Proc. 2006:394–398.
21 Miller RA, Waitman LR, Chen S, Rosenbloom ST. The anatomy of decision support during inpatient care provider order entry (CPOE): Empirical observations from a decade of CPOE experience at Vanderbilt. J Biomed Inform. 2005;38:469–485.
22 Miller RA, Gardner RM, Johnson KB, Hripcsak G. Clinical decision support and electronic prescribing systems: A time for responsible thought and action. J Am Med Inform Assoc. 2005;12:403–409.
23 Neilson EG, Johnson KB, Rosenbloom ST, et al. The impact of peer management on test-ordering behavior. Ann Intern Med. 2004;141:196–204.
24 Ozdas A, Speroff T, Waitman LR, Ozbolt J, Butler J, Miller RA. Integrating “best of care” protocols into clinicians’ workflow via care provider order entry: Impact on quality-of-care indicators for acute myocardial infarction. J Am Med Inform Assoc. 2006;13:188–196.
25 Rosenbloom ST, Grande J, Geissbuhler A, Miller RA. Experience in implementing inpatient clinical note capture via a provider order entry system. J Am Med Inform Assoc. 2004;11:310–315.
26 Shultz E, Rosenbloom T, Kiepek W, et al. Quill: A novel approach to structured reporting. AMIA Annu Symp Proc. 2003:1074.
27 Waitman LR, Pearson D, Hargrove FR, et al. Enhancing Computerized Provider Order Entry (CPOE) for neonatal intensive care. AMIA Annu Symp Proc. 2003:1078.
28 Rosenbloom ST, Geissbuhler AJ, Dupont WD, et al. Effect of CPOE user interface design on user-initiated access to educational and patient information during clinical care. J Am Med Inform Assoc. 2005;12:458–473.
29 Institute of Medicine (U.S.). Committee on Quality of Health Care in America. Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, DC: National Academy Press; 2001.
30 Jagsi R, Kitch BT, Weinstein DF, Campbell EG, Hutter M, Weissman JS. Residents report on adverse events and their causes. Arch Intern Med. 2005;165:2607–2613.
31 McDonald CJ. Protocol-based computer reminders, the quality of care and the non-perfectability of man. N Engl J Med. 1976;295:1351–1355.
32 Tierney WM, McDonald CJ, Martin DK, Rogers MP. Computerized display of past test results. Effect on outpatient testing. Ann Intern Med. 1987;107:569–574.
33 Tierney WM, Miller ME, Overhage JM, McDonald CJ. Physician inpatient order writing on microcomputer workstations. Effects on resource utilization. JAMA. 1993;269:379–383.
34 Chen P, Tanasijevic MJ, Schoenenberger RA, Fiskio J, Kuperman GJ, Bates DW. A computer-based intervention for improving the appropriateness of antiepileptic drug level monitoring. Am J Clin Pathol. 2003;119:432–438.
35 Shojania KG, Yokoe D, Platt R, Fiskio J, Ma’luf N, Bates DW. Reducing vancomycin use utilizing a computer guideline: Results of a randomized controlled trial. J Am Med Inform Assoc. 1998;5:554–562.
36 Wilson GA, McDonald CJ, McCabe GP Jr. The effect of immediate access to a computerized medical record on physician test ordering: A controlled clinical trial in the emergency room. Am J Public Health. 1982;72:698–702.
37 Chertow GM, Lee J, Kuperman GJ, et al. Guided medication dosing for inpatients with renal insufficiency. JAMA. 2001;286:2839–2844.
38 Bates DW, Teich JM, Lee J, et al. The impact of computerized physician order entry on medication error prevention. J Am Med Inform Assoc. 1999;6:313–321.
39 Teich JM, Merchia PR, Schmiz JL, Kuperman GJ, Spurr CD, Bates DW. Effects of computerized physician order entry on prescribing practices. Arch Intern Med. 2000;160:2741–2747.
40 Chassin MR, Galvin RW. The urgent need to improve health care quality. Institute of Medicine National Roundtable on Health Care Quality. JAMA. 1998;280:1000–1005.
41 Fischer MA, Solomon DH, Teich JM, Avorn J. Conversion from intravenous to oral medications: Assessment of a computerized intervention for hospitalized patients. Arch Intern Med. 2003;163:2585–2589.
42 McCaffrey TV. Using hand-held computing devices in the practice of otolaryngology-head and neck surgery. Curr Opin Otolaryngol Head Neck Surg. 2003;11:156–159.
© 2008 Association of American Medical Colleges
43 McCreadie SR, Stevenson JG, Sweet BV, Kramer M. Using personal digital assistants to access drug information. Am J Health Syst Pharm. 2002;59:1340–1343.