WAGER, KAREN A. DBA; SCHAFFNER, MARILYN J. PhD, RN, CGRN; FOULOIS, BONNIE RN; SWANSON KAZLEY, ABBY PhD; PARKER, CHERYL PhD, RN; WALO, HELENA RN
Hospitals and other healthcare organizations are in the midst of implementing health information technology, including electronic health record (EHR) systems, as a means to improve patient safety and quality, reduce costs, and increase efficiency. Electronic health record systems are not new. They have been in various stages of development and use for more than 30 years.1-3 Yet, it was not until the Institute of Medicine published its two landmark reports4,5 on medical errors and quality that there was a groundswell of activity at the national level to further the widespread adoption and use of the EHR, primarily as a strategy for improving quality. Despite the growing interest, only an estimated 11% of hospitals have fully implemented EHR systems; another two-thirds have "partially" implemented EHR systems or are automating various components such as clinical documentation, medication administration, or provider order entry.6 A major barrier to EHR system adoption is the enormous investment for a healthcare organization, requiring substantial resources, workflow redesign, and support. Too often, clinical information systems (such as EHRs) are implemented without fully assessing patient care workflow and involving clinicians in the early stages of the design process. If the technology impedes the care process or is too cumbersome to use, clinicians will develop workarounds, which can in turn lead to errors and time delays in getting information documented in the patient's medical record-and can ultimately limit the system's impact in improving quality, safety, and efficiency.7
Having the right devices available to clinicians and patient care technicians for entering data into the patient's record at the point of care remains a challenge even among providers who have adopted sophisticated EHR systems. Healthcare leaders involved in implementing clinical information systems often face the dilemma of which data-entry device or combination of devices is ideal in facilitating the accurate and timely capture of key critical data to the patient's record. If computer workstations are not readily available or the application requires a lot of typing and transcription, time delays, omissions, and errors in documentation can occur.8 An earlier study at a large southeastern hospital found that most nurses preferred bedside documentation, but reported that environmental and system barriers often prevented EHR documentation at the bedside.9
One area of particular challenge is capturing vital signs data. Vital signs (eg, blood pressure, temperature, pulse, respirations, and oxygen saturation [Spo2]) provide clinicians with an important marker of the patient's physiological status during hospitalization. Physicians and nurses rely on ready access to vital signs data when monitoring the patient's condition. Yet, vital signs are generally taken by patient care technicians, who may not have ready access to a computer workstation to enter the values.
Fischer and colleagues10 published a review paper summarizing the use of handheld devices in medicine. Their review article was limited to the use of PDAs, one specific type of mobile computing device, and their findings revealed that PDAs were being used primarily by physicians for retrieving patient data and accessing clinical knowledge aids. Wu and Straus11 conducted a systematic review of randomized controlled trials that evaluated the effects on practitioner performance or patient outcomes of handheld EHRs compared with either paper or desktop electronic medical records. Only two studies met their inclusion criteria.12,13 Both compared point-of-care documentation on a PDA with paper and found that the electronic format can increase the detail and quality of the data entered.
With regard to vital signs documentation, little is known about which data-entry method is best in facilitating the accurate capture and timely reporting of data in an EHR in the hospital setting. We found only one study that examined the impact of EHR systems on accuracy of vital signs documentation,14 and no studies examined the timeliness of vital signs data using different devices for entering the data into the patient's record. Gearing et al14 found 25.6% of vital sign sets had one or more errors when documented in paper medical records, and 14.9% had one or more errors when documented in an electronic medical record system. They did not report using handheld or mobile devices for entering data at the point of care. With patient safety a top priority in hospitals, it is critical that vital signs data be accurate, timely, and available to all clinicians and technicians involved in the patient's care. Delays, omissions, and transcription errors in vital signs data can result in inappropriate, delayed, or missed treatment15; thus, there is a need to study the impact of using different data-entry devices on the quality and timeliness of documentation.
The purpose of this study was to measure the accuracy and timeliness of vital signs data before and after the implementation of a clinical documentation system using four different data-entry devices: (1) a paper medical record system whereby vital signs were handwritten on a piece of paper and then hand transcribed to the patient's record (paper to paper), (2) a clinical documentation system with a "computer on wheels" workstation outside the patient's room whereby vital signs were handwritten on a piece of paper and then transcribed into a computer on wheels (paper to computer), (3) a clinical documentation system with a tablet PC affixed to the vital signs monitor, a GE Dinamap machine (GE Healthcare, Waukesha, WI) whereby vital signs were immediately transcribed from the vital signs monitor to the tablet PC (machine to computer), and (4) a clinical documentation system with direct feed from the vital signs monitor to the Tablet PC. The fourth documentation system is yet to be tested.
Background and Study Setting
The study was conducted at the Medical University of South Carolina (MUSC), a freestanding academic health sciences center, located in Charleston, SC. As a level I trauma center, MUSC is composed of 709 beds and four specialty hospitals including an adult hospital, a children's hospital, a psychiatric hospital, and a new 156-bed cardiovascular and digestive disease hospital (which opened in February 2008). The MUSC provides services to more than 33 000 inpatients per year and is home to a 500-physician practice plan. The organization employs nearly 10 000 staff and educates 3000 health professions students each year from six colleges including medicine, nursing, pharmacy, dental medicine, health professions, and graduate studies.
The clinical care component of MUSC's mission is to provide high-quality, cost-effective patient care in a safe environment. To aid in accomplishing this goal, MUSC is currently implementing a host of clinical applications as the organization moves toward having a fully integrated EHR system among all inpatient settings. The MUSC has a fully operational EHR system in its ambulatory care clinics. The new clinical applications, which are all products of McKesson Corporation (San Francisco, CA), include clinical documentation system, medication administration using bar coding, and computerized provider order entry (CPOE). Both the electronic clinical documentation system and medication administration system were installed on most of the medical/surgical units in the MUSC adult hospital in 2007 and are being used in the new hospital. CPOE was implemented in the new cardiovascular and digestive disease hospital in 2009 and will be further deployed in the existing adult hospital in 2010.
When the MUSC leadership team made the strategic decision to invest long term in an integrated EHR system, key patient safety, quality and efficiency goals for the project, and metrics for assessing the value of the system, were established. It was felt that the system should improve the accuracy and timeliness of documentation in the patient's record and optimize workflow. During the early stages of implementation, the team visited two other large levels I and II medical centers in the state that had implemented clinical applications similar to the ones to be implemented at the MUSC. This provided an opportunity to observe the importance of technology in facilitating the care process, including care provided by noncredentialed patient care technicians. For example, in one of the medical centers visited, patient care technicians were handwriting vital signs to a piece of paper, and then reentering the vital signs to the clinical documentation system on a wall-mounted device in the patient's room. At the other site, staff were provided with a computer on wheels, but it seemed very cumbersome for the patient care technician to wheel the computer into the patient's room along with the vital signs machine. Thus, MUSC's leadership team decided to equip the nurses and patient care technicians with mobile devices (eg, Tablet PCs) to facilitate the documentation process.
The MUSC purchased two types of Tablet PCs, the Motion Computing LE1600 and the C5 (Motion Computing, Austin, TX).a The Motion Computing LE1600 is a general-purpose Tablet PC used by the patient care technicians in this study. The LE1600 has a 12.1-in screen, and data can be entered via a keyboard or through handwriting recognition. Information system field engineers attached the Tablet PC and vital signs monitor to a single GCX (GCX, Petaluma, CA) rolling pole with a 38-in post, 21-in base, six 4-in casters, and a 10-lb counterweight. The bottom portion of Figure 1 shows the positioning of the devices on the single pole, allowing the caregiver to easily move from room to room.
The goal of this project was to measure the impact of the new clinical documentation system (using three different data-entry modalities) on the accuracy and timeliness of vital signs data compared with a paper-based record system. In the fourth and final phase of the study, there will be a direct integration between the vital signs monitor and the clinical documentation application that will allow data to flow directly from the vital signs acquisition device to the EHR. The accuracy and timeliness of vital signs documentation in this final phase will be reported once the data integration functionality is operational.
An observational study design was chosen based on the aims of the study and review of the literature. The study was conducted in four adult inpatient medical/surgical units at this level I trauma hospital. At least 30 observations were conducted in each of the first three phases: (1) the paper medical record system, (2) the clinical documentation system with a computer-on-wheels workstation outside the patient's room (Figure 1), and (3) the clinical documentation system with a Tablet PC affixed to the vital signs monitor in the patient's room (Figure 2). The lead author trained six nurses and two nurse externs to conduct observations of patient care technicians as they made their routine vital sign rounds. (The patient care technicians had varied backgrounds ranging from certified nursing assistants to those with high-school diplomas with 1 year of healthcare experience. All are trained and skilled in taking patient vital signs.) Five of the six nurses were employed as nursing informatics staff members. The other nurse observer participated in the study as part of a graduate course. The training was conducted in two stages to help ensure interrater reliability among observers. First, the observers met as a group with the lead author to discuss the purpose of the observations, the data collection worksheet, and how the observations were to be made. Observers were instructed to record the date and time the vital signs were taken and the time the vital signs were entered into the patient's record. The observers were also instructed to record the patient vital signs exactly as they appeared on the vital signs machine and at the same time the patient care technicians recorded the patient's vital signs. They were told to check and double-check the accuracy of their observations and recordings by (1) reviewing the vital signs on the monitor, (2) writing them down immediately, and (3) verifying that each value was recorded correctly (check vital signs machine and recordings on worksheet). Following a trial run of observations, the observers met again with the trainer to discuss their experiences in conducting the observations and how they were using the observation worksheet and to clarify any questions they might have. The observation worksheet was modified slightly based on the initial observations and the debriefing from the pilot. For example, initially the recording worksheet did not include the time the patient care technician entered and left the patient's room. These two times were added to the observation worksheet based on observer recommendations. No differences were noted in the trainers' (nurses vs nurse externs) comfort level with the new system.
Institutional review board approval and nurse manager support from the respective units were obtained before beginning the study. The observers told the patient care technicians that they were observing workflow and were interested in knowing how long it takes to document vital signs using different data-entry modalities. All patient care technicians invited to participate agreed, except for one.
The research team developed, pilot tested, and refined the data collection instrument for recording observations. Five vital signs were defined-blood pressure, temperature, heart rate, Spo2, and respiration rate.b The observers were instructed to record on the data collection form the patient's blood pressure, temperature, heart rate, and Spo2 exactly as it appeared on the vital signs monitor. The observers then compared the vital signs that they wrote on the data collection form with the vital signs documented in the patient's medical record. A documentation error was defined as any discrepancy or difference between what the observer documented on the data collection form and the vital sign recorded by the patient care technician in the medical record. The two categories of errors included transcription errors and omission errors. A transcription error was defined as the value of the vital sign recorded in the patient's record that differed from the value recorded on the observer's worksheet. For example, if the blood pressure recorded in the patient's record read "118/65" and the observer's worksheet read "128/65," this would be considered a transcription error. An error of omission was defined as a vital sign value left blank in the patient's record, although it had been recorded by the observer.
In addition to noting any documentation errors, the observers recorded the time (in minutes and seconds) that the patient care technician entered the room, the time the patient's vital signs were taken, the time the vital signs were entered into the patient's medical record, and the time the patient care technician left the room. The observers also noted if the patient was in isolation and whether the patient care technician recorded the vital signs directly into the patient's record or an interim sheet. Two authors (K.A.W. and B.F.) met with the observers as a group after each round of observations to ensure consistency in their understanding and use of the data collection forms. A debriefing and discussion ensued at the completion of each stage of the project. During the debriefing, the observers shared their impressions of the process and any interesting observations they noted during their rounds with the patient care technicians.
Caution was taken regarding the timing of the observations in stages 2 and 3. The clinical documentation system was newly installed, and observations were not begun until approximately 3 to 4 weeks after it was implemented. It was important for staff to feel comfortable and proficient using the new clinical documentation system and data-entry devices before being observed. Therefore, observations were not made until patient care technicians indicated that they were comfortable using the new electronic documentation system to record vital signs. Similarly, observations did not begin in stage 3 until the patient care technicians indicated they were comfortable using the Tablet PC devices affixed to the vital signs monitor.
A primary interest became the differences in documentation error rates and the delay in documenting the vital signs in the patient's record in each of the three stages of the study. A total of 270 instances of the vital signs process were made. (One instance included the four vital sign recordings that were made on a given patient at a specific date and time.) A dichotomous variable was created for each record to indicate if the record had a documentation error, defined as any discrepancy between the vital signs recorded on the observers' worksheets and the vital signs documented in the patient's medical record. If a record had two discrepancies between what the observer noted and what was documented in the patient's record and an error in the recording of blood pressure and/or heart rate, it was counted as one documentation error. The data were entered into SPSS (SPSS Inc, Chicago, IL). A one-way analysis of variance (ANOVA) and a χ2 were used to measure differences in documentation error rates between each stage. The latency or mean time difference between when the vital signs were taken and when the data were recorded in the patient's record was measured. A one-way ANOVA was conducted to measure the mean time differences between the three stages. The post hoc Bonferroni test was used to detect the strength and direction of the relationship among the stages. P < .05 were considered statistically significant.
A total of 270 instances of vital signs were made during the course of the three stages of this study. During phase 1, the observers found that the patient care technicians typically handwrote the vital signs into the patient's paper medical record at the point of care. In phase 2, when the patient care technicians were fully trained on the electronic clinical documentation system, most handwrote the patient vital signs on a sheet of paper at the bedside and then transcribed the vital signs into the electronic clinical documentation system using the computer on wheels outside the patient's room. Since the computers on wheels were often occupied by other nurses and staff members, it was not uncommon for the patient care technicians to continue their rounds and later enter the vital signs data when they had time or when a computer workstation was available. In phase 3, when the patient care technicians were equipped with Tablet PCs affixed to the vital signs machine, the observers noted that the technicians generally took the patient vital signs and recorded them directly into the Tablet PC, eliminating the paper documentation and transcription steps.
Table 1 depicts the number of observations, the rate of documentation error, and the mean latency rates in all three stages. In stage 1 (paper-based medical record), 113 observations were made, 33 in stage 2 (clinical documentation system with computer on wheels located outside the patient's room), and 124 in stage 3 (clinical documentation system with Tablet PC affixed to vital signs monitor in patient's room). For an effect size of 0.25 and α of .05, the power of stages 1 and 3 are greater than 0.85. For stage 2, the power is 0.58. At the onset of the study, 110 to 125 observations were planned for each stage. However, this proved not feasible in stage 2 when physicians were frustrated that vital signs were not in the patient's record at the time of their rounds. Physicians expressed concerns about patient care to the senior leadership team, and consequently, the clinical documentation research study team was instructed to rapidly equip the patient care technicians with Tablet PCs and documentation of vital signs at the bedside. All observations made within each stage were collected over a 1- to 2-week period.
A 16.8% error rate was noted in stage 1. Stage 2 had an error rate of 15.2%, and stage 3, 5.6%. There was a statistically significant difference between stage 1, the paper-based documentation system, and stage 3, when the patient care technicians used the Tablet PC affixed to the vital signs monitor in the patient's room to record vital signs in the clinical documentation system (P < .05). The documentation error rate with the paper-based system was 16.8%, compared with an error rate of 5.6% with the Tablet PC. No statistically significant differences were found in documentation error rates between stages 1 and 2, or stages 2 and 3. Of the total 31 documentation errors, 28 (90.3%) were transcription errors.
The mean time difference between the time vital signs were taken and when the data were recorded in the patient's record varied significantly between stage 1 (the paper-based system) and stage 2 (clinical documentation system where computer on wheels was in the hallway), and between stages 2 and 3 (P < .001). With paper records, it took an average of 1 minute 24 seconds (SD, 2:17) for the patient care technician to record the vital signs into the patient's record after taking them. It took slightly more than half a minute to do so with the Tablet PC (35 seconds; SD, 1:42). In stage 2, when the workstation was outside the patient's room, the mean latency time was 9 minutes 15 seconds, with a maximum time of 27 minutes (SD, 7:25). No statistically significant differences were found between stage 1 (paper medical record system) and stage 3 (when Tablet PC was affixed to the pole of the vital signs monitor).
Findings from this study suggest that equipping patient care technicians with an electronic clinical documentation system, without appropriate data-entry devices to enter vital signs directly at the point of care, can result in significant time delays in getting the vital signs data into the patient's medical record. Having available computers on wheels (or carts) for clinicians and technicians to use outside the patient's room is not sufficient. At the MUSC, which is now approximately 50 years old, there is insufficient space to easily wheel the computer workstations (with medication drawers) into the patient's room, and it is nearly impossible for patient care technicians to maneuver both the computer on wheels and the vital signs monitor. With either a paper medical record or the Tablet PC affixed to the vital signs monitor, the patient care technicians could easily document vital signs at the point of care. Because of the burden of maneuvering the larger computer on wheels and the vital signs monitor into the patient's room, many patient care technicians chose to write the vital signs on an interim sheet and then transcribed them into the clinical documentation system when they had time or a workstation was available, thus introducing another opportunity for errors in documentation. This resulted in documentation errors and a significant time delay in getting vital signs into the patient medical record.
The quality of vital signs documentation improved by providing patient care technicians with a Tablet PC affixed to the vital signs monitor. The percentage of patient's records with one or more documentation errors dropped from 16.8% and 15.2% (paper record and clinical documentation system outside computer outside the room, respectively) to 5.6% when patient care technicians were equipped with Tablet PCs. In this study, the decrease in documentation error rates from the paper record to the clinical documentation system with the Tablet PC was statistically significant.
The patient care technicians took an average of 9 minutes 15 seconds to get the vital signs into the patient's record, compared with an average of 1 minute 14 seconds with paper records and 35 seconds when equipped with a Tablet PC. Computers outside the patient's room were often occupied by nurses and other clinicians. Thus, the patient care technicians believed it was best to continue making their vital sign round and enter the data into the patient's record when a workstation was free or they had time to get to it. Given that vital signs are typically taken every 4 to 6 hours in medical/surgical units in hospitals, delays getting the vital signs into the patient's record and having them readily available to all clinicians involved in the patient's care have important patient safety implications.
These findings also have important implications for healthcare leaders whose institutions are in the midst of implementing various clinical applications including clinical documentation, bar coding of medications administration, and CPOE or other EHR functions. Providing healthcare staff with Tablet PCs can optimize patient care workflow processes and promote data entry at the bedside. PDAs and other handheld devices are options. However, being able to easily read the screen can be an issue with the aging nursing population. Another issue is the restricted amount of patient information that is available to the clinician at the bedside with a single-purpose device.
In conducting this study, the observers made some important observations that are referred to as "unintended consequences," which warrant discussion. First, the observers found that a number of the vital signs monitor machines had not been sufficiently charged (eg, battery was running low), and in some cases, faulty machines had not been reported to biomedical engineering for repair. In the fourth phase of this study, there will be an automatic direct feed of the data from the vital signs monitor to the clinical documentation system, and it is critical that the vital signs monitors are working properly. On several occasions, vital signs monitors that appeared to be working had to be plugged in or recharged more than once during the patient care technicians' rounds, which generally included six to eight patients. Ensuring that all equipment is working properly is important, and staff needs to know how and to whom to report problems so that issues may be resolved quickly. The adoption of EHR systems and electronic linking of biomedical machines with clinical documentation systems makes the technical functioning of the devices and machines increasingly important.
The observers also discovered that the patient care technicians were neither assessing patient pain during their rounds nor notifying the nurse if the patient's vital signs were outside the reference range on a consistent basis. With the new clinical documentation system, the patient care technician is alerted if the vital signs are outside the reference range. Routine reports provide nursing leaders with valuable information regarding the completeness of the documentation, including the frequency of pain assessments. As a result of the observations, retraining was completed to ensure the patient care technician assesses the patient's pain with every vital sign round and notifies the nurse of vital signs outside the reference range. These unintended consequences proved to be extremely useful to nurses, the nurse managers, and the nursing informatics project team supporting the system.
Observational studies such as this can be extremely useful to healthcare organizations. However, they can introduce a Hawthorne effect to the person being observed. To minimize this effect, observers who were "familiar faces" to the patient care technicians and not generally perceived as "outsiders" were employed. Another possible limitation exists in the differences among the patient care technicians and their individual rate of errors. The authors provided consistent training to all technicians, and this may mimic "real life" where some individuals are higher performers than others.
It is recognized that the nurse observers could also have made an error in their recording of the vital signs from the vital signs monitor to the data collection form. The observers were instructed to check and double-check the accuracy of their work. However, the possibility of observer error could have occurred and remains a limitation of this study. The number of observations made in stage 2 was limited when delays in getting vital signs into the patient's record caused enormous concern from the medical staff; thus, the small sample size is a limitation. Despite the lower number of observations in stage 2, the fact that having a computer on wheels outside the patient's room was a problem is an important finding. Other organizations in the throes of implementing electronic clinical documentation systems should be aware of workflow issues associated with having patient care technicians rely on data-entry devices located outside the patient's room.
The transition from paper to electronic documentation systems can be challenging. Findings from this study suggest it is critical that healthcare leaders provide their staff with the devices they need to document patient care at the bedside and facilitate workflow. Observing patient care technicians while making vital sign rounds can be enormously helpful in detecting bottlenecks in the process and workarounds. Providing patient care technicians with Tablet PCs affixed to the vital signs monitor improved the quality and timeliness of documentation. Future research is needed to further explore the ideal device or combination of devices for enabling clinicians and patient care technicians to effectively document patient care in a timely manner.
The authors thank the MUSC nurse informatics staff and nurse externs for their assistance in conducting the observations.
1. Institute of Medicine. The Computer-Based Patient Record: An Essential Technology for Health Care. Washington, DC: National Academy Press; 1991.
2. IOM Committtee on Improving the Patient Record. The Computer-Based Patient Record: An Essential Technology for Health Care. Revised Edition. Washington, DC: National Academy Press; 1997.
3. Brailer D, Terasawa E. Use and Adoption of Computer-Based Patient Records. Oakland, CA: California HealthCare Foundation; 2003:1-42.
4. Institute of Medicine. To Err Is Human: Building a Safer Health System. Washington, DC: National Academy Press; 2000.
5. Institute of Medicine. Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, DC: National Academy Press; 2001.
6. Continued Progress: Hospital Use of Information Technology. Chicago, IL: American Hospital Association; 2007.
7. Poon E, Blumenthal D, Jaggi T, Honour M, Balas D, Kaushal R. Overcoming barriers to adopting and implementing computerized physician order entry systems in US hospitals. Health Aff. 2004;23(4):184-190.
8. Lee TT. Nurses' experiences using a nursing information system: early stage of technology implementation. Comput Inform Nurs. 2007;25(5):294-300.
9. Moody LE, Slocumb E, Berg B, Jackson D. Electronic health records documentation in nursing: nurses' perceptions, attitudes, and preferences. Comput Inform Nurs. 2004;22(6):337-344.
10. Fischer S, Stewart TE, Mehta S, Wax R, Lapinsky SE. Handheld computers in medicine. J Am Med Inform Assoc. 2003;10:139-149.
11. Wu RC, Straus SE. Evidence for handheld electronic medical records in improving care: a systematic review. BMC Med Inform Decis Mak. 2006;6:26.
12. VanDenKerkhof E, Goldstein D, Rimmer M, Tod D, Lee H. Evaluation of hand-held computers compared to pen and paper for documentation on an acute pain service. Acute Pain. 2004;2004(6):115-121.
13. Stengel D, Bauwens K, Walter M, Kopfer T, Ekkernkamp A. Comparison of handheld computer-assisted and conventional paper chart documentation of medical records. A randomized, controlled trial. J Bone Joint Surg Am. 2004;(86-A):553-560.
14. Gearing P, Olney CM, Davis K, Loranzo D, Smith LB, Friedman B. Enhancing patient safety through electronic medical record documentation of vital signs. J Healthc Inf Manag. 2006;20(4):40-45.
15. Smith LB, Banner L, Olney CM, Loranzo D, Friedman B. Connected-care: enhancing patient safety through automated vital signs data upload. Comput Inform Nurs. 2007;25(5):312-313.
aThe Motion Computing C5 device has an integrated bar-code scanner and is easily disinfected with standard germicidal solutions. It was chosen as the primary Tablet PC device to accommodate bar-coding medication administration in the next phase of the EHR rollout. Cited Here...
bThe observers initially counted and recorded patient respirations on the data collection form. However, we opted not to include them in our analyses since they are rather subjective (as compared with recordings on the vital signs monitor), and there is no easy way to objectively verify their accuracy in an observational study such as this one. Cited Here...
© 2010 Lippincott Williams & Wilkins, Inc.