Human factors issues associated with standard patient monitors used in anesthesia are well documented.1,2 The monitors are often located in an awkward position, away from the patient, so that the anesthesiologist cannot monitor the patient's clinical signs while keeping the vital signs within view. Furthermore, the anesthesiologist may be unable to see the monitor when performing physically constraining tasks such as laryngoscopy, drawing up drugs, or repositioning the patient. Auditory alarms are intended to prevent the anesthesiologist from missing important changes on the monitor, but alarms are often false, uninformative, difficult to distinguish, or are sometimes turned off.3,4
One approach to solving these problems is to use an advanced monitoring display that provides the anesthesiologist with continuous information, removing the need to review information repeatedly on the visual monitor.4 The audible pulse oximetry tone is an example of such a display because it conveys the patient's oxygen saturation, heart rate, and rhythm, without the anesthesiologist having to review the monitor. Given the success of audible pulse oximetry, researchers have investigated presenting other vital signs using auditory, vibrotactile, or head-mounted display (HMD) technology.5
HMDs superimpose a visual information display over their wearers’ field of view.6 They are similar to the head-up displays often used in aviation that allow pilots to detect unexpected changes to flight instrumentation faster than with traditional “head-down” cockpit displays.7,8 Several authors have proposed that the benefits found with head-up displays may also apply to HMDs in anesthesia.3,4,9–12 HMDs could be a solution to the ergonomic issues associated with the location of patient monitors by letting the anesthesiologist perform physically constraining tasks while continuing to monitor the patient. In contrast to traditional “slave” monitoring displays, the HMD does not require the anesthesiologist to look away from the primary task to see the information presented. Furthermore, the HMD is always available even in locations where both the primary and slave displays are out of view.
Monitoring with HMDs has been evaluated in both clinical and simulated operating room (OR) environments.4,13 After briefly experiencing the HMD in the OR, anesthesiologists from 2 independent clinical studies reported that the HMD had potential despite its technological limitations.10* Simulator-based studies have indicated that an HMD allows anesthesiologists to spend more time focusing on the patient and less time looking at the monitor and anesthesia workstation.14,15 Furthermore, anesthesiologists using the HMD when they were busy and physically constrained detected changes in their simulated patient's vital signs faster than with a standard monitor.15† Similar benefits have also been found with surgeons in simulated environments.16,17 Finally, anesthesiologists report that they detect vital signs changes faster with an HMD compared with a standard monitoring display.15,18
The 2 previously reported clinical HMD evaluations10* had serious limitations. First, 1 study was reported in an abstract* and does not seem to have been published in a peer-reviewed journal. Second, in both studies, there were no objective measures of anesthesiologists’ performances or behavioral changes when using the HMD nor were there control conditions for comparison. Third, no waveforms were presented on the HMD in 1 study,10 and no CO2 information in the other.* Fourth, the HMD was used for a short time; participants experienced only 1 case10 or 1 induction* with the HMD. Finally, the participants were tethered to the anesthesia workstation by a connecting cable for the HMD unit, and so were not free to move about the OR.10*
Although simulator-based studies have demonstrated benefits for monitoring with HMDs, there have been no controlled studies to determine whether the results from the simulator will generalize to the clinical environment.
We report the results of a prospective, controlled clinical evaluation of monitoring with HMDs. Our aim was to examine whether the HMD would free the anesthesiologist's visual attention from the anesthesia workstation and allow him or her to spend more time looking at the patient, as was found in simulator-based studies.14,15
METHODS
Participants
This study received Human Research and Ethics Committee approvals from the Royal Adelaide Hospital (RAH) and The University of Queensland, and was registered with the Australian New Zealand Clinical Trials Registry (ACTRN12608000245392). Six RAH anesthesiologists volunteered for the study and provided written consent. The selection criteria included board certification (attending anesthesiologists), regular engagement in urology OR lists, and prior participation in at least 1 simulator-based HMD study conducted at the RAH over the previous 2 years.15,18
Design
A 2 (display) × 3 (trial) repeated-measures design was used. Display referred to the monitoring technologies available to participants: the control condition was the standard patient monitor, and the HMD condition was the standard patient monitor plus the HMD (Fig. 1). Trial referred to the first, second, or third case performed by participants for each condition. Each participant provided anesthesia to 6 patients, corresponding to the 6 combinations of the experimental design.
Figure 1: An anesthesiologist manually ventilates a patient with a bag and mask during induction, while simultaneously monitoring the patient's vital signs using the head-mounted display (HMD). The HMD uses a single transparent monocle over the anesthesiologist's right eye to superimpose information over their field of view (
Fig. 2). The battery pack and interfacing computer are contained within the backpack worn by the participant and a head-mounted camera records the anesthesiologist's direction of gaze (
Fig. 3).
The dependent variables were the percentage, frequency, and duration of participants’ head turns toward the anesthesia workstation and toward the patient/surgical field (6 variables in total). Percentage was the amount of time that participants spent looking at the anesthesia workstation or at the patient/surgical field as a percentage of the total case duration. Frequency measured the number of head turns per minute. Duration measured the mean time in seconds that participants spent looking toward the anesthesia workstation or patient/surgical field for each head turn. The anesthesia workstation included the standard patient monitor, ventilator, gas flows, volatile anesthetic dials, and electronic record-keeping system. The patient/surgical field included the patient, operating table, IV stands, surgical field, and the cystoscope monitor.
Case Selection
Cases in the urology theater were selected on the basis of specific inclusion and exclusion criteria that were intended to minimize the variability between cases and therefore increase statistical power. The 36 cases performed during the study were either surveillance cystoscopies with or without minor resections of bladder tumors (26 cases), or cystoscopies with urethral dilation or minor resections for improving urine flow (10 cases).
The inclusion criteria were (a) presentation for elective rigid cystoscopy with or without minor transurethral surgery, and (b) preassessment by an independent anesthesiologist as suitable for general anesthesia with laryngeal mask-type airway (LMA).
The exclusion criteria were (a) procedures likely to take >1 hour; (b) procedures involving radiological support (e.g., ureteroscopy and stenting procedures); (c) patients judged by an independent anesthesiologist as likely to benefit from either endotracheal intubation or regional anesthesia; and (d) patients unable or unwilling to provide consent.
Apparatus
Participants in the HMD conditions used a Nomad ND2000™ (Microvision, Redmond, WA) optical see-through HMD consisting of a head-mount and battery pack. The head-mount housed a scanning laser that projected a monochrome red display over the participant's field of view using a single transparent monocle in front of their right eye. The battery pack was connected to a Sony Vaio U50™ handheld computer, both of which were contained in a backpack. The equipment operated wirelessly, so that participants were free to move about the OR.
Patients were monitored with a Philips IntelliVue™ MP70 monitor attached to a Philips Anesthetic Gases Module (Philips Healthcare, Andover, MA). The HMD displayed the same vital signs as presented on the MP70 (Fig. 2) with the exception of the ST segment depression index, noninvasive arterial blood pressure (NIBP) cycle state, and time until next NIBP sample (Fig. 3). The MP70 did not export the CO2 waveform; therefore, an independent Philips IntelliVue™ MP30 monitor was used to measure and present a duplicate CO2 waveform on the HMD. Tidal and minute volumes were measured by the Datex-Ohmeda Aestiva/5 ventilator (GE Healthcare, Madison, WI) but were not available on the HMD.
Figure 2: Simulated image showing the participant's view while watching the patient, monitoring the surgical field, and monitoring the patient's vital signs with the head-mounted display (HMD). The HMD displayed the patient's electrocardiogram, plethysmogram, and capnogram waveforms; heart rate; pulse rate, SpO2, and perfusion; end-tidal and inspired CO2, anesthetic agent, O2, and N2O; respiratory rate; noninvasive blood pressure; mean alveolar concentration (MAC); current time; and technical and patient alarms.
Figure 3: Screen capture of the video data recorded for the study showing the same scene as in
Figure 1. Clockwise from top left: view from a head-mount camera worn by the participant, field camera showing the drug cart and anesthetic area, the Philips IntelliVue™ MP70 patient monitor display, and a field camera showing the patient and anesthetic area.
A laptop computer interfaced with the MP70 and MP30 patient monitors using the Philips Data Export Protocol over Ethernet.19 The computer streamed the patients’ vital signs to the U50 inside the backpack using an 802.11a wireless network. The experimenters developed custom software that interfaced with the patient monitors and generated the monitoring display for the HMD.
Experimental data were collected in audio-video and questionnaire formats. Video data were recorded as digital files in a quad display format (Fig. 3) consisting of 2 field cameras from corners of the room, a head-mounted camera attached to the head-mount worn by participants, and the MP70 screen. Audio was recorded using a lapel microphone attached to the backpack and a boom microphone attached to one of the field cameras.
Procedure
The anesthesiologists participated in 3 phases: orientation, data collection, and debriefing. During the orientation phase, participants completed a background questionnaire; were informed about the data collection process; and were given refresher training on how to wear, focus, and use the HMD. Participants spent 30 minutes in a simulated OR practicing monitoring with the HMD while they viewed a range of critical events. This training was provided in addition to their exposure to the HMD during prior studies.
During the data collection phase, participants provided anesthesia with and without the HMD over a 4- to 12-week period. The initial display condition (control versus HMD) was selected randomly and then alternated for every consecutive case. Participants performed no more than 2 cases for the study in 1 day.
Before the start of each case, 1 experimenter (anesthesiologist) sought written informed consent from the patient, and another experimenter sought verbal consent from the nursing and surgical staff. Participants were instructed to provide anesthesia as they would during normal practice (e.g., type of anesthetic and airway).
An experimenter set up the study equipment in the OR at the start of the case. Video was recorded from the moment the patient entered the OR until they were moved to the postanesthetic care unit after the procedure. The experimenter provided the participant with the backpack and HMD after the emergency drugs were prepared and remained near the equipment trolley for the duration of the case. In the control condition cases, the HMD monocle was removed but participants still wore the head-mount and backpack. After each case, participants completed a postsession questionnaire.
In the debriefing phase, participants completed a postexperiment questionnaire, were debriefed by an experimenter, and were given a small gift for their time.
Data Preparation
The period of video data used in the statistical analyses began when the first drug was administered and ended when the last monitor was removed (e.g., capnography). The participants’ monitoring behavior was coded from the video data in 3 steps by 1 experienced coder (DL) using a custom software tool (as in 2 prior HMD studies15). First, changes in where participants were looking as each case unfolded were recorded under 1 of 3 gaze location categories: anesthesia workstation (including gas and ventilator controls, patient monitors, and the electronic record-keeping system), patient/surgical field (including patient head and airway, IV line, surgical draped area, surgeon, and cystoscope display), and other (including reading materials, OR staff away from the surgical field, and drug cart). Second, each case was divided into 3 main phases: induction (from when the first drug was administered), maintenance (starting at onset of surgery), and emergence (from the cessation of volatile anesthetic until all monitoring was removed). Induction was further subdivided into drugs (when the first drug was administered), LMA placement (when the anesthesiologist took the mask from the anesthetic nurse), and draping (when the anesthesiologist helped reposition the patient). Finally, the percentage, frequency, and duration metrics were calculated from the coded head-turning data for each anesthetic phase.
Analysis
Differences in the percentage, frequency, and duration metrics from the head-turning data were independently tested for significance with Statistica™ 8 (StatSoft, Tulsa, OK) using a repeated-measures analysis of variance for each measure with α = 0.05, 2-tailed. The factors were display (control, HMD) × trial (first, second, third) × phase (induction [drugs], induction [LMA placement], induction [draping], maintenance, emergence) × gaze location (anesthesia workstation, patient/surgical field). Planned comparisons were used to test the differences between display conditions independently for each gaze location (averaging over the 5 phases).
For questionnaire responses, if a question was asked in both display conditions, then differences in mean Likert scale values between the 2 conditions were tested for significance with Statistica using t tests of dependent samples with α = 0.05, 2-tailed. Responses to questions that did not have a control condition baseline were tested for significance with Statistica using t tests against a single sample, 4 (neutral response), with α = 0.05, 2-tailed. For analyses of postsession questionnaires where participants performed 3 cases in each display condition, responses for each display were averaged.
RESULTS
Monitoring Behavior
The duration of cases ranged from 17 to 75 minutes with a median of 31 minutes. The frequency of anesthesiologists’ looks toward the anesthesia workstation and patient/ surgical field, as well as the percentage of time they looked at each, and the average duration of each look were calculated from 16,342 head turns coded in 22 hours of video (Table 1).
Table 1: Monitoring Behavior Variables (Percentage, Frequency, Duration) Averaged Across All Participants for Each Direction of Gaze, Display Condition, and Phase
For the percentage of time the anesthesiologist was looking toward an object, there were main effects of phase (P < 0.001) and gaze location (P < 0.001) but no effect of display or trial (Fig. 4). Moreover, there was a significant interaction between phase and gaze location (P < 0.001) and between display and gaze location (P < 0.001). Planned comparisons showed that participants with the HMD spent less time looking at the anesthesia workstation (21.0% vs 25.3%, P = 0.003) and more time looking at the patient/surgical field (55.9% vs 51.5%, P = 0.014), than in the control condition.
Figure 4: The percentage of time participants spent looking toward the anesthesia workstation and patient/surgical field for both display conditions. Participants with the head-mounted display (HMD) spent a smaller percentage of time looking toward the anesthesia workstation (P = 0.003) and a larger percentage of time looking toward the patient/surgical field (P = 0.014) than in the control condition. Error bars indicate 95% confidence intervals.
For the frequency of head turns toward an object, there were main effects of phase (P = 0.018) and gaze location (P = 0.011), a significant interaction between phase and gaze location (P = 0.002), but no effects of display or trial. Participants looked more frequently toward the patient/surgical field (5.0 head turns/min) than toward the anesthesia workstation (3.9 head turns/min).
For the duration of looks toward an object, there was a main effect of gaze location (P = 0.003) and a significant interaction between phase and gaze location (P = 0.019), but no effect of display, trial, or phase. Participants looked at the anesthesia workstation for 3.7 seconds per head turn on average and 7.2 seconds per head turn toward the patient/surgical field.
Two critical events occurred during the study: there was an episode of regurgitation during 1 control case and another episode of regurgitation during an HMD case. Both cases were included in the overall analysis. There were not enough cases for meaningful statistical comparison of the effect of HMD use on behavior during critical events (but a descriptive analysis of the 2 events was performed20).
Questionnaires
Table 2 shows the participants’ responses on Likert scales in postsession and postexperiment questionnaires. There were no significant differences in participants’ responses between the 2 display conditions for perceived ease of monitoring, speed of abnormality detection, or usefulness of the anesthesia machine. However, participants rated the standard patient monitor as being less useful when they used the HMD (5.2 vs 6.2, P = 0.030).
Table 2: Means of Participants’ Responses to Questionnaires on a Likert Scale from 1 to 7, Where 1 = Very Difficult/Slowly/Not Useful/etc., 4 = Neutral, and 7 = Very Easy/Quickly/Useful
On a Likert scale from 1 (useless) to 7 (very useful), participants rated the HMD as being moderately useful (5.4 vs neutrality at 4.0, P = 0.013). There was a tendency toward rating the HMD as comfortable to read (4.9 vs neutrality at 4.0, P = 0.085) and easy to monitor (5.1, P = 0.065), which did not reach significance with this sample size. The responses to postexperiment questionnaires were not significantly different from neutral (4.0), indicating that participants did not have significantly positive or negative views about the HMD.
In questionnaire free-form responses, participants indicated that they liked being able to monitor vital signs with the HMD anywhere in the OR without having to turn around, but disliked wearing the experimental equipment because of its weight and bulk. Issues with HMD use included difficulties reading information presented on the display (focusing and visual interference) and maneuvering tight spaces with the experimental equipment. Participants also noted that displaying the NIBP cycle state, time until next blood pressure sample, and tidal volumes on the HMD would have been useful.
CONCLUSIONS
The head-turning results suggest that HMDs could help anesthesiologists free their attention from the patient monitor and focus on monitoring the patient's clinical signs and the surgical field. When participants wore the HMD, they spent less time looking toward the anesthesia workstation and more time looking toward the patient and surgical field. The increase in time spent looking at the patient and surgical field, at the monitor's expense, is consistent with findings from a full-scale simulation-based study using the same device.15
Participants considered the HMD to be moderately useful. When the HMD was available, the standard patient monitor was considered relatively less useful than when only standard monitoring was available. In contrast, the anesthesia machine presented information not available on the HMD, and therefore, it was considered equally useful in both display conditions.
The HMD did not change how often participants looked in a certain direction or for how long. This result suggests that although the HMD reduced the total amount of time that anesthesiologists looked at the anesthesia workstation, it did not dramatically alter their pattern of head turning during routine monitoring, such as by completely eliminating the need to look at the workstation. The reduction in time spent looking toward the workstation with the HMD can be compared with the results of other studies in which the effect of variables relating to major changes in practice such as expertise,21 fatigue,22 and intraoperative reading23 are examined. Although participants’ gaze locations are coded differently in each study, some comparisons are possible. In this experiment, the HMD provided a similar benefit to that of expertise in redirecting the anesthesiologists’ looks away from the anesthesia workstation and toward the patient and surgical field.21 In contrast, a time-consuming activity such as intraoperative reading does not change the amount of time anesthesiologists spend observing monitors or the surgical field,23 whereas the HMD does change the scanning patterns to some degree.
During situations that have a demonstrated effect on anesthesiologists’ monitoring behaviors and vigilance, however, HMDs may provide other benefits. Novice anesthesiologists spend more time observing monitors compared with experienced anesthesiologists,21 and when anesthesiologists are fatigued, they spend more time observing the monitors, patient, and surgical field than when they are rested.22 When the anesthesiologist is busy during periods of high workloads,21 teaching residents,24 or using transesophageal echocardiography,25 they respond to changes on the patient monitor more slowly. However, these situations were not investigated in this study; therefore, further research would be needed to determine whether HMD use would reduce event detection times.26
Finally, few studies have been able to demonstrate that a novel anesthesia monitoring display is clinically effective.4,27,28 However, this study found that the HMD induced a behavioral change in anesthesiologists’ monitoring patterns comparable with that observed in a controlled simulation study.15
Limitations
This study was limited by a number of factors. First, when participants were using the HMD, the increase in time spent looking toward the patient and surgical field with the HMD was relatively small (4.4% difference), and their time looking at the anesthesia workstation was still substantial (21.0% of the case). However, these statistics may underrepresent the difference in monitoring patterns as a result of HMD use. Even with an HMD, the anesthesiologist would still need to look at the anesthesia workstation to perform tasks (e.g., adjusting gas flows) or monitor information not available on the HMD (e.g., ventilator settings and documentation). Furthermore, the HMD allows anesthesiologists to monitor the vital signs during tasks in which it would be otherwise impossible or impractical to see the patient monitor (e.g., laryngoscopy), but this would not be reflected in the behavioral statistics.
Second, the weight and bulk of the head-mount and backpack equipment was a major concern for participants. Technological improvements to superimposed information displays should result in smaller and less obtrusive devices.29
Third, despite participants’ prior involvement in HMD studies, they experienced the HMD for only 2 hours in total during the study. They may not have had sufficient time to break their existing monitoring habits and develop new strategies that make specific use of the HMD.
Fourth, the number of participants in the study was relatively small, but the study gained power by examining HMD and non-HMD management in the same anesthesiologists.
Finally, the cases selected for the study involved stable patients requiring relatively simple anesthetics and a basic level of monitoring (e.g., no invasive blood pressure monitoring). Although the behavioral analyses investigated the effect of HMDs on anesthesiologists’ monitoring behaviors in the OR, they did not investigate how HMDs might affect the management of clinical events and crises.20 The HMD may be more effective in cases in which the patient's condition is unstable or could deteriorate rapidly.
Future Research
More research is needed to determine which types of data should be presented on the HMD, whether further exposure to the HMD leads to more marked behavioral changes, and whether the behavioral changes resulting from HMD use can lead to improved anesthesiologist performance in the OR or improved patient outcomes.30
DISCLOSURE
David Liu reports that he is currently a Visiting Graduate Student at the University of Utah under the supervision of Dwayne Westenskow, section Editor for Technology, Computing, and Simulation for the journal.
ACKNOWLEDGMENTS
The authors thank the 6 anesthesiologists who participated for their time and feedback, and the medical staff and patients at the Royal Adelaide Hospital (RAH) who consented to being recorded on video for the study. They also thank Nicole Zweck and her nursing, anesthetic, and surgical colleagues from RAH Theater 7 for accommodating the experiment. They acknowledge the help of Matthew Thompson (The University of Queensland) in setting up the audiovisual equipment, Craig Todd, and his Biomedical Engineering colleagues (RAH) for technical advice and assistance preparing equipment. Finally, they also acknowledge the help of Terrence Leane and Edward Murphy (RAH) for assistance with portable patient monitors during the study.
* Via DK, Kyle RR, Geiger PG, Mongan PD. A head mounted display of anesthesia monitoring data is of value and would be used by a majority of anesthesia providers. Anesth Analg 2002;95:S132.
Cited Here
† Via DK, Kyle RR, Kaye RD, Shields CH, Dymond MJ, Damiano LA, Mongan PD. A head mounted display of anesthesia monitoring data improves time to recognition of crisis events in simulated crisis scenarios. Proceedings of Society for Technology in Anesthesia 2003 Annual Meeting. Available at: http://www.anestech.org/media/Publications/Annual_2003/sta112.html.
Cited Here
REFERENCES
1. Decker K, Bauer M. Ergonomics in the operating room—from the anesthesiologist's point of view. Minim Invasive Ther Allied Technol 2003;12:268–77
2. Nyabadza M. Location of anaesthetic monitors. Br J Anesth 2001;86:736–7
3. Walsh T, Beatty PCW. Human factors error and patient monitoring. Physiol Meas 2002;23:R111–32
4. Sanderson PM, Watson MO, Russell WJ. Advanced patient monitoring displays: tools for continuous informing. Anesth Analg 2005;101:161–8
5. Sanderson P. The multimodal world of medical monitoring displays. Appl Ergon 2006;37:501–12
6. Rolland J, Hua H. Head-mounted display systems. In: Johnson RB, Driggers RG eds. Encyclopedia of Optical Engineering. New York: Marcel Dekker, 2005
7. Naish JM. Combination of information in superimposed visual fields. Nature 1964;202:641–6
8. Martin-Emerson R, Wickens CD. Superimposition, symbology, visual attention, and the head-up display. Hum Factors 1997;39:581–601
9. Hudspith S. Controlling and monitoring induced unconsciousness: ergonomics and design in anaesthesia. In: Woods D, Roth E eds. Proceedings of the Human Factors Society 34th Annual Meeting. Santa Monica, CA: Human Factors and Ergonomics Society, 1990:482–5
10. Block FE, Yablok DO, McDonald JS. Clinical evaluation of the ‘head-up’ display of anesthesia data. Int J Clin Monit Comput 1995;12:21–4
11. Platt MJ. Heads up display. Br J Anesth 2004;92:602–3
12. Weller P. Telemonitoring. J Clin Monit Comput 2006;20:129–30
13. Bitterman N. Technologies and solutions for data display in the operating room. J Clin Monit Comput 2006;20:165–73
14. Ormerod DF, Ross B, Naluai-Cecchini A. Use of an augmented reality display of patient monitoring data to enhance anesthesiologists’ response to abnormal clinical events. Stud Health Technol Inform 2003;94:248–50
15. Liu D, Jenkins S, Sanderson PM, Watson MO, Russell WJ, Leane T, Kruys A. Monitoring with head-mounted displays: performance and safety in a full-scale simulator and part-task trainer. Anesth Analg 2009;109:1135–46
16. von Segesser LK. Video-on-command for thoracic and cardio-vascular surgery. Eur J Cardiothorac Surg 2003;24:473–4
17. Beuchat A, Taub S, Saby JD, Dierick V, Codeluppi G, Corno AF, von Segesser LK. Cybertools improve reaction time in open heart surgery. Eur J Cardiothorac Surg 2005;27:266–9
18. Sanderson PM, Watson MO, Russell WJ, Jenkins S, Liu D, Green N, Llewelyn K, Cole P, Shek V, Krupenia SS. Advanced auditory displays and head-mounted displays: advantages and disadvantages for monitoring by the distracted anesthesiologist. Anesth Analg 2008;106:1787–97
19. Philips Medical Systems: IntelliVue Patient Monitor Data Export Interface Programming Guide M8000–9305C. Germany: Philips Medizin Systeme Boeblingen GmbH, 2004
20. Liu D, Jenkins SA, Sanderson PM. Clinical implementation of a head-mounted display of patient vital signs. In: Proceedings of the 2009 International Symposium on Wearable Computers. Linz, Austria: IEEE, 2009:47–54
21. Weinger MB, Herndon OW, Zornow MH, Paulus MP, Gaba DM, Dallen LT. An objective methodology for task analysis and workload assessment in anesthesia providers. Anesthesiology 1994;80:77–92
22. Cao CGL, Weinger MB, Slagle J, Zhou C, Ou J, Gillin S, Sheh B, Mazzei W. Differences in day and night shift clinical performance in anesthesiology. Hum Factors 2008;50:276–90
23. Slagle JM, Weinger MB. Effects of intraoperative reading on vigilance and workload during anesthesia care in an academic medical center. Anesthesiology 2009;110:275–83
24. Weinger MB, Reddy SB, Slagle JM. Multiple measures of anesthesia workload during teaching and nonteaching cases. Anesth Analg 2004;98:1419–25
25. Weinger MB, Herndon OW, Gaba D. The effect of electronic record keeping and transesophageal echocardiography on task distribution, workload, and vigilance during cardiac anesthesia. Anesthesiology 1997;87:144–55
26. Loeb RG. Monitor surveillance and vigilance of anesthesia residents. Anesthesiology 1994;80:527–33
27. Drews FA, Westenskow DR. The right picture is worth a thousand numbers: data displays in anesthesia. Hum Factors 2006;48:59–71
28. Moller JT, Pedersen T, Rasmussen LS, Jensen PF, Pedersen BD, Ravlo O, Rasmussen NH, Espersen K, Johannessen NW, Cooper JB. Randomized evaluation of pulse oximetry in 20,802 patients: I. Design, demography, pulse oximetry failure rate, and overall complication rate. Anesthesiology 1993;78:436–44
29. Ho H, Saeedi E, Kim SS, Shen TT, Parviz BA. Contact lens with integrated inorganic semiconductor devices. In: Proceedings of the IEEE 21st International Conference on Micro Electro Mechanical Systems (MEMS). Tucson, AZ: IEEE, 2008:403–6
30. Liu D, Jenkins S, Sanderson PM. patient monitoring with head-mounted displays. Curr Opin Anaesthesiol 2009;22: 796–803