Share this article on:

Pilot Test of a Collaborative “Helping Hands” Tele-Assistance System for the Development of Clinical Skills

Barnett, Tony PhD, RN; Huang, Weidong PhD; Mather, Carey MPH, RN

Section Editor(s): Alexander, Susan DNP, ANP-BC, ADM-BC

CIN: Computers, Informatics, Nursing: October 2017 - Volume 35 - Issue 10 - p 491–495
doi: 10.1097/CIN.0000000000000393
DEPARTMENTS: CIN Plus

Author Affiliations: Centre for Rural Health (Dr Barnett) and Health Sciences (Ms Mather), University of Tasmania, Launceston, Tasmania, Australia; and School of Software and Electrical Engineering, Swinburne University of Technology (Dr Huang), Melbourne, Victoria, Australia.

The authors have disclosed that they have no significant relationship with, or financial interest in, any commercial companies pertaining to this article.

Corresponding author: Tony Barnett, PhD, RN, Centre for Rural Health, University of Tasmania, Locked Bag 1372, Launceston, Tasmania, Australia 7250 (Tony.barnett@utas.edu.au).

Key Points

• The development and acquisition of clinical skills take time and practice. Access to expertise and guidance to perform clinical (patient care) procedures can be difficult, especially in real-world clinical settings.

• We developed an innovative, collaborative tele-assistance system that provided remote guidance to nursing students in the development of procedural skills.

• Proof of concept was demonstrated. Instructors were able to guide learners in undertaking procedures remotely using the “helping hands” technology. Such technology has the potential to improve health professional education in the workplace.

The safe and correct performance of clinical procedures is a critical component of the skill set required by healthcare professionals. The development of capability requires repetition and practice in different situations to improve confidence and performance. Nursing students learn procedural skills in simulation laboratories, before being immersed in work-integrated learning (WIL) under direct supervision of an instructor. This supervision is resource intensive and often time limited when an instructor is not available on site.

A lack of practice can cause students to feel underprepared when they are asked to undertake clinical procedures while in training and on entering the workforce as new graduates.

Within health curricula, the learning of clinical skills typically begins in the classroom with an explanation of the procedure and the context for its application. The student then practices the skill in a safe, controlled environment, often using a manikin. Students are then provided opportunities to apply that skill with a patient when undertaking WIL. Skills take repetition and practice to develop and are often highly contextualized. Progressing from novice to expert level in the application of a skill requires feedback to be provided in different settings and under different circumstances.1–4 Performing a procedural skill can also be stressful for students who may feel nervous or overwhelmed by the experience and are often fearful of making a mistake that may cause harm to the patient.5 Continued guidance and support are necessary, an area in which the application of tele-assistance or augmented reality (AR) technologies has great potential.6,7

In health, AR and tele-assistance systems have been successfully used to train surgeons, doctors, nurses, and students across a variety of skill areas.8–10 While research is ongoing in AR, our review found no reports on the gesture-based learning and teaching innovation we have named “helping hands” or the impact this may have on learning outcomes.

The aim of this pilot project, based in a simulation laboratory, was to obtain user feedback from instructor and students on the application of a collaborative system that allowed students to undertake a clinical procedure with real-time audio and visual guidance provided by an instructor at a different location. This technology enabled the instructor's helping hands to be projected through surface computing onto the student's workspace and visualized through a near-eye display attached to the headwear worn by the student. It allowed the instructor to demonstrate a movement or “step-in” and assist the student with both visual and audio prompts. The project was designed to demonstrate proof of concept, inform clinical skills education, and serve as a basis for the further development of the technology and future research in this area.

Back to Top | Article Outline

METHODS

A novel remote guidance system was adapted for use in WIL environments.11 Surface computing technology (wherein the user interacts with an interface projected on to a surface, rather than traditional computer equipment) and head-mounted display devices were used to enable geographically remote collaboration. Two laptops, a headwear device worn by the student, and a touch-screen pad used by the instructor were networked through wireless Internet. We used C++ as the programming language (Standard C++ Foundation, Redmond, WA). The hardware used for the helping hands system is listed in Table 1, and the cost was less than $2000 AUD.

The technology supported unmediated remote gestures by augmenting the object of interest with helping hands. As shown in Figure 1, the student, wearing the headwear device, performs a simple dressing on the wound of a manikin (simulated patient). The focus of attention, the wound dressing procedure, is captured by the camera mounted on the headwear, and these images are sent to the screen used by the instructor (Figures 2 and 3). The instructor's unit also has a camera from which images of the instructor's hands, overlaid on the screen display of the student's visual field, can be seen by the student on the near-eye display. The microphone and headphone enabled two-way verbal communication.

Back to Top | Article Outline

Procedures

Following institutional review board approval from the Tasmanian Social Science Human Research Ethics Committee, five first-year nursing students were recruited by poster advertisement and paired with five experienced clinical instructors. Information about the project was provided, and written consent was obtained from all participants prior to their inclusion in the study. Students had either little or no experience with the clinical procedure for which they were to receive instruction. Each instructor-student dyad was connected through the helping hands system, although physically separated at opposite ends of a large simulation laboratory. The student was asked to perform a simple wound dressing on a manikin. The procedure was detailed though a checklist.12 Forty-five minutes was allocated to each dyad for setup, equipment familiarization, skill performance, completion of the questionnaire, and verbal feedback.

Data were obtained via questionnaire administered to all participants immediately following completion of the task and analyzed using simple descriptive techniques. The questionnaire included a 9-item user survey that was developed by the research team and judged by an expert panel to have high face validity. It assessed usability from different aspects including ease of use, ease of learning, task satisfaction, copresence, and perception of engagement. It included two items that asked participants to rate how they felt about the giving or receiving of unmediated hand gestures. The 5-item National Aeronautics and Space Administration Task Load Index (NASA-TLX) Mental Workload Rating Scale was also administered. Since its original development and because of its strong psychometric properties across different samples, this scale has been used in various formats to assess the impact and “workload” of task performance.13 Participants were asked to indicate their response to the 14 items on a 7-point scale (from strongly agree to strongly disagree for the user survey items and from low to high workload for the NASA-TLX items). Five open-ended questions encouraged instructors and students to comment on their experiences with the system. The wording of some of these items was modified to ensure it was appropriate to either instructor or the student participant (Table 2). A member of the research team took observational notes during each performance and later validated these with two other members of the team. Human research ethics committee approval (Ref. H0014291) was obtained prior to commencement of the trial.

Back to Top | Article Outline

RESULTS

Each participant dyad was observed to work collaboratively and complete the assigned task of performing a simple wound dressing within the time allocated. The student participants were able to communicate with their instructors verbally and looked up at the near-eye display for visual prompts (hand gestures) provided by the instructor. Instructors used their hands for simple gestures such as pointing and to demonstrate more complex movements such as opening a dressing pack (Figure 3). Some students initially relied on verbal instructions and needed to be reminded to use the near-eye display because their focus was directly on the dressing field. This was ameliorated through practice as each instructor-student dyad became accustomed to each other's preferences for audio and visual cues and became familiar with using the system.

User ratings from both instructor and student participants were positive with an overall usability rating of 5.6 out of 7 (mean raw score). Both groups tended to rate items favorably, most either agreeing or strongly agreeing with each statement (Table 3). There were differences in ratings of some items between the instructors and students. These were not statistically significant (P < .05), although the larger differences related to individual task performance and perceptions of copresence (Table 3, Items 4 and 7). Both groups rated items related to visual prompting (ie, giving or receiving unmediated hand gestures) lower than other aspects of usability (Table 3, Items 8 and 9). Neither the instructor or student group found the workload high or particularly demanding (Table 4). The average mean score across the five workload items was 1.7 (where a rating of 7 indicates the highest workload).

The feedback received from participants to the open-ended questions was constructive and generally positive. For example, when asked how they felt about using the technology, responses included the following:

• “Having used it only once I would say I already feel fairly confident and thought this was a useful learning tool even though not quite market ready” (instructor).

• “I thought it was fantastic” (instructor).

• “I feel very confident” (student).

When asked what benefits they could think of using the technology, responses from instructors included the following:

• “Allow for UTAS [university] support in rural/remote locations.”

• “Can provide help/advice to many students over many different sites.”

Participants also proposed improvements such as the need to build in a record/replay function of the instruction sessions for later review, use the latest wearable technologies to improve user experience, improve on-screen depth perception (three-dimensional visualization), and build capability for remote operation of the camera.

Back to Top | Article Outline

DISCUSSION

The usability trial demonstrated there were benefits, barriers, and challenges in the use of this innovative system. Positive feedback on technology was received from both students and instructors around the opportunity for one-to-one interaction, although separated by distance.

The application of this type of technology has only recently had some exposure in the health arena.9 Further work is needed to develop and realize its potential in learning and teaching clinical and procedural skills. It has the potential to benefit and augment the training students currently receive and has a wide range of remote applications, including expert guidance of novices and laypersons in emergency situations without the instructor having to be in the same physical location. This can represent a significant saving in time and resources and more effective utilization of expertise in the education of healthcare professionals and in real-world emergency situations.

Back to Top | Article Outline

Limitations

The number of instructor-student dyads was small (only five); thus, further testing is required with larger numbers of pairs and across environments where instructor and student are separated by larger distances to test robustness of the system and connectivity between units. The study used procedural novices who undertook only one clinical procedure (a simple dressing) using a manikin. This limited the scope of the usability test. Future work could therefore also examine the effectiveness of the technology between more experienced participants, using different (and more technical) procedures, and in realist settings with patients.

Back to Top | Article Outline

CONCLUSIONS

This study was important in determining proof of concept about the usability of helping hands technology that instructors could use remotely to guide students in undertaking procedural learning. The application and exploration of helping hands and similar technologies will contribute to the international discourse around the use of technology to augment and improve health professional education outcomes.

Back to Top | Article Outline

Acknowledgments

The authors thank the students, instructors, advisors, and technical personnel who made this project possible. They also thank the members of the Tasmanian Clinical Education Network for their encouragement and financial support.

Back to Top | Article Outline

References

1. Benner P. From Novice to Expert: Excellence and Power in Clinical Nursing Practice. Reading, MA: Addison-Wesley; 1984.
2. Dreyfus SE. The five-stage model of adult skill acquisition. Bull Sci Technol Soc. 2004;24: 177–181.
3. Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med. 2004;79(10 suppl): S70–S81.
4. Mather CA. Human interface technology: enhancing tertiary nursing education to ensure workplace readiness””. In: International Technology, Education and Development Conference Proceedings; March 8-10, 2010. Valencia, Spain: IATED.
5. Pulido-Martos M, Augusto-Landa JM, & Lopez-Zafra E. Sources of stress in nursing students: a systematic review of quantitative studies. Int Nurs Rev. 2012;59(1): 15–25.
6. Lee K. Augmented reality in education and training. TechTrends. 2012;56(2): 13–21.
7. Yuen SC, Yaoyuneyong G, & Johnson E. Augmented reality: an overview and five directions for AR in education. J Educ Technol Dev Exchange. 2011;4(1): 119–140.
8. Botden SM, deHingh IH, & Jakimowicz JJ. Suturing training in augmented reality: gaining proficiency in suturing skills faster. Surg Endosc. 2009;23: 2131–2137.
9. Chaballout B, Molloy M, Vaughn J, Brisson R, & Shaw R. Feasibility of augmented reality in clinical simulations: using Google Glass with manikins. JMIR Med Educ. 2016;2: 6.
10. Feifer A, Al-Ammari A, Kovac E, Delisle J, Carrier S, & Anidjar M. Randomized controlled trial of virtual reality and hybrid simulation for robotic surgical training. BJU Int. 2011;108(10): 1652–1656.
11. Huang W, & Alem L. HandsinAir: a wearable system for remote collaboration on physical tasks. In: Proceedings of the 2013 Conference on Computer Supported Cooperative Work Companion (CSCW '13); February 23–27, 2013; San Antonio, Texas. New York, NY: ACM; 2013: 153–156.
12. Tollefson J. Clinical Psychomotor Skills: Assessment Skills for Nurses. South Melbourne, Australia: Cengage Learning; 2012.
13. Hart SG, & Staveland LE. Development of NASA-TLX (Task Load Index): results of empirical and theoretical research. Adv Psychol. 1988;52: 139–183.
Copyright © 2017 Wolters Kluwer Health, Inc. All rights reserved.