Share this article on:

Board 516 - Technology Innovations Abstract A Novel Method for Synthesizing Naturalistic Pain on Virtual Patients (Submission #1072)

Gonzales Michael BS; Moosaei, Maryam BS; Riek, Laurel PhD
Simulation in Healthcare: December 2013
doi: 10.1097/01.SIH.0000441715.40653.4e
Abstracts: PDF Only


Since nearly all areas of clinical care involve face-to-face interaction, it is important for learners to be able to recognize pain in patients. While patient self-report (i.e. pain scales) can be a helpful tool in the clinical toolbox, in most cases they must rely on a keen level of sensory observation skills.1 In fact, in some cases the pain-rating scale may not be a reliable tool due to variability across patients, as well as across clinicians’ own interpretation of the scale.2 Thus, there is a desire within the clinical education community to help hone the pain observation skills of learners. However, few training tools exist. To help bridge this gap, we have developed technology that may one day become part of an educational tool. We have designed a method for expressing pain on virtual patients using a naturalistic database of people expressing pain.


We extracted source videos from the UMBC-McMaster Pain Archive, a fully labeled, naturalistic data set of 200 video sequences from 25 participants suffering from shoulder pain. The participants performed range-of-motion tests on both their affected and unaffected limbs under the instruction of a physiotherapist. Videos were labeled by facial expression recognition experts, with a pain score from zero to twelve.3 We included all source videos with a pain score greater than three. For each video, we used a CLM-based facial-feature tracker4 to track 68 points on the face. Each facial expression is comprised into certain facial segments called action units (AUs),5 and we mapped these facial movements to a virtual patient in real time using a technique called performance-driven animation. Thus, the virtual patient’s expression directly matched those of a real person in the source video. We used several avatars in the Steam Source SDK6 as our virtual patient representations. In order to validate people’s ability to identify pain on virtual avatars using our method, we conducted an online study involving 50 participants recognizing expressions of pain on virtual avatars (34 female, 16 male, mean age = 38.6). To test the flexibility of our system, we also created three avatars to see if there was any significant effect of gender on participants’ perception of pain (male, female, and androgynous).

Results: Conclusion

We found that people are able to accurately identify naturalistic facial expressions of pain when expressed by a virtual patient using performance-driven pain synthesis (overall pain accuracy = 67.33%). We also found no significant difference in participants’ accuracy in identifying pain across our three virtual avatar types (all p > .05, with accuracy’s of 66.67%, 65.33%, and 70% for female, male, and androgynous avatars respectively). Our Results are encouraging, suggesting that virtual patients expressing pain using a performance-driven animation method may be useful as part of a general training tool for learners. However, one limitation of this study was it was conducted on the general population, who may interpret pain differently than clinicians.7 Additional work is needed to explore this difference further, and we intend to conduct a study with clinical students in the future.


1. Grady, D: Physician Revives a Dying Art: The Physical. New York Times. Oct 11, 2010

2. De C Williams AC, Davies HT, Chadury Y. Simple pain rating scales hide complex idiosyncratic meanings. Pain. 2000;85(3):457–63.

3. Lucey P, Cohn JF, Prkachin KM, Solomon PE, Matthews I: Painful data: The UNBC-McMaster shoulder pain expression archive database. Face and Gesture 2011. 2011:57–64.

4. Baltrusaitis T, Robinson P, and Morency L: 3D constrained local model for rigid and non-rigid facial tracking. Computer Vision and Pattern Recognition. 2012:2610–2617.

5. Lucey P, Cohn J, Lucey S, Matthews I, Sridharan S, Prkachin KM: Automatically Detecting Pain Using Facial Actions. International Conference on Affective computing and intelligent interaction. 2009;2009:1–8.

6. Valve Software: Source SDK. Source SDK Website. Accessed July 30, 2013.

7. Prkachin KM, Craig KD: Expressing pain: The communication and interpretation of facial pain signals. Journal of Nonverbal Behavior. 1995;19(4):191–205.



© 2013 by Lippincott Williams & Wilkins, Inc.