Secondary Logo

Share this article on:

Defining Responsibilities of Simulated Patients in Medical Education

Nestel, Debra PhD; Clark, Susan MA; Tabak, Diana MEd, SurgEd; Ashwell, Victoria MBBS; Muir, Elizabeth MRCP; Paraskevas, Paraskeva FRCS; Higham, Jenny FRCOG

Simulation in Healthcare: The Journal of the Society for Simulation in Healthcare: June 2010 - Volume 5 - Issue 3 - p 161-168
doi: 10.1097/SIH.0b013e3181de1cb6
Empirical Investigations

Background: Simulated patients (SPs) play a critical role in medical education. The development of SP methodology has resulted in wide ranging responsibilities. For SPs to work effectively, we believed it was important to clearly articulate their responsibilities, and that this would be best achieved by consultation with all stakeholders—SPs, students, tutors, and administrators.

Methods: As part of a quality assurance initiative, we designed a questionnaire and focus group study to explore stakeholders’ perceptions of the responsibilities of SPs in teaching. Convenience and purposive sampling was used to recruit participants to questionnaires and focus groups, respectively. Data were analyzed thematically.

Results: Eighty-six questionnaires were collected, and six focus groups were conducted. Five sets of guidelines on responsibilities were produced. In addition, guidelines were established for feedback that SPs and tutors could use to maximize impact.

Discussion: The results highlight the complexity of SP-based teaching. Clarification of all stakeholders’ responsibilities demonstrates the importance of a team approach to SP-based teaching. Focusing attention on just one set of stakeholder’s responsibilities is unlikely to improve perception of quality. The process for developing the guidelines may be valuable for those who work with SPs. Stakeholder engagement is likely to ensure greater commitment than those developed by faculty.

SUPPLEMENTAL DIGITAL CONTENT IS AVAILABLE IN THE TEXT.

From the Imperial College, London, UK.

Reprints: Debra Nestel, Gippsland Medical School, Monash University, Northways Road, Churchill, Victoria, Australia 3842 (e-mail: debra.nestel@med.monash.edu.au).

Simulated patients (SPs) are now widely used in medical education,1–4 and their contribution to nursing and other health professional education is growing. Initially, SP contribution was ancillary; however, there are several drivers to their increased use and centrality in curricula. These include ethical imperatives for learning in simulation, patient safety initiatives, patient empowerment, and increased number of medical students with reduced access to patients in clinical settings. Additional drivers include growing acceptance of simulation as an educational method and the maturation of SP programs.

In undergraduate medical education, SPs usually play the role of a patient to support the development of a range of interpersonal and professional skills.4–6 Guidelines for roles are provided by faculty or designed with participants at the time of the session. SPs also play “standardized” roles in high stakes assessments in which they may be asked to make judgments on learner’s performance. In some assessments, SPs have total responsibility for such judgments. There are several excellent articles outlining the evolution and breadth of work undertaken by SPs.1,4–10

This article sets out responsibilities of those involved in SP work. We use the term “simulated” rather than standardized because this reflects the broad range of activities in which SPs work. We define SPs as individuals who are trained to portray patients (their relatives and healthcare professionals) and provide learners with feedback. This study was conducted in response to a quality assurance initiative in which we systematically reviewed all facets of our SP program. After a brief overview of SP work and its context at Imperial College where this study was undertaken, we describe the quality assurance process and outcomes.

Back to Top | Article Outline

SPs at Imperial College

Our SPs are extensively used in medical education and research (Table 1). The program has evolved over the past decade to include SPs in diverse and complex educational activities. One example is hybrid simulations in which actors are linked with simulators (benchtop models such as suture pads) in quasi-clinical settings to support the development of procedural skills.11–15 SPs work directly with real patients to write and perform authentic roles.16 Actors (SPs) play the roles of healthcare professionals in team simulations in the operating theater,17,18 the interventional suite, and at handover.19 We have also introduced technology-based methods for SPs to provide feedback to students.20–22 SPs lead some aspects of teaching sessions such as briefing and debriefing students, skills for managing emotions and performance anxiety, and orientating students to role play. SPs are often called to work in scenarios that are sensitive, highly charged, and for which high stakes judgments are made.

Table 1

Table 1

There are ∼230 SPs in our database of whom 45 regularly contribute to teaching and research. The remaining SPs are involved in assessments when large numbers of SPs are required simultaneously. Imperial College has cohorts of medical students that exceed 380. At undergraduate level, medical students have the opportunity to work closely in teaching sessions with SPs on at least six occasions throughout the 6-year curriculum. There are additional interactions with SPs in high stakes assessments. Imperial College also has an extensive research program that explores the use of diverse simulation modalities (including SPs) as part of educational programs in medical and other health professional groups across all levels of training.

There are potential difficulties in bringing together diverse groups of individuals (SPs, students, tutors, clinicians, and researchers) with the goal of using simulation to provide a powerful lens through which “interpersonal and professional skills” can be explored. For example, difficulties may include different participant perspectives and experiences of students’ attitudes and behaviors. SPs may be asked to defend their judgements about students’ performances when they differ to judgements made by tutors and/or students. Power dynamics in teaching settings may also mean some judgments are valued over others. These complex demands place SPs under pressure and led to the need for developing a code of practice. This initiative was also intended to demonstrate our commitment to SPs as a critical learning resource, to implement rigorous standards of SP methodology, and to attract and retain SPs of the highest caliber.

It is in this context that we developed this quality assurance project with the goal of establishing a set of “professional” responsibilities for all those involved in SP work. Although the literature includes many helpful references outlining SP characteristics and qualities,5,6 they are largely descriptive and without a systematic empirical basis. Rather than imposing a list of faculty generated expectations of behavior, we wanted to work with stakeholders to develop a set of guidelines. We anticipated that a set of reciprocal guidelines would be necessary because optimal SP performance is to some extent dependent on those with whom SPs work.

Back to Top | Article Outline

METHODS

We used a qualitative approach to address the question: What do stakeholders in SP programs believe are their own and others’ responsibilities in SP education? Stakeholders included SPs, students, tutors, and administrators. Questionnaires and focus groups were used in the study design to ensure robustness. This also expanded the number of stakeholders who could express their views.

Back to Top | Article Outline

Questionnaires

Convenience sampling was used to recruit participants for the semistructured questionnaires eliciting their experiences of working as SPs (or with them), what works well, what does not, and the perceived responsibilities of SPs, students, and tutors (Table 2). All SPs attending a training session and all SPs administrators across the College were invited to complete the questionnaire. Students and tutors attending a series of teaching sessions at St. Mary’s Hospital were also invited. Free text responses were analyzed thematically23 to develop draft preliminary guidelines of responsibilities and a topic guide for focus groups. Three project team members identified themes independently. These were crafted into draft guidelines. If major discrepancies occurred, they were resolved by negotiation within the team. Minor discrepancies were left in the draft guidelines for discussion in focus groups.

Table 2

Table 2

Back to Top | Article Outline

Focus Groups

Purposive sampling was used to recruit participants for focus groups that were stakeholder specific. We invited participants whom we believed reflected a broad range of variables within respondent groups (eg, level of experience, sex, age, and formal education). Topic guides devised from the questionnaires ensured that key topics were covered in each focus group. Table 3 lists the topic guide for SPs. This was adapted for each focus group according to membership (eg, students and tutors). Principles of focus groups were adopted in which participants’ views were exchanged with each other obtaining a depth of understanding unlikely to emerge from individual interviews or questionnaire-based studies.24,25 Draft guidelines of responsibilities were provided and discussed. In some ways, this served as respondent validation of the questionnaire findings. That is, participants could express their level of agreement and the rationale for their judgement. Focus groups lasted up to 2 hours and were conducted by a project team member (V.A.). The interviewer made notes on the ambience and perceived quality of responses. Audio recordings were made, and themes were identified independently by two team members (V.A. and D.N.). Variations were addressed by negotiation jointly reviewing the raw data until consensus was achieved. The final version of the guidelines was based on the results of the questionnaires and the focus groups.

Table 3

Table 3

Back to Top | Article Outline

Quality Assurance

Since this project formed part of a quality assurance initiative rather than research, we were not required to seek approval from our institution’s human research ethics committee. However, we applied the same standards required of research involving humans including explanatory statements, voluntary participation, the opportunity to withdraw at any time, deidentified information, and notification of potential for dissemination of findings. On request of the editor, we sought retrospective human research ethics approval, which was granted by Imperial College Research Ethics Committee (#ICREC_9_5_3).

Back to Top | Article Outline

RESULTS

Eighty-six questionnaires were collected from SPs (n = 59), students (n = 11), tutors (n = 10), and administrators (n = 8) with response rates of 74%, 52%, 76%, and 100%, respectively. Six focus groups were conducted with population specific groups of between five and six participants—three groups of SPs (n = 18), two of students (n = 10), and one tutor group (n = 5). Eight administrators were identified as potential focus group participants. However, scheduling them proved too difficult. Focus groups were reported to be supportive with high levels of participation. The interviewer experienced minimal reticence from participants with most eager to express their views and positive about the opportunity to do so. Even when unhelpful SP-based experiences were reported, this was done calmly and after considerable reflection even though strong emotions had been reported at the time of the teaching event. Some participants used focus groups to learn from each other. The key difference between questionnaire and focus group data was in the level of detail with focus groups providing much richer and layered meaning.

Based on the questionnaires and the focus groups, guidelines for all stakeholders in teaching sessions were developed. These included

  1. Expectations of SPs in teaching sessions
  2. Expectations of students in SP teaching sessions
  3. Expectations of tutors in SP teaching sessions
  4. Expectations of program directors
  5. Expectations of administrators working with SPs for teaching sessions

Guidelines for SPs are listed in Table 4. The remainder are available in Appendices, Supplemental Digital Contents 1 to 4, http://links.lww.com/SIH/A11; http://links.lww.com/SIH/A12; http://links.lww.com/SIH/A13; and http://links.lww.com/SIH/A14. Expectations of SPs and tutors were the most similar reflecting the importance of their partnership as educators. They also shared similar personal qualities. Issues of confidentiality were noted. Students had the least number of expectations with most related to respectful behaviors relevant in any teaching encounter and those specific to working with “patients” including no unauthorised recording or showing of SP encounters. The program director and administrators shared the need for a systematic approach to work and strong organizational skills. Evaluation and feedback were core to the program director’s role, although maintenance of SP records and teaching materials was essential for administrators. In addition, we developed feedback guidelines for tutors and SPs in teaching sessions because this was deemed by respondents to be a critical element of working in simulation (Table 5). We had not set out to establish these guidelines but the thematic analysis raised the importance of this activity.

Table 4

Table 4

Table 5

Table 5

Themes extracted from questionnaire and focus group data highlighted several issues that were fed into the guidelines.

Back to Top | Article Outline

Simulated Patients

The SPs reported feeling valued in the process of supporting student learning. However, they identified several areas for development—improved partnerships with tutors, increased opportunities for feedback, and improved preparation of students for sessions. Prescriptive roles were thought to be well crafted and faculty responsive to feedback from SPs. The SPs acknowledged their extended roles and the requisite skills for acting, observation, and teaching. The importance of precise and inspiring feedback was identified as critical to their role and noted as challenging. SPs also expressed concerns at the use of students’ audiovisual recordings and reported that some students used personal recording devices (usually mobile phones) during teaching sessions and that it was unlikely that this was for educational development.

Back to Top | Article Outline

Medical Students

Students suggested encounters with SPs were powerful learning experiences. They enjoyed SP sessions and thought that almost all their peers took the sessions seriously. Students spoke of some peers who did not, and these students were thought to be compensating for too much or too little confidence. Contrary to program goals, students believed that there were covert aims intended to test or trick them. Some reported states of hypervigilance, whereas others suggested a power imbalance that was unlike interactions with real patients. Despite this, they reported that interactions with SPs were realistic. They enjoyed the variety of formats in which SPs worked with them, expressed a desire for more opportunities, and increased feedback.

Back to Top | Article Outline

Tutors and Program Director

Tutors reported enjoying work with SPs. Their comments suggested that SPs contributions were not maximized, and that tutors could do more to support increased contribution. This included more coteaching, seeking SP feedback, and acknowledging that observing an interpersonal interaction is different to participating in it. Improved briefing before sessions would be beneficial. Tutors wanted more support in learning to teach through simulation. It was apparent that specific guidelines should be developed for the program director because there were additional responsibilities that were outside the other stakeholder categories.

Back to Top | Article Outline

Administrators

Administrators enjoyed working with SPs and acknowledged the rapid increase in bookings and breadth of activities. Those who had opportunities to observe SP sessions thought this improved their ability to contribute to recruitment and bookings and enabled them to appreciate the importance of planning and the nature of the information SPs require. Keeping administrators informed of most aspects of SP sessions (content, timing, SP requirements, etc) in a timely fashion was highly valued. Administrators also expected SPs to keep them informed of changes in personal data (eg, contact details).

Back to Top | Article Outline

DISCUSSION

The resulting guidelines highlight the complexity of teaching and learning with SPs in the context of undergraduate medical education. It was apparent that stakeholders did not share the same understanding of roles and responsibilities and that many participants made assumptions about purpose and process in SP teaching sessions. By articulating responsibilities, restating goals, acknowledging areas of overlap in roles between stakeholders we have improved our understanding of the strengths and weaknesses of SP teaching. However, we do not know if this will translate to an improved educational experience for students.

We had not anticipated the development of feedback guidelines, but this emerged from the framework for analysis in which we had to set aside preconceptions and deal with key themes from respondents. SPs, students, and tutors held strong views on feedback and wanted ways to make the process more learner centered. The resulting feedback guidelines reflect sound educational principles in addressing the learners’ “need to know” before seeking balance.26,27 This is in contrast with the frequently used Pendleton28 approach of commencing feedback with what the learner thought worked well before moving to what could be improved. This approach does not permit the learner to drive the focus, sequence, and amount of feedback. The Calgary-Cambridge29 approach is also commonly used and provides the learner with more control over feedback than the Pendleton model but is still sequenced by the tutor. In both feedback approaches outlined above, the SP is not specifically mentioned because they were designed for application with real patients. However, they have been adapted for working with SPs and tutors. The feedback guidelines generated in this project offer new and inclusive ways of working collaboratively.

Our willingness to be open and transparent in the development of all project guidelines reflects the philosophy underpinning our SP program. We believe that user involvement in developing guidelines will increase engagement of all stakeholders and more accurately reflect their different perspectives than faculty generated guidelines.

The guidelines on responsibilities have already raised awareness of the interdependency of stakeholders, their different foci, diverse challenges, and the need for creating an environment that better facilitates partnership models for teaching and learning. We intend to use the guidelines in recruitment of new SPs, training sessions for SPs, tutors, and administrators and in sessions that introduce students to learning through simulation. Although the focus in the guidelines was on teaching sessions, they may have relevance for assessment sessions and when SPs work in research projects. We may need to return to stakeholders to establish guidelines for these specific areas of practice.

The role of SPs is becoming increasingly “professionalised” as SPs are involved in a wide range of teaching and learning activities. The values that underpin their contribution and the skills required to enact them are highly specialized. SPs may be privy to personal information about patients and students and need to understand the same principles of confidentiality that medical professionals adopt. SPs also work directly with individuals in what are sometimes stressful circumstances. Students may experience intense emotions as they work in simulations, and SPs need to be sensitive to supporting students as the SP comes in and out of role. The guidelines do not directly set out how to manage these situations but acknowledge their significance. Establishing a set of guidelines for professional practice provides some safety for all those involved.

Back to Top | Article Outline

Limitations of the Work

The study was developed as a quality assurance initiative rather than a research project. Although we applied the same standards for working with humans in research, the purpose of this study was different and may have influenced the outcome. We believe our SP respondents were representative of their population at Imperial College. Although convenience sampling in the questionnaire component limits representativeness, in combination with purposive sampling for focus groups, the rigor of the study design is improved. Students were the least representative group. The low response rate for questionnaires reflected the voluntary nature of the project and the timing of distribution at the end of a teaching session. However, students were quite willing to participate in scheduled focus groups and were sampled from different year levels and teaching groups. Tutors were well represented in the study, whereas administrators all completed questionnaires, but their geographical distribution and other commitments made it too difficult to bring them together for a focus group.

We did not collect demographic data, in part because of the quality assurance nature of the project. The work was conducted in a large university hospital setting with a high level of SP-based teaching and examinations. Some of the guidelines are likely to be institution specific; however, we believe that application is far wider. The SPs in our program participate in examinations and research, but these were not explored in this study. Although general principles are likely to be shared, there may be additional responsibilities for these specific contexts.

Future plans include evaluating the impact of the guidelines on learners and SPs. The guidelines will be used in recruitment and training sessions. We will also conduct a multicenter study to explore application in different settings. Further work on exploring effective learner-centered feedback with SPs is an exciting area of research.

Back to Top | Article Outline

CONCLUSIONS

The process of establishing guidelines for SPs in teaching sessions was a valuable experience highlighting strengths and weaknesses of an SP program. The most notable finding was the lack of shared understanding of the purpose and process of SP-based teaching by different stakeholders. The overlap of tutor and SP characteristics emphasizes the importance of articulating and reinforcing key values underpinning the program. We believe that the guidelines will enable SPs and others to work safely and constructively in teaching sessions.

Back to Top | Article Outline

REFERENCES

1. Adamo G. Simulated and standardized patients in OSCEs: achievements and challenges 1992–2003. Med Teach 2003;25:262–270.
2. Van Dalen J, Bartholomeus P, Kerkhofs E, et al. Teaching and assessing communication skills in Maastricht: the first twenty years. Med Teach 2001;23:245–251.
3. Hargie O, Dickson D, Boohan M, Hughes K. A survey of communication skills training in UK Schools of Medicine: present practices and prospective proposals. Med Educ 1998;32:25–34.
4. Cleland J, Abe K, Rethans J. The use of simulated patients in medical education: AMEE Guide No 42. Med Teacher 2009;31:477–486.
5. Ker JS, Dowie A, Dowell J, et al. Twelve tips for developing and maintaining a simulated patient bank. Med Teacher 2005;27:4–9.
6. Wallace P. Coaching Standardized Patients for Use in Assessment of Clinical Competence. New York: Springer; 2006.
7. Barrows HS. Simulated patients in medical teaching. Can Med Assoc J 1968;98:674–676.
8. Vu N, Barrows H, Marcy M, Verhulst S, Colliver J, Travis T. Six years of comprehensive, clinical performance-based assessment using standardized patients at the Southern Illinois University School of Medicine. Acad Med 1992;67:43–50.
9. Petrusa E. Taking standardized patient-based examinations to the next level. Teach Learn Med 2004;16:98–110.
10. Petrusa E. Clinical performance assessments. In: Norman G, van der Vleuten C, Newble D, eds. International Handbook of Research in Medical Education. The Netherlands: Kluwer Academic; 2002:673–710.
11. Kneebone R, Kidd J, Nestel D, Asvall S, Paraskeva P, Darzi A. An innovative model for teaching and learning clinical procedures. Med Educ 2002;36:628–634.
12. Kneebone R, Nestel D, Yadollahi F, et al. Assessing procedural skills in context: exploring the feasibility of an Integrated Procedural Performance Instrument (IPPI). Med Educ 2006;40:1105–1114.
13. Higham J, Nestel D, Lupton M, Kneebone R. Teaching and learning gynaecology examination with hybrid simulation. Clin Teach 2007;4:238–243.
14. Kneebone R, Nestel D, Bello F, Darzi A. An Integrated Procedural Performance Instrument (IPPI) for learning and assessing procedural skills. Clin Teach 2008;5:45–48.
15. Kneebone R, Nestel D, Wetzel C, et al. The human face of simulation: patient-focused simulation training. Acad Med 2006;81:919–924.
16. Nestel D, Cecchini M, Calandrini M, et al. Real patient involvement in role development evaluating patient focused resources for clinical procedural skills. Med Teach 2008;30:534–536.
17. Black S, Nestel D, Horrocks E, et al. Evaluation of a framework for case development and simulated patient training for complex procedures. Simul Healthc 2006;1:66–71.
18. Nestel D, Black S, Kneebone R, et al. Simulated anaesthetists in high fidelity simulations for surgical training: feasibility of a training programme for actors. Med Teach 2008;30:407–413.
19. Nestel D, Kneebone R, Barnet A. Teaching communication skills for handover: perioperative specialist practitioners. Med Educ 2005;39:1157.
20. Kneebone R, Nestel D, Ratnasothy J, Kidd J, Darzi A. The use of handheld computers in scenario-based procedural assessments. Med Teach 2003;25:632–642.
21. Nestel D, Bello F, Kneebone R, Akhtar K, Darzi A. Remote assessment and learner-centred feedback using the Imperial College Feedback and Assessment System (ICFAS). Clin Teach 2008;5:88–92.
22. Kneebone R, Bello F, Nestel D, et al. Learner-centred feedback using remote assessment of clinical procedures. Med Teach 2008;30:795–801.
23. Pope C, Ziebland S, Mays N. Analysing qualitative data. In: Pope C, Mays N, eds. Qualitative Research in Health Care. Oxford: BMJ Publishing; 2006:63–81.
24. Kitzinger J. Focus groups. In: Pope C, Mays N, eds. Qualitative Research in Health Care. Oxford: BMJ Publishing; 2006:21–31.
25. Morgan D. Focus groups. Annu Rev Sociol 1996;22:129–152.
26. Knowles M. Self-Directed Learning. A Guide for Learners and Teachers. Englewood Cliffs: Prentice Hall/Cambridge; 1975.
27. Kneebone R, Nestel D. Learning clinical skills—the place of simulation and feedback. Clin Teach 2005;2:86–90.
28. Pendleton D, Schofield T, Tate P, Havelock P. The Consultation: An Approach to Learning and Teaching. New York: Oxford University Press Inc; 1998.
29. Kurtz S, Silverman J, Draper J. Teaching and Learning Communication Skills in Medicine. Abingdon: Radcliffe Medical Press Ltd; 1998.
Keywords:

Simulated patient; Undergraduate medical education; Patient perspectives; Standardized patients

Supplemental Digital Content

Back to Top | Article Outline
© 2010 Society for Simulation in Healthcare