Secondary Logo

Journal Logo

Reviews

Addressing the Interprofessional Collaboration Competencies of the Association of American Medical Colleges: A Systematic Review of Assessment Instruments in Undergraduate Medical Education

Havyer, Rachel D. MD; Nelson, Darlene R. MD; Wingo, Majken T. MD; Comfere, Nneka I. MD; Halvorsen, Andrew J. MS; McDonald, Furman S. MD, MPH; Reed, Darcy A. MD, MPH

Author Information
doi: 10.1097/ACM.0000000000001053

Abstract

An emerging taxonomy for competency domains proposed by the Association of American Medical Colleges (AAMC)1 expands the core competencies required of physicians,2,3 to include “interprofessional collaboration,” reflecting broad acknowledgment of the importance of teamwork across the medical education continuum. Further, teamwork is among the 13 core entrustable professional activities that medical students are expected to perform competently prior to entering residency.4 It is also included among the required graduate medical education milestones5 and among the American Board of Medical Specialties maintenance of certification standards.3

Research indicates that effective teamwork among health professionals may enhance safety, efficiency, and quality in health care.6–8 To achieve these outcomes in clinical practice, medical schools must provide rigorous evidence that their graduates can be trusted to function collaboratively within health care teams.4,9 Although many medical schools involve students in interprofessional and team-based learning activities, curricula frequently lack reliable and valid assessments of students’ teamwork competency.10,11 The absence of such evidence leaves residency programs, hospitals, and the public uncertain as to the preparedness of medical school graduates for working with teams during residency training12,13 and medical practice.

We therefore conducted a systematic review of teamwork assessment in undergraduate medical education (UME) to identify tools that medical school faculty and curriculum planners can use to assess the AAMC’s proposed interprofessional collaboration competencies. For each assessment tool we uncovered, we provided a synthesis of its characteristics, content, the settings where it is typically used, and evidence for its validity. We applied predefined criteria to the tools in our synthesis to identify specific tools that are best suited to assess the AAMC’s teamwork competencies,1 and we have included recommendations for applying these tools within UME.

Method

We previously conducted a systematic review of tools used to assess teamwork in internal medicine.14 The primary goal of this prior review was to examine outcomes associated with teamwork assessments, particularly patient outcomes, within the field of internal medicine. Following publication of the AAMC’s new taxonomy for competency domains that emphasizes interprofessional collaboration,1 we updated and expanded our prior search strategy to identify teamwork assessment tools used in UME. We aimed to identify tools that could be adopted by medical schools to assess the new teamwork competencies. This current review differs from our prior review14 in terms of the primary aim, the inclusion criteria, the learner group, and the target audience. This review focuses on medical students and includes all teamwork assessment tools used in UME. We intend for the results and recommendations to help medical school faculty and curriculum developers select tools to assess teamwork among medical students. In contrast, the prior review14 focused on tool outcomes, was limited to tools used in the field of internal medicine, and did not include a description of tools meeting the new AAMC competency standards for use by medical schools.

We have reported our results according to relevant sections of the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines.15 The Mayo Clinic institutional review board exempted this review.

Data sources and search strategy

We searched MEDLINE, MEDLINE In-process, the Cumulative Index to Nursing and Allied Health Literature (CINAHL), and PsycINFO for English-language studies from January 1, 1979, through April 1, 2014. We have previously published details of the search strategy14 and thus have only briefly summarized here. We have provided the full MEDLINE search strategy in Supplemental Digital Appendix 1, http://links.lww.com/ACADMED/A323. With the help of a research librarian, we combined an extensive list of search terms, medical subject headings (MeSH), and keywords related to teamwork and collaboration with terms pertaining to measurement (e.g., instrument, assessment), and terms pertaining to UME (e.g., student, interprofessional, multidisciplinary). To identify unpublished studies, we searched abstracts dating from 2010 through 2014 from national meetings of the AAMC, the Association for Medical Education in Europe, the American College of Surgeons, the Society of General Internal Medicine, and the International Meeting on Simulation in Healthcare. We also searched the reference lists of included articles for additional citations.

Article selection

We included original research studies describing a quantitative tool measuring teamwork in UME. We included studies of all medical and surgical specialties and studies in all educational settings (i.e., classroom, simulation, clinical). We included studies of collaboration among interprofessional teams, including assessments of interprofessional education—as long as medical students were participants in the teams. We excluded studies of interprofessional education that did not involve medical students because the aim of this review was to provide medical school faculty and curriculum planners with a summary of instruments that can be used to address teamwork competencies among medical students.

Data extraction and synthesis

We entered data from the reports into a structured abstraction form. We abstracted the following:

  • setting (i.e., classroom, simulation, or clinical and country),
  • level of medical students assessed (i.e., preclinical, clinical),
  • professions of other team members (e.g., nursing, physical or occupational therapy, social work),
  • instrument structure and content (i.e., assessment of individual or whole team, dimensions of teamwork measured, number of items), and
  • elements of study quality (design, number of institutions involved, types of outcomes, and validity evidence).

Five of us (R.D.H., M.T.W., N.I.C., D.R.N., and D.A.R.) reviewed all of the articles. Each article was independently reviewed by one of us; we resolved any uncertainties regarding data extraction through consensus. Next, someone other than the initial reviewer abstracted the data again from 25% of articles. For these 25%, we calculated interrater agreement using an intra-class correlation coefficient (ICC).

We categorized the published validity evidence for each instrument using an established framework16–18 that has been used in similar systematic reviews in medical education.19 We categorized the outcomes of the included studies according to the Kirkpatrick20 hierarchy: satisfaction and/or opinion (Level 1), knowledge and skills (Level 2), behaviors (Level 3), and patient outcomes (Level 4).

We evaluated the methodological quality of the included studies, using criteria from the Medical Education Research Study Quality Instrument (MERSQI),21 which has established validity evidence for content, interrater agreement, intrarater agreement, internal consistency reliability, and relation to other variables.21,22 We categorized assessment tools within an evidence table according to the educational setting (i.e., classroom, simulation, or clinical) in which the tools were applied.

Next, we reviewed all the assessment tools to identify those best suited to address the AAMC’s proposed interprofessional teamwork competencies. To select these tools, we defined a priori the following selection criteria based on established principles for evaluating the construct validity of psychometric instruments18:

  1. Content validity or concordance between content of the assessment tool items and the AAMC competency language. Content validity reflects the degree to which assessment items represent the construct being measured.18
  2. Strength of the published validity evidence for the assessment tool. We derived the evidence for validity from five sources: content, response process, internal structure, relation to other variables, and outcomes.18 We determined whether or not each teamwork assessment tool has demonstrated validity evidence from each of these sources. We considered tools with a greater number of sources of published validity evidence to have stronger evidence of validity.
  3. Generalizability of scores from the assessment tool, based on published evidence. We examined the number of institutions and settings in which each teamwork tool has been applied within UME to ascertain the generalizability of assessment tool scores.
  4. Level of outcomes assessed using the tool, according to Kirkpatrick’s hierarchy.20 Outcomes associated with assessment scores are one measure of “consequences validity.”16,18 We categorized educational outcomes according to a modified version of the Kirkpatrick hierarchy (see above) that has been used in prior systematic reviews.19,21

Two of us (R.D.H. and D.A.R.) applied the above criteria to the included tools, resolved any disagreements through iterative discussion, and determined final tool selection by consensus.

Results

Of 13,549 citations, 70 articles describing 64 teamwork assessment tools met all inclusion criteria and were included in the review (see Figure 1; Appendix 1).23–92 Interrater agreement for data extraction was very good (ICC = 0.80; 95% confidence interval: 0.54–0.92).

Figure 1
Figure 1:
Flow diagram illustrating exclusion and inclusion of articles (published from January 1, 1979, through April 1, 2014) for a review of studies that described a quantitative tool used to assess teamwork in undergraduate medical education.

Setting

Of the 70 included studies, 39 (56%) were conducted in the United States, 15 (21%) in Europe, 11 (16%) in Canada, 5 (7%) in Australia,27,38,50,84,88 and 1 (1%) in the United Arab Emirates.48 Preclinical medical students were evaluated in 22 studies (31%) and clinical medical students in 30 (43%); 18 studies (26%) did not specify the level of medical student. Of the 64 assessment tools, 47 (73%) assessed medical student teamwork in interprofessional teams (as opposed to individuals’ behaviors in or attitudes about the team).

Methodology and patient outcomes

A minority of the 70 studies (n = 18 [26%]) included multi-institutional samples. The most frequently employed study design was single-group pre- and posttest (n = 29 [41%]), followed by single-group cross-sectional design (n = 22 [31%]). Three (4%) studies64,78,81 used a randomized two-group experimental design.

No studies looked at patient outcomes in association with teamwork assessment.

Teamwork assessments in classroom, simulation, and clinical settings

Appendix 1 shows each of the 70 studies describing the 64 teamwork assessment tools, categorized by the setting in which the assessment tools were applied (classroom, simulation, clinical). Of the 64 tools, 27 (42%) were used in the classroom setting. Of these tools, 21 (78%) measured students’ attitudes and 2 (7%) measured knowledge.29,30,53 Only 1 study (4%) assessed teamwork behaviors in the classroom.63

Simulation was the setting for 31 (48%) of the 64 teamwork assessment tools. Types of simulation included role-play, standardized patients, and technology-assisted simulation (e.g., simulation using a mannequin). The majority (21; 68%) of these simulation-based assessment tools required direct observation of medical students’ teamwork skills. These skills included crew resource management (adapted from the airline industry, referring to skills necessary for effective teamwork in crisis situations) and nontechnical skills (e.g., leadership, communication, task management, situational awareness). Eleven (35%) of the tools used in simulation settings measured attitude.

Only 7 (11%) of the 64 tools measured teamwork among medical students within clinical settings. Five of these tools were applied in the inpatient setting,86,88,89,91,92 1 in the outpatient setting,87 and 1 in the emergency department.90 Assessment of attitudes in the clinical setting was measured by 6 (86%) tools,86,87,89–92 while only 1 tool, the Teamwork Mini-Clinical Evaluation Exercise (T-MEX), measured behavior.88 Six tools (86%) measured interdisciplinary teams in the clinical setting.87–92 Team members included nurses, physical therapists, chaplains, social workers, and other allied health staff.

Addressing the AAMC’s proposed interprofessional collaboration competencies

Within the common taxonomy of competency domains proposed by the AAMC is the domain of interprofessional collaboration: “Demonstrate the ability to engage in an interprofessional team in a manner that optimizes safe effective patient and population-centered care.”1 The AAMC delineated four competencies to further define this domain. We applied four specific criteria (described in Method) to all 64 teamwork assessment tools to identify a single tool that best addressed each proposed competency (Table 1).

Table 1
Table 1:
Recommended Tools for Assessing Each of the Interprofessional Collaboration Competencies Proposed by the Association of American Medical Colleges

Team climate and mutual respect.

The first interprofessional competency defined by the AAMC emphasizes team climate and mutual respect1 (Table 1). The Collaborative Healthcare Interdisciplinary Relationship Planning (CHIRP) Scale is an attitudinal scale of interdisciplinary teamwork that assesses interdependence, recognition, empathy, sharing, dominance, organizational climate, and respect.29–31 CHIRP is an appropriate tool for assessing this particular competency because it specifically measures interdisciplinary team climate and mutual respect. Validity evidence for CHIRP includes content, internal structure, and relationships to other variables.31 Its use, however, has been limited to attitudinal assessments in classroom learning.

Roles of team members.

The second AAMC competency pertains to understanding the roles of oneself and others within interdisciplinary teams (Table 1). The Readiness for Interprofessional Learning Scale (RIPLS) is a widely published tool23,27,31,37,38,45–52 that measures students’ attitudes toward interprofessional learning and teamwork, including specifically attitudes toward the roles and responsibilities of various team members in the health care team. It consists of 19 statements of beliefs regarding the benefit of interprofessional learning. RIPLS comprises three subscales—(1) teamwork and collaboration, (2) professional identity, and (3) roles and responsibilities—all scored on a five-point Likert scale.52 This instrument has been largely used to measure attitudes (Kirkpatrick Level 1) among preclinical medical and other health care students23,27,31,38,45–47,49,52; however, it has also been applied to pain medicine fellows93 as well as practicing physicians and allied health staff.94 Generalizability and feasibility of RIPLS scores are well established on the basis of analyses conducted in multiple institutions across four continents.27,38,46–48,50–52 Validity evidence includes content, factor analysis, internal structure, and relationships to other variables.31,37,48,49,52 Studies using the RIPLS have shown differences in attitudes among students in different professional groups (i.e., medicine, nursing, pharmacy, dentistry, physical/occupation therapy).27,37,45–48,52,95 RIPLS scores have also been shown to correlate with CHIRP scores31 and Professional Identity Scale scores.47 However, to our knowledge, no published studies have reported relationships between RIPLS scores and the ability of health care teams to “address the health care needs of the patients and populations served,” as called for by Englander and colleagues1 in this second interprofessional competency. Ideally, studies of such patient outcomes resulting from effective teamwork should be done in the future.

Communication.

We recommend the Communication and Teamwork Skills (CATS) tool66,96 to assess the third AAMC interprofessional collaboration competency which focuses on responsive communication among interprofessional teams1 (Table 1). An important strength of CATS is that it requires direct observation of students’ teamwork skills (Kirkpatrick Level 3), as opposed to students’ self-assessments of teamwork which characterize the majority of assessments in this review. CATS is completed by trained observers who assess and weight 18 teamwork behaviors in four areas: (1) communication, (2) coordination, (3) cooperation, and (4) situational awareness.66,96 The initial validity study of CATS was conducted among actual clinical teams in the operating room and during interdisciplinary hospital rounds.96 A subsequent study has used CATS in simulated environments among medical, nursing, and physical therapy students, so it has reasonable generalizability.66 Although the use of trained observers enhances the validity of CATS scores, the costs involved in training observers may limit the feasibility of CATS in certain settings.

Active participation and patient-centered care.

Lastly, we suggest the T-MEX88,97 for assessing the fourth AAMC competency, which requires that students actively participate in interprofessional teams to provide person-centered and population-centered care1 (Table 1). We recommend T-MEX to assess this particular compe tency because T-MEX is the only tool identified in this review that measures actual teamwork behaviors (Kirkpatrick Level 3) among medical students in real-world clinical settings.88 T-MEX involves direct observation of six collaborative behaviors within three workplace domains: (1) supportive team relationships, (2) self-awareness and responsibility, and (3) safe communication. In one study, Olupeliyawa and colleagues88 showed that T-MEX scores have an acceptable reproducibility index after 8 observations, and these authors suggest that interdisciplinary observers can use T-MEX without significant rater training. The mean time for T-MEX observation was 11 minutes, and the mean time for sharing feedback was 8 minutes. Further, among 88 observations, 81% of the encounters and 74% of the feedback exchanges were completed in 5 to 15 minutes.88 All of these findings suggest reasonable feasibility in a clinical setting88; however, an important limitation of T-MEX is the paucity of studies evaluating its use in multiple institutions. Further research is needed to evaluate the generalizability of T-MEX and establish additional validity evidence.

Discussion

This review provides a synthesis of teamwork assessment tools reported in the medical literature. It includes specific recommendations for addressing the interprofessional competencies within the newly proposed common taxonomy framework1 and can therefore serve as a resource for medical schools whose leaders and faculty hope to fulfill these new AAMC competencies. Many strategies for assessing teamwork exist and are available to educators looking to address these important competencies. Although prior reviews have summarized the effectiveness of interprofessional education curricula10,34—and our own previous review examined assessment tools within the field of internal medicine14—this is, to our knowledge, the first comprehensive review of teamwork assessment tools within UME.

The interprofessional collaboration competencies defined by the AAMC call for students to demonstrate collaboration in interprofessional teams so as to provide patient and population-centered care.1 Fully assessing these two components of the competencies requires observation of live interactions among students and interprofessional teams, and it suggests that consideration be given to the outcomes of care for individual patients and populations. Yet, this review indicates that, to date, attitudinal assessments of teamwork predominate, very few teamwork assessments have involved direct observation of students’ teamwork behaviors in actual clinical settings, and none have assessed patient outcomes associated with measures of teamwork. We believe, therefore, on the basis of the current body of published studies, that a gap exists between the level of evidence required to fulfill the new interprofessional competencies and the toolkit for competency assessment that currently exists.

The results of this review suggest that medical schools can address this gap by focusing efforts in three areas. First, schools need to assess students’ teamwork in real-world clinical teams. In this review, we found that only seven tools (11%) measured teamwork among medical students within real-world clinical settings. Yet, in the modern UME structure, early clinical exposure is commonplace. The majority of students work with clinical teams beginning in the first year, a practice that provides rich opportunities for teamwork assessment. Second, medical schools should initiate teamwork assessments at the start of training so that students can receive longitudinal feedback on their teamwork. Among the 70 articles in this review, just 22 (31%) examined teamwork at the beginning of training (i.e., among preclinical students). To be most helpful, teamwork assessments, including those that occur early in training, should involve direct observation of students’ teamwork behaviors. The CATS66,96 and T-MEX88,97 are two direct observation tools for which validity evidence has been established within UME. Additionally, tools such as the Observational Teamwork Assessment for Surgery98–104 and Non-technical Skills for Surgeons,105–111 both of which have been used among residents and practicing physicians, could be adapted to UME but would require validity studies within the UME setting. Third, medical schools should maximize students’ involvement in interprofessional teams. In this review, 47 (73%) tools assessed medical student teamwork in interprofessional teams. Interprofessional education is increasing within medical education,10,11,112,113 yet it is not enough to simply learn side by side; students must actively engage with other health professionals in the workplace to obtain meaningful assessments of interprofessional teamwork behaviors and outcomes.

We note several limitations to this review. First, although we used a broad search strategy of multiple databases and attempted to capture unpublished work by reviewing abstracts from scientific meetings, we possibly failed to identify some relevant articles. We also recognize that despite our best efforts, publication bias is a limitation inherent in systematic reviews,114 so poor performance of the assessment tools may be underreported. Second, we used standard frameworks to summarize validity evidence16–18 and study quality21,22; however, these frameworks do not include every aspect of validity and methodological quality. Finally, we applied specific criteria to select tools that are best suited to address the AAMC competencies, yet we acknowledge some subjectivity in this selection. Furthermore, each of the AAMC competencies includes multiple components, making the identification of a single assessment tool that aligns with all elements of each competency difficult. Medical school faculty and curriculum developers may choose to use more than one assessment tool to completely address each of these competencies.

In conclusion, this review provides a resource for medical schools to identify teamwork assessment tools that they can use to assess the new interprofessional collaboration competencies proposed by the AAMC. As shown in this review, numerous tools (n = 64) have been used to assess teamwork in UME, and substantial validity evidence has been demonstrated for many of them. To strengthen this body of evidence, future research should be directed toward validity studies of assessment instruments, and these studies should include direct observation of medical students working in interprofessional teams in real-world clinical environments, as well as evaluations of teamwork effectiveness on patient outcomes.

References

1. Englander R, Cameron T, Ballard AJ, Dodge J, Bull J, Aschenbrener CA. Toward a common taxonomy of competency domains for the health professions and competencies for physicians. Acad Med. 2013;88:10881094.
2. Accreditation Council for Graduate Medical Education. ACGME common program requirements. Revised June 9, 2013; effective July 1, 2013. https://www.acgme.org/acgmeweb/Portals/0/PFAssets/ProgramRequirements/CPRs2013.pdf. Accessed October 9, 2015.
3. American Board of Medical Specialties. Based on core competencies. 2015. http://www.abms.org/board-certification/a-trusted-credential/based-on-core-competencies/. Accessed October 9, 2015.
4. Association of American Medical Colleges. Core Entrustable Professional Activities for Entering Residency: Curriculum Developers’ Guide. 2014. Washington, DC: Association of American Medical Colleges; https://members.aamc.org/eweb/upload/Core%20EPA%20Curriculum%20Dev%20Guide.pdf. Accessed October 9, 2015.
5. Accreditation Council for Graduate Medical Education. Milestones. 2008–2015. http://acgme.org/acgmeweb/tabid/430/ProgramandInstitutionalAccreditation/NextAccreditationSystem/Milestones.aspx. Accessed October 9, 2015.
6. Manser T. Teamwork and patient safety in dynamic domains of healthcare: A review of the literature. Acta Anaesthesiol Scand. 2009;53:143151.
7. Mazzocco K, Petitti DB, Fong KT, et al. Surgical team behaviors and patient outcomes. Am J Surg. 2009;197:678685.
8. Mardon RE, Khanna K, Sorra J, Dyer N, Famolaro T. Exploring relationships between hospital patient safety culture and adverse events. J Patient Saf. 2010;6:226232.
9. Liaision Committee of Medical Education. Functions and Structure of a Medical School: Standards for Accreditation of Medical Education Programs Leading to the MD Degree. 2014. http://www.lcme.org/publications.htm. Accessed October 9, 2015.
10. Reeves S, Perrier L, Goldman J, Freeth D, Zwarenstein M. Interprofessional education: Effects on professional practice and healthcare outcomes (update). Cochrane Database Syst Rev. 2013;3:CD002213.
11. Gilbert JH, Yan J, Hoffman SJ. A WHO report: Framework for action on interprofessional education and collaborative practice. J Allied Health. 2010;39(suppl 1):196197.
12. Chen PW. Are med school grads prepared to practice medicine? Doctor and Patient [blog]. 2014. http://well.blogs.nytimes.com/2014/04/24/are-med-school-grads-prepared-to-practice-medicine/?_php=true&_type=blogs&_php=true&_type=blogs&_r=1. Accessed October 9, 2015.
13. Angus S, Vu TR, Halvorsen AJ, et al. What skills should new internal medicine interns have in july? A national survey of internal medicine residency program directors. Acad Med. 2014;89:432435.
14. Havyer RD, Wingo MT, Comfere NI, et al. Teamwork assessment in internal medicine: A systematic review of validity evidence and outcomes. J Gen Intern Med. 2014;29:894910.
15. Moher D, Liberati A, Tetzlaff J, Altman DG; PRISMA Group. Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. Ann Intern Med. 2009;151:264269, W64.
16. Downing SM. Validity: On meaningful interpretation of assessment data. Med Educ. 2003;37:830837.
17. Messick S. Standards of validity and the validity of standards in performance assessment. Educ Meas Issues Pract. 1995;14:58.
18. Cook DA, Beckman TJ. Current concepts in validity and reliability for psychometric instruments: Theory and application. Am J Med. 2006;119:166.e7166.16.
19. Kogan JR, Holmboe ES, Hauer KE. Tools for direct observation and assessment of clinical skills of medical trainees: A systematic review. JAMA. 2009;302:13161326.
20. Kirkpatrick D. Craig R, Bittel I. Evaluation of training.Training and Development Handbook. 1976.New York, NY: McGraw-Hill.
21. Reed DA, Cook DA, Beckman TJ, Levine RB, Kern DE, Wright SM. Association between funding and quality of published medical education research. JAMA. 2007;298:10021009.
22. Reed DA, Beckman TJ, Wright SM, Levine RB, Kern DE, Cook DA. Predictive validity evidence for medical education research study quality instrument scores: Quality of submissions to JGIM’s medical education special issue. J Gen Intern Med. 2008;23:903907.
23. Curran VR, Sharpe D, Flynn K, Button P. A longitudinal study of the effect of an interprofessional education curriculum on student satisfaction and attitudes towards interprofessional teamwork and education. J Interprof Care. 2010;24:4152.
24. Curran V, Heath O, Adey T, et al. An approach to integrating interprofessional education in collaborative mental health care. Acad Psychiatry. 2012;36:9195.
25. Heinemann GD, Schmitt MH, Farrell MP, Brallier SA. Development of an attitudes toward health care teams scale. Eval Health Prof. 1999;22:123142.
26. Lennon-Dearing R, Lowry LW, Ross CW, Dyer AR. An interprofessional course in bioethics: Training for real-world dilemmas. J Interprof Care. 2009;23:574585.
27. Saini B, Shah S, Kearey P, Bosnic-Anticevich S, Grootjans J, Armour C. An interprofessional learning module on asthma health promotion. Am J Pharm Educ. 2011;75:30.
28. Wamsley M, Staves J, Kroon L, et al. The impact of an interprofessional standardized patient exercise on attitudes toward working in interprofessional teams. J Interprof Care. 2012;26:2835.
29. Robertson B, Kaplan B, Atallah H, Higgins M, Lewitt MJ, Ander DS. The use of simulation and a modified TeamSTEPPS curriculum for medical and nursing student team training. Simul Healthc. 2010;5:332337.
30. Hobgood C, Sherwood G, Frush K, et al.; Interprofessional Patient Safety Education Collaborative. Teamwork training with nursing and medical students: Does the method matter? Results of an interinstitutional, interdisciplinary collaboration. Qual Saf Health Care. 2010;19:e25.
31. Hollar D, Hobgood C, Foster B, Aleman M, Sawning S. Concurrent validation of CHIRP, a new instrument for measuring healthcare student attitudes towards interdisciplinary teamwork. J Appl Meas. 2012;13:360375.
32. Slack MK, Coyle RA, Draugalis JR. An evaluation of instruments used to assess the impact of interdisciplinary training on health professions students. Issues Interdiscip Care. 2001;3:5967.
33. Basran JF, Dal Bello-Haas V, Walker D, et al. The longitudinal elderly person shadowing program: Outcomes from an interprofessional senior partner mentoring program. Gerontol Geriatr Educ. 2012;33:302323.
34. Cameron A, Rennie S, DiProspero L, et al. An introduction to teamwork: Findings from an evaluation of an interprofessional education experience for 1000 first-year health science students. J Allied Health. 2009;38:220226.
35. Giordano C, Umland E, Lyons KJ. Attitudes of faculty and students in medicine and the health professions toward interprofessional education. J Allied Health. 2012;41:2125.
36. Hawk C, Buckwalter K, Byrd L, Cigelman S, Dorfman L, Ferguson K. Health professions students’ perceptions of interprofessional relationships. Acad Med. 2002;77:354357.
37. Margalit R, Thompson S, Visovsky C, et al. From professional silos to interprofessional education: Campuswide focus on quality of care. Qual Manag Health Care. 2009;18:165173.
38. Neville CC, Petro R, Mitchell GK, Brady S. Team decision making: Design, implementation and evaluation of an interprofessional education activity for undergraduate health science students. J Interprof Care. 2013;27:523525.
39. Hoffman J, Redman-Bentley D. Comparison of faculty and student attitudes toward teamwork and collaboration in interprofessional education. J Interprof Care. 2012;26:6668.
40. Curran V, Hollett A, Casimiro LM, et al. Development and validation of the interprofessional collaborator assessment rubric (ICAR). J Interprof Care. 2011;25:339344.
41. King G, Shaw L, Orchard CA, Miller S. The interprofessional socialization and valuing scale: A tool for evaluating the shift toward collaborative care approaches in health care settings. Work. 2010;35:7785.
42. Ardahan M, Akçasu B, Engin E. Professional collaboration in students of medicine faculty and school of nursing. Nurse Educ Today. 2010;30:350354.
43. Hojat M, Spandorfer J, Isenberg GA, Vergare MJ, Fassihi R, Gonnella JS. Psychometrics of the scale of attitudes toward physician–pharmacist collaboration: A study with medical students. Med Teach. 2012;34:e833e837.
44. Mazur H, Beeston JJ, Yerxa EJ. Clinical interdisciplinary health team care: An educational experiment. J Med Educ. 1979;54:703713.
45. Atack L, Parker K, Rocchi M, Maher J, Dryden T. The impact of an online interprofessional course in disaster management competency and attitude towards interprofessional learning. J Interprof Care. 2009;23:586598.
46. Bradley P, Cooper S, Duncan F. A mixed-methods study of interprofessional learning of resuscitation skills. Med Educ. 2009;43:912922.
47. Coster S, Norman I, Murrells T, et al. Interprofessional attitudes amongst undergraduate students in the health professions: A longitudinal questionnaire survey. Int J Nurs Stud. 2008;45:16671681.
48. El-Zubeir M, Rizk DE, Al-Khalil RK. Are senior UAE medical and nursing students ready for interprofessional learning? Validating the RIPL scale in a Middle Eastern context. J Interprof Care. 2006;20:619632.
49. Hamilton SS, Yuan BJ, Lachman N, et al. Interprofessional education in gross anatomy: Experience with first-year medical and physical therapy students at Mayo Clinic. Anat Sci Educ. 2008;1:258263.
50. Hood K, Cant R, Baulch J, et al. Prior experience of interprofessional learning enhances undergraduate nursing and healthcare students’ professional identity and attitudes to teamwork. Nurse Educ Pract. 2014;14:117122.
51. Joseph S, Diack L, Garton F, Haxton J. Interprofessional education in practice. Clin Teach. 2012;9:2731.
52. Parsell G, Bligh J. The development of a questionnaire to assess the readiness of health care students for interprofessional learning (RIPLS). Med Educ. 1999;33:95100.
53. Meier AH, Boehler ML, McDowell CM, et al. A surgical simulation curriculum for senior medical students based on TeamSTEPPS. Arch Surg. 2012;147:761766.
54. Van Winkle LJ, Bjork BC, Chandar N, et al. Interprofessional workshop to improve mutual understanding between pharmacy and medical students. Am J Pharm Educ. 2012;76:150.
55. Thompson BM, Levine RE, Kennedy F, et al. Evaluating the quality of learning-team processes in medical education: Development and validation of a new measure. Acad Med. 2009;84(10 suppl):S124S127.
56. Curran VR, Mugford JG, Law RM, MacDonald S. Influence of an interprofessional HIV/AIDS education program on role perception, attitudes and teamwork skills of undergraduate health sciences students. Educ Health (Abingdon). 2005;18:3244.
57. Warrier KS, Schiller JH, Frei NR, Haftel HM, Christner JG. Long-term gain after team-based learning experience in a pediatric clerkship. Teach Learn Med. 2013;25:300305.
58. Cox KR, Scott SD, Hall LW, Aud MA, Headrick LA, Madsen R. Uncovering differences among health professions trainees exposed to an interprofessional patient safety curriculum. Qual Manag Health Care. 2009;18:182193.
59. Hope JM, Lugassy D, Meyer R, et al. Bringing interdisciplinary and multicultural team building to health care education: The downstate team-building initiative. Acad Med. 2005;80:7483.
60. MacDonnell CP, Rege SV, Misto K, Dollase R, George P. An introductory interprofessional exercise for healthcare students. Am J Pharm Educ. 2012;76:154.
61. Morison S, Jenkins J. Sustained effects of interprofessional shared learning on student attitudes to communication and team working depend on shared learning opportunities on clinical placement as well as in the classroom. Med Teach. 2007;29:464470.
62. Vasan NS, DeFouw DO, Compton S. A survey of student perceptions of team-based learning in anatomy curriculum: Favorable views unrelated to grades. Anat Sci Educ. 2009;2:150155.
63. Wilson AR, Fabri PJ, Wolfson J. Human error and patient safety: Interdisciplinary course. Teach Learn Med. 2012;24:1825.
64. Jankouskas TS, Haidet KK, Hupcey JE, Kolanowski A, Murray WB. Targeted crisis resource management training improves performance among randomized nursing and medical students. Simul Healthc. 2011;6:316326.
65. Wright MC, Phillips-Bute BG, Petrusa ER, Griffin KL, Hobbs GW, Taekman JM. Assessing teamwork in medical education and practice: Relating behavioural teamwork ratings and clinical performance. Med Teach. 2009;31:3038.
66. Garbee DD, Paige J, Barrier K, et al. Interprofessional teamwork among students in simulated codes: A quasi-experimental study. Nurs Educ Perspect. 2013;34:339344.
67. Wallin CJ, Meurling L, Hedman L, Hedegård J, Felländer-Tsai L. Target-focused medical emergency team training using a human patient simulator: Effects on behaviour and attitude. Med Educ. 2007;41:173180.
68. Youngblood P, Harter PM, Srivastava S, Moffett S, Heinrichs WL, Dev P. Design, development, and evaluation of an online virtual emergency department for training trauma teams. Simul Healthc. 2008;3:146153.
69. Dillon PM, Noble KA, Kaplan L. Simulation as a means to foster collaborative interdisciplinary education. Nurs Educ Perspect. 2009;30:8790.
70. Sigalet E, Donnon T, Grant V. Undergraduate students’ perceptions of and attitudes toward a simulation-based interprofessional curriculum: The KidSIM ATTITUDES questionnaire. Simul Healthc. 2012;7:353358.
71. Sigalet E, Donnon T, Cheng A, et al. Development of a team performance scale to assess undergraduate health professionals. Acad Med. 2013;88:989996.
72. Carlson J, Min E, Bridges D. The impact of leadership and team behavior on standard of care delivered during human patient simulation: A pilot study for undergraduate medical students. Teach Learn Med. 2009;21:2432.
73. Hall P, Marshall D, Weaver L, Boyle A, Taniguchi A. A method to enhance student teams in palliative care: Piloting the McMaster–Ottawa team observed structured clinical encounter. J Palliat Med. 2011;14:744750.
74. Paige JT, Garbee DD, Kozmenko V, et al. Getting a head start: High-fidelity, simulation-based operating room team training of interprofessional students. J Am Coll Surg. 2014;218:140149.
75. Hänsel M, Winkelmann AM, Hardt F, et al. Impact of simulator training and crew resource management training on final-year medical students’ performance in sepsis resuscitation: A randomized trial. Minerva Anestesiol. 2012;78:901909.
76. Posmontier B, Montgomery K, Smith Glasgow ME, Montgomery OC, Morse K. Transdisciplinary teamwork simulation in obstetrics–gynecology health care education. J Nurs Educ. 2012;51:176179.
77. Fernandez Castelao E, Russo SG, Cremer S, et al. Positive impact of crisis resource management training on no-flow time and team member verbalisations during simulated cardiopulmonary resuscitation: A randomised controlled trial. Resuscitation. 2011;82:13381343.
78. Fernandez R, Pearce M, Grand JA, et al. Evaluation of a computer-based educational intervention to improve medical teamwork and performance during simulated patient resuscitations. Crit Care Med. 2013;41:25512562.
79. Meurling L, Hedman L, Felländer-Tsai L, Wallin CJ. Leaders’ and followers’ individual experiences during the early phase of simulation-based team training: An exploratory study. BMJ Qual Saf. 2013;22:459467.
80. Mueller G, Hunt B, Wall V, et al. Intensive skills week for military medical students increases technical proficiency, confidence, and skills to minimize negative stress. J Spec Oper Med. 2012;12:4553.
81. Reising DL, Carr DE, Shea RA, King JM. Comparison of communication outcomes in traditional versus simulation strategies in nursing and medical students. Nurs Educ Perspect. 2011;32:323327.
82. Stewart M, Kennedy N, Cuene-Grandidier H. Undergraduate interprofessional education using high-fidelity paediatric simulation. Clin Teach. 2010;7:9096.
83. Tofil NM, Morris JL, Peterson DT, et al. Interprofessional simulation training improves knowledge and teamwork in nursing and medical students during internal medicine clerkship. J Hosp Med. 2014;9:189192.
84. Whelan JJ, Spencer JF, Rooney K. A “RIPPER” project: Advancing rural inter-professional health education at the University of Tasmania. Rural Remote Health. 2008;8:1017.
85. Zheng B, Denk PM, Martinec DV, Gatta P, Whiteford MH, Swanström LL. Building an efficient surgical team using a bench model simulation: Construct validity of the legacy inanimate system for endoscopic team training (LISETT). Surg Endosc. 2008;22:930937.
86. Faulk CE, Mali J, Mendoza PM, Musick D, Sembrano R. Impact of a required fourth-year medical student rotation in physical medicine and rehabilitation. Am J Phys Med Rehabil. 2012;91:442448.
87. Wittenberg-Lyles E, Parker Oliver D, Demiris G, Regehr K. Interdisciplinary collaboration in hospice team meetings. J Interprof Care. 2010;24:264273.
88. Olupeliyawa AM, O’Sullivan AJ, Hughes C, Balasooriya CD. The teamwork mini-clinical evaluation exercise (T-MEX): A workplace-based assessment focusing on collaborative competencies in health care. Acad Med. 2014;89:359365.
89. Dando N, d’Avray L, Colman J, Hoy A, Todd J. Evaluation of an interprofessional practice placement in a UK in-patient palliative care unit. Palliat Med. 2012;26:178184.
90. Ericson A, Masiello I, Bolinder G. Interprofessional clinical training for undergraduate students in an emergency department setting. J Interprof Care. 2012;26:319325.
91. Nadolski GJ, Bell MA, Brewer BB, Frankel RM, Cushing HE, Brokaw JJ. Evaluating the quality of interaction between medical students and nurses in a large teaching hospital. BMC Med Educ. 2006;6:23.
92. Nørgaard B, Draborg E, Vestergaard E, Odgaard E, Jensen DC, Sørensen J. Interprofessional clinical training improves self-efficacy of health care students. Med Teach. 2013;35:e1235e1242.
93. Novy D, Hamid B, Driver L, et al. Preliminary evaluation of an educational model for promoting positive team attitudes and functioning among pain medicine fellows. Pain Med. 2010;11:841846.
94. Braithwaite J, Westbrook M, Nugus P, et al. A four-year, systems-wide intervention promoting interprofessional collaboration. BMC Health Serv Res. 2012;12:99.
95. Lê Q, Spencer J, Whelan J. Development of a tool to evaluate health science students’ experiences of an interprofessional education (IPE) programme. Ann Acad Med Singapore. 2008;37:10271033.
96. Frankel A, Gardner R, Maynard L, Kelly A. Using the communication and teamwork skills (CATS) assessment to measure health care team performance. Jt Comm J Qual Patient Saf. 2007;33:549558.
97. Olupeliyawa A, Balasooriya C, Hughes C, O’Sullivan A. Educational impact of an assessment of medical students’ collaboration in health care teams. Med Educ. 2014;48:146156.
98. Undre S, Sevdalis N, Healey AN, Darzi A, Vincent CA. Observational teamwork assessment for surgery (OTAS): Refinement and application in urological surgery. World J Surg. 2007;31:13731381.
99. Undre S, Healey AN, Darzi A, Vincent CA. Observational assessment of surgical teamwork: A feasibility study. World J Surg. 2006;30:17741783.
100. Sevdalis N, Lyons M, Healey AN, Undre S, Darzi A, Vincent CA. Observational teamwork assessment for surgery: Construct validation with expert versus novice raters. Ann Surg. 2009;249:10471051.
101. Hull L, Arora S, Kassab E, Kneebone R, Sevdalis N. Observational teamwork assessment for surgery: Content validation and tool refinement. J Am Coll Surg. 2011;212:234243.e1.
102. Healey AN, Undre S, Vincent CA. Developing observational measures of performance in surgical teams. Qual Saf Health Care. 2004;13(suppl 1):i33i40.
103. Hull L, Arora S, Kassab E, Kneebone R, Sevdalis N. Assessment of stress and teamwork in the operating room: An exploratory study. Am J Surg. 2011;201:2430.
104. Russ S, Hull L, Rout S, Vincent C, Darzi A, Sevdalis N. Observational teamwork assessment for surgery: Feasibility of clinical and nonclinical assessor calibration with short-term training. Ann Surg. 2012;255:804809.
105. Yule S, Rowley D, Flin R, et al. Experience matters: Comparing novice and expert ratings of non-technical skills using the NOTSS system. ANZ J Surg. 2009;79:154160.
106. Beard JD, Marriott J, Purdie H, Crossley J. Assessing the surgical skills of trainees in the operating theatre: A prospective observational study of the methodology. Health Technol Assess. 2011;15:ixxi, 1.
107. Arora S, Miskovic D, Hull L, et al. Self vs expert assessment of technical and non-technical skills in high fidelity simulation. Am J Surg. 2011;202:500506.
108. Crossley J, Marriott J, Purdie H, Beard JD. Prospective observational study to evaluate NOTSS (non-technical skills for surgeons) for assessing trainees’ non-technical performance in the operating theatre. Br J Surg. 2011;98:10101020.
109. Yule S, Flin R, Maran N, Youngson G, Mitchell A, Rowley D. Debriefing surgeons on non-technical skills (NOTSS). Cogn Tech Work. 2008;10:265274.
110. Lee JY, Mucksavage P, Canales C, McDougall EM, Lin S. High fidelity simulation based team training in urology: A preliminary interdisciplinary study of technical and nontechnical skills in laparoscopic complications management. J Urol. 2012;187:13851391.
111. Willaert W, Aggarwal R, Harvey K, et al.; European Virtual Reality Endovascular Research Team (EVEResT). Efficient implementation of patient-specific simulated rehearsal for the carotid artery stenting procedure: Part-task rehearsal. Eur J Vasc Endovasc Surg. 2011;42:158166.
112. Reeves S, Zwarenstein M, Goldman J, et al. The effectiveness of interprofessional education: Key findings from a new systematic review. J Interprof Care. 2010;24:230241.
113. Reeves S, Zwarenstein M, Goldman J, et al. Interprofessional education: Effects on professional practice and health care outcomes. Cochrane Database Syst Rev. 2008:CD002213.
114. Higgins JPT, Altman, Stenre AC. Higgins JPT, Green S. Chapter 8: Assessing risk of bias in included studies.Cochrane Handbook for Systematic Review of Interventions Version 5.1.0. Updated March 2011. www.cochrane-handbook.org. Accessed October 9, 2015.

Appendix 1

Tools Assessing Teamwork Among Medical Students, Published From January 1979 Through April 2014

Supplemental Digital Content

Copyright © 2016 by the Association of American Medical Colleges