In recent years, the social mission, which is focused on advancing social justice and health equity, has gained recognition as an important aspect of health professions education. However, social mission–dedicated education varies from institution to institution, and there is currently no established method to measure the intensity or quality of an institution’s commitment to these pedagogical and experiential activities. In this Perspective, we describe our experience creating a survey instrument to measure the social mission in dental, medical, and nursing schools in the United States, and we reflect on the implications of using this tool to deepen discussions around the social mission and strengthen teaching and role modeling of health equity.
The Social Mission in Health Professions Education
The social mission of a health professions school is the contribution of the school in its mission, programs, and the performance of its graduates, faculty, and leadership to enhancing health equity and to addressing the health disparities of the society in which it exists.1 The social mission includes the school’s programs that teach or role model diversity and inclusion, community engagement, health disparities reduction, and addressing the social determinants of health. Many schools use the social mission concept to define their educational and institutional commitments to health equity. To date, no widely adopted tool exists to measure the social mission in health professions education.
In today’s complex health care environment, measurement—including assessing the quality, impact, and outcome of any intervention—is an essential part of doing business. Just as health professions students take clinical skills exams with standardized patients and medical residents are assessed on the milestones of clinical and professional development, approaches to measure social mission–related values, programs, and activities in schools are needed. THEnet: Training for Health Equity Network has pioneered a framework for socially accountable health workforce education that offers a practical tool to help schools align their training of health workers with community needs.2 In 2012, the Association for Medical Education in Europe introduced the ASPIRE-to-Excellence Award, which recognizes excellence in several educational areas, one of which is social accountability.3 Since its inception, 4 U.S. medical schools have earned ASPIRE awards for social accountability.4
There is promising consistency in the measurement tools for these 2 programs, but there is a need for a simple assessment tool that schools can use to evaluate their social mission programs to establish their relative strengths and weaknesses. In 2016, we undertook the development of such a tool for use by health professions schools to identify their level of engagement in social mission activities, track that level over time, and compare their progress with that of other institutions.
Developing a Social Mission Measurement Tool
The development of our social mission metrics survey proceeded in 3 phases: (1) instrument design, (2) field testing and survey revision, and (3) data analysis, each of which we describe below.
Instrument design
The initial goal of our work was to design a survey instrument that would provide a standardized approach to measuring the social mission at dental, medical, and nursing schools, disciplines chosen as representative of the broader range of health professions. Our study team included individuals with educational, clinical, and methodological experience as well as a survey scientist to provide technical guidance in survey design and analysis. In February 2016, we invited selected dental, medical, and nursing educators as well as representatives from national organizations, students, residents, and a public member to join a working advisory committee. The committee provided content perspective, strategic guidance, and feedback on the survey as it developed. See Supplemental Digital Appendix 1 at https://links.lww.com/ACADMED/A851 for the complete list of advisory committee members.
In early 2016, we also performed a literature review to identify preexisting indicators, criteria, and frameworks related to the social mission, and we interviewed key informants with expertise in health professions education, measurement, and the social mission. Based on this research and these discussions, we created an extensive list of programs, policies, functions, and activities at health professions schools that we judged to be social mission enhancing. We then sorted these activities into domains or major areas of function common to all health professions schools. The 6 domains we identified were: (1) educational program, (2) community engagement, (3) governance, (4) diversity and inclusion, (5) institutional culture and climate, and (6) research.
We sent the candidate domains and activities to the advisory committee members in advance of an in-person meeting held in May 2016. Over 2 rounds of a modified Delphi process, the advisory committee members discussed the candidate items in small groups and modified the lists as appropriate. We then selected measurable activities from the master list and winnowed, sorted, and grouped them within the most appropriate domain. Through this process, the domains were combined, eliminated, or renamed according to group consensus.
We subsequently undertook an iterative approach to developing the survey instrument questions based on the advisory committee’s work, taking into account perceived relevance to the social mission, presumed data availability, and discriminatory potential. We considered reporting burden and removed questions we thought would present barriers to participation (due to the time it would take to access the data). We also paid attention to phrasing questions in a way that would be appropriate to all 3 health professions (dental, medical, and nursing schools; see Table 1 for the final list of domains and examples of activities in each).
Table 1: Social Mission–Enhancing Domains and Examples of Corresponding Activities and Indicators Identified During the Development of the Social Mission Metrics Survey, 2016–2019
Field testing and survey revision
In September 2016, we pilot tested the initial social mission metrics survey instrument using a convenience sample of 6 schools in the 3 included health professions to assess the survey’s validity and reliability and to gauge the reporting burden for respondents. Schools were invited based on their professional connections with the survey team or advisory committee members. The dean or a delegate completed the survey at each school, and there was no monetary incentive to participate. Of the 6 schools that agreed to participate, 5 completed the survey—A.T. Still University Arizona School of Dentistry & Oral Health, East Carolina University School of Dental Medicine, Michigan State University College of Osteopathic Medicine, Nebraska Methodist College, and University of Cincinnati College of Nursing.
We conducted semistructured debriefing interviews in late 2016 with respondents to evaluate the ease of use, consistency of data obtained, and the utility and burden of the exercise. Respondents indicated they experienced a significant reporting burden, reporting an average time spent on the survey of 4.6 hours (range = 1–12) and an average of 10.4 people consulted (range = 6–18). Nonetheless, reviews were highly positive with all respondents agreeing that the survey questions were reflective of the social mission and captured the activities that promoted the social mission within their schools. Several respondents also stated that the process of completing the survey led to valuable discussions among their school leaders about actions that could improve or better track the social mission at their institution. See Supplemental Digital Appendix 2 at https://links.lww.com/ACADMED/A851 for the debriefing interview script.
In May and December 2017, we launched 2 larger field tests of our survey, which we had modified based on feedback from the previous phases. We invited 61 “friendly” schools where we had contacts known to be receptive to the idea of the social mission to complete the survey. Of those, 37 (60.7%) agreed to participate. Respondents from 33 (89.2%) of those schools completed the survey with an average response time of 4 weeks. Then we chose 99 schools at random from those schools with which we had no contacts and invited them to participate. Thirty (30.3%) agreed to do so. Twenty-five (83.3%) completed the survey with an average response time of 5.5 weeks. The survey responses came from leadership at each school; the residents of the communities that the schools served were not directly surveyed, although several survey questions addressed community collaborations and the use of feedback from the communities the schools served. Table 2 provides a breakdown of respondents by health profession and field-testing phase.
Table 2: Field Test Participation by Health Profession and Phase in the Development of the Social Mission Metrics Survey, 2017
Using feedback from these 2 field tests, we again refined the survey based on 3 criteria: (1) discriminatory value, (2) data availability, and (3) reporting burden. We removed questions that elicited the same responses from all schools since they did not add discriminatory value to the survey. We also encountered some areas where the generally used vocabulary was unclear or concepts overlapped. In these cases, we discussed the ambiguities, consulted content area experts, and established definitions for the purposes of the survey. We included these definitions throughout and added a glossary at the end of the survey instrument. In the sections that follow, we discuss important examples of these areas of ambiguity.
Governance.
The governing structure of a health professions school contributes to the extent and rigor of its social mission activities. Since schools have many management configurations often relating to affiliated universities or teaching hospitals, it was challenging to construct uniformly understood questions about a school’s governance. We decided to rely on written governing documents—the mission statement and strategic plan—as targets for measurement because all schools have them and research indicates that social mission–related governance documents have a positive impact on institutional structure, ideology, and workforce outcomes related to achieving the social mission.5
Community of commitment.
Many health professions schools are located in or near poor communities. While dental, medical, and nursing schools frequently define their mission as training the national and global health workforce, we were interested in institutional commitments made to specific communities, particularly low-income and disadvantaged communities. We called these communities of commitment, which we defined as “medically or socially underserved communities—health disparity communities—that could be geographic areas, demographic groups, or categories of patients that [schools] explicitly targeted as a focus [of their] work.”
Community health needs assessment.
Tax-exempt hospitals are required to perform a community health needs assessment every 3 years. A health professions school with an affiliated hospital can use this assessment to inform its educational and research programs and align them with the needs of the community. As such, we included several measures to assess the school’s use of formal or informal needs assessments.
Racial and ethnic diversity and inclusion.
The scoring of racial and ethnic identity is a sensitive and evolving area, yet any measurement of the social mission requires diversity-related questions. Recognizing that there is variation in how schools report racial and ethnic data for their students, faculty, and leadership, we adopted the categorization system of the U.S. Department of Education. These categories are imperfect but useful in collecting uniform data across schools and health professions. We emphasized that all race and ethnicity data should be based on self-reported information. We also included indicators of socioeconomic status as diversity measures.
LGBTQ+ diversity and inclusion.
We considered whether to include questions regarding the gender identity; sexual identity; and sexual orientation of students, faculty, and leadership. After pilot testing the survey, we decided to approach this issue from several angles: (1) including self-reported data to measure the diversity and representativeness of the student body, faculty, and leadership; (2) including questions assessing the status of openly LGBTQ+ (lesbian, gay, bisexual, transgender, and queer) individuals on campus as a measure of institutional culture; and (3) asking about curricular content around the care of LGBTQ+ patients as a measure of educational policy.
Health and workforce disparities.
We used the term underrepresented when referring to health professionals and students who identify as members of a racial group that constitutes a smaller proportion of the health workforce than of the general population. We used underserved when referring to populations that systematically have worse health outcomes than other groups, including rural populations.
Nursing schools.
Nursing education is a much larger and more complex enterprise than medical or dental education. There are more than 2,000 nursing schools6 in the United States, graduating nurses at the associate’s, bachelor’s, master’s, and doctoral levels. In comparison, there are 67 dental schools7 and 189 MD- and DO-granting medical schools in the United States,8,9 all producing graduates at the doctoral level. To manage the numbers and keep some balance between the 3 professions, we decided that the design and field testing of our survey would be limited to nursing programs graduating nurses at the bachelor’s (BSN) and master’s (NP) levels.
Data analysis
Our social mission metrics survey would not be complete without the development of a scoring system that quantified schools’ responses. As our instrument development progressed, the final set of usable questions remained large, covering 6 broad domains and 18 activity areas, each of which represented several survey questions, which we called indicators. See Supplemental Digital Appendix 3 at https://links.lww.com/ACADMED/A851 for the complete list of 18 activity areas. We developed simple scoring rubrics for closely related questions. For example, a school might receive 1 point if it reported having curricular programs focused on the social determinants of health, but it would garner additional points if, on follow-up questions, it indicated that large proportions of its students took these courses and still more points if these courses were longitudinal and required of all students in the program. Since the range of possible points varied and there was variance in the scores across the schools that participated in our field tests, we standardized the indicator scores.
None of the social mission areas measured on our survey were unimportant, but it was unrealistic to treat them all as equally important. Therefore, 2 large issues remained for scoring: how should specific indicators be weighted when a score was computed for a given area and which of the areas should be given greater or lesser weight when evaluating the overall social mission performance of a school. After much discussion, we decided that the best answers to these questions should come from actual stakeholders in health professions education.
Accordingly, we conducted a social mission metrics priorities survey in April 2018. We sent invitations to complete this online survey to 3 audiences: the advisory committee members, the individuals at the field test institutions who completed the metrics survey, and participants in the fourth Beyond Flexner conference. Beyond Flexner is a national movement devoted to aligning health professions education with health equity. The Beyond Flexner conference provided an opportunity for us to survey social mission–focused health professionals, students, staff, and community members.
The priorities survey had 2 main sections. In the first section, respondents considered specific questions, or indicators, within an area. Using the best/worst method, also referred to informally as the “maxdiff” question format,10 we asked respondents to choose the most important and the least important indicators from a rotating list of 4 indicators for a specific area. When averaged across respondents, data on these choices produced a numeric importance score, or weight, for every indicator in a given activity area.
The second part of the priorities survey aimed to determine the relative perceived importance of each of the 18 activity areas. We showed respondents successive pairs of activity areas, randomly generated. For each pair, we asked respondents to say which was more important and to choose whether it was a lot more important or a little more important or if the 2 areas were equally important. When aggregated across all respondents, the ratings generated numeric importance scores, or weights, for each of the 18 activity areas.
The priorities survey generated a fairly high rate of response. Of 639 invited respondents, 293 (45.9%) provided usable responses. Of those, 251 (85.7%) reported their profession. Those affiliated with medicine were overrepresented (157, 62.5%) compared with those affiliated with dentistry (11, 4.4%), nursing (42, 16.7%), and other professions (41, 16.3%), including public health, mental health, and physician assistants, among others. A large number of respondents held high-level academic positions, including 109 faculty appointments (43.4%) and 74 academic leadership positions (29.5%); half of respondents had been involved in health professions training for 16 years or more. Respondents were racially diverse (59% non-Hispanic white, 21% African American or black, 9% Hispanic, and 8% Asian). Women comprised 61% of respondents (179), and students were 14% (41).
We standardized the importance scores for the activity areas on a scale of 1 to 5. The scores ranged from a high of 3.43 for “curriculum content focuses on medically underserved communities, health disparities, and interprofessional education” to a low of 2.17 for “allied institution in a low- to middle-income country to train international students and offer global health rotations for U.S. students.” Within each activity area, we found that the indicator scores were similarly distributed from high to low, although within many of the areas we saw that the indicators in that area were similarly weighted. See Supplemental Digital Appendix 4 at https://links.lww.com/ACADMED/A851 for a sample response to the social mission metrics priorities survey.
In our final data analysis, we scored the responses to each survey question and combined the scores for all the indicators in each area to create an area score. Given that they are meant to address different dimensions of the social mission, the area scores showed substantial intercorrelation (Cronbach’s alpha = 0.68). We combined the 18 area scores to create an overall social mission score. We weighted results for each individual indicator as well as each area based on weights established through the priorities survey described above. The social mission scores (i.e., the weighted sums of the 18 standardized area scores) varied widely across the schools that participated in the field tests and had an approximately normal distribution.*
We confidentially reported to each participating school its relative position, compared with the other participating schools, for each activity area score and for the overall social mission score. No school was given information on the scores of any other individual school, no school rankings will be published or released, and all data are stored on secure servers with only qualified members of the study team allowed access.
Through our field test process, we tested and improved the usability of our social mission metrics survey instrument. See Supplemental Digital Appendix 5 at https://links.lww.com/ACADMED/A851 for the dental school/medical school field test version of the survey and Supplemental Digital Appendix 6 at https://links.lww.com/ACADMED/A851 for the nursing school version. Validation of the survey instrument will be an ongoing process that will necessarily evolve as the results are used in different ways.11 At this stage of our work, we have fairly strong evidence of the content validity of the instrument, given that the input on the survey content we gathered was from diverse experts and a large number of stakeholders rated the importance items in the priorities survey. We have some statistical evidence of coherence and interitem reliability both within and across activities, but the field test dataset is too small for any extensive psychometric analysis. As we gather data from a larger number of health professions schools, it will become possible to conduct confirmatory factor analyses as well as various methods of external or criterion validation. For example, we could compare the social mission scores we calculated with results from schools that have deployed other related metrics as well as to workforce outcomes that promote health equity. In addition, we will be able to examine the sensitivity of the overall scores to different weight values for the indicator and activity area scores.
Next Steps
Our initial findings suggest the utility of the social mission metrics survey instrument for characterizing health professions schools’ social mission engagement. At this stage, the survey is a tested instrument for gauging the level of involvement of a health professions institution in activities that will teach and role model an institutional and individual social mission. In addition, a number of schools that participated in the development of the instrument reported that the exercise of completing the survey was a social mission awareness-raising experience for them. The efforts required to collect information and review programs sparked conversations about and assessments of social mission activities that would not have taken place without the survey. The participating schools saw this as a positive outcome of the exercise, apart from any specific feedback they received from the survey scoring. In this sense, the survey itself is a potential catalyst for social mission advancement.
As institutional self-analysis is a potent tool for change and improvement, our social mission metrics survey was formatted as a self-assessment for dental, medical, and nursing schools. In February 2019, we sent the survey to more than 700 schools across the United States as part of the Social Mission Metrics Initiative, a national self-assessment campaign (https://socialmissionmetrics.gwhwi.org). We disseminated information about this campaign through professional association listservs, conference presentations, health professions student organizations, and social media. Participating schools will receive confidential, structured feedback based on their analyzed and benchmarked results indicating how they compare with the national cohort of participating schools. Despite the time commitment involved and the voluntary status of the self-assessment, preliminary results suggest that approximately one-third of invited schools have completed the survey. As additional schools complete the survey, we will be able to establish progressively better benchmarks for social mission performance, which have many possible uses and benefits.
Our goal over time is to produce a system of metrics that is useful for schools: (1) to track their social mission engagement, (2) to know their level of performance in any given area of the social mission, and (3) to understand how their performance compares with national norms, based on the performance of participating schools, both within and across professions. We aim to encourage schools to complete the survey every 3 to 5 years to track their social mission over time. We also plan to expand the tool to other health professions in partnership with national education organizations. We hope that the social mission metrics survey will prove to be a useful tool for improving the level and quality of social mission engagement at health professions schools, all toward the end goal of improving the awareness, skills, and commitment of health professionals to health equity in our society.
Acknowledgments:
The authors wish to thank the advisory committee and the many pilot and field test participants who helped shape the development of the social mission metrics survey instrument. The authors are especially grateful to the Center for Survey Research at the University of Virginia for help with the data analysis of the survey results. The authors appreciate the Beyond Flexner Alliance for supporting the distribution of the priorities survey at the Beyond Flexner 2018: Community, Diversity, and Equity conference. They would like to thank the Beyond Flexner conference attendees for their participation in the priorities survey. Finally, the authors greatly appreciate those who made contributions to the project, including: Vianca Bedoya, Candice Chen, MD, MPH, Heba Elnaiem, Rory Merritt, MD, Aaron J. Spiegelman, Crystal Xue, and Hexuan Zhang.