Robust studies in health professions education are needed to build the strongest evidence for educational effectiveness, with many groups calling for multi-institutional research. Currently, the Best Evidence Medical and Health Professional Education (BEME) Collaboration, which is supported by the Association for Medical Education in Europe, a worldwide organization with members in 90 countries on five continents, is one group that has conducted several systematic reviews on important topics that include teaching pedagogies, validity and reliability of different types of assessments, continuous professional development, and interprofessional education (IPE).1 However, many articles identified during the study selection process for these BEME Collaboration reviews had limitations that resulted in their ultimate exclusion from the study. Such limitations included poor study or evaluation design, lack of generalizability, limited representativeness as only a single institution was included, or the testing of a singular educational approach.
Given the rapid advancement of information technology systems, infrastructures now exist to advance educational research approaches in ways that could overcome many of these limitations. One such approach is to create a network of the networks conducting outcomes research to inform educational and credentialing practices in the health professions across the United States. Such an approach would allow for the pooling of data across many health professions education programs and regions of the country. Early examples of such efforts exist, but much more work is needed. In this Invited Commentary, we provide examples of existing networks, identify their progress and the challenges they have encountered, and propose a vigorous way forward toward understanding the effectiveness of health professions education.
Examples of Successful Networks
The American Medical Association’s Accelerating Change in Medical Education Initiative
This learning collaborative was formed in September 2013 and initially included 11 medical schools. In January 2016, it was expanded to 32 schools. The 11 original schools each received $1,000,000 to support their efforts to innovate over a five-year period, and each of the 21 schools in the second cohort received $75,000 over a three-year period, ending in September 2018. The consortium is designed to support and drive collaboration across medical schools on the development of common projects that would advance innovations in medical education. These projects have included developing flexible, competency-based educational pathways; teaching and evaluating new content in health systems science; working with health care delivery systems in novel ways to educate students; fostering the development of master adaptive learners; creating individualized coaching models to guide learners through their educational processes; and leveraging technology to support learning and assessment.
Importantly, this effort also provides a platform for a national evaluation plan leading to evidence-based best practices for all medical schools.2 Current data sources included in this evaluation plan are demographic data from the American Medical College Application Service and the American Association of Colleges of Osteopathic Medicine Application Service, scores on objective structured clinical examinations that are shared across institutions, and performance scores on a health systems science formative subject examination being developed in collaboration with the National Board of Medical Examiners and the American Medical Association.
Challenges to obtaining and collecting evaluation data largely have been logistical. Coordinating with multiple institutions requires dedicated staff and a validated process to ensure accuracy. Additionally, consensus regarding the specific data that can be transferred across schools is required. For example, the level of demographic granularity that is acceptable at one school with a large population may be too revealing at another school with a smaller or more diverse student body. Collecting performance and assessment data has different challenges, particularly in the nascent stages of collaborative learning. For instance, schools have different assessment processes and curricular timelines in place. The differences in the strengths, weaknesses, needs, and specific focus areas between the participating schools add another level of complexity. As such, there is the added logistical challenge of schools finding the right time in their curriculum to administer a particular assessment tool, and still the assessment tool chosen through careful review and consensus may not be a good fit for another school at that point in time.
The National Center for Interprofessional Practice and Education
Another network is the National Center for Interprofessional Practice and Education, a public–private initiative that includes the University of Minnesota, the Health Resources and Services Administration (HRSA), and four foundations. In 2012, HRSA charged the National Center with providing leadership, scholarship, evidence and research coordination, and national visibility to advance IPE and collaborative practice. Despite a 50-year history, relatively few rigorous studies have been conducted to link IPE to changes in collaborative behavior or to demonstrate its effectiveness in improving health and systems outcomes. One reason cited for this lack of evidence is the misalignment between health professions education and the health care delivery system.3 To address these issues, the National Center plans to create a deeply integrated learning system designed to transform education and care together to ultimately create healthier communities.
An important component of the National Center is the Nexus Innovations Network, which has active partners in 33 states working on 107 implementation science and other IPE research projects. In additional to the local IPE implementation and research projects being conducted, the National Center administers annual surveys and gathers deidentified patient data from programs in practice and community settings that are aligned with individual organizations and networks of universities and colleges, including academic health centers across the nation. The relevance of this network cannot be understated. It includes data from schools such as dentistry, medicine, nursing, pharmacy, and social work, and their practice partners.
In 2017, the National Center team analyzed initial data to create a national standardized core dataset. Using the Institute of Medicine framework for measuring the impact of IPE,3 the National Center mapped the Interprofessional Education Collaborative competencies to levels of learning using a standardized IPE measurement tool. Additionally, early data that the National Center collected are contributing to an education investment index, with the goal of informing leadership decisions regarding investment and organizational structure to support effective IPE implementation. The National Center is also accelerating IPE design and implementation through the creation of a Nexus Learning System, with the goal of expanding the number of network members and the big data collection needed to further inform the field. At the same time, the National Center’s community-generated, open access resource center (nexusipe.org) is used globally and is growing at the rate of 20,000 users quarterly.
Like the Accelerating Change in Medical Education initiative, the financial model of the National Center has varied; currently, it assumes responsibility for funding its national research platform. As interest in IPE increases, including in new IPE accreditation requirements, most network members likely will continue to be driven by advancing the field, essentially volunteering their efforts and paying for National Center services when needed, or requesting letters of support to seek funding elsewhere to undertake their work. To be financially sustainable and continue its work, the National Center seeks partnerships and grant funding while offering a portfolio of fee-based consulting and training services in IPE implementation and research.
The Accreditation Council for Graduate Medical Education’s Accreditation System
In 2009, the Accreditation Council for Graduate Medical Education (ACGME), which accredits sponsoring institutions and residency and fellowship programs in the United States, began restructuring its accreditation system to further advance educational outcomes in the six domains of clinical competency (i.e., patient care and procedural skills; medical knowledge; practice-based learning and improvement; systems-based practice; professionalism; and interpersonal and communication skills).4 This evolution in accreditation also placed a greater emphasis on continuous quality improvement as a component of the accreditation system. To facilitate this change, the ACGME revised and modified its Accreditation Data System, which includes the case log system as well as resident/fellow and faculty surveys. Milestones, developed by each specialty specifically for their discipline, are another addition to the accreditation system. In 2013, programs started reporting on the Milestones their residents reached every six months as part of the continuous quality improvement aspect of the Accreditation Data System. While all programs are required to report on Milestones, there is no requirement regarding the Milestones level that residents and fellows must achieve; this judgment is left to the programs to help all learners grow and improve. The developmental Milestones, which are competency-based educational outcomes, provide rich information on learner performance that can be tracked longitudinally in all ACGME-accredited programs.
The ACGME provides aggregated national-level Milestones data to the specialty review committees to assist them in evaluating and developing policies and procedures for their specialty moving forward. The first Milestones annual report was published in 2016.5 This report was vital in that it revealed that there was variation within each specialty in Milestones attainment across residents and programs. This work represents a turning point in educational outcomes data collection, beyond the tracking of medical knowledge data. The availability of national-level data in other competencies is beginning to provide a mechanism for identifying, investigating, and understanding variation that can and should be improved, while also driving cultural changes that promote sharing comparative data across specialties.
What Is Next?
These three examples represent networks that have succeeded in engaging multiple institutions and programs across disciplines, specialties, faculties, and learners. These efforts involved tasks that were not easy to accomplish and that required a level of commitment on the part of the respective partners. When such activities involve regulation, a commitment to the work can understandably feel burdensome and difficult. However, if systems can continue to improve on the data they collect (both the content and collection process) then the whole system will continue to benefit.
What if similar approaches began to multiply across the country, creating a network of networks? Such a model would allow schools to pool data for research purposes, overcoming current limitations. Additionally, this model would provide national organizations across the continuum of education with an opportunity to collaborate and explore real-world, previously unanswerable questions facing health and education systems. Furthermore, it would offer an opportunity to engage faculty in large-scale projects or substudies that would provide a robust pathway toward promotion, further development of educational research expertise, and a heightened commitment to implement and study educational innovations. Creating datasets that can be shared and expanded would foster the development of different paths forward. If each network created a fellowship program for educational researchers in the health professions, it would amass an even larger body of passionate educators and educational researchers than currently exists. As was the case with the ACGME Accreditation Data System, we will likely make unexpected discoveries while pursuing this work. Still, it is time to undertake such activities on a much larger basis to advance studies on educational and credentialing effectiveness quickly. Pooling and sharing data in meaningful ways, opening dialogues of discovery, and creating linkable big datasets will allow us to advance many fields simultaneously.
How could such a model be supported financially? Many possibilities exist, even in the three examples we described. Local, institutional support is one way, as membership in a network of networks would facilitate advancement both locally and nationally. Another possible funder is HRSA, a federal organization that would benefit from the collection and sharing of health professions training data. Stakeholder organizations, such as the American Medical Association, the Association of American Medical Colleges, and the Josiah Macy Jr. Foundation, and others could provide startup support and then use membership fees to sustain that support in the poststartup period. Continued collaborative research and scholarship efforts would be significant incentives for network members.
These three examples, as well as other smaller efforts that are percolating, indicate to us that the time is ripe to undertake a vigorous, data-driven effort to encourage health professions schools to join a network and then to link those networks together in meaningful ways to create robust educational research repositories or learning “collaboratories” to share data more effectively and efficiently. Health systems are using this same approach with electronic health records to advance and improve health care and population health. The studies that result from such partnerships could help us understand how best to develop collaborative, systems-ready health practitioners and master adaptive learners, improve the return on investment of interprofessional practice and education, and accelerate Milestones development and achievement in trainees.
1. The Best Evidence Medical and Health Professional Education Collaboration. https://www.bemecollaboration.org/Home/
. Accessed February 12, 2018.
2. Skochelak SE, Stack SJ. Creating the medical schools of the future. Acad Med. 2017;92:1619.
3. Institute of Medicine. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. 2015. Washington, DC: National Academies Press; http://www.nationalacademies.org/hmd/Reports/2015/Impact-of-IPE.aspx
. Accessed February 12, 2018.
4. Accreditation Council for Continuing Medical Education. Accreditation Council for Graduate Medical Education Next Accreditation System. http://www.accme.org/tags/acgme-next-accreditation-system
. Accessed February 12, 2018.
5. Hamstra SJ, Edgar L, Yamazaki K, Holmboe ES. Milestones Annual Report 2016. 2016. Chicago, IL: Accreditation Council for Graduate Medical Education; http://www.acgme.org/Portals/0/PDFs/Milestones/MilestonesAnnualReport2016.pdf?ver=2016-10-21-092055-947
. Accessed February 12, 2018.