Secondary Logo

Share this article on:

Scholarly Conversations in Medical Education

O’Brien, Bridget C., PhD; May, Win, MBBS, PhD; Horsley, Tanya, PhD

doi: 10.1097/ACM.0000000000001378
Commentary

This supplement includes the eight research papers accepted by the 2016 Research in Medical Education Program Planning Committee. In this Commentary, the authors use “conversations in medical education” as a guiding metaphor to explore what these papers contribute to the current scholarly discourse in medical education. They organize their discussion around two domains: the topic of study and the methodological approach. The authors map the eight research papers to six “hot topics” in medical education: (1) curriculum reform, (2) duty hours restriction, (3) learner well-being, (4) innovations in teaching and assessment, (5) self-regulated learning, and (6) learning environment, and to three purposes commonly served by medical education research: (1) description, (2) justification, and (3) clarification. They discuss the range of methods employed in the papers. The authors end by encouraging educators to engage in these ongoing scholarly conversations.

B.C. O’Brien is associate professor, Department of Medicine and Educational Researcher, Center for Faculty Educators, University of California, San Francisco, San Francisco, California.

W. May is professor, Department of Medical Education, Keck School of Medicine of the University of Southern California, Los Angeles, California.

T. Horsley is associate director, Research Unit, Royal College of Physicians and Surgeons of Canada and School of Epidemiology, Public Health and Preventive Medicine, University of Ottawa, Ottawa, Ontario, Canada.

Funding/Support: None reported.

Other disclosures: None reported.

Ethical approval: Reported as not applicable

Correspondence should be addressed to Bridget C. O’Brien, UCSF Office of Medical Education, Box 0710, 533 Parnassus Ave., Suite U-80, San Francisco, CA 94143; e-mail: bridget.obrien@ucsf.edu.

“The most fruitful and natural exercise for our minds is, in my opinion, conversation.”

—Michel de Montaigne, The Essays: A Selection

Research in medical education focuses on “researchable problems” related to teaching and learning.1 The “problems” investigators pursue depend heavily on the context in which medical education is situated (e.g., social and political issues affecting health care and education). In 2012, Dauphinee and Anderson2 reviewed the 50-year history of the Association of American Medical Colleges Research in Medical Education (RIME) program, noting that research presented at the conference “reflects the state of our community and our society at any point in time,” particularly in areas such as admissions, assessment, relationships, clinical reasoning, professional development, and learning environments.3 , 4 In these and other areas, there continue to be vibrant scholarly conversations captured through published research and presentations.5 In this Commentary, we highlight the scholarly conversations taken up among the 2016 RIME papers and encourage readers to see each paper as a catalyst for thought-provoking conversations with colleagues, peers, or even the authors themselves at the RIME conference.

In this RIME Commentary we aim to go beyond a general summary of the eight accepted papers by exploring what these papers contribute to the current scholarly discourse in medical education. Using “conversations in medical education” as our guiding frame, we reviewed each paper as a voice embedded in broader conversations. We organized our discussion around two domains: The topic of study and the methodological approach.

Back to Top | Article Outline

Hot Topics of Conversation

In February 2016, Uijtdehaage6 presented 10 “hot topics” in medical education based on a review of the literature and broad stakeholder input across medical education. The topics included (1) curriculum reform, (2) impact of duty hours restrictions, (3) well-being of trainees and physicians, (4) innovations in teaching and assessment, (5) self-regulated learning, (6) transitions to accountable care, (7) interprofessional education, (8) student admissions, (9) faculty development, and (10) simulation. We added learning environment, which has become an important area of educational research,7–10 particularly given its integration into the current Liaison Committee on Medical Education and Accreditation Council for Graduate Medical Education requirements. Using the edited list of hot topics as a guide, we mapped the 2016 RIME papers to six topics (see Table 1) and describe here how the research contributes to ongoing scholarly conversations.

Table 1

Table 1

Back to Top | Article Outline

Topic 1: Curriculum reform

Recognizing that time-based approaches to medical education are no longer sufficient for ensuring a competent workforce, competency-based medical education (CBME) has become one of the guiding principles globally for curriculum reform and assessment.11 However, transformational curricular change is not easily realized.12 Challenges experienced by educational programs trying to implement CBME call attention to important scholarly questions and have catalyzed important evaluation and research to inform future efforts.

Three of the 2016 RIME papers contribute to ongoing conversations related to CBME. Bierer and Dannefer13 explore how the shift toward CBME affects learners’ motivation and study strategies. They suggest that a CBME approach requires learners to take greater personal ownership and responsibility for learning than more traditional time-based, test-driven curricula. Drawing on three components of self-determination theory (competence, relatedness, autonomy), this paper identifies important features of the curriculum and the learning environment that facilitated students’ adopting appropriate study and performance monitoring strategies. The paper invites educators to consider the relevance of these strategies and supporting resources to learners and learning environments in their own institutions. It also contributes to conversations started by several others who have used self-determination theory to guide curricular reform, and further demonstrates the value of this theory.14–16

Apramian and colleagues17 speak to issues related to performance standards in CBME. Several authors, including Apramian and colleagues, have shown wide variation in assessors’ judgments of a learner’s competence.18–20 The 2016 RIME paper by Apramian and colleagues adds to ongoing discussions about how to change the culture of assessment such that variation among assessors and their conceptualization of competence can be acknowledged and accounted for in assessment practices and processes.21

On the topic of designing curricula that can accommodate flexible learning trajectories, Burk-Rafel and colleagues22 consider new ways of developing curricular experiences that fall outside the core curriculum. These experiences also need to be competency based, but can allow learners’ customization of curriculum based on personal learning interests.23 , 24 In this paper, the authors address a gap in the literature by describing a process for developing scholarly concentrations that combines a national review of similar curricula with a locally based, learner-centered approach. On the basis of students’ preferences, the authors used an algorithm to determine the optimal number of scholarly concentrations to offer in order to meet students’ preferences. This paper invites conversations about innovative ways to perform the needs assessment portion of the Kern framework for curriculum development.25

Back to Top | Article Outline

Topic 2: Impact of duty hours restrictions

Following the Institute of Medicine reports in 1999 (“To Err Is Human”)26 and 2008 (“Resident Duty Hours”)27 that highlighted the need for safer health care workplaces, the United States,28 Canada,29 and many other countries30 have mandated reduced work hours for residents. These restrictions have sparked crucial conversations in medical education, particularly in procedurally oriented specialties such as surgery. Surgeons have voiced great concern about negative consequences of duty hours restrictions on surgical residents’ training, development of professionalism, and preparedness for independent practice,31 but the evidence base is inconclusive. The Royal College of Physicians and Surgeons of Canada published a consensus report on resident duty hours32 citing evidence that rigid restriction of resident duty hours can lead to suboptimal outcomes in both care delivery and education. Additionally, studies suggest that duty hours are associated with improvements in surgical residents’ quality of life and reductions in fatigue.31 A recent national cluster-randomized trial of duty hours flexibility in surgical training reported in the New England Journal of Medicine that flexible, less-restrictive duty hours policies for surgical residents, as compared with standard duty hours policies, were associated with noninferior patient outcomes and no significant difference in residents’ satisfaction with overall well-being and education quality.33

Residents must regularly decide how to navigate competing priorities to protect the safety of patients, maximize their learning by seeing the whole trajectory of a patient’s course, and abide by the rules of their training program.34 , 35 Coverdill and colleagues36 give voice to surgical residents’ perspectives and decisions about when to “break the rules” (i.e., stay beyond the end of their shift) and when to comply with duty hours. The research sheds light on the complexity of negotiating these decisions and connects readers to conversations related to professionalism from social, cultural, and political perspectives—as a characteristic of the systems, organizations, and environments in which individuals are situated.

Back to Top | Article Outline

Topic 3: Learner well-being

Much of the literature on learner well-being focuses on particular psychological states (namely, stress, anxiety, and depression) or on characteristics such as resilience.37–39 A few authors have called attention to contextual factors (social, cultural, organizational, etc.) that influence well-being,40 but this topic warrants further attention. Two of the 2016 RIME papers join these conversations. Whitgob and colleagues41 explore mistreatment of residents by patients and families. Building on prior work that described the prevalence of mistreatment among residents, this research furthers the conversation by examining how residents and faculty respond to such mistreatment. This work turns the conversation about mistreatment toward strategies for supporting learner well-being. By so doing, it bridges two important, but often separate, discourses.

The National Steering Committee on Resident Duty Hours reviewed the impact of resident duty hours and reported that traditional call models presented risks to the physical, mental, and occupational health of residents.29 Coverdill and colleagues36 contributed to conversations about learner well-being, particularly in the context of duty hours restrictions. Interestingly, resident well-being is a key part of the argument for duty hours restrictions and the “new professionalism” explored in this paper, yet there is another side of well-being related to learning and cultural expectations regarding “traditional professionalism” that works against the intent of duty hours restrictions. The findings by Coverdill and colleagues invite us to join a conversation about the meaning of well-being for residents in the context of these tensions. This is an important conversation in light of Bilimoria and colleagues’33 article comparing residency programs with standard duty hours policies to programs with flexible, less restrictive policies, which found no significant differences in residents’ satisfaction with overall well-being.

Back to Top | Article Outline

Topic 4: Innovations in teaching and assessment

Educators are often innovators, continuously striving to improve their craft and enhance learning through new pedagogical techniques, technologies, assessment processes, and approaches to feedback. Ideally, research in medical education both informs innovations as well as evaluates them.42 The paper by McConnell and colleagues43 provides an example of how research can inform future innovations. Using an experimental design with random assignment, the authors suggest that positive and negative mood states hindered novice students’ knowledge acquisition and application of knowledge to novel problems. These mood states may disrupt working memory and add to students’ cognitive load, which may explain reduced performance compared with students who are not in the activated mood state. These findings may help learners and educators consider mood state into their pedagogical strategies, particularly by adjusting information-processing demands.

Park and colleagues44 take up important questions for conversations about scoring performance-based assessments. Their paper examines methods used to calculate composite scores from subcomponent scores to balance the psychometric properties of the score with clinical relevance. In this study, the authors examined different approaches to weighting subcomponent scores in a locally derived clinical skills exam designed to mimic the United States Medical Licensing Examination (USMLE) Clinical Skills (CS) exam. While their findings provide new, important knowledge about how individual schools could approach scoring their own locally derived clinical skills exams, they also have broader implications. The concept of balancing psychometric properties with clinical and curricular relevance could be applied to other settings where it is necessary to combine and interpret scores from multiple sources of assessment, such as Clinical Competency Committee decisions about entrustment or resident progress on Milestones.

Back to Top | Article Outline

Topic 5: Self-regulated learning

Self-regulated learning, described as “a multidimensional process incorporating a set of inter-related and contextualized thoughts, actions, and feelings that a person strategically uses to reach personal goals,”45 is widely recognized as an important contributor to development and maintenance of competence. The “once in, good for life” model has been successfully challenged by research demonstrating that physicians benefit from well-structured educational programs and approaches to their learning.45–50 Bierer and Dannefer13 posit that the transition to CBME will require restructuring of learning environments in order to motivate trainees to take personal ownership of their learning. The longitudinal, qualitative study centers on how first-year students adapt to a curriculum purposely designed to focus on reflection on performance and competence (versus that of high-stakes examination) using self-determination theory as a framework to guide interpretation of findings. Although great variability exists for adapting to the learning environment, first-year students were able to realize and adopt an assessment-for-learning mind-set. These findings contribute to the emerging discourse of CBME that seeks to transition from a system based primarily on high-stakes examination to one that personalizes progression through trackable milestones, entrustable professional activities, and multiple formative biopsies of performance. Such systems, aligned with self-regulated learning, seek to ensure that physicians have the skills they need at every stage of their careers to provide the best possible patient care.

Back to Top | Article Outline

Topic 6: Learning/educational environment

The learning or educational environment goes beyond the physical surroundings within which learning takes place and encompasses broader and less tangible notions of educational “climate,” “culture,” or “ethos.” The American Medical Association defines the learning environment as

a social system that includes the learner (including the external relationships and other factors affecting the learner), the individuals with whom the learner interacts, the settings and purposes of the interaction, and the formal and informal rules/policies/norms governing the interaction.51

The learning environment can be measured in many ways, including perceptions and observations.52 , 53 Several of the 2016 RIME papers speak to one or more aspects of the learning environment—some explicitly, others implicitly.

The paper by Whitgob and colleagues41 highlights the importance of recognizing patients and families as part of the learning environment, particularly when they contribute to a negative learning environment through discrimination and mistreatment of trainees. By offering some possible strategies for addressing this mistreatment, the authors open the door for much-needed conversations about how educators and trainees can navigate these challenging situations and ultimately improve the learning environment.

Bierer and Dannefer13 examined the learning environment from the perspective of assessment. They studied how the shift from a culture of assessment based on high-stakes testing to a culture that is competency based and promotes reflection on performance influences students’ strategies and motivation for learning. With guidance and support, all students eventually adopted study strategies that aligned with their learning needs and with a competency-based approach to assessment. By identifying specific components of the learning environment that motivate students to “take personal ownership for learning” and guide them toward learning strategies that support self-regulated learning, the authors contribute important contextual information to current conversations about implementation and impact of CBME.12 , 54

Building on a series of previous studies,55 Gruppen and Stansfield56 examine the educational environment of medical schools from a systems perspective, looking specifically at the contribution of individual and institutional components to variance in ratings of the educational environment. Individual-level variables (i.e., students’ demographic and personal characteristics such as age, ethnicity, empathy, patient centeredness, and tolerance for ambiguity) outweighed institutional-level variables (i.e., acceptance rate, student enrollment, and median Medical College Admission Test [MCAT] score of matriculated students), fourfold, in their contribution to perceived quality of the educational environment. While these findings offer several focused points for conversation around the complex dynamics of educational environments, the authors also note areas that warrant further discussion such as group-level variables within schools and objective measures of the educational environment.

Apramian and colleagues17 studied the learning environment at the level of learner–supervisor interactions. Building on prior work that examined the role of procedural variation in surgical education,57 this study explores the implications of procedural variation and “thresholds of principle and preference” on supervisors’ assessments of residents’ competence. The study raises intriguing questions about cultures of assessment and the tension between reasonable variation in supervisors’ judgments and variation that creates an unfavorable learning environment by causing undue stress and uncertainty. This conversation goes beyond surgical education and can draw a larger audience of faculty in graduate medical education who must make entrustment decisions and may benefit from considering how their own decisions and those of their colleagues impact the learning environment in their setting and the overall residency program.58

Coverdill and colleagues36 looked at the learning environment from the perspective of culture. Studying residents from 13 different general surgery programs allowed the authors to identify cultural and organizational influences that affect surgical residents’ decisions to stay or leave at the end of a day shift. Cultural norms against handing off work to an on-call resident or night team were pervasive, suggesting that learning environments in which “care can be passed off confidently and consistently across shifts” are far from the norm.36 As mentioned earlier, this study adds to the growing discussion about the impact of duty hours restrictions on patient care, patient safety, resident wellness, and resident education59 by delving into the prevalence of duty hours violations and the reasoning and cultural norms underpinning the violations. This study can bring multiple readers to the table—those interested in learning environments as well as those interested in learner well-being, professionalism, and ethics.

Back to Top | Article Outline

Summary: Hot topics

The research papers chosen for publication in this year’s RIME supplement contribute to a variety of conversations related to six hot topics (Table 1). Several of the papers build on prior work presented at the RIME conference—some by some of the same authors (e.g., Gruppen55; Apramian57; Park60), and others by different authors35 , 61—which nicely illustrates the opportunity to keep the scholarly dialogue alive and advance our understanding of and potential solutions to challenging but researchable problems in medical education. The 2016 RIME papers approach these topics from a variety of standpoints, most commonly from the perspective of individual learners,13 , 22 , 36 , 43 , 44 though also from the standpoint of faculty,17 , 41 and through analyses that examine institutional effects.57 There may be opportunities for future work to incorporate patient and family perspectives.

Organizing our review of the RIME papers around Hot Topics calls attention to the five topics that were not addressed—namely, Transition to Accountable Care (workforce issues, career choices, patient and family engagement in care); Interprofessional (IP) Education (IP learning, collaboration, practice, teaching and assessing IP communication, teamwork); Student Admissions (new MCAT, screening for noncognitive attributes); Faculty Development (supporting educational scholarship; identity and community; mentoring models; leadership); and Simulation (using simulation for assessment of clinical competence, teamwork; relationship between level of fidelity and learning outcomes; transfer of learning in simulation to practice). There are many possible reasons why these topics do not appear among the selected RIME papers, including the fact that these papers are a limited sample of a much larger population of research. Nonetheless, reflecting on the absence of these topics in 2016 may prompt another turn in the conversation, particularly if these are topics that members of the RIME community would like to see taken up. For example, what are the important problems or questions related to IP education, simulation, or student admissions? Are there opportunities for members of the RIME community to collaborate or support one another’s scholarship in these areas?

Back to Top | Article Outline

Purpose and Methodological Approaches

As we considered the scholarly conversations about the methodological approaches used in the 2016 RIME papers, we found it helpful to take a step back and reflect on the purposes served by these approaches. In 2008, Cook and colleagues42 proposed three categories for classifying the purposes of research in medical education: description (What was done?), justification (Did it work?), and clarification (Why or how did it work?). They found that studies often fell into more than one category and, when this occurred, classified it as the “highest level.” Various study designs and methodological approaches can be used for these purposes, and the 2016 RIME papers provide excellent examples in each of these categories. The papers also prompt discussion about the types of approaches needed going forward to advance our knowledge and educational practices.

Back to Top | Article Outline

Description

Many of the 2016 RIME papers could be classified as both description and clarification, particularly if they present a new conceptual model that helps clarify a complex phenomenon. The paper by Burk-Rafel and colleagues22 exemplifies the valuable contributions made by studies that are primarily descriptive. The authors used a novel process for developing a learner-centered set of scholarly concentrations in their medical school. They began by collecting information about scholarly concentrations offered at 32 medical schools, then thematically analyzed the titles to identify 10 content domains and surveyed students at their institution about their preferences for each of these domains. Using the survey results, the authors created and applied a capacity optimization algorithm to decide the best configuration of domains to meet student preferences. By providing a clear description of their approach to program development, the authors offer a model that others can learn from and replicate.

Back to Top | Article Outline

Justification through experimental design

There have been calls for more high-quality studies using an experimental design with randomized control in medical education,62 , 63 but these calls have met concerns about the difficulties of randomizing in educational settings, the lack of an underlying conceptual or theoretical framework, and the privileging of certain types of evidence over others.64 , 65 The absence of these key components limits investigators’ ability to identify the active component of educational interventions. Further, some scientists have criticized randomized trials for not contributing to the development of theory and for providing evidence that may not generalize across different contexts and type of learners.66

This year McConnell and colleagues43 conducted an experimental study using random assignment to examine the impact of mood on learners’ application of basic science principles to novel problems. In contrast to the concerns mentioned above, this study is firmly grounded in cognitive theories regarding mood and cognitive load. We considered this a justification study in the sense that it tests a hypothesis about the relationship between mood and learning basic science principles. Although mood is not an educational intervention per se, it can be altered and, as demonstrated in this study, can impair or enhance learning. Reading this paper may inspire researchers to consider opportunities for theoretically based randomized control experiments that justify (or refute) particular educational approaches or interventions.

Back to Top | Article Outline

Clarification through mixed methods

Mixed-method approaches have a great deal to offer researchers in medical education, particularly when “studying new questions and initiatives or complex initiatives in natural, as opposed to experimental, settings.”67 As such, they are well poised to bring clarity to complex phenomena. There are several models and typologies used to describe mixed-methods research,67–69 one of which is the explanatory model, commonly used to delve more deeply (using qualitative methods) into questions identified from quantitative data. Coverdill and colleagues36 nicely illustrate this model in their mixed-methods study. The authors began with a survey and then conducted follow-up interviews with survey participants who volunteered to be interviewed. This approach allowed them to provide descriptive information about delayed departures and general reasons for the delays across 13 residency programs (thus enhancing generalizability by “transcending local cultures”), then to delve more deeply into the reasoning and motivation underpinning the delays. Through this explanatory mixed-methods design, the authors identified several motivations for delaying end-of-shift departure that had largely been overlooked in other studies and would not have been captured through survey methods alone.

Back to Top | Article Outline

Clarification through qualitative approaches

One of the most important considerations in qualitative approaches is how the methods align with the research purpose and questions. The three papers in the RIME supplement that use qualitative methods make compelling cases for the approach and methods used. In the spirit of a grounded theory approach, Apramian and colleagues17 pursue questions that arose from earlier phases of their analysis and thus move their work from description to clarification. Their study involves reanalysis of qualitative interview data and additional interviews to refine their developing explanatory model about the role of procedural variation (thresholds of principle and preference) in judgment of resident competence. This work demonstrates how a large qualitative dataset can be used to support a program of research. Whitgob and colleagues41 also used a constructivist grounded theory approach, making use of scenarios to facilitate discussion around the phenomenon of interest (learner mistreatment by patients and families) during interviews. Ginsburg and colleagues70 , 71 used a similar approach to study students’ and faculty members’ reasoning and responses to professionally challenging situations. The scenario-based, guided interview approach helps reduce the complexity of the phenomenon of interest by addressing difficult topics in a way that removes some of the emotion and personal investment associated with a personal example. Also, using a standard set of scenarios makes it easier to compare responses among participants and to capture thought processing in real time. This method can enhance the clarity of complex issues such as trainee mistreatment by patients and families and thus offers a methodological approach that may be helpful for other studies addressing similarly psychosocially, emotionally, and/or culturally complex phenomena.

Back to Top | Article Outline

Clarification through multi-institutional studies/large datasets

An ongoing critique of medical education research is the limited generalizability of findings due to reliance on single-institution studies with small sample sizes.72 Notwithstanding the impediments and challenges related to funding for medical education,73 multi-institutional collaborative projects have the potential to advance the field in important ways. Two RIME papers demonstrate the value of multi-institutional studies. Gruppen and Stansfield56 used data from the Learning Environment Study, which included 28 medical schools in the United States and Canada. This allowed multilevel modeling of factors that may have a bearing on the learning environment at the undergraduate medical education level. Coverdill and colleagues36 conducted their study at the graduate medical education level, using 13 general surgery programs in 10 regionally diverse states. These papers could generate conversations focusing on the feasibility of conducting multi-institutional collaborative projects, and how RIME could possibly be a venue for getting these collaborations in place.

Back to Top | Article Outline

Clarification through psychometric studies

Psychometric studies advance the understanding and excellence of assessment and typically study measurement and measurement instruments.1 These studies can add clarity to processes that many might consider a mystery. The paper by Park and colleagues44 nicely illustrates the valuable contributions that psychometric studies make to our discussions and decisions about scoring performance-based exams. The authors examined the validity of different weighting strategies when combining the scores from standardized patient encounters and the postencounter notes to create the Integrated Clinical Encounter (ICE) score, in a local examination similar to the USMLE CS examination. The scoring of the different components of the ICE is somewhat of a black box as exact details are not provided to medical schools. In this study, SP–history (SP-Hx) scores, SP–physical exam scores, and postencounter note scores were combined, using different weighting approaches. By doing so, the composite score reliability of the ICE scores increased by up to 0.20 points. This study offers a thought-provoking take that invites conversations about whether to include SP-Hx scores to increase the composite score reliability, and whether this could provide more useful feedback to students.

Back to Top | Article Outline

Summary of purpose and research methods

The 2016 RIME papers offer perspectives and opportunities for further dialogue about the purposes served by this research and the methods selected to support these purposes. We categorized most of the papers as clarification studies and noted the wide range of methodological approaches used to advance our understanding of various phenomena (Table 1). While some authors have raised concerns about the limited scope and quality of research methodologies in medical education,66 , 74 , 75 this set of RIME papers includes a broad range of approaches and perspectives. This diversity can foster generative and sophisticated conversations and suggests the health of medical education research as a field.

In general, the papers provide clear descriptions of their methods and appropriate justification for their methodological choices. This transparency is strongly encouraged going forward as it enriches our conversations regarding methodology and facilitates future implementation or replication of similar methodologies.

As research in medical education continues to accumulate, we anticipate increased outputs in the form of evidence syntheses. Although no evidence syntheses are included in RIME this year, important conversations are emerging in the literature, particularly regarding the “place” of evidence syntheses within medical education and how to select the most appropriate approaches for particular research questions.76–78

Back to Top | Article Outline

Discussion

Dauphinee and Anderson2 encouraged us to see the RIME conference as “an arena for learning, for creativity, for scholarly interaction.” Along similar lines, Lingard and Driessen5 describe research writing as a “social and rhetorical act,” which begins with authors joining a conversation. To this end, the RIME supplement can be seen as a means of promoting scholarly conversation, with each paper serving as “the next turn in a conversation.”79 We hope readers will join these conversations in person at the RIME conference as well as through subsequent research.

The RIME Commentary is a unique privilege to reflect on the rich and diverse conversations of researchers from varying perspectives, paradigms, and jurisdictions. Such conversations can be especially useful for newcomers to the medical education research community and to educators from countries where there is a need to make faculty more aware of educational research.80 The supplement contains some of the field’s best scholarship, and we hope that all educators and researchers will find themselves inspired to see a problem in a new way, ask a new question, consider an alternative theoretical or methodological approach, design a novel intervention, or collaborate with a colleague from a different institution.

Acknowledgments: Many thanks to the members of the Research in Medical Education Planning Committee who reviewed this Commentary and provided helpful feedback and editing.

Back to Top | Article Outline

References

1. Ringsted C, Hodges B, Scherpbier A. “The research compass”: An introduction to research in medical education: AMEE guide no. 56. Med Teach. 2011;33:695–709.
2. Dauphinee WD, Anderson MB. Maturation (and déjà vu) comes to the Research in Medical Education program at age 51. Acad Med. 2012;87:1307–1309.
3. West DC, Robins L, Gruppen LD. Workforce, learners, competencies, and the learning environment: Research in Medical Education 2014 and the way forward. Acad Med. 2014;89:1432–1435.
4. Miller KH, Miller BM, Karani R. Considering research outcomes as essential tools for medical education decision making. Acad Med. 2015;90(11 suppl):S1–S4.
5. Lingard L, Driessen E. Cleland J, Durning SJ. How to tell compelling scientific stories: Tips for artful use of the research manuscript and presentation genres.Researching Medical Education. 2015:West Sussex, UK: Association for the Study of Medical Education/John Wiley & Sons, Ltd.; 259–268.
6. Uijtdehaage S. What are the hot topics in medical education research? Presented at: Energizing Your Educational Scholarship workshop; February 9, 2016; San Antonio, TX.
7. Genn JM. AMEE medical education guide no. 23 (part 1): Curriculum, environment, climate, quality and change in medical education—a unifying perspective. Med Teach. 2001;23:337–344.
8. Colbert-Getz JM, Kim S, Goode VH, Shochet RB, Wright SM. Assessing medical students’ and residents’ perceptions of the learning environment: Exploring validity evidence for the interpretation of scores from existing tools. Acad Med. 2014;89:1687–1693.
9. Weiss KB, Wagner R, Bagian JP, Newton RC, Patow CA, Nasca TJ. Advances in the ACGME Clinical Learning Environment Review (CLER) program. J Grad Med Educ. 2013;5:718–721.
10. Schönrock-Adema J, Bouwkamp-Timmer T, van Hell EA, Cohen-Schotanus J. Key elements in assessing the educational environment: Where is the theory? Adv Health Sci Educ Theory Pract. 2012;17:727–742.
11. Frank JR, Snell LS, Cate OT, et al. Competency-based medical education: Theory to practice. Med Teach. 2010;32:638–645.
12. Hawkins RE, Welcher CM, Holmboe ES, et al. Implementation of competency-based medical education: Are we addressing the concerns and challenges? Med Educ. 2015;49:1086–1102.
13. Bierer SB, Dannefer EF. The learning environment counts: Longitudinal qualitative analysis of study strategies adopted by first-year medical students in a competency-based educational program. Acad Med. 2016;91(11 suppl):S44–S52.
14. Kusurkar RA, Croiset G, Mann KV, Custers E, Ten Cate O. Have motivation theories guided the development and reform of medical education curricula? A review of the literature. Acad Med. 2012;87:735–743.
15. Ten Cate TJ, Kusurkar RA, Williams GC. How self-determination theory can assist our understanding of the teaching and learning processes in medical education. AMEE guide no. 59. Med Teach. 2011;33:961–973.
16. Orsini C, Binnie VI, Wilson SL. Determinants and outcomes of motivation in health professions education: A systematic review based on self-determination theory. J Educ Eval Health Prof. 2016;13:19.
17. Apramian T, Cristancho S, Watling C, Ott M, Lingard L. “Staying in the game”: How procedural variation shapes competence judgments in surgical education. Acad Med. 2016;91(11 suppl):S37–S43.
18. Gingerich A, van der Vleuten CP, Eva KW, Regehr G. More consensus than idiosyncrasy: Categorizing social judgments to examine variability in Mini-CEX ratings. Acad Med. 2014;89:1510–1519.
19. Naumann FL, Marshall S, Shulruf B, Jones PD. Exploring examiner judgement of professional competence in rater based assessment [published online January 21, 2016]. Adv Health Sci Educ Theory Pract. doi: 10.1007/s10459-016-9665-x.
20. Kogan JR, Conforti LN, Iobst WF, Holmboe ES. Reconceptualizing variable rater assessments as both an educational and clinical care problem. Acad Med. 2014;89:721–727.
21. Govaerts MJ. Competence in assessment: Beyond cognition. Med Educ. 2016;50:502–504.
22. Burk-Rafel J, Mullan PB, Wagenschutz H, Pulst-Korenberg A, Skye E, Davis MM. Scholarly concentration program development: A generalizable, data-driven approach. Acad Med. 2016;91(11 suppl):S16–S23.
23. Cooke M, Irby DM, O’Brien BC. Educating Physicians. 2010.San Francisco, CA: Jossey-Bass.
24. Boninger M. Foreword: Scholarly concentrations in the medical student curriculum. Acad Med. 2010;85:403–404.
25. Kern DE, Thomas PA, Hughes MT. Curriculum Development for Medical Education: A Six-Step Approach. 2009.2nd ed. Baltimore, MD: Johns Hopkins University Press.
26. Kohn LT, Corrigan JM, Donaldson MS. To Err Is Human: Building a Safer Health System. 1999.Washington, DC: National Academy Press, Institute of Medicine.
27. Institute of Medicine. Resident Duty Hours: Enhancing Sleep, Supervision, and Safety. 2008.Washington, DC: National Academies Press.
28. Accreditation Council for Graduate Medical Education. Duty hours. http://www.acgme.org/What-We-Do/Accreditation/Duty-Hours. Accessed July 11, 2016.
29. National Steering Committee on Resident Duty Hours. Fatigue, Risk and Excellence: Towards a Pan-Canadian Consensus on Resident Duty Hours. 2013.Ottawa, Ontario, Canada: Royal College of Physicians and Surgeons of Canada.
30. Temple J. Resident duty hours around the globe: Where are we now? BMC Med Educ. 2014;14(suppl 1):S8.
31. Harris JD, Staheli G, LeClere L, Andersone D, McCormick F. What effects have resident work-hour changes had on education, quality of life, and safety? A systematic review. Clin Orthop Relat Res. 2015;473:1600–1608.
32. Imrie K, Frank JR, Ahmed N, Gorman L, Harris KA. A new era for resident duty hours in surgery calls for greater emphasis on resident wellness. Can J Surg. 2013;56:295–296.
33. Bilimoria KY, Chung JW, Hedges LV, et al. National cluster-randomized trial of duty-hour flexibility in surgical training. N Engl J Med. 2016;374:713–727.
34. Byrne JM, Loo LK, Giang DW. Duty hour reporting: Conflicting values in professionalism. J Grad Med Educ. 2015;7:395–400.
35. Taylor TS, Nisker J, Lingard L. To stay or not to stay? A grounded theory study of residents’ postcall behaviors and their rationalizations for those behaviors. Acad Med. 2013;88:1529–1533.
36. Coverdill JE, Alseidi A, Borgstrom DC, et al. Professionalism in the twilight zone: A multicenter, mixed-methods study of shift transition dynamics in surgical residencies. Acad Med. 2016;91(11 suppl):S31–S36.
37. Dyrbye LN, Thomas MR, Huntington JL, et al. Personal life events and medical student burnout: A multicenter study. Acad Med. 2006;81:374–384.
38. Howe A, Smajdor A, Stöckl A. Towards an understanding of resilience and its relevance to medical training. Med Educ. 2012;46:349–356.
39. Mavor KI, McNeill KG, Anderson K, Kerr A, O’Reilly E, Platow MJ. Beyond prevalence to process: The role of self and identity in medical student well-being. Med Educ. 2014;48:351–360.
40. Dobkin PL, Balass S. Multiple influences contribute to medical students’ well-being and identity formation. Med Educ. 2014;48:340–342.
41. Whitgob EE, Blankenburg RL, Bogetz AL. The discriminatory patient and family: Strategies to address discrimination towards trainees. Acad Med. 2016;91(11 suppl):S64–S69.
42. Cook DA, Bordage G, Schmidt HG. Description, justification and clarification: A framework for classifying the purposes of research in medical education. Med Educ. 2008;42:128–133.
43. McConnell MM, Monteiro S, Pottruff MM, et al. The impact of emotion on learners’ application of basic science principles to novel problems. Acad Med. 2016;91(11 suppl):S58–S63.
44. Park YS, Lineberry M, Hyderi A, Bordage G, Xing K, Yudkowsky R. Differential weighting for subcomponent measures of integrated clinical encounter scores based on the USMLE Step 2 CS examination: Effects on composite score reliability and pass–fail decisions. Acad Med. 2016;91(11 suppl):S24–S30.
45. Cleary TJ, Durning SJ, Gruppen LD, Hemmer PA, Artino AR. Walsh K. Self-regulated learning in medical education.Oxford Textbook of Medical Education. 2013:Oxford, UK: Oxford University Press; 465–477.
46. Brydges R, Butler D. A reflective analysis of medical education research on self-regulation in learning and practice. Med Educ. 2012;46:71–79.
47. Klass D. A performance-based conception of competence is changing the regulation of physicians’ professional behavior. Acad Med. 2007;82:529–535.
48. Goulet F, Hudon E, Gagnon R, Gauvin E, Lemire F, Arsenault I. Effects of continuing professional development on clinical performance: Results of a study involving family practitioners in Quebec. Can Fam Physician. 2013;59:518–525.
49. Wenghofer EF, Marlow B, Campbell C, et al. The relationship between physician participation in continuing professional development programs and physician in-practice peer assessments. Acad Med. 2014;89:920–927.
50. Wenghofer EF, Campbell C, Marlow B, Kam SM, Carter L, McCauley W. The effect of continuing professional development on public complaints: A case–control study. Med Educ. 2015;49:264–275.
51. American Medical Association, Initiative to Transform Medical Education. Strategies for Transforming the Medical Education Learning Environment. Phase 3: Program Implementation. Final Report of the December, 2008 Working Conference. 2007.Chicago, Ill:American Medical Association.
52. Skochelak SE, Stansfield RB, Dunham L, et al. Medical student perceptions of the learning environment at the end of the first year: A 28-medical school collaborative. Acad Med. 2016;91:1257–1262.
53. Roff S, Mcaleer S, Harden RM, et al. Development and validation of the Dundee Ready Education Environment Measure (DREEM). Med Teach. 1997;19:295–299.
54. Whitcomb ME. Transforming medical education: Is competency-based medical education the right approach? Acad Med. 2016;91:618–620.
55. Gruppen LD, Stansfield RB, Zhao Z, Sen S. Institution and specialty contribute to resident satisfaction with their learning environment and workload. Acad Med. 2015;90(11 suppl):S77–S82.
56. Gruppen LD, Stansfield RB. Individual and institutional components of the medical school educational environment. Acad Med. 2016;91(11 suppl):S53–S57.
57. Apramian T, Cristancho S, Watling C, Ott M, Lingard L. Thresholds of principle and preference: Exploring procedural variation in postgraduate surgical education. Acad Med. 2015;90(11 suppl):S70–S76.
58. Saxon K, Juneja N. Establishing entrustment of residents and autonomy. Acad Emerg Med. 2013;20:947–949.
59. Bolster L, Rourke L. The effect of restricting residents’ duty hours on patient safety, resident well-being, and resident education: An updated systematic review. J Grad Med Educ. 2015;7:349–363.
60. Yudkowsky R, Park YS, Hyderi A, Bordage G. Characteristics and implications of diagnostic justification scores based on the new patient note format of the USMLE Step 2 CS exam. Acad Med. 2015;90(11 suppl):S56–S62.
61. Sebok SS, Syer MD. Seeing things differently or seeing different things? Exploring raters’ associations of noncognitive attributes. Acad Med. 2015;90(11 suppl):S50–S55.
62. Todres M, Stephenson A, Jones R. Medical education research remains the poor relation. BMJ. 2007;335:333–335.
63. Torgerson CJ. Educational research and randomised trials. Med Educ. 2002;36:1002–1003.
64. Dornan T, Peile E, Spencer J. On “evidence.” Med Educ. 2008;42:232–234.
65. Norman G. RCT = results confounded and trivial: The perils of grand educational experiments. Med Educ. 2003;37:582–584.
66. Eva KW. Broadening the debate about quality in medical education research. Med Educ. 2009;43:294–296.
67. Schifferdecker KE, Reed VA. Using mixed methods research in medical education: Basic guidelines for researchers. Med Educ. 2009;43:637–644.
68. Creswell JW, Plano Clark VL. Designing and Conducting Mixed Methods Research. 2011.2nd ed. Thousand Oaks, CA: Sage Publications, Inc.
69. Leech NL, Onwuegbuzie AJ. A typology of mixed methods research designs. Qual Quant. 2009;43:265–275.
70. Ginsburg S, Regehr G, Lingard L. The disavowed curriculum: Understanding student’s reasoning in professionally challenging situations. J Gen Intern Med. 2003;18:1015–1022.
71. Ginsburg S, Lingard L, Regehr G, Underwood K. Know when to rock the boat: How faculty rationalize students’ behaviors. J Gen Intern Med. 2008;23:942–947.
72. Shea JA, Arnold L, Mann KV. A RIME perspective on the quality and relevance of current and future medical education research. Acad Med. 2004;79:931–938.
73. Gruppen LD, Durning SJ. Needles and haystacks: Finding funding for medical education research. Acad Med. 2016;91:480–484.
74. Rotgens JI. The themes, institutions, and people of medical education research 1988–2010: Content analysis of abstracts from six journals. Adv Health Sci Educ. 2012;17:515–527.
75. Albanese M. Life is tough for curriculum researchers. Med Educ. 2009;43:199–201.
76. Ellaway RH. Challenges of synthesizing medical education research. BMC Med. 2014;12:193.
77. Tricco AC, Soobiah C, Antony J, et al. A scoping review identifies multiple emerging knowledge synthesis methods, but few studies operationalize the method. J Clin Epidemiol. 2016;73:19–28.
78. Kastner M, Tricco AC, Soobiah C, et al. What is the most appropriate knowledge synthesis method to conduct a review? Protocol for a scoping review. BMC Med Res Methodol. 2012;12:114.
79. Lingard L. Joining a conversation: The problem/gap/hook heuristic. Perspect Med Educ. 2015;4:252–253.
80. Damodar KS, Lingaraj J, Kumar LR, Chacko TV. A qualitative analysis of an interactive online discussion by health professions educators on education research. Educ Health (Abingdon). 2012;25:141–147.
© 2016 by the Association of American Medical Colleges