Skip Navigation LinksHome > April 2011 - Volume 86 - Issue 4 > Faculty Development in Assessment: The Missing Link in Compe...
Academic Medicine:
doi: 10.1097/ACM.0b013e31820cb2a7
Faculty Development

Faculty Development in Assessment: The Missing Link in Competency-Based Medical Education

Holmboe, Eric S. MD; Ward, Denham S. MD, PhD; Reznick, Richard K. MD; Katsufrakis, Peter J. MD, MBA; Leslie, Karen M. MD; Patel, Vimla L. PhD; Ray, Donna D. MD; Nelson, Elizabeth A. MD

Free Access
Article Outline
Collapse Box

Author Information

Dr. Holmboe is chief medical officer and senior vice president, American Board of Internal Medicine, Philadelphia, Pennsylvania.

Dr. Ward is chair, Department of Anesthesiology, and associate dean for faculty development–medical education, University of Rochester Medical Center, Rochester, New York.

Dr. Reznick is R.S. McLaughlin Professor and Chair, Department of Surgery, University of Toronto, Toronto, Ontario, Canada.

Dr. Katsufrakis is vice president, National Board of Medical Examiners, Philadelphia, Pennsylvania.

Dr. Leslie is director, Centre for Faculty Development, University of Toronto, Toronto, Ontario, Canada.

Dr. Patel is director, Center for Cognitive Informatics and Decision Making, University of Texas Health Science Center, Houston, Texas.

Dr. Ray is director, Faculty Development, Office of Continuing Medical Education and Faculty Development, University of South Carolina School of Medicine, Columbia, South Carolina.

Dr. Nelson is senior associate dean for medical education, Office of Undergraduate Medical Education, Baylor College of Medicine, Houston, Texas.

Please see the end of this article for information about the authors.

Correspondence should be addressed to Dr. Holmboe, American Board of Internal Medicine, 510 Walnut Street, Suite 1700, Philadelphia, PA 19106; telephone: (215) 446-3606; e-mail: eholmboe@abim.org.

First published online February 21, 2011

Collapse Box

Abstract

As the medical education community celebrates the 100th anniversary of the seminal Flexner Report, medical education is once again experiencing significant pressure to transform. Multiple reports from many of medicine's specialties and external stakeholders highlight the inadequacies of current training models to prepare a physician workforce to meet the needs of an increasingly diverse and aging population. This transformation, driven by competency-based medical education (CBME) principles that emphasize the outcomes, will require more effective evaluation and feedback by faculty.

Substantial evidence suggests, however, that current faculty are insufficiently prepared for this task across both the traditional competencies of medical knowledge, clinical skills, and professionalism and the newer competencies of evidence-based practice, quality improvement, interdisciplinary teamwork, and systems. The implication of these observations is that the medical education enterprise urgently needs an international initiative of faculty development around CBME and assessment. In this article, the authors outline the current challenges and provide suggestions on where faculty development efforts should be focused and how such an initiative might be accomplished. The public, patients, and trainees need the medical education enterprise to improve training and outcomes now.

Just over 100 years ago, Abraham Flexner's1 seminal report, Medical Education in the United States and Canada, sparked widespread reform, and now medical education is once again experiencing significant pressure to transform. Multiple reports from many of medicine's specialty groups and external stakeholders highlight the inadequacies of current training models to prepare a physician workforce to meet the needs of an increasingly diverse and aging population across the globe.2–9 Educators and regulatory bodies are responding to these calls for transformation by focusing on competency-based medical education (CBME), an amalgam of educational theories and approaches that emphasize the outcomes of training.10–12 CBME was recently defined by a group of international collaborators as

an outcomes-based approach to the design, implementation, assessment, and evaluation of a medical education program using an organizing framework of competencies. In CBME, the unit of progression is mastery of specific knowledge, skills, and attitudes and is learner-centered.13

One of the first competency-based frameworks to be introduced was CanMEDS in the mid-1990s.14 The Accreditation Council for Graduate Medical Education followed with the development and introduction of the general competencies framework for residency and fellowship in 2001.15 More recently, the Association of American Medical Colleges has strengthened its emphasis on competencies and outcomes for medical students,16 and the United States Medical Licensing Examination will increasingly emphasize physician competencies.17 Other countries, looking to improve the quality of training and potentially reduce costs, are also working to implement CBME.18

Although there is widespread agreement about the need for competencies that go beyond more traditional competencies, such as clinical skills and knowledge, some have expressed skepticism about the ability of training programs to perform reliably and validly the comprehensive assessments required by a CBME approach.19,20 For example, limited assessment methods and tools currently exist for teamwork and care coordination, key subcompetencies of systems-based practice. CBME, because it is driven by complex situational and context-dependent outcomes, requires robust assessment and evaluation processes to determine whether a trainee is truly prepared to enter the next stage of his or her career. As a result, since the inception of CBME, medical educators have been seeking the holy grail of evaluation tools. Methods such as secure examinations, standardized patients, and procedural simulations have contributed substantially to reliable and valid trainee assessment. For example, higher performance on secure examinations is modestly associated with better clinical performance in practice after completing a graduate medical education training program.21 Standardized patients have become an integral part of medical student education and assessment and are increasingly used in residency programs to judge capability in a controlled setting across a multitude of clinical skills.21–23

However, these methods and tools cannot replace the importance of faculty who are enabled to critically observe, question, and judge trainee performance in actual patient care situations.24 Ensuring that a trainee's capability or competence, as measured by exams and standardized patients, translates, or “transfers,” into actual work-based performance with patients and families is an essential faculty responsibility.25 Because of its emphasis on developmental trajectories, CBME requires more frequent, timely, formative, and authentic assessment and less dependence on “proxy,” summative assessments.10

This perspective is supported by evidence from work in the development of expertise and the perils of isolated self-assessment. For example, exclusively using standardized patients to judge whether a trainee was acquiring competence in clinical skills would not only be expensive but, more important, would not provide the learner with regular and ongoing feedback; direct observation of trainees with timely feedback by faculty is essential. The journey to expertise also requires continuous practice under the critical eyes and ears of faculty who must accurately assess how trainees are progressing with frequent and timely feedback.26,27 Furthermore, a substantial body of literature clearly demonstrates that most physicians cannot determine their own strengths and weaknesses without external data and feedback.28 Effective assessment by faculty is a critical aspect of the equation in the transformation to CBME.

Back to Top | Article Outline

Faculty as Evaluators: Challenges and Opportunities

The fractured learning environment

At present, medical faculty work with trainees primarily in clinical units, referred to by some as microsystems, such as an ambulatory clinic or office-based setting, a hospital ward, a surgical suite, an intensive care unit, or other such sites.29 These clinical units are the context for work-based training and assessment. We are now beginning to understand how professional development and assessment are influenced by the functionality of the clinical units where students, residents, and fellows learn and care for patients.29 Research has identified that effective, successful microsystems are characterized in part by a strong focus on patients, interdependence of staff, staff development, and the generation of performance results. Embedded in these success characteristics is the need for a high level of professionalism, especially among physician leaders. Several recent reports demonstrate that internal medicine residency clinics scoring highly on a systems assessment tool and having electronic medical records are still not using basic quality improvement interventions or providing optimal care.30,31 It is hard to conceive that trainees can effectively acquire competency in clinical care, quality improvement, or systems-based practice if they practice in poorly functioning clinical microsystems.

In the inpatient setting, too many faculty are transients in the very clinical units where they teach and assess. For example, faculty in internal medicine and pediatrics often rotate on inpatient clinical services for just two to four weeks. This rotational structure is deeply ingrained within these specialty training cultures, yet we know little about how rotating through these microsystems affects the faculty's ability to accurately assess competence of their learners.32 In other fields, such as surgery or anesthesiology, residents often encounter pressure to maximize operational efficiency in the unit. Residents may face multiple operating room schedule changes and ultimately may anesthetize or operate on patients they did not originally evaluate for the procedure. These circumstances may or may not be known to the faculty responsible for assessing the learners. A recent study found that supervising faculty anesthesiologists had significantly different and variable conceptions compared with residents about when the residents should be allowed to perform six critical entrustable professional activities independently; acquaintance with the trainee was a key factor that affected this decision.33

This lack of continuity in both patient experience and time with faculty for trainees in the current medical education system makes longitudinal assessment and feedback very difficult. Hirsh and colleagues34 argued for the importance of continuity as an “organizing principle” for medical education, and a recent review on key attributes of effective supervision highlighted the importance of meaningful relations.35 Compounding the fractured learning environment and lack of continuity is the substantial reluctance on the part of faculty to “feed forward” information to their colleagues about trainees over fear of “biasing” the receiving faculty.36,37 However, the end result is a perpetual cycle of “starting over” with assessment instead of using the shared information for the trainee's development and creation of meaningful action plans. These cultural issues around supervision and feedback must be addressed by the educational community.37,38

System factors influence trainee performance and faculty members' need to account, and sometimes “adjust,” for these system factors. Such adjustment might lead to rating errors, such as halo effect and leniency error, because the faculty may feel the trainee was disadvantaged by a dysfunctional microsystem, especially if the microsystem “parasitizes” trainees, assigning them menial or undesirable tasks, often at the expense of educational experiences. Conversely, faculty may blame a trainee for an error when in fact the primary cause was a system problem. Teasing out the factors that lead to adverse events, for example, can be difficult unless systematic methods, such as root cause analysis, are used.39 Few faculty are trained to use these skills.

Learning to work in interdisciplinary teams and understanding how the systems of the clinical unit function are also vital to the quality of patient care, teaching, and assessment. Unless interdisciplinary team care is the norm of a practice setting, it is hard to imagine how spending only two to four weeks supervising trainees is sufficient time for faculty to assess how well the trainees are interacting with the other essential health care providers on the unit. Working in interdisciplinary teams also calls for a more complex, contextually rich conception of professionalism. Hafferty and Levinson40,41 have explicitly called for the incorporation of complex adaptive system thinking when teaching and evaluating professionalism. To do this, faculty must understand the science of systems and how to work effectively in interdisciplinary teams, and they must move away from traditional views to a more relational view of autonomy. Relational autonomy recognizes that human agents are interconnected and interdependent, meaning that autonomy is socially constructed and must be granted by others.42,43

Furthermore, adhering to a systems approach assumes the faculty themselves have a good understanding of how the clinical unit functions and have the skill necessary to effectively assess the system and the essential roles of other health care providers on a team. Combining faculty who have insufficient system understanding with dysfunctional clinical units can only exacerbate the problem of flawed assessment and contribute to the potentially deleterious effects of the hidden curriculum.42 Future faculty development will need to incorporate training about how system factors affect the quality of both teaching and patient care, and also how faculty must be prepared to assess their trainees' competencies in systems-based practice.

For several reasons, the outpatient setting holds potentially more promise than inpatient settings for longitudinal assessment and feedback for most specialties.44 First, many trainees in specialties such as internal medicine, family medicine, and pediatrics work with a stable group of faculty preceptors who can observe these trainees over time.24 Second, because trainees often have their own panel of patients, assessment methods such as a medical record audit can be combined with reflection guided by faculty.45 Finally, as so much of medicine has moved into the outpatient setting, it follows logically that more training and assessment should occur here as well.

Back to Top | Article Outline
Traditional assessment roles

For the foreseeable future, two traditional faculty roles in assessment will continue to be essential: (1) questioning to probe knowledge and clinical reasoning and (2) direct observation to judge the clinical skills of medical interviewing, physical examinations, counseling, and other communication skills as well as procedural skills. Questions are crucial for helping trainees to learn the core skill of clinical reasoning. Unfortunately, faculty often fail to explore the logic and rationale behind trainee decisions.46 Faculty need to develop the skills to ask questions that emphasize the reasoning process and incorporate key findings and lessons from a growing body of evidence from research on cognition.46,47 Practical approaches exist to help faculty acquire these skills.46,48 These questioning skills apply equally well to the evaluation of procedural skills.

Although faculty need to be critical and accurate observers of trainee performance, limited published research demonstrates that faculty frequently fail to identify deficiencies in trainees' clinical skills.24,49–51 Ironically, despite the central role of faculty in teaching and assessment, only one study to date has demonstrated any efficacy of faculty development in improving the quality of faculty ratings of trainees based on direct observation.52 Part of the reason for this state of affairs is medical education's overemphasis on finding the “perfect” evaluation tool instead of focusing on the more important issue—the faculty who use the tool.53,54 To be sure, faculty should only use tools that have been evaluated for basic psychometric and quality properties, and a recent systematic review identified a small group of observation tools that meet minimal quality criteria for use.55 However, given that the redesign of evaluation forms only explains up to 10% of the variance in ratings,56 medical educators must now shift their attention to developing more effective methods to train faculty in observation and assessment.

In addition, we must help faculty and programs move away from rating scales based on just numbers, as CBME will require a greater reliance on descriptive or “qualitative” assessment.57 Early work using qualitative research methods to judge medical student portfolios is as reliable as quantitative methods.58 Faculty need to recognize that numeric ratings are nothing more than a process to synthesize and then represent a composite judgment about a trainee. Ultimately, evaluation tools are only as good as the individuals using them; perhaps it is time for the medical education profession to require all faculty involved in training students and residents to learn a core set of competencies in assessment, and for all training programs to provide ongoing professional development in assessment.59

Along those lines, recent work by Albanese and colleagues60 provides a useful framework about how the educational community and institutions might structure faculty development activities using an integrated systems model (ISM). They lay out 14 implications of the ISM for continuing medical education. With minor adjustments, some of these can be equally applied to faculty development, for example:

* Changes in assessment and supervision that are also mission critical for the institution and help to build system “reserve” will be more likely implemented.

* The further a faculty member moves along the stages of change, the higher the likelihood of adoption that can also produce individuals more likely to become champions for the change.

* Enlisting the assistance of respected educational faculty to help implement the change helps to promote broader and more rapid uptake by other faculty.

* Helping faculty mentally picture how the change in the educational program will affect and improve their own educational practices will also assist in the adoption of new knowledge and skills.

These and other factors provided in the article can serve as a useful guide to educators planning faculty development activities.61

Back to Top | Article Outline
Assessment by faculty must be grounded in the principles of CBME

CBME requires assessment be criterion based and developmental. Defining the criteria in developmental terms, commonly called milestones or benchmarks, allows faculty and program directors to determine whether the trainee is on an appropriate “trajectory.”62 Evolving toward such a developmental, criterion-based standard will require training to help faculty acquire shared mental models and understanding of what competence should look like at various developmental stages. Milestones, in effect, can become the blueprint for curriculum and assessment.62

Multiple studies highlight that one of our biggest and most refractory problems in assessment is the lack of agreement among faculty about what constitutes satisfactory performance across competencies regardless of the competency framework.20,54 This lack of agreement among faculty is a major threat to the reliability and validity of decisions about trainee competence.54,56 In addition, it places an unfair burden on trainees to make sense of the disparate ratings and feedback they receive from faculty. Too often, the assessment process can feel to the trainee like playing the lottery—“Who will I get today and what will they say?” Because effective assessment is not an innate skill but, rather, requires training and practice, programs must provide ongoing feedback to faculty regarding their evaluation skills. Ideally, this feedback would provide comparisons with the skills of their peers within the program, and ultimately it would also provide comparisons with national benchmarks.63 Programs also must develop longitudinal assessment systems to counter the pernicious effects of the current fractured learning environment highlighted previously. Ultimately, faculty must become less fearful of providing meaningful performance data—including strengths and developmental needs—about the trainee during educational handoffs.36,37 This is especially important in our current rotational model of training—without “forward feeding” of information, trainees may end up in a perpetual cycle of superficial, nonspecific assessment and feedback.

The good news is that a number of organizations are aggressively supporting a national effort to define milestones across all the disciplines in medicine, and likewise a consortium of organizations has defined core competencies in geriatrics for medical students and residents.7 The next crucial step will be to implement and apply the milestones in training programs, a process that will require a substantial effort in faculty development using techniques such as performance dimension training and frame-of-reference training.54,64 These approaches have been shown in other fields to improve the quality of performance appraisals.65 More important, frame-of-reference training has been successfully used as part of an internal medicine student clerkship system for many years at the Uniformed Services University of the Health Sciences and now nationally.66,67

Back to Top | Article Outline
Assessment requires competent faculty

Clinical competence of faculty is a crucial component of effective assessment, yet this issue has received little attention to date. Programs operate on the assumption that faculty possess sufficient, if not high, levels of knowledge, skills, and attitudes in the competencies they are responsible for teaching and assessing. We have known for some time that numbers of students and residents graduate with significant deficiencies in clinical skills,24 so it might not be surprising that those who later become faculty may possess important deficiencies in clinical skills. A growing body of literature supports this concern. For example, a study of cardiac auscultation skills found that faculty were no more skilled than third-year medical students.68 Another study highlighted substantial deficiencies in informed decision-making skills among family medicine physicians, internists, and surgeons,69 and a recent study found that, compared with residency clinics, practicing physicians provided only marginally better care to older patients in a number of areas.30

The implication of these findings is that CBME-focused faculty development will need to incorporate clinical skills training with training in assessment. In addition to improving the clinical skills of faculty, faculty development will also need to incorporate training in the “new” competencies crucial to 21st-century practice: evidence-based practice using point-of-care clinical decision support and information; health information technology; teamwork; care coordination; systems functionalities; advocacy; and context-aware professionalism, to name a few. The majority of faculty working today never received formal training in any of these competencies.29 In effect, there are a number of new competencies that faculty will need to learn as their trainees learn them, necessitating more collaborative models of faculty training. The Residency Review Committee for internal medicine recently added a requirement for core faculty to be the “expert competency evaluators … to assist in developing and implementing the evaluation system.”70

This is not to say that a single faculty member need be an expert in all competencies; rather, trainees should be taught and evaluated by those individuals that truly possess the highest level of knowledge and skill in the domain of interest, and those individuals may not be physicians. Furthermore, some individuals may be excellent judges of competence, yet they may not necessarily be experts in the field. One excellent example in medicine is standardized patients, who can be trained to judge performance effectively in key clinical skills.22

Back to Top | Article Outline
Faculty as coach and mentor in assessment

Ultimately, the majority of trainees will graduate from their programs and enter unsupervised practice. From that point forward, trainees can no longer rely on structured approaches to assessment from others; they will need to develop their own systems of self-directed assessment to continue their professional development and, at a minimum, remain competent. Faculty must prepare trainees for this important inevitability. Portfolios are a potentially powerful tool for engaging trainees in their own assessment.71 Building a portfolio is an active process that requires contributions from the trainee, and self-assessments like medical record audits can be performed directly by the trainee.45 Lack of engagement by individual trainees in their own assessment will substantially undermine a widespread transformation to CBME but, more important, will inadequately prepare trainees for a practice environment looking to measure physician performance continuously. One clear implication is the need for trainees to fully understand the value and impact of the assessment methods and tools being used by their training programs.

Back to Top | Article Outline

Next Steps: Preparing Faculty for the CBME Era

There is a growing consensus that the rate-limiting step in the evolution to CBME is faculty development.72 As we have highlighted in this article, faculty will need substantial help in improving both their core competencies as well as new ones in teaching and in assessment. Most learning still occurs through the care of actual patients in a variety of clinical settings, and although we will need to increasingly embrace simulation and other assessment technologies in the future, faculty will remain central to the education process. If we are to transform medical education for the good of the public, faculty must also fully embrace their role as evaluators. The role of faculty as expert “coaches” must encompass teaching, assessment, and feedback.

Significant challenges and barriers to this evolution do exist. First, the available time for faculty to learn and practice new skills has been shrinking as pressures for productivity in clinical care and/or research have grown substantially. This is frustrating not only for faculty but also increasingly for policy makers who believe that taxpayers are not getting a meaningful return on their more than $15 billion investment in graduate medical education.73,74 Furthermore, ethical standards of our profession would direct us to ensure that our students possess sufficient knowledge, skills, and attitudes for successful matriculation into residency. The bottom line is that institutions must provide the resources necessary to ensure at least a competent educational workforce. It is no longer acceptable to perform education as a “one-off” activity that is inadequately supported.72,73

We have yet to develop the most effective faculty development models. The good news from a recent systematic review is that the faculty who participate in educational training activities report (1) high levels of satisfaction, (2) positive changes in their attitudes, (3) increased understanding of educational principles and teaching skills, (4) changes in behavior as noted by their students, and (5) greater involvement in teaching.75 This study also noted that success factors for faculty development include incorporation of feedback in the training, active learning, effective relationships with peers and colleagues, and use of diverse teaching approaches. However, few studies have investigated whether faculty training translates into actual behavior changes among trainees. In addition, most faculty development is designed as a one-time “bolus” activity and less often as a longitudinal designed program.

We will not address the current shortcomings of both undergraduate and graduate medical education faculty development using single-institution-based programs and one-time workshops. A national faculty development effort in assessment and CBME using new models of longitudinal, experiential training is needed. In Table 1 we provide a summary of what we believe are the critical next steps.

Table 1
Table 1
Image Tools

There is a need now to create regional centers to develop a national cadre of trainers, a sort of “SWAT team,” who can provide longitudinal training and on-site coaching. These centers could function using existing resources such as simulation labs at medical schools addressing the key items listed and create networks of expertise that extended well beyond an individual school's or program's boundaries. Financial resources could come from the redirection of a portion of current federal graduate medical education dollars, the Human Resources Service Administration, and pooling of local institutional resources.73 By creating regional centers, economies of scale can be realized with the added benefit of faculty from multiple programs interacting to create a shared understanding of the competencies and milestones, reducing the unwarranted variation in assessment currently seen across the country.

We should not wait for research to find the perfect faculty development models before embarking on this initiative. Instead, we must build in ongoing research and learning as part of the process, using new methodological strategies to evaluate the effectiveness of faculty development as part of a continuous quality improvement process.76 We know enough about general principles and educational theory to build and implement faculty development in assessment to move CBME forward and improve training for the benefit of the public. The public, patients, and our trainees need for the medical education enterprise to make this transition now.

Back to Top | Article Outline

Funding/Support:

This work was supported by a writing conference funded by the Medallion Fund and the Josiah Macy, Jr. Foundation. The conference was entitled “A 2020 Vision of Faculty Development Across the Medical Education Continuum” and was held at Baylor College of Medicine on February 26–28, 2010.

Back to Top | Article Outline

Other disclosures:

Dr. Holmboe co-leads a faculty development course in assessment conducted at the American Board of Internal Medicine; he receives no additional compensation for the course. He receives royalties from Mosby-Elsevier for a textbook on assessment. Finally, he received an honorarium from Baylor School of Medicine for a presentation at a symposium related to this manuscript.

Back to Top | Article Outline

Ethical approval:

Not applicable.

Back to Top | Article Outline

Previous presentations:

This information was presented in part at the conference mentioned above.

Back to Top | Article Outline

References

1 Flexner A. Medical Education in the United States and Canada. A Report to the Carnegie Foundation for the Advancement of Teaching. Bulletin No. 4. Boston, Mass: Updyke; 1910.

2 Hoover EL. A century after Flexner: The need for reform in medical education from college and medical school through residency training. J Natl Med Assoc. 2005;97:1232–1239.

3 Whitcomb ME. Commentary: Flexner redux 2010: Graduate medical education in the United States. Acad Med. 2009;84:1476–1478. http://journals.lww.com/academicmedicine/Fulltext/2009/11000/Commentary_Flexner_Redux_2010_Graduate_Medical.10.aspx. Accessed December 13, 2010.

4 Holmboe ES, Bowen JL, Green ML, et al. Reforming internal medicine residency training. J Gen Intern Med. 2005;20:1165–1172.

5 Schroeder SA, Sox HC. Internal medicine training: Putt or get off the green. Ann Intern Med. 2006;144:938–939.

6 Institute of Medicine. Resident Duty Hours: Enhancing Sleep, Supervision, and Safety. Washington, DC: National Academy Press; 2009.

7 Leipzig RM, Granville L, Simpson D, Anderson MB, Sauvigné K, Soriano RP. Keeping granny safe on July 1: A consensus on minimum geriatrics competencies for graduating medical students. Acad Med. 2009;84:604–610. http://journals.lww.com/academicmedicine/Fulltext/2009/05000/Keeping_Granny_Safe_on_July_1_A_Consensus_on.17.aspx. Accessed December 13, 2010.

8 Royal College of Physicians of Canada. Directions for Residency Education, 2009: The Final Report of the Core Competency Project. http://rcpsc.medical.org/residency/competency/index.php. Accessed December 17, 2010.

9 Simpson JG, Furnace J, Crosby J, et al. The Scottish doctor—Learning outcomes for the medical undergraduate in Scotland: A foundation for competent and reflective practitioners. Med Teach. 2002;24:136–143.

10 Carraccio C, Wolfsthal SD, Englander R, Ferentz K, Martin C. Shifting paradigms: From Flexner to competencies. Acad Med. 2002;77:361–367. http://journals.lww.com/academicmedicine/Fulltext/2002/05000/Shifting_Paradigms_From_Flexner_to_Competencies.3.aspx. Accessed December 13, 2010.

11 Smith SR, Dollase R. AMEE Guide No. 14: Outcomes-based education: Part 2—Planning, implementing and evaluating a competency-based curriculum. Med Teach. 1999;21:15–22.

12 Hodge S. The origins of competency-based training. Aust J Adult Learn. 2007;47:179–209.

13 Frank JR, Mungroo R, Ahmad Y, Wang M, De Rossi S, Horsley T. Toward a definition of competency-based education in medicine: A systematic review of published definitions. Med Teach. 2010;32:631–637.

14 Frank JR, Danoff D. The CanMEDS initiative: Implementing an outcomes-based framework of physician competencies. Med Teach. 2007;29:642–647.

15 Batalden P, Leach D, Swing S, Dreyfus H, Dreyfus S. General competencies and accreditation in graduate medical education. Health Aff (Millwood). 2002;21:103–111.

16 Association of American Medical Colleges. Scientific Foundations for Future Physicians: Report of the AAMC–HHMI Committee. https://www.aamc.org/download/64442/data/08209execsummary.pdf. December 13, 2010.

17 McMahon GT, Tallia AF. Perspective: Anticipating the challenges of reforming the United States Medical Licensing Examination. Acad Med. 2010;85:453–456. http://journals.lww.com/academicmedicine/Abstract/2010/03000/Perspective_Anticipating_the_Challenges_of.18.aspx. Accessed December 13, 2010.

18 Harden RM. International medical education and future directions: A global perspective. Acad Med. 2006;81(12 suppl):S22–S29. http://journals.lww.com/academicmedicine/Fulltext/2006/12001/International_Medical_Education_and_Future.5.aspx. Accessed December 13, 2010.

19 Huddle TS, Heudebert GR. Viewpoint: Taking apart the art: The risk of anatomizing clinical competence. Acad Med. 2007;82:536–541. http://journals.lww.com/academicmedicine/Fulltext/2007/06000/Viewpoint_Taking_Apart_the_Art_The_Risk_of.3.aspx. Accessed December 13, 2010.

20 Lurie SJ, Mooney CJ, Lyness JM. Measurement of the general competencies of the Accreditation Council for Graduate Medical Education: A systematic review. Acad Med. 2009;84:301–309. http://journals.lww.com/academicmedicine/Fulltext/2009/03000/Measurement_of_the_General_Competencies_of_the.11.aspx. Accessed December 13, 2010.

21 Sharp LK, Bashook PG, Lipsky MS, Horowitz SD, Miller SH. Specialty board certification and clinical outcomes: The missing link. Acad Med. 2002;77:534–542. http://journals.lww.com/academicmedicine/Fulltext/2002/06000/Specialty_Board_Certification_and_Clinical.11.aspx. Accessed December 13, 2010.

22 Hawkins RE, Boulet JR. Direct observation: Standardized patients. In: Holmboe ES, Hawkins RE, eds. Practical Guide to the Evaluation of Clinical Competence. Philadelphia, Pa: Mosby-Elsevier; 2008:102–118.

23 Cleland JA, Abe K, Rethans JJ. The use of simulated patients in medical education: AMEE Guide No. 42. Med Teach. 2009;31:477–486.

24 Holmboe ES. The importance of faculty observation of trainees' clinical skills. Acad Med. 2004;79:16–22. http://journals.lww.com/academicmedicine/Fulltext/2004/01000/Faculty_and_the_Observation_of_Trainees_Clinical.6.aspx. Accessed December 13, 2010.

25 Eva KW, Neville AJ, Norman GR. Exploring the etiology of content specificity: Factors influencing analogic transfer and problem solving. Acad Med. 1998;73(10 suppl):S1–S5. http://journals.lww.com/academicmedicine/Citation/1998/10000/Exploring_the_Etiology_of_Content_Specificity_.28.aspx. Accessed December 13, 2010.

26 Ericsson KA. An expert-performance perspective of research on medical expertise: The study of clinical performance. Med Educ. 2007;41:1124–1130.

27 Ericsson KA. The influence of expertise and deliberate practice on the development of superior expert performance. In: Ericsson KA, Charness N, Feltovich P, Hoffman RR, eds. Cambridge Handbook of Expertise and Expert Performance. Cambridge, UK: Cambridge University Press; 2006:685–706.

28 Eva KW, Regehr G. “I'll never play professional football” and other fallacies of self-assessment. J Contin Educ Health Prof. 2008;28:14–19.

29 Nelson EC, Batalden PB, Godfrey MM. Quality by Design: A Clinical Microsystems Approach. San Francisco, Calif: Jossey-Bass; 2007.

30 Lynn LA, Hess BJ, Conforti LN, Lipner RS, Holmboe ES. The effect of residency clinic systems on the quality of care for older adults in internal and family medicine residency programs. Acad Med. 2009;84:1732–1740. http://journals.lww.com/academicmedicine/Fulltext/2009/12000/Clinic_Systems_and_the_Quality_of_Care_for_Older.24.aspx. Accessed December 13, 2010.

31 Reddy SG, Babbott SF, Beasley BW, Nadkarni M, Gertner E, Holmboe ES. Prevalance and functionality of electronic health records in internal medicine continuity clinics. Acad Med. 2010;85:1369–1377. http://journals.lww.com/academicmedicine/Abstract/2010/08000/Prevalence_and_Functionality_of_Electronic_Health.25.aspx. Accessed February 8, 2011.

32 Holmboe E, Ginsburg S, Bernabeo E. The rotational approach to medical education: Time to confront our assumptions? Med Educ. 2011;45:69–80.

33 Sterkenburg A, Barach P, Kalkman C, Gielen M, ten Cate O. When do supervising physicians decide to entrust residents with unsupervised tasks? Acad Med. 2010;85:1408–1417. http://journals.lww.com/academicmedicine/Abstract/2010/09000/When_Do_Supervising_Physicians_Decide_to_Entrust.11.aspx. Accessed December 13, 2010.

34 Hirsh DA, Ogur B, Thibault GE, Cox M. “Continuity” as an organizing principle for clinical educational reform. N Engl J Med. 2007;356:858–866.

35 Kilminster S, Cottrell D, Grant J, Jolly B. AMEE Guide No. 27: Effective educational and clinical supervision. Med Teach. 2007;29:2–19.

36 Cox SM. “Forward feeding” about students' progress: Information on struggling medical students should not be shared among clerkship directors or with students' current teachers. Acad Med. 2008;83:801. http://journals.lww.com/academicmedicine/Fulltext/2008/09000/_Forward_Feeding_About_Students_Progress_.3.aspx. Accessed December 13, 2010.

37 Pangaro L. “Forward feeding” about students' progress: More information will enable better policy. Acad Med. 2008;83:802–803. http://journals.lww.com/academicmedicine/Fulltext/2008/09000/_Forward_Feeding_About_Students_Progress_The.2.aspx. Accessed December 13, 2010.

38 Babbott S. Watching closely at a distance: Key tensions in supervising resident physicians. Acad Med. 2010;85:1399–1340. http://journals.lww.com/academicmedicine/Fulltext/2010/09000/Commentary_Watching_Closely_at_a_Distance_Key.7.aspx. Accessed December 13, 2010.

39 Graber ML, Franklin N, Gordon R. Diagnostic error in internal medicine. Arch Intern Med. 2005;165:1493–1499.

40 Hafferty FW, Levinson D. Moving beyond nostalgia and motives: Towards a complexity science view of medical professionalism. Perspect Biol Med. 2008;51:599–615.

41 Hafferty FW. Beyond curriculum reform: Confronting medicine's hidden curriculum. Acad Med. 1998;73:403–407. http://journals.lww.com/academicmedicine/Abstract/1998/04000/Beyond_curriculum_reform_confronting_medicine_s.13.aspx. Accessed December 13, 2010.

42 MacDonald C. Nurse autonomy as relational. Nurs Ethics. 2002;9:194–201.

43 Sherwin S. A relational approach to autonomy in health care. In: Sherwin S, ed. The Politics of Women's Health: Exploring Agency and Autonomy. Philadelphia, Pa: Temple University Press; 1998:19–47.

44 Bowen JL, Salerno SM, Chamberlain JK, Eckstrom E, Chen HL, Brandenburg S. Changing habits of practice. Transforming internal medicine residency education in ambulatory settings. J Gen Intern Med. 2005;20:1181–1187.

45 Holmboe ES, Prince L, Green ML. Teaching and improving quality of care in a residency clinic. Acad Med. 2005;80:571–577. http://journals.lww.com/academicmedicine/Fulltext/2005/06000/Teaching_and_Improving_Quality_of_Care_in_a.12.aspx. Accessed December 13, 2010.

46 Bowen JL. Educational strategies to promote clinical diagnostic reasoning. N Engl J Med. 2006;355:2217–2224.

47 Gruppen LD, Frohna AZ. Clinical reasoning. In: Norman GR, van der Vleuten CPM, Newble DI, eds. International Handbook of Research in Medical Education. Dordrecht, Netherlands: Kluwer Academic; 2002:205–230.

48 Aagaard E, Teherani A, Irby DM. Effectiveness of the one-minute preceptor model for diagnosing the patient and the learner: Proof of concept. Acad Med. 2004;79:42–49. http://journals.lww.com/academicmedicine/Fulltext/2004/01000/Effectiveness_of_the_One_Minute_Preceptor_Model.10.aspx. Accessed December 13, 2010.

49 Herbers JE, Noel GL, Cooper GS. How accurate are faculty evaluations of clinical competence? J Gen Intern Med. 1989;4:202–208.

50 Kalet A, Ear JA, Kilowatts V. How well do faculty evaluate the interviewing skills of medical students? J Gen Intern Med. 1992;97:179–184.

51 Noel GL, Herbers JE Jr, Caplow MP, Cooper GS, Pangaro LN, Harvey J. How well do internal medicine faculty members evaluate the clinical skills of residents? Ann Intern Med. 1992;117:757–765.

52 Holmboe ES, Hawkins RE, Huot SJ. Direct observation of competence training: A randomized controlled trial. Ann Intern Med. 2004;140:874–881.

53 Landy FJ, Farr JL. Performance rating. Psychol Bull. 1980;87:72–107.

54 Holmboe ES. Direct observation by faculty. In: Holmboe ES, Hawkins RE, eds. Practical Guide to the Evaluation of Clinical Competence. Philadelphia, Pa: Mosby-Elsevier; 2008:110–129.

55 Kogan JR, Holmboe ES, Hauer KE. Tools for direct observation and assessment of clinical skills of medical trainees: A systematic review. JAMA. 2009;302:1316–1326.

56 Williams RG, Klamen DA, McGaghie WC. Cognitive, social and environmental sources of bias in clinical performance settings. Teach Learn Med. 2003;15:270–292.

57 Govaerts MJ, van der Vleuten CP, Schuwirth LW, Muijtjens AM. Broadening perspectives on clinical performance assessment: Rethinking the nature of in-training assessment. Adv Health Sci Educ Theory Pract. 2007;12:239–260.

58 Driessen E, van der Vleuten C, Schuwirth L, van Tartwijk J, Vermunt J. The use of qualitative research criteria for portfolio assessment as an alternative to reliability evaluation: A case study. Med Educ. 2005;39:214–220.

59 National Board for Professional Teaching Standards. What is national board certification? http://www.nbpts.org/become_a_candidate/what_is_national_board_c. Accessed December 13, 2010.

60 Albanese M, Mejicano G, Xakellis G, Kokotailo P. Physician practice change I: A critical review of an integrated systems model. Acad Med. 2009;84:1043–1055. http://journals.lww.com/academicmedicine/Fulltext/2009/08000/Physician_Practice_Change_I_A_Critical_Review_and.18.aspx. Accessed December 13, 2010.

61 Albanese M, Mejicano G, Xakellis G, Kokotailo P. Physician practice change II: Implications of the integrated systems model (ISM) for the future of continuing medical education. Acad Med. 2009;84:1056–1065. http://journals.lww.com/academicmedicine/Fulltext/2009/08000/Physician_Practice_Change_II_Implications_of_the.19.aspx. Accessed December 13, 2010.

62 Green ML, Aagaard EM, Caverzagie KJ, et al. Charting the road to competence: Developmental milestones for internal medicine residency training. J Grad Med Educ. 2009;1:5–20.

63 Swing SR, Clyman SG, Holmboe E, Williams RG. Advancing resident assessment in graduate medical education. J Grad Med Educ. 2009;1:278–286.

64 Goodstone MS, Lopez FE. The frame of reference approach as a solution to an assessment center dilemma. Consult Psychol J Pract Res. 2001;53:96–107.

65 Hauenstein NMA. Training raters to increase accuracy of appraisals and the usefulness of feedback. In: Smither JW, ed. Performance Appraisal. San Francisco, Calif: Jossey-Bass; 1998:404–442.

66 Hemmer PA, Pangaro L. Using formal evaluation sessions for case-based faculty development during clinical clerkships. Acad Med. 2000;75:1216–1221. http://journals.lww.com/academicmedicine/Fulltext/2000/12000/Using_Formal_Evaluation_Sessions_for_Case_based.21.aspx. Accessed December 13, 2010.

67 Hemmer PA, Papp KK, Mechaber AJ, Durning SJ. Evaluation, grading, and use of the RIME vocabulary on internal medicine clerkships: Results of a national survey and comparison to other clinical clerkships. Teach Learn Med. 2008;20:118–126.

68 Vukanovic-Criley JM, Criley S, Warde CM, et al. Competency in cardiac examination skills in medical students, trainees, physicians and faculty: a multicenter trial. Arch Intern Med. 2006;166:610–616.

69 Braddock CH 3rd, Edwards KA, Hasenberg NM, Laidley TL, Levinson W. Informed decision making in outpatient practice: Time to get back to basics. JAMA. 1999;282:2313–2320.

70 Internal Medicine Residency Review Committee. ACGME Program Requirements for Graduate Medical Education in Internal Medicine. http://www.acgme.org/acWebsite/downloads/RRC_progReq/140_internal_medicine_07012009.pdf. Accessed December 13, 2010.

71 Holmboe ES, Davis MH, Carraccio C. Portfolios. In: Holmboe ES, Hawkins RE, eds. Practical Guide to the Evaluation of Clinical Competence. Philadelphia, Pa: Mosby-Elsevier; 2008:86–101.

72 Nasca TJ. Where will the “milestones” take us? The next accreditation system. ACGME Bull. September 2008:3–5. http://www.acgme.org/acWebsite/bulletin/ACG11_BulletinSep08_F.PDF. Accessed December 13, 2010.

73 Medicare Payment Advisory Commission. Medical education in the United States: Supporting long-term delivery system reforms. In: Report to the Congress: Improving Incentives in the Medicare Program. Washingon, DC: Medicare Payment Advisory Commission; June 2009: 3–39. http://www.medpac.gov/documents/Jun09_EntireReport.pdf. Accessed December 13, 2010.

74 Iglehart JK. Medicare, graduate medical education and new policy directions. N Engl J Med. 2008;359:643–650.

75 Steinert Y, Mann K, Centeno A, Dolmans D, Spencer J, Gelula M, Prideux D. A systematic review of faculty development initiatives designed to improve teaching effectiveness in medical education: BEME Guide No. 8. Med Teach. 2006;28:497–526.

76 O'Sullivan PS, Irby DM. Reframing research on faculty development. Acad Med. 2011;86:421–428.

Cited By:

This article has been cited 6 time(s).

Academic Emergency Medicine
Assessing Practice-based Learning and Improvement
Salzman, DH; Franzen, DS; Leone, KA; Kessler, CS
Academic Emergency Medicine, 19(): 1403-1410.
10.1111/acem.12026
CrossRef
Medical Teacher
Clinical teachers' views on how teaching teams deliver and manage residency training
Slootweg, I; Lombarts, K; Van Der Vleuten, C; Mann, K; Jacobs, J; Scherpbier, A
Medical Teacher, 35(1): 46-52.
10.3109/0142159X.2012.731108
CrossRef
Medical Teacher
Evidence-based competencies for improving communication skills in graduate medical education: A review with suggestions for implementation
Henry, SG; Holmboe, ES; Frankel, RM
Medical Teacher, 35(5): 395-403.
10.3109/0142159X.2013.769677
CrossRef
Medical Teacher
Quality evaluation reports: Can a faculty development program make a difference?
Dudek, NL; Marks, MB; Wood, TJ; Dojeiji, S; Bandiera, G; Hatala, R; Cooke, L; Sadownik, L
Medical Teacher, 34(): E725-E731.
10.3109/0142159X.2012.689444
CrossRef
Academic Pediatrics
The Pediatrics Milestones: Initial Evidence for Their Use as Learning Road Maps for Residents
Schumacher, DJ; Lewis, KO; Burke, AE; Smith, ML; Schumacher, JB; Pitman, MA; Ludwig, S; Hicks, PJ; Guralnick, S; Englander, R; Benson, B; Carraccio, C
Academic Pediatrics, 13(1): 40-47.

Medical Teacher
Frameworks for learner assessment in medicine: AMEE Guide No. 78
Pangaro, L; ten Cate, O
Medical Teacher, 35(6): E1197-E1210.
10.3109/0142159X.2013.788789
CrossRef
Back to Top | Article Outline

© 2011 Association of American Medical Colleges

Login

Article Tools

Images

Share