Many of the substantial changes seen over the last ten years have sought to bring an educational outcomes focus to the design and delivery of medical education.1 To date, the greatest changes have come in the preclerkship years2,3 and focus on integration of the basic sciences and clinical reasoning in the curriculum. Initially described by Neufeld and Barrows,4 problem-based learning (PBL) is implemented in many forms in the education of medical students.5,6 Eighty percent of U.S. medical schools report they use PBL; however, of these schools, 39 (45%) report fewer than 10% of preclinical contact hours in PBL.7 Despite the adoption of PBL in medical education, rigorous research-based evidence of its effectiveness is limited.
In 2000, Blake, Hosokawa, and Riley8 reported marked improvement in medical students’ United States Medical Licensing Examination (USMLE) Step 1 and Step 2 scores after the implementation of a PBL curriculum at the University of Missouri—Columbia School of Medicine (UMCSOM). The purpose of this article is to add to this prior work by reporting ten classes of students’ performances in the PBL curriculum at the UMCSOM.
Students are accepted into the UMCSOM through a traditional application process, as well as through two preadmissions programs. The Conley preadmissions program began with the graduating class of 1995 (entering in 1991) and the Bryant Rural Scholar’s program began with the graduating class of 2002 (entering in 1998). Students meeting the preadmissions requirements for these programs are not required to take the Medical College Admission Test (MCAT).
All students in the UMCSOM graduating class of 1997, who matriculated in 1993, participated in a new four-year PBL curriculum. The architects of the new curriculum (including authors MH and RB) drew from a rich educational literature to create a curriculum that is contextually sensitive and integrative across disciplinary boundaries and that requires communication and collaboration. The PBL curriculum situates learning and problem solving within a specific context to activate students’ prior knowledge and establish the relevance of the information to be learned; help students learn information in the same way it will be used in practice, promoting transfer of learning; and enhance elaboration of knowledge and improve retention. The PBL cases are sufficiently challenging to the learners that groups work collaboratively to synthesize their ideas and not simply fit the pieces together.9–13 Figure 1 provides an overview of the UMCSOM preclerkship curriculum.
The two-year, preclerkship element of the new curriculum consists of three components: Basic Science/PBL, Introduction to Patient Care, and Patient Care Experiences. All medical students participate in the same curriculum, organized into eight blocks of ten weeks each across the first and second years. For eight weeks, first-year students are in PBL groups up to ten hours per week, with about ten hours of supplemental lectures and laboratories. Second-year students spend up to eight hours per week in PBL with up to ten hours in traditional learning activities. Lectures are designed to be conceptual, providing overviews rather than detailed presentations of the basic sciences. Basic sciences are integrated, and there are no departmental or discipline-based courses. The ninth week of the block is for student assessment, and the tenth week is free of all academic activities.
With the implementation of the PBL curriculum in 1993, class size at UMCSOM was reduced from 112 to 96 students. Each class meets as a group for lectures each block; the students are placed at random in 12 groups of eight with a “process-expert” faculty tutor. Students work through one PBL case each week and cover 64 cases by the completion of the second year. Student assessment in Basic Science/PBL is based on multiple-choice knowledge-based exams, “open-book” problem-solving exams, and the tutor’s assessment of each student’s performance in the small groups. Students are evaluated through a criterion-referenced grading system.
Preclerkship students gain clinical exposure through three structured experiences: Introduction to Patient Care, Ambulatory Care Experience, and Advanced Physical Diagnosis. In Introduction to Patient Care, students gain knowledge and skill in history taking, physical examination, psychosocial aspects of medicine, epidemiology, appropriate use of diagnostic tests, and psychopathology. In Ambulatory Care Experience, students shadow a faculty or community physician. In Advanced Physical Diagnosis, groups of four second-year students work with two clinicians to further develop their physical diagnosis skills. Students’ patient-care skills are evaluated with knowledge-based exams and skills tests and by the observations of tutors and preceptors.
The first class to graduate from the PBL curriculum also participated in a restructured third- and fourth-year curriculum. In the third year, the internal medicine core clerkship and the surgery core clerkship were reduced from 12 weeks to eight weeks and an eight-week family medicine clerkship was created. An eight-week Advanced Biomedical Science selective is required in the fourth year that students fulfill by original research or a scholarly review of the basic sciences in the context of a clinical problem.
Students take the USMLE Step 1 exam before or early in their third year and must pass the examination before beginning their fourth year. They must pass the USMLE Step 2 in order to graduate. There are no formal USMLE preparatory activities included in the curriculum.
Students matriculating in the UMCSOM curriculum for the graduating classes of 1993 through 2006 were the subjects of this study. The graduating classes of 1993 through 1996 participated in the traditional curriculum, whereas the PBL curriculum was in place for the graduating classes from 1997 through 2006. On average, 83% of students admitted to the UMCSOM were white and 17% were minorities, 44% were female, and students’ mean age was 24 years at matriculation. The demographic characteristics of the student body have remained stable over time.
Sources of data
Data include measures of students’ academic aptitude (MCAT and undergraduate grade point average [GPA] at the time of medical school application), students’ performance on the USMLE Step 1 and Step 2, and survey responses from residency program directors. The UMCSOM program routinely solicits residency program directors’ evaluations of graduates at the end of their first year of residency. Using a questionnaire developed at UMCSOM, program directors rate graduates’ performance on 17 unique elements, considering each of the elements independently. UMCSOM graduates are compared with all other interns in the residency program on each element. UMCSOM graduates are rated as (a) among the top 20% of first-year residents (Likert response 5); (b) average (Likert responses 2–4); or (c) among the lowest 20% of first-year residents (Likert response 1).
To explore the effects of a PBL curriculum on students’ knowledge of basic sciences (USMLE Step 1) and of clinical medicine (USMLE Step 2), we compared students’ Step 1 and Step 2 mean scores before and after introduction of the PBL curriculum. The current version of the USMLE was implemented in 1992; thus, scores for three traditional curriculum graduating classes are available for Step 1 comparisons, and scores for four classes are available for Step 2 comparisons. For each year, we compared the mean scores of UMCSOM students attempting the USMLE for the first time with the mean scores of all U.S. and Canadian students taking the exam for the first time and used the t-test to assess differences. We calculated effect sizes as the difference between the means divided by the standard deviation of the national data.1
To explore differences in students’ USMLE performances before and after implementation of the PBL curriculum, we compared the number of UMCSOM students who scored in the 90th, 95th, and 99th percentile nationally on the USMLE Step 1 and USMLE Step 2. The t-test was used to assess differences between performance of the classes of 1993, 1994, 1995, and 1996 (pre-PBL) and each of the PBL graduating classes from 1997–2000. Scores for each administration of the USMLE are equated so that the three-digit score indicates the same level of performance across time. The equivalence holds even if the pass-fail standard changes. Percentiles are interpreted in light of the examinee group on which they are based. The same three-digit score may be associated with different percentile ranks based upon the examinee performance from different examination periods. The percentile rank provides an assessment of an individual’s performance in comparison to all other first-time takers during a given examination period. Citing problems with interpretation and comparison of percentile ranks, the USMLE announced that as of May 1999 percentile information would no longer be provided in connection with reports of USMLE scores.15 Percentile rank data are available through the graduating class of 2000 for Step 1 and the class of 1999 for Step 2. Significance is annotated when the comparisons between the PBL curriculum and each of the years of the traditional curriculum are significant, with p < .01.
To explore possible effects of students’ baseline characteristics on USMLE scores, we compared the mean MCAT scores and undergraduate GPAs of UMCSOM matriculants with all U.S. and Canada medical school matriculants over the same period.14 The class of 1996 was the first class to complete the revised MCAT, thus we provide comparative data for that class onward compared to the mean MCAT scores of all matriculants in the United States and Canada. We compared the total combined MCAT scores as well as the verbal, biological science, and physical science components of the MCAT. We used t-tests to assess differences. To explore possible increases in USMLE scores attributed to students for whom MCAT data were unavailable, we compared the mean USMLE scores of students with MCAT scores to those who did not attempt the MCAT and again used t-tests to assess differences. We calculated effect sizes as the difference between the means divided by the standard deviation of the national data.
To assess differences in faculty contact hours, we reviewed syllabi from the traditional curriculum and the scheduled activities of the PBL curriculum.
To explore UMCSOM graduates’ performance during the first year of residency, we compared program directors’ ratings for each of 17 characteristics for seven years prior to implementation of the PBL curriculum (1990–1996) with data for the graduating classes of 1997–2003. We used the chi-square test with one degree of freedom to assess pre/post differences for each of the 17 characteristics. Program directors’ responses indicating UMCSOM graduates’ performance in the top 20% were compared with the combined categories of average and lowest 20%. We then combined these categories, as the “lowest 20%” category contained only 1% of the total responses.
The institutional review board, Health Sciences Section, of UMCSOM approved this project through exempt review. We used SPSS version 11.0 statistical software (SPSS Inc., Chicago, IL) for the statistical analysis.
Ninety-six students are admitted to each class and evaluated using criterion-referenced grading. Years in which fewer than 96 students attempted the USMLE occurred when there were students who were unable to meet the evaluation criteria or who progressed through the curriculum at a different rate due to a planned leave of absence, participation in an MD/PhD program, or participation in other research programs.
Performance on the USMLE
Figure 2 illustrates the mean USMLE Step 1 scores for UMCSOM compared to the mean score of all first-time examinees in the United States and Canada.
The UMCSOM class of 1997 was the first graduating class to complete the PBL curriculum. Mean Step 1 scores for two of the classes graduating prior to the curricular change (1994 and 1995) were similar to national mean scores, whereas the mean score for the UMCSOM class of 1996 was significantly lower than the national mean. Mean scores of six of the ten classes that completed the PBL curriculum were significantly higher than national means.
Figure 3 illustrates the mean USMLE Step 2 scores for UMCSOM students compared to national means. Mean Step 2 scores for the last four classes to complete the traditional curriculum were near or lower than the national means. Mean scores for six of the PBL classes were significantly higher than the national means.
Table 1 shows the percentile ranks of UMCSOM students’ performance on the Step 1 and Step 2 exams before and after implementation of the PBL curriculum. These data show an increase in the numbers of UMCSOM students who scored in the 90th, 95th, and 99th percentile after the introduction of PBL as the major learning strategy. Seven of 12 of the comparisons were significant for the Step 1 scores, and these gains persisted in Step 2 scores, with four of nine comparisons demonstrating significance.
Undergraduate GPA and MCAT
Table 2 illustrates UMCSOM students’ mean MCAT component scores compared to the national mean. On the MCAT component scores, UMCSOM students obtained slightly higher MCAT verbal scores, with one of 11 years of comparisons between UMCSOM and the national cohort demonstrating significance. UMCSOM students’ biological sciences MCAT scores were generally lower than the national mean, with two of 11 comparisons demonstrating significance. UMCSOM physical science MCAT scores are generally lower than the national mean, and the results of five of these comparisons are significant.
Figure 4 illustrates the effect sizes for the MCAT component scores and the effect sizes of the USMLE Step 1 and Step 2 examinations. Six of the ten post-PBL comparisons of the Step 1 scores were statistically significant. The effect sizes of the post-PBL Step 1 comparisons were medium, ranging from –0.11 to 0.62.16 Six of nine post-PBL comparisons of Step 2 scores achieved significance. The effect sizes ranged from 0.01 to 0.52, also demonstrating medium effect sizes.16
Figure 5 and Figure 6 illustrate the mean USMLE Step 1 and USMLE Step 2 scores for three groups: all UMCSOM students, UMCSOM students who completed the MCAT, and UMCSOM students who participated in a preadmission program and did not attempt the MCAT. Comparisons between the UMCSOM students with and without MCATs did not demonstrate statistically significant differences between the mean USMLE Step 1 and Step 2 scores of the two groups.
University of Missouri—Columbia School of Medicine students obtained cumulative undergraduate GPAs higher than the national mean of all medical school matriculants, with nine of 11 comparisons demonstrating significance. Increases in UMCSOM students’ GPAs parallel increases in the upward trend of GPAs of all medical school matriculants since the 1980s.14 The mean cumulative undergraduate GPA of UMCSOM traditional curriculum matriculants for the classes of 1993–1996 (pre-PBL) was 3.41, whereas the mean undergraduate GPA of UMCSOM students matriculating in the PBL curriculum for the classes of 1997–2006 was 3.65. The change parallels the increase in the undergraduate GPAs for all medical school matriculants.
Our review of syllabi showed that students participating in the traditional curricula had 1,883 faculty contact hours per year. Although PBL groups varied in the amount of time required to work through a case, UMCSOM PBL students spend between 1,750 and 1,800 hours per year in structured learning activities. PBL students thus had fewer faculty contact hours than students who matriculated in our traditional curriculum.
Performance in the first year of residency
To explore changes in UMCSOM graduates’ performances during their first year of residency, we compared program directors’ responses to 17 discrete elements for graduates’ performance before and after implementation of the PBL curriculum. Table 3 provides the elements on which residency program directors rated UMCSOM graduates. Twelve of the 17 comparisons were significant (p < .01). In every comparison, students who completed the PBL curriculum received higher scores from the program directors than did students from the traditional curriculum.
Gains on standardized exams
Previous studies have found that students participating in a PBL curriculum score the same or lower on standardized knowledge tests than do traditional curriculum students.17–20 Albanese and Mitchell21 concluded that students in PBL curricula did not perform significantly better on the USMLE Step 1, but rather performed at a slightly lower level than students in traditional curricula. A meta-analysis by Vernon and Blake22 concluded that PBL students had a slightly lower mean score than students in a traditional curriculum. Using a narrative approach, Berkson23 drew similar conclusions.
The chronology of the meta-analyses by Vernon and Blake and by Albanese and Mitchell is of interest to this discussion, as both were published in 1993. The USMLE was implemented in 1992–1994 as the successor to the National Board of Medical Examiners certifying examinations (Parts 1, 2, and 3) and the Federation Licensing Examination. The USMLE “assesses a physician’s ability to apply knowledge, concepts, and principles, and to demonstrate fundamental patient-centered skills.”24 Gains in the accretion and retention of knowledge that are evident in this study but not present in prior research may, in part, be attributed to the changes in the examination process. If the USMLE assesses students’ abilities to use knowledge within a specific context, PBL students may be anticipated to perform well on the examination as they are being tested in the same way in which they acquire new knowledge. Thus, studies included in the meta-analyses prior to 1992 may only provide an accurate picture of students’ performances prior to the changes to the USMLE examination.
Gains beyond standardized exams
Extant research demonstrates student satisfaction with PBL, as well as graduates’ perceptions of their superior critical thinking, patient communication, team work, and clinical decision-making skills.19,25–29 Recently, Distlehorst and colleagues30 reported significantly better combined clerkship performance for students completing a PBL curriculum than for traditional curriculum students. PBL students’ enhanced clinical performance has been described by other schools with dual curricula.17,18,31,32
Overall effectiveness of PBL
Colliver’s33 review of the effectiveness of PBL in medical education concluded that randomized studies show no effect for PBL and that the majority of nonrandomized studies demonstrate small effects that may be accounted for by selection differences among PBL students. Colliver34 is particularly concerned with effect sizes that are minimal. Gains in student performance from a PBL curriculum are discounted in two areas: changes may be attributed to selection of academically stronger students either because this active form of learning is more attractive, or due to decreased class sizes; and increased time on tasks associated with the PBL curriculum.
Selection of academically advantaged students
Our analysis of MCAT scores, coupled with the analysis of the performance of the preadmitted and traditional admissions students, provides strong evidence that the success of the UMCSOM students on the USMLE Step 1 and Step 2 exams is not explained by selection of students with academically superior science knowledge or aptitude for taking multiple-choice exams. The differences in verbal scores are not sufficient to account for the differences in USMLE scores.
Students matriculating at UMCSOM consistently have slightly higher mean cumulative GPAs than the mean for all matriculating medical students nationwide. However, this difference in GPA is unlikely to account for the superior performances on the Step 1 and Step 2 exams. The GPA profile of matriculants to any medical school is highly dependent on the undergraduate institutions from which the school draws applicants. The literature reports many attempts to compensate for the psychometric limitations of college GPAs. Mitchell et al. 35 created a metric for the “quality” of the degree-granting institution. Basco, Way, Gilbert, and Hudson36 reported better correlation between MCAT component scores and performance on USMLE Step 1 than between science GPA and Step 1 performance.
More recently, Julian37 reported a longitudinal study demonstrating the effectiveness of MCAT scores as predictors of academic success. In this comprehensive analysis, the author suggests that MCAT scores may replace the need for undergraduate GPAs in predicting academic success and performance on the USMLE examinations.
Our analysis of the increased number of students scoring in the 90th, 95th, and 99th percentile on the USMLE examinations after the introduction of PBL as the major learning strategy adds further evidence that UMCSOM students’ performance on the USMLE examinations is not explained by selection of academically superior students. Although the class size was smaller after the implementation of PBL, a greater number of UMCSOM students scored in the highest percentiles nationally on Step 1 and Step 2 exams.
In addition, the Step 1 and 2 differences between PBL and traditional curriculum students demonstrated a moderate effect size, while differences in their MCAT scores demonstrated small or negative effect sizes.16 The effect sizes add further evidence that improved USMLE examination performance by PBL students is not explained by selection of academically superior students.
The second argument used to account for any observed gains from PBL is that because students spend an increased amount of time on task, learning is enhanced. However, the amount of time students spend in formal learning activities and with faculty did not increase with the introduction of PBL at the UMCSOM. In fact, it decreased. One could argue that the complexity of the authentic problems presented to the PBL students produced cognitive dissonance and uncertainty, which motivated the students to actively engage in self-directed learning outside of formal learning events. This quickly becomes an argument in favor of PBL as an educational strategy.
The metric by which PBL success is most frequently gauged is mastery of knowledge. Our data demonstrate increased gains in students’ knowledge as evidenced by their performance on the USMLE. Mastery of knowledge provides important but incomplete information on the ability to become a competent physician. Both the Accreditation Council for Graduate Medical Education and the Medical School Objectives Project have increasingly focused on a physician’s ability to practice within a complex system and as part of a health care team that includes the patient and family. These competencies will require many of the skills fostered through the PBL curricula. Extant research highlights that the PBL curriculum enhances graduates’ cooperative and problem-solving skills, improves their facility for working independently,28 develops their ability to manage patients’ psychosocial problems, increases their readiness to practice humanistic medicine,29 and facilitates their ability to communicate with patients.26
Data obtained from program directors for UMCSOM graduates’ performance at the end of their first year of residency suggest that PBL graduates gain skills that enhance their abilities to become competent physicians. These data demonstrate movement out of the “average” range and into the “top 20% among all residents in the program” category after the implementation of PBL. The analysis of residency performance data shows gains beyond knowledge, including PBL graduates’ enhanced communication and collaboration skills, maturity, initiative, and the ability to project the qualities of a good physician. This study adds to the literature by confirming residency program director’s perceptions of the PBL graduates’ enhanced performance on the individual facets of clinical practice.
This study extends the findings of Blake, Hosokawa, and Riley.8 Our analysis demonstrates that improvement in knowledge fostered by participation in a PBL curriculum in the preclerkship curriculum are sustained through the clinical years, as evidenced by students’ performance on the USMLE Step 2. Further, these gains persist into the first year of residency, as evidenced by the change in program directors’ evaluations after the implementation of the PBL curriculum.
Our research is limited in that it examines performance in one school before and after an educational intervention. Yet we believe that there is value in reporting an individual school’s experience with curricular change, particularly when such reports provide new or differing information from that previously reported in the literature. As with much educational research reported, our study would have been strengthened by being randomized. The data collected on the program directors’ evaluation of UMCSOM graduates were not originally designed for research purposes but to gather data for programmatic evaluation of the curriculum. Data are available for graduates’ performance up to and including their first year of residency training. Undergraduate GPA comparisons were made based on students’ GPA at the time of medical school application and may not accurately reflect final undergraduate GPA. Finally, Cohen’s16 standards for effects size should be interpreted with caution, as there is certain risk in offering operational definitions in fields of inquiry as diverse as the behavioral science.
At the UMCSOM, curricular changes implemented with the graduating class of 1997 have resulted in improved student performances on the national licensing examinations that have persisted over a decade. The changes from a traditional to a PBL curriculum have better provided our graduates with the knowledge and skills needed to practice within a complex health care system. These outcomes demonstrate that PBL is an effective learning strategy in our medical school curriculum.
1 Whitcomb M. More on competency-based education. Acad Med. 2004;79:493–94.
2 Whitcomb M. Responsive curriculum reform: continuing challenges. The education of medical students: ten stories of curriculum change. Milbank Mem Fund Q. 2000;1–10.
3 The AAMC Project on the Clinical Education of Medical Students [monograph]. Washington, DC: Association of American Medical Colleges, 2001.
4 Neufeld VR, Barrows HS. The “McMaster philosophy”: an approach to medical education. J Med Educ. 1974;49:1040–50.
5 Maudsley G. Do we all mean the same thing by “problem-based learning”? A review of the concepts and a formulation of the ground rules. Acad Med. 1999;74:178–85.
6 Barrows HS. Problem-Based Learning Applied to Medical Education. Springfield, IL: Southern Illinois University School of Medicine 2000.
7 Kinkade S. A snapshot of the status of problem-based learning in U.S. Medical Schools, 2003–04. Acad Med. 2005;80:300–301.
8 Blake RL, Hosokawa MC, Riley SL. Student performances on Step 1 and Step 2 of the United States Medical Licensing Examination following implementation of a problem-based learning curriculum. Acad Med. 2000;75:66–70.
9 Duch BJ. Writing problems for deeper understanding. In: Duch BJ, Groh SE, Allen DE (eds). The Power of Problem-Based Learning: A Practical “How to” of Teaching Undergraduate Courses in Any Discipline. Sterling, VA: Stylus 2001: 47–48.
10 Jonassen D. Toward a design theory of problem solving. Educ Technol Res Dev. 2000;48:63–85.
11 Gijselaers WH. Connecting problem-based practices with educational theory. In: Wilerson L, Gijselaers WH (eds). Bringing Problem-Based Learning to Higher Education: Theory and Practice. New Directions for Teaching and Learning, no. 68. San Francisco: Jossey-Bass, 1996:13–21.
12 Drummond-Young M, Mohide EA. Developing problems for use in problem-based learning. In: Rideout E (ed). Transforming Nursing Education Through Problem-based Learning. Boston: Jones and Bartlett, 2001:165–191.
13 Delisle R. How to Use Problem-Based Learning in the Classroom. Alexadria, VA: Association for Supervision and Curriculum Development, 1997.
14 MCAT scores and GPAs for Applicants and Matriculants, 1992-2003. The American Association of Medical Colleges (http://www.aamc.org/data/facts/2003/2003mcatgpa.htm
). Accessed 23 March 2006. The American Association of American Medical Colleges, Washington, DC.
16 Cohen J. Statistical Power Analysis for the Behavioral Sciences. 2nd ed. Hillside, NJ: Lawrence Erlbaum Associates, 1988.
17 Goodman LJ, Brueschke EE, Bone RC, Rose WH, Williams EJ, Paul HA. An experiment in medical education: a critical analysis using traditional criteria. JAMA. 1991;265:2373–76.
18 Moore GT, Block SD, Style CB, Mitchell R. The influence of the new pathway curriculum on Harvard medical students. Acad Med. 1994;69:983–89.
19 Mennin SP, Friedman M, Skipper B, Kalishman S, Snyder J. Performances on the NBME I, II and III by medical students in the problem-based learning and conventional tracks at the University of New Mexico. Acad Med 1993;68:616–24.
20 Distlehorst LH, Barrows HS. A new tool for problem-based, self-directed learning. J Med Educ. 1982;62:486–88.
21 Albanese MA, Mitchell S. Problem-based learning: a review of literature on its outcomes and implementation issues. Acad Med. 1993;68:52–81.
22 Vernon DTA, Blake RL. Does problem-based learning work? A meta-analysis of evaluative research. Acad Med. 1993;68:303–15.
23 Berkson L. Problem-based learning. Have the expectations been met? Acad Med. 1993;68:S79–S88.
24 National Board of Medical Examiners Statement of Guiding Principles (http://www.nbme.org/about/about.asp
). Accessed 23 March 2006. National Board of Medical Examiners, Philadelphia, PA.
25 Woodwar CA, Ferrier BM. The content of the medical curriculum at McMaster University: graduates’ evaluation of their preparation for postgraduate training. Med Educ. 1983;17:54–60.
26 Santos-Gomez L, Kalishman S, Rezler A, Skipper B, Mennin SP. Resident performance of graduates from a problem-based and a conventional curriculum. Med Educ. 1990;24:366–75.
27 Schmidt HG, van der Molen H. Self-reported competency ratings of graduates of a problem-based medical curriculum. Acad Med. 2001;76:466–68.
28 Peters AS, Greenberger-Rosovsky R, Crowder C, Block SD, Moore GT. Long-term outcomes of the new pathway program at Harvard Medical School: a randomized controlled trial. Acad Med. 2000;75:470–79.
29 Antepohl W, Domeij E, Forsberg P, Ludvigsson J. A follow-up of medical graduates of a problem-based learning curriculum. Med Educ. 2003;155–62.
30 Distlehorst LH, Dawson E, Robbs RS, Barrows HS. Problem-based learning outcomes: the glass half-full. Acad Med. 2005;80:294–99.
31 Richards BF, Ober KP, Cariaga-Lo L, et al. Ratings of students’ performances in a third-year internal medicine clerkship: a comparison between problem-based and lecture-based curricula. Acad Med. 1996;71:187–89.
32 Kaufman A, Mennin S, Waterman R, et al. The New Mexico experiment: educational innovation and institutional change. Acad Med. 1989;285–94.
33 Colliver JA. Effectiveness of problem-based learning curricula: research and theory. Acad Med. 2000;75:259–65.
34 Colliver JA. Are qualitative studies of the PBL tutorial process indicated? [Letter.] Acad Med. 2001;76:215.
35 Mitchell KJ. Traditional predictors of performance in medical school. Acad Med. 1990;65:149–58.
36 Basco WT, Way DP, Gilbert GE, Hudson A. Undergraduate institutional MCAT scores as a predictor of USLMLE Step 1 Performance. Acad Med. 2002;77:S13–S16.
37 Julian ER. Validity of the medical college admission test for predicting medical school performance. Acad Med. 2005;80: F910–F917.
© 2006 Association of American Medical Colleges
This article has been cited