OBJECTIVE: To evaluate the effect of a resident‐created study guide on Council on Resident Education in Obstetrics and Gynecology (CREOG) In‐Training and American Board of Obstetrics and Gynecology (ABOG) written examination scores.
METHODS: In 1995, a group of residents at the University of Texas Southwestern Medical Center began creating an annual study guide based on the CREOG Test Item Summary Booklet. Individual, program, and national scores for 3 years before the intervention were compared with scores for 3 years after the intervention. A four‐way analysis of variance was used to evaluate the effect of the intervention accounting for sex, Alpha Omega Alpha Medical Honor Society (AOA) status, and calendar year. A random effects model was also used to adjust for confounders. Categoric variables were compared using Mantel‐Haenszel χ2. Program failure rates for the ABOG written examination before and after the intervention were compared with relative risks.
RESULTS: After introduction of the study guide, the annual difference between our program and the national percent correct increased significantly (2.1% versus 4.8%, P < .001), after adjustment for AOA status and calendar year. The improvement was distributed among resident levels 2–4 (all P < .02) and for non‐AOA residents (P ≤ .001). The relative risk of failure of the written ABOG examination before the study guide was 3.5 (95% confidence interval 0.77, 15.9).
CONCLUSION: These findings demonstrate an important cooperative use of the Test Item Summary Booklet as an educational resource.
A resident&#x2010;created study guide based on the CREOG Test Item Summary Booklet is useful in examination preparation.
Department of Obstetrics and Gynecology, University of Texas Southwestern Medical Center, Dallas, Texas.
Address reprint requests to: Lisa M. Hollier, MD, MPH, Department of Obstetrics, Gynecology and Reproductive Science, University of Texas Houston Health Science Center, Lyndon Baines Johnson General Hospital, 5656 Kelley Street, Houston, TX 77026; E‐mail: firstname.lastname@example.org.
Received March 22, 2001. Received in revised form August 1, 2001. Accepted August 9, 2001.
The American College of Obstetricians and Gynecologists organized the Council on Resident Education in Obstetrics and Gynecology (CREOG) in 1967. One of the important missions of the council was to develop an in‐training examination for residents. In 1970, the first of the yearly CREOG examinations was administered. The purpose of the testing is to provide both residents and program directors with a tool for assessing the cognitive knowledge of the residents. The knowledge base expected of the residents has been defined by CREOG in Educational Objectives for Residents in Obstetrics and Gynecology (1999).1
Until 1991, the CREOG examination booklets were returned to the residents after completion of the testing. This procedure negated the value of the examination as a determinant of a marginal or failing score for individual residents and prevented standardization of the resident scores to show progress from year to year. As a result, the CREOG In‐Training Examination Test Item Summary Booklet was developed and first distributed in 1993. The Booklet consists of several components: 1) categorization of the question, 2) key word phrase providing a focal point, and 3) several item‐specific text references. In 1993 and 1994, the Booklet contained the answers to the actual questions posed on the examination; however, these are not provided currently.
A 1994 survey of residents taking the CREOG examination found the Test Item Summary Booklet was underused. Only 43% of obstetrics and gynecology residents surveyed found it “helpful” or “very helpful” as a preparatory aid.2
There are no published reports evaluating the impact of the CREOG Test Item Summary Booklet as a study guide on CREOG examination performance. We therefore sought to measure the effect of a resident‐created study guide based on the Test Item Summary Booklet on resident CREOG scores, program CREOG scores, and the failure rate for the written examination of the American Board of Obstetrics and Gynecology (ABOG) in our residency program.
MATERIALS AND METHODS
We evaluated examination performance for our residency before and after the introduction of a study guide in April of 1995. After the 1995 examination in January, the residents at the University of Texas Southwestern Medical Center created a study guide using the Test Item Summary Booklet as a basis. The items in the Booklet were distributed among the 56 residents in the program, and each resident received only 7 or 8 items. Topics were researched using the references provided in the Summary Booklet and were written as a discussion of the key word phrase. Each synopsis was limited to several paragraphs and was then reviewed by resident editors and faculty. The finalized summaries were compiled into book form and initially distributed in April of 1995 to prepare for the 1995 ABOG written examination and the 1996 CREOG examination. Figure 1 illustrates the distribution of the before and after groups for analysis of examination performance.
An entirely new study guide is created each year, and the items in the latest Test Item Summary Booklet are again divided among the residents. The residents are entirely responsible for the creation of the study guide, although the department pays the cost of printing, which is approximately $12 per guide. It is distributed by November of each year.
All resident demographic information, individual resident scores, program composite scores, and national CREOG scores were entered into a database. The CREOG examination questions and scores are divided two ways: into six subjects or into five categories. Subjects are general considerations, ambulatory health, obstetrics, gynecology, reproductive endocrinology, and gynecologic oncology. Categories are mechanisms of disease, diagnosis, risks and benefits, medical therapy, and practice management. The overall mean percent correct for the individual resident, our program, and the nation were obtained from the distributed examination result sheets. The individual program and national mean percent correct by subject were obtained from the distributed examination result sheets. For each category, the mean percent correct for individuals and our program was taken from the examination result reports; however, the national mean percent correct was obtained from the CREOG office in Washington, DC (personal communication, A. Carpentieri, 2000). CREOG standardized scores were not used. The American Board of Obstetrics and Gynecology written examination scores were obtained from annual reports to our residency program. The identity of the individual resident was unknown in all ABOG results.
Initially, the annual differences in our program's mean percent correct and the national mean percent correct were calculated and compared for the 3 years before and after the introduction of the study guide using Student t tests. Categoric variables were analyzed by χ2 in before‐and‐after analyses and by Mantel‐Haenszel χ2 for trends over the 6‐year period.3 To adjust for the potential effects of differing academic abilities of residents over time, we identified the residents who had been elected to membership in the Alpha Omega Alpha Medical Honor Society (AOA) as medical students.
Because the residents took multiple examinations, the comparisons of mean percent correct were analyzed by a random effects model, which investigated the effects of the intervention on the 132 residents across the years of their examinations. This analysis accommodated correlations among repeated observations and therefore allowed the inclusion of residents as the primary sampling unit (n = 132) with multiple observations per resident (up to 4 years). The fixed effects included in the model were national mean score for the year the examination was taken, program year, AOA status, sex of resident, and the major effect of interest as to whether the examination was before or after the introduction of the study guide.
To evaluate the effect of the intervention (study guide) on the CREOG scores we also used a four‐way analysis of variance with interaction between the following factors: AOA status, sex, calendar year, and before‐and‐after study guide introduction.
The percentage of residents in our program who failed the ABOG written examination as first‐time examinees was compared in the 3 years before and after the introduction of the study guide and expressed as relative risk with 95% confidence intervals. This project was approved by the Institutional Review Board at the University of Texas Southwestern Medical Center at Dallas.
In the years from 1993 to 1998, 132 residents took a total of 335 CREOG examinations. The percentage of female residents increased significantly within our program from 48% to 69% (P = .01). There also were significantly more AOA members in our residency in 1998 (32%) compared with 1993 (10%) (P = .02).
For overall CREOG examination scores, the annual difference between our program's mean percent correct and the national mean percent correct was significantly higher in the 3 years after the introduction of the study guide (2.1% versus 4.8%, P < .001 after adjusting for AOA and resident level) (Figure 2).
A similar pattern occurred for all components. However, the differences in mean percent correct in reproductive endocrinology and gynecologic oncology were not significant (data not shown).
Using a random effects model to accommodate for correlations among repeated observations, performance at each year level within our program was investigated. The mean percent correct for postgraduate years 2, 3, and 4 were significantly higher after introduction of the study guide (Table 1).
Because there were significantly more residents who had been elected to AOA taking the examination in the post–study guide years, we evaluated examination performance stratifying by AOA status. The residents who were AOA not only had higher mean scores overall (Figure 3), but also for all examination subjects and categories in the 3 years before the introduction of the study guide (data not shown). There were no significant improvements in the examination scores of the AOA residents after the introduction of the study guide (P = .98). For non‐AOA residents on the other hand, there were statistically significant improvements in the post–study guide years (P ≤ .001 for trend). Table 2 shows the improvement for the non‐AOA residents adjusting for national mean score and gender with stratification by resident level.
We stratified the examination performance by gender. Similar improvements in the mean scores after the introduction of the study guide were seen for both men and women (P < .001 for each).
A factorial analysis of variance including before‐and‐after study guide introduction, AOA status, gender, calendar year, and their interactions was consistent with the findings of the random effects model, showing that gender is not associated with examination performance, either as a main effect or an interaction. Additionally, the AOA residents had better performance globally (P < .01); however, the gap between AOA and non‐AOA scores is narrowed in the years after the introduction of the study guide (P = .007). Finally, the scores in the post–study guide years are increased significantly after adjustment for AOA status and calendar year (P = .041)
The effect of the study guide on the failure rate for the written examination of the ABOG was also evaluated (Table 3). There were 87 first‐time examinees from 1992 to 1997 (3 years pre‐ and post–study guide). The relative risk of failure in the years before compared with after was 3.5 (95% confidence interval 0.77, 15.9).
Since its introduction more than 20 years ago, the CREOG In‐Training examination has become a major tool for the objective assessment of the cognitive knowledge of individual residents. In the 3 years after the introduction of a study guide based on the Test Item Summary Booklet, we found an improvement in our residents' performance on the CREOG examination.
During the 3 years before the introduction of the study guide, the program scores were increasing. However, after the introduction of the study guide, the increase in scores was more profound. Improvement in the scores was distributed across residency levels 2–4 and broadly across all topic areas.
There was one important exception to the general improvement. Using AOA status as a surrogate for prior academic performance, we found that the majority of the improvement in scores was attributable to non‐AOA residents. This is plausible for several reasons. The AOA residents may have been less likely to incorporate the study guide into their established study pattern. Alternatively, review of the study guide may not have improved their fund of knowledge. Finally, the lack of improvement may have been related to a “ceiling effect”—it is more difficult to raise a high score than a low one.
There was no significant effect of the study guide on the scores for the first‐year residents. It may be that the interns do not yet have adequate clinical experience or knowledge to find the study guide helpful. Alternatively, they may not have adequate time to use the study guide as it is distributed in November of each year.
Because the Test Item Summary Booklet closely mirrors the CREOG examination, we chose to examine an additional end point, the failure rate for the ABOG written examination. The difference in the failure rate after the introduction of the study guide was not statistically significant; however, this difference represents one additional resident achieving a passing score on an annual basis.
Because our comparison was retrospective, we are limited to an evaluation of changes in performance. We cannot distinguish between the effects of the process and the product. We want to emphasize that the study guide is recreated by the residents each year. By using the text references supplied in the Test Item Summary Booklet, residents were encouraged to read. They not only reviewed several specific topics, but needed to understand the material and summarize it for their colleagues. This exercise may have served to reinforce the importance of learning from textbooks and improved their ability to assimilate and synthesize medical information. The process has also served to emphasize the importance of cognitive knowledge as assessed by both the CREOG and ABOG examinations. Residents may have been more motivated to study hard and perform well. Finally, the completed study guide is an excellent resource for review. The format of the study guide allows the individual resident to pursue a general or a focused approach to further learning, depending upon their own strengths and weaknesses.
We found an improvement in our resident cognitive performance after introducing a study guide based on the Test Item Summary Booklet. Our findings support the important educational value of the Summary Booklet.
1. Council on Resident Education in Obstetrics and Gynecology. Educational objectives. Core curriculum in obstetrics and gynecology. 6th ed. Washington, DC: Council on Resident Education in Obstetrics and Gynecology, 2000.
2. Cox SM, Herbert WN, Grosswald SJ, Carpentieri AM, Visseher HC, Laube DW. Assessment of the resident in-training examination in obstetrics and gynecology. Obstet Gynecol 1995;84:1051–4.
© 2002 by The American College of Obstetricians and Gynecologists.
3. Breslow NE, Day NE. Statistical methods in cancer research. Volume 1—The analysis of case-control studies. IARC Sci Publ 1980;32:5–338.