Over the last two decades the scientific evidence supporting the effectiveness of several cancer screening tests has grown dramatically. As a result, physicians are increasingly expected to be proficient in promoting and performing selected tests and in counseling their patients about screening. Medical schools have the responsibility to provide students with a foundation in cancer prevention, including a basic knowledge of cancer pathophysiology and epidemiology. Students should also develop skills for interviewing patients about their cancer risks and screening histories, for counseling about relevant lifestyle changes and screening tests, and for performing aspects of the physical examination related to screening.1
In addition to developing and implementing a comprehensive cancer-control curriculum, medical educators face the continuing challenge of validly measuring students' knowledge about and attitudes toward preventing and detecting cancer and the relevant counseling and physical diagnostic skills.2,3,4,5 Various evaluative tools have been used to assess students' cancer-control knowledge and clinical skills. Traditional written exams are the most feasible means of assessing knowledge acquisition and self-reported beliefs, attitudes, and behaviors,6,7 but some investigators, such as Harrell et al.,8 have studied self-reports of process-of-care tasks in primary care settings. In the Harrell study, students used rating cards on specific patient encounters during their clerkship experiences. In another example, Peters and colleagues3 evaluated students' performances in an academic course by videotaping and rating students during interviews with a standardized patient (SP).
Directly observing and rating the performances of selected clinical skills requires substantial resources, but it is believed to provide a more valid measurement of skills than self-report. The objective structured clinical examination (OSCE) is an alternative to recording and/or rating performances in the actual clinic, where the numbers and types of cases may vary considerably. The OSCE has become popular in recent years and is considered a valid means to assess clinical skills that are fundamental to the practice of medicine.9 OSCE-type examinations are particularly appropriate for assessing physical examination skills because the standardized-yet-realistic test makes it possible to determine whether the student can actually perform the skill of interest. SPs are increasingly used to both teach strategy and evaluate method for the clinical breast examination (CBE). While SPs are central to an OSCE, supplemental tasks are often used as well. For example, in the CBE, silicone models of breasts may be used to assess whether students are able to locate and identify masses.
In general, physicians' greater knowledge of prevention is related to a more positive attitude toward clinical practice, and both knowledge and positive attitude are associated with the actual use of skills in clinical practice. Among medical students, however, the links across measures of knowledge, self-reported confidence, and physical examination skills are inconsistent.4,5 For example, Krackov et al.10 found knowledge about cancer biology and attitudes about cancer care and cancer patients improved after students completed an elective oncology course. The knowledge and attitude scores, however, were not related to each other. In contrast, Peters and colleagues3 found that knowledge and attitude measures of disease-prevention principles among students who completed a disease-prevention course were positively correlated. Lee et al.11 found inconsistent links between knowledge, confidence, and performance of the breast examination after standard training through each year of medical school. Knowledge was not related to proficiency, but was related to stage of training. This variability in findings suggests the need for further investigation. If cancer-control knowledge, attitudes, confidence, and objective clinical performances are not highly related, understanding the reasons for the divergence of these dimensions could be useful in designing more effective curricula.
The purpose of this study was threefold: to assess students' competency in clinical breast cancer screening practices, to compare different approaches for measuring these competencies, and to assess the effect of an additional CBE training session with an SP on students' CBE performances.
The Cancer Prevention and Control Education Program (CPACE) at the University of Massachusetts Medical School, supported by the National Cancer Institute, is part of a nationwide effort to improve skills and practices of health professionals in cancer prevention and control. A major component of CPACE enhances the medical school curriculum to improve students' knowledge, attitudes, and skills regarding cancer prevention. CBE and mammography counseling skills were a major focus of the project.
During the spring of 1998, all 98 third-year medical students (48 men and 50 women) were asked to complete a short optical-scan questionnaire assessing their confidence and knowledge, as described below. They were also required to complete a six-station OSCE in which students interacted with an SP in a variety of primary care scenarios. One student, a woman, missed the OSCE exam, and another student, a man, did not complete the questionnaire, resulting in complete data for 96 students (47 men and 49 women).
At one OSCE station students were instructed to conduct a cancer-prevention history, including a history of breast cancer risk factors and screening practice, counsel the SP about getting mammography, perform a CBE on the SP, and perform CBEs on two silicone breast models. The students were allowed 20 minutes for history taking, counseling, and the CBE. An additional 15 minutes was allotted for a physical examination of the models and for recording the findings and other paperwork. Following the exam, the SP used a checklist to rate each student on several dimensions of his or her prevention history, counseling and CBE performance, and general interview skills. The SP assigned a yes/no rating for each item on the checklist, indicating whether the student had exhibited the skill or performed the task.
Data on the following measures were obtained from the responses to the self-administered questionnaires and the OSCE rating forms.
Knowledge of breast cancer epidemiology. On the questionnaires, students indicated which cancers (from six choices) ranked first, second, or third in incidence and mortality among women over age 50. Additionally, they chose the best estimate of percentage for breast cancers thought to be attributable to an inherited genetic mutation, and the relative risk of developing breast cancer for a woman with one first-degree relative (such as a mother or sister) with the diagnosis of breast cancer compared with a woman with no family history of breast cancer. Respondents were then asked to answer yes or no if there was strong evidence from randomized clinical trials that a screening test is effective in reducing mortality from the breast cancer by at least 20–30%, for which cancers (including breast) smoking cigarettes is an important risk factor, and for which cancers (including breast) a diagnosis in a first-degree relative increases the risk of developing cancer. The composite measure was the sum of points received for correct answers, with seven being the highest number of points possible.
Confidence. Respondents to the questionnaire also rated their confidence levels to counsel a 60-year-old woman reluctant to have mammography, to perform a female breast examination, and to interpret findings of this examination. Responses were on a four-point Likert-type scale for the three measures (1 = not at all confident, 4 = very confident).
Performance. Three cancer prevention performance domains and general interview skills were measured through the OSCE:
- ▪ Risk factor and screening history skills: 17 items (rated yes = 1 and no = 0); total score range 0–17
- ▪ Mammography counseling skills: 16 items (rated yes = 1 and no = 0); total score range 0–16
- ▪ Clinical breast exam skills: 30 items (rated yes = 1 and no = 0) on details of visual inspection, axillary examination, and breast examination; total score range 0–30
- ▪ General interview skills: Six items on organization, type of question, pacing, facilitative behavior, encouragement of questions, and closure of the interview were rated on a scale of 1–5; the total score was the mean of the scores for the six items (range 1–5)
All third-year students had received training for the CBE during the physical diagnosis course prior to administration of all assessment measures. This training was small-group instruction and assessment by the SP. Students were able to practice the examination on the SP and on two silicone breast models. Half of the students had a second training session during the obstetrics—gynecology clerkship rotation.
We computed descriptive statistics and frequency distributions both for the entire sample and for the relevant subgroups. Differences in means were assessed using the two-tailed t-test for all measures, and we evaluated the relationships between the knowledge score, the confidence ratings, and performances on the OSCE using correlations where appropriate.
Finally, to assess the relationships of knowledge and confidence to performance, we conducted hierarchical regression analyses. Predictors were gender, the relevant knowledge and confidence measures, training status, and the measure of general interview skills. The dependent variables were the OSCE skill measures (history, counseling, and breast examination). In each analysis, gender was entered into the model first, then the contribution of the additional predictors was assessed.
Table 1 reports the mean of each measure by gender and receipt of supplemental CBE training. Two measures differed significantly by gender: confidence in the ability to perform a CBE was significantly higher for women than for men (3.4 vs. 2.9; p = .001), and women also received marginally higher scores in general interview skills (4.5 vs. 4.3; p = .02). Those students who had received supplemental CBE training scored higher on CBE performance than did those with only standard training (25.0 vs. 23.0; p = .04). No other statistically significant difference was found for the two training groups.
Table 2 summarizes the correlations of knowledge, confidence, and performance. Breast cancer knowledge scores were modestly correlated with CBE performance scores (r = .26, p = .01). Confidence in mammography counseling skills was significantly correlated with the mammography counseling performance score (r = .43, p < .001). The general interview skills score was significantly correlated with four of the six items. It was modestly associated with confidence in mammography counseling (r = .32, p < .01) and CBE confidence ratings (r = .29, p < .01), as well as with the performance scores for mammography counseling (r = .36, p < .001) and the risk history (r = .40, p < .001).
Hierarchical regression analyses, reported in Table 3, revealed that both confidence in mammography counseling and the general interview skills score were significant predictors of a high mammography counseling score after controlling for gender. Knowledge and training did not improve prediction over and above these predictors. With respect to the CBE performance score, both breast cancer knowledge and training status were significant predictors of performance over and above gender. Only the general interview skills score was a significant predictor of performance of risk history after taking gender into account.
The descriptive findings from this study must be placed in the context of reasonable expectations for third-year medical students who are one year out from their basic science education. Related work reports descriptive findings for first-, second-, and third-year students.12 In our study, students' knowledge of breast cancer epidemiology was less than optimal, especially given that the knowledge questions were about issues relevant to clinical practice (e.g., the relative incidences of common cancers and the level of risk associated with family history). The mean levels of confidence for mammography counseling and CBE are reasonable (between somewhat and very confident), but they demonstrate that a small proportion of students still have little confidence in their skills at the end of their third year. The relatively low level of confidence in the interpretation of findings on the CBE is appropriate for students completing their first clinical year of training, since at this stage exposure to common clinical findings on the breast examination is limited. The mean scores on all the performance measures indicate quite acceptable performances by most students, although a minority scored lower than expected for third-year students.
The main objective of this study was to assess the relationship between a measure of cancer-control knowledge and three measures of clinical performance, and between three measures of confidence and the performance measures. We found that higher levels of confidence in mammography counseling skills and higher general interview skills scores were associated with higher mammography counseling scores. CBE confidence levels, however, were not related to the actual performance of the CBE. This result supports findings from Lee's study,11 which also showed no link between greater confidence and better performance of the CBE. Perhaps students are better able to assess their own performances of verbal cognitive (counseling) skills than of physical examination skills.
Our study confirms the importance of practicing psychomotor skills such as the CBE. The students who had had supplemental training in the CBE scored higher on the CBE performance assessment than did those without the extra session, even though the additional training did not appear to affect confidence in performing a CBE. These findings are similar to those of Sachdeva et al.,13 who found that a single intervention with an SP significantly enhanced students' breast examination skills. Other studies suggest the need to reinforce physical examination skills to maintain accuracy.11 Because the extra training in our study took place six months before the OSCE, we can feel confident that the higher scores reflect longer-term learning. This may have important implications in medical education if a simple extra training session is adequate for reinforcing skills. It is possible, however, that the training may have been specific to the CBE, and more investigation is needed to see whether it would hold true for other examination skills. It is apparent that the simple introduction to breast models is time- and cost-efficient—the students use the same silicone models, the models do not take up space, and a trained SP, rather than faculty, can educate the students.13
We also found that knowledge about breast cancer was modestly related to the CBE score, unlike Lee,11 who found that knowledge of breast cancer was not related to proficiency on the CBE. This discrepancy may be due to the different knowledge scales used in the two studies. The content of the knowledge we measured (epidemiologic facts) could not contribute directly to performance skills, so perhaps that knowledge is associated with other characteristics more relevant to developing CBE skills, such as motivation to learn, capacity to retain information, or other behavioral factors. While there were some modest gender differences, such as in confidence to counsel, the only significant difference in the genders' performances related to general interview skills, which women performed better than men.
One limitation of this study is the lack of a standardized measure for knowledge in our questionnaire or in the OSCE. In addition, some SPs participated in both the training and the end-of-year OSCE. While unlikely, it is impossible to guarantee lack of interviewer bias.
Educators should be encouraged to adopt a multi-method approach to the evaluation of students' knowledge, skills, and attitudes about breast cancer, incorporating data from self-report and from observation. They cannot assume, however, that positive results in one domain (e.g., confidence) are associated with positive results in another (e.g., examination skill). It is possible that the students are poor self-assessors (confidence) and/or that the evaluation tools are inadequate. We need to learn more about what learning experience contributes (and contributes differentially) to counseling skills and to physical examination skills.14 Additionally, while knowledge may be an important step to acquiring skill, knowledge may not be a good indicator of students' performances of the risk assessment, counseling, or physical examination skills.
1. Campbell H, McBean M, Mandin H, Bryant H. Teaching medical students how to perform a clinical breast examination. Acad Med. 1994;69:993–5.
2. Brownson R, Davis J, Simms S, Kern T, Harmon R. Cancer control knowledge and priorities among primary care physicians. J Cancer Educ. 1993;8:35–41.
3. Peters A, Schimpfhauser F, Cheng J, Daly S, Kostyniak P. Effect of a course in cancer prevention on students' attitudes and clinical behavior. J Med Educ. 1987;62:592–600.
4. Scott C, Greig L, Neighbor W. Curricular influences on preventive-care attitudes. Prev Med. 1986;15:422–31.
5. Scott C, Neighbor W. Preventive care attitudes of medical students. Soc Sci Med. 1985; 21:299–305.
6. Chamberlain R, Lane M, Weinberg A, Carbonari J. Application of cancer prevention knowledge: a longitudinal follow-up study of medical students. J Cancer Educ. 1987;2:93–106.
7. Delnevo C, Abatemarco D, Gotsch A. Health behaviors and health promotion/disease prevention perceptions of medical students. Am J Prev Med. 1996;12:38–43.
8. Harrell P, Kearl G, Reed E, Grigsby D, Caudill T. Medical students' confidence and the characteristics of their clinical experiences in a primary care clerkship. Acad Med. 1993;68:577–9.
9. Tervo R, Dimitrievich E, Trujillo A, Whittle K, Redinius P, Wellman L. The objective structured clinical examination (OSCE) in the clinical clerkship: an overview. South Dakota J Med. 1997;50(5):153–6.
10. Krackov S, Preston W, Rubin P. Effects of an oncology elective on first-year medical students' knowledge and attitudes about cancer. J Cancer Educ. 1990;5:43–9.
11. Lee K, Dunlop D, Dolan N. Do clinical breast examination skills improve during medical school? Acad Med. 1998;73:1013–9.
12. Zapka JG, Luckmann R, Sulsky SI, et al. Cancer control knowledge, attitudes, and perceived skills among medical students. J Cancer Educ. 2000;15:73–8.
13. Sachdeva A, Wolfson P, Blair P, Gillum D, Gracely E, Friedman M. Impact of a standardized patient intervention to teach breast and abdominal examination skills to third-year medical students at two institutions. Am J Surg. 1997;173:320–5.
14. Little M, Rodnick J. Evaluating student experiences in a family medicine clerkship. Fam Med. 1988;20:347–51.