Share this article on:

Knowledge Growth in Medical Education

Hodgson, Carol S. PhD

Letters to the Editor

Dr. Hodgson is associate professor of medicine and director, Office of Educational Research and Development, University of California, San Francisco, School of Medicine.

In reply: I thank Drs. Dugdale and Alexander for their thought-provoking response to my article.1 Evidence to confirm their conclusion that professional education decreases ignorance and reinforces knowledge, while having little effect on incorrect knowledge, is also contained in my study. I reanalyzed the pre-test and the first post-test data for the 30 nutrition items that covered material taught in the first-year curriculum at the UCLA School of Medicine, which was when the greatest concentration of nutrition education occurred and when there was the greatest amount of change in the total scores. I calculated the counts of correct, incorrect, and “don't know” responses at pre- and post-test for the 30 items as well as the pattern of change for those students who initially at pre-test answered items correctly, incorrectly, or with “don't know” and at post-test answered either correctly or incorrectly. Last, I used the method described by Dr. Dugdale and his colleagues2 to calculate perception of knowledge (number correct + number incorrect/total number of items) and accuracy of knowledge (number correct/number correct + number incorrect) when using a “don't know” category. I examined the changes from pre-test to post-test using a repeated-measures analysis of variance (ANOVA) model.

The results indicate that the percentages of correct responses (pre-test M = 40.8, SD = 15.0; post-test M = 61.0 SD = 11.7) and incorrect responses (pre-test M = 21.1, SD = 8.7; post-test M = 26.0, SD = 10.0) significantly increased while the percentage of “don't know” responses significantly decreased (pre-test M = 38.1, SD = 18.5; post-test M = 13.0 SD = 11.0). On average, items initially answered as “don't know” at pre-test were answered correctly at post-test 66% of the time. However, those items answered incorrectly at pre-test changed to correct answers at post-test only 39% of the time. Eighty-eight percent of the time items correct at pre-test were correct at post-test. The students' perception of their knowledge significantly increased, as did their accuracy. Interestingly, at pre-test the students perceived their knowledge (M = 61.9, SD = 18.5) to be less than their actual knowledge (M = 65.5, SD = 11.4). Although the accuracy of their knowledge significantly improved at post-test (M = 70.0, SD = 12.2), the improvement was much less than what they perceived their improvement to be (M = 87.0, SD = 11.0). These results are consistent with Dr. Dugdale's earlier work; however, the change from pre- to post-test demonstrates the change even more dramatically here, since perception was associated with reality at pre-test but not at post-test. This speaks to the students' inability to self-assess and their unwillingness to discard incorrect knowledge beliefs.

Back to Top | Article Outline


1. Hodgson CS. Tracking knowledge growth across an integrated nutrition curriculum. Acad Med. 2000;75(10 suppl):S12–S14.
2. Dugdale AE, Chandler D, Baghurst K. Knowledge and belief in nutrition. Am J Clin Nutr. 1979;32:441–5.
© 2001 Association of American Medical Colleges