Share this article on:

Knowledge Growth in Medical Education

Dugdale, Alan MD; Alexander, Heather PhD

Letters to the Editor

Dr. Dugdale was formerly associate professor, Department of Paediatrics and Child Health; and Dr. Alexander is senior lecturer in Medical Education, Graduate School of Medicine, both at the School of Medicine, University of Queensland, Herston, Australia.

Carol Hodgson has shown1 that nutrition teaching increased the percentage of correct responses students gave to multiple-choice questions (MCQs) of the multiple true/false type (MTF)—with “don't know” as an option—from about 45% to about 75%, with a corresponding decrease in the “don't know” responses they gave. The level of incorrect responses, however, remained high and almost unchanged.

If incorrect responses were due to random guessing, a student would get different items wrong on each retest. If a student thought mistakenly that his or her answer was correct, then that student should repeat the same mistake on each retest. This is easily testable. A student would mark “don't know” if he or she admitted ignorance or decided that his or her knowledge was too uncertain to risk a negative mark. In Hodgson's questionnaires, a wrong response caused minimal harm, so even uncertain personal knowledge could be used safely. A student would be more cautious in pass-or-fail MTF—MCQ examinations if the negative marking of error had serious repercussions. In clinical medicine, where using false information could cause serious harm or death, even stricter criteria would apply.

We have shown2 that professionals and lay people answer many MTF—MCQ questions on nutrition incorrectly. Later discussions with our study's participants confirmed that their incorrect responses reflected their actual beliefs. We also have data3 on pass-or-fail examinations taken by 658 students at the end of pediatrics postings [clerkships]. These examinations included MTF—MCQ questions with “don't know” options and negative marking for error. Participants commonly gave incorrect responses to questions covering topics taught during the postings, sometimes at rates significantly higher than 50%, which eliminated random guessing as a cause. Through a pre-/post-test on a subset of students, we found results similar to those Hodgson reports. Analysis of these results ruled out guessing as a major source of error.

If analysis of her results discounts the effects of guessing, then Hodgson's pretest showed that the students had some prior knowledge that agreed with current science; they gave some incorrect responses, suggesting they held beliefs that disagreed with current scientific knowledge, and there were some topics about which the students admitted ignorance by responding “don't know.” Hodgson's post-test showed that the material taught replaced some ignorance and probably reinforced pre-existing “correct” knowledge, but it also revealed that the course hardly altered prior beliefs contrary to current scientific knowledge. We have drawn the same conclusions from our results.

Many medical schools and departments must have similar sets of data. If they show the same patterns as those found by Hodgson and ourselves, then we suggest that the main effect of professional education is to replace ignorance with correct (by current scientific standards) facts and reinforce valid pre-existing knowledge. However, prior incorrect beliefs seem much more resistant to change.

Back to Top | Article Outline


1. Hodgson CS. Tracking knowledge growth across an integrated nutrition curriculum. Acad Med. 2000;75(10 suppl):S12–S14.
2. Dugdale AE, Chandler D, Baghurst K. Knowledge and belief in nutrition. Am J Clin Nutr. 1979;32:441–5.
3. Alexander H. The assessment of knowledge in medical education [dissertation]. Herston, Australia: University of Queensland, 1998.
© 2001 Association of American Medical Colleges