Purpose: To evaluate the reliability, efficiency, and cost of administering open-ended test questions by computer.
Methods: A total of 1,194 students in groups of approximately 30 were tested at the end of a required surgical clerkship from 1993 through 1998. For the academic years 1993–94 and 1994–95, the administration of open-ended test questions by computer was compared experimentally with administration by paper-and-pencil for two years. The paper-and-pencil mode of the test was discontinued in 1995, and the administration of the test by computer was evaluated for all students through 1998. Computerized item analysis of responses was added to the students' post-examination review session in 1996.
Results: There was no significant difference in the performances of 440 students (1993–94 and 1994–95) on the different modes of test administration. Alpha reliability estimates were comparable. Most students preferred the computer administration, which the faculty judged to be efficient and cost-effective. The immediate availability of item-analysis data strengthened the post-examination review sessions.
Conclusion: Routine administration of open-ended test questions by computer is practical, and it enables faculty to provide feedback to students immediately after the examination.