VANA, KIMBERLY D. MS; SILVA, GRACIELA E. PhD, MPH; MUZYKA, DIANN PhD; HIRANI, LORRAINE M. MSN
Educators have promoted inquiry, teaching through questioning, to engage students dynamically in the attainment and retention of knowledge and skills. Traditionally, students have assumed passive roles, observing the instructors' performances, rather than actively participating with instructors and peers.1-4 Often, lectures present new knowledge, but lack dedicated time for teaching higher-order thinking skills such as analysis, synthesis, and problem-solving.1-4 Inquiry may be used to teach these higher-order thinking skills and increase student participation.4 However, encouraging student participation in large-lecture courses can be daunting due to time constraints and the reticence of pupils to ask or answer questions in front of large groups.
Audience response system (ARS) technology can address this need for increased interactivity and critical thinking in a large-classroom environment.1,2,5 Students' use of an ARS may promote student comprehension and retention by offering each learner several chances to participate in class, anonymously answering questions posed by the instructor. Researchers have suggested that students are satisfied with the use of ARSs, but the research is mixed on whether the use of these systems actually increases test scores. This study sought to investigate whether students' use of an ARS to answer multiple-choice PowerPoint (MCPP) slides (Microsoft, Redmond, WA) is more effective in increasing test scores than using MCPP slides alone.
AUDIENCE RESPONSE SYSTEMS
An ARS is a communication technology that allows participants to interact with the lecturer; many of these systems seamlessly merge with PowerPoint presentations.2 The lecturer may post multiple-choice or true-false questions on a PowerPoint slide, and then each student answers by pressing an individual keypad button corresponding to his/her answer. These keypads, or clickers, use infrared, radio frequency, Bluetooth, or Wireless Fidelity (Wi-Fi) technology to communicate with a receiver, which is plugged into a universal serial bus (USB) port on the podium computer.1,6,7 After polling the audience, a bar graph of the classroom responses is displayed on a projected slide. The graph allows the presenter to gauge the level of participant understanding immediately and discuss the correct and incorrect answers.3 The lecturer may choose to review previous content, tailoring the lecture to comprehension level.1,2,8-15 The responses of each individual may be saved and then later printed in a variety of report formats.16,17 Thus, the responses to the ARS questions may be compared with future test scores if desired.
As new communication technologies become available, ARSs continue to advance. Currently, radio frequency is preferred to infrared technology because it accommodates larger audiences with less interference and does not require that the keypads be aimed at the USB receiver while inputting answers.1,16,17 Enterprising researchers are experimenting with using PDAs to communicate with the audience response receivers.1,7 In addition, student terminals or full laptop keyboards may allow participants to enter their answers to open-ended questions. The instructor may randomly select a student's typed response to project on the slide.18 Some students, however, may find laptops too cumbersome to tote to class and too slow to activate.19 Thus, laptop use in large lecture-class environments may be seen as burdensome by students.
The use of ARSs may increase active learning by students in large lecture classes5,13,20 and in small groups.1,3,6,21,22 When inserting MCPP slides within a lecture, other content slides must be omitted to stay within the lecture's allotted period. In addition, the lecturer must allow more time for each MCPP slide to present the question, discuss the correct and incorrect answers, and possibly reteach the content area. Thus, the use of an ARS may decrease the amount of lecture content that may be addressed.1,6,9 Increased student comprehension may, however, outweigh this time limitation. In addition, increased lecture preparation time is required to develop questions that encourage application, synthesis, and problem solving.3
Students, as well, have said that using the ARS helps them to judge their understanding of the content2,8,11,15 and provides them the safety of answering anonymously.1,2,6,9,11,23-25 Many students believe that the ARS questions help them perform better on quizzes and examinations.2,13 Stein et al13 stated that 89% of nursing students perceived a beneficial effect of discussing distracters, citing increased comprehension of the lecture content. In addition, the ARS questions may assist students in becoming familiar with the lecturer's test-question formats.13 Students may have "emotional" investments in their responses, which increase their attentiveness6 and participation.5,10,14 Lastly, the use of ARS positively influences class attendance.6,22,26 Overall, this technology has been widely embraced by students.2,6,8,11,17,20,27,28 This may be due to the partitioning of lecture content into small, manageable units and allowing students intermittent breaks from the lecture while increasing student engagement and attention.3,6,11-13,27,29 DeBourgh2 suggested that ARS questions be used every 20 minutes to prevent "cognitive overload."2(p81) Miller et al12 demonstrated that students assign higher lecturer evaluation scores to instructors who use an ARS.
Some researchers, however, have suggested that satisfaction levels are greater in the freshman-sophomore level courses than junior-senior level courses.20 They suggest that students may appreciate an ARS more when studying an unfamiliar topic. Some students have registered dissatisfaction with the use of an ARS in taking attendance,6 and others resent the increased expectation to participate in class.2 Of course, students have been known to deceive,6 responding with multiple keypads on quizzes and attendance.
Although students are generally satisfied with audience response technology, the literature is mixed on whether the students' use of ARSs increases student performance on test scores. Some researchers have demonstrated clear increases in test scores in ARS-enhanced lecture courses rather than typical lecture courses.19,22,28-30 Crossgrove and Curran20 found that students performed significantly better on test questions that had been discussed previously in ARS-enhanced lectures. However, retention of course material may be greater in introductory versus junior-senior level courses. Preszler et al26 discovered that the test scores improved as the lecturer's use of ARS questions increased. Other research showed no difference in test scores between courses utilizing didactic lectures versus lectures that incorporated ARS.11-13
The conflicting results on ARSs' ability to positively affect test scores could be attributed to how closely ARS questions parallel test content, the frequency of MCPP slides within the lecture, differences between freshman-sophomore and junior-senior course content and student expectations, the instructors' experience with inquiry, or other confounding variables, such as student age. In addition, the lecturer may have already incorporated inquiry or small group work into his/her course; thus, one more method of encouraging student engagement, ARS, may not result in further statistically significant score increases.20 Although ARSs may increase scores in traditional, didactic courses with one-way communication, this technology may not add any advantage in courses that already engage the learner in two-way communication. Ultimately, the interaction between pedagogy and ARS use defines learning outcomes,9 and the proposed benefits of ARSs must justify institutional and student keypad costs.3
Use of Audience Response Systems in Nursing Education
Nursing researchers are beginning to investigate the use of ARSs to augment the acquisition and retention of nursing knowledge by nursing students. Three nursing studies evaluated the impact of ARSs on course test scores8,11,13; only the study by Abdallah8 showed increases in test scores. In 1995, Halloran11 investigated the use of computer-managed instruction (CMI) and keypads in a medical/surgical course taught to baccalaureate junior nursing students. The experimental group (14 students) used CMI and keypads in the lectures, whereas the control group (14 students) was taught the same content by the traditional lecture format, including discussion, overheads, and oral questioning. Both groups took the same three multiple-choice tests. The differences in test scores between groups were not significant; no differences in test scores were found between those nursing students who used CMI and those who participated in traditional lectures.
Stein et al13 investigated the use of an ARS for pretest reviews in two successive freshman nursing anatomy and physiology courses (155 and 128 students, respectively). The students received three ARS test reviews and one traditional, lecture-format review prior to the examinations in each course. The ARS test review used a Jeopardy! game format (a television game show in which contestants are presented with answers and must respond with the correct question) with 25 questions. The question formats included multiple-choice questions, true-false questions, and labeling of diagrams. The 92% of students voiced positive satisfaction with the ARS technology. Most of the students (94%) perceived a beneficial effect on their test scores, although the test averages were not significantly higher in tests that followed an ARS review. If the students answered poorly on the ARS question in class, they generally gave significantly more correct responses on a similar test question when it was on the examination. Eighty-nine percent of the students felt that discussing the distracters, or wrong answers, to the ARS questions assisted their comprehension of the lecture content.
In fall of 2006, Abdallah8 investigated the effects of using a personal response system (PRS) with 71 junior nursing students in a nursing foundation course. Students were awarded extra participation points if they used their clickers in all classes. Initial and midterm assessments of the PRS were completed. Students' exposure to PRS questions resulted in 70% or more students correctly answering the same or similar content questions on the course quizzes; 80% to 92.5% answered these questions correctly on the midterm. The higher percentage of students answering correctly on the midterm may be due to the students' realizations that the quiz questions came from the PRS slides. At midterm, 67% of the students felt that the PRS assisted them in understanding the lecture content, as compared with 56% on the first day of class. Thus, clearer understanding of what constituted core content may have led to higher test scores, rather than PRS use. Studies are needed to separate the effects of active participation from that of ARS use.
Lastly, DeBourgh2 conducted an online survey of 92 advanced nursing therapeutics course students on their satisfaction levels with ARS use. Sixty-five students completed the anonymous questionnaire 1 week before the end of the semester. Most of the responding students (75.8%) recommended continued use of clickers, and 66% believed that the ARS increased their quiz and examination scores. Fifty-three percent came to class "better prepared."2(p83) DeBourgh2 did not evaluate the impact of ARS use on test scores.
The primary objective was to evaluate whether a lecture format using MCPP slides and an ARS was more effective than a lecture format solely using MCPP slides in the comprehension and retention of pharmacological knowledge among baccalaureate nursing students. In addition, this study sought to evaluate the students' satisfaction with the course, their perceived comprehension and retention of knowledge, and their satisfaction with the use of ARS technology.
This quasi-experimental study used convenience samples. The control group consisted of 55 baccalaureate nursing students enrolled in a nursing pharmacology course at the downtown campus of a large state university in fall 2007. This group attended lecture classes that used MCPP slides interspersed throughout the lectures. The intervention group comprised 78 baccalaureate nursing students enrolled in the same nursing pharmacology course at the west campus of the same university in fall 2007. This group attended classes with MCPP slides interspersed throughout the lectures and used an ARS to input their answers to the multiple-choice questions listed on the slides; radio-frequency keypads were used. The students were assigned to the only pharmacology course taught on their campus; thus, the students were unable to self-select the course based on pedagogy. Both groups had the same instructor, lecture slides, and content material for all semester class periods.
Students in both courses were invited to participate in this study during the first class periods. They were assured that their decision to enroll or decline would not affect their course grades and that all data collected would be coded to ensure anonymity and confidentiality. They were also advised that they could drop out of the study at any time. Students' answers to the MCPP slides and the test questions were coded so that their responses were not linked to their names. In the control group, 56 of 60 students chose to participate, and one later withdrew from the course. Another pupil chose not to complete the satisfaction questionnaire at the end of the semester. In the intervention group, 79 of 81 students agreed to participate; however, one additional student withdrew from the course. An exemption for this study was obtained from the university institutional review board.
All subjects in the control and intervention groups were asked to complete the demographic questionnaires during the first class period. Completion of the demographic questionnaire was considered consent to participate in the study. The demographic questionnaire included questions on age, sex, ethnicity, and prior educational degrees. In addition, the demographic questionnaire gathered information about previous ARS experience with courses or educational offerings, previous experience with healthcare or information technology, and familiarity with instant messaging and/or text messaging (categorized into 0, 1-25, 26-50, 51-75, 76-100, and >100 texts per week). The demographic characteristics between the control and intervention ARS group were compared using the Pearson's χ2 test. Student's t test was used to compare mean age between groups. P ≤ .05 was chosen for significance for all tests.
ARS and MCPP Use
Each student in the intervention group at the west campus purchased an ARS keypad or clicker from the university bookstore, whereas the control group at the downtown campus did not purchase clickers for the pharmacology course. Each student's keypad was assigned an identifying number, which was linked to the student's name by an ARS participant list. All responses entered into each keypad were saved to a computer file after each lecture. The coded answers for each multiple-choice question (ARS and test) were then inputted into the study data for those students who chose to participate in the study. Each correct ARS answer was coded as 1; all other answers were coded as zeros. Data from students who scored zeros for all ARS questions in a lecture were presumed to be absent or nonparticipatory and were dropped from the analysis of ARS responses for that lecture. For a given lecture, 78 to 48 students utilized their clickers (mean, 66.2; mode, 72; median, 70). Because attendance was not mandatory, not all students attended each class period. In addition, students occasionally forgot their clickers.
Because some of the students in the intervention group had not yet procured their keypads before the first test, only the ARS data for the lectures covered in tests 2 to 4 were used. The students in the control group were given MCPP slides interspersed among the lecture's content slides and were asked to call out their answers. The instructor read the MCPP questions to both classes and then allowed 10 seconds for the students to respond. For each MCPP slide, the teacher discussed the correct answer and explained why each distracter was wrong to the control and intervention groups.
Students at both campuses were administered the same four-course tests throughout the semester. Each test covered six lecture topics. Each test consisted of six lecture subtests of 10 questions each, for a total of 60 multiple-choice questions. Each test question was worded differently than the ARS practice questions in lecture to encourage understanding of the concept versus memorization of the answers. Because the test score distributions were negatively skewed, the median differences in test scores for tests 2, 3, and 4 were compared between the two groups using the Wilcoxon rank sum (Mann-Whitney) test for independent nonparametric samples.
On the last day of the course, each group was given a satisfaction questionnaire asking the student's satisfaction with the nursing pharmacology course and his/her perceived comprehension and retention of pharmacological knowledge. The survey elicited opinions on the MCPP slides from both groups and on the use of an ARS for the interventional group. The students in the control group were also asked whether they wished they had an opportunity to use the ARS and would they choose a course that utilized an ARS over one that did not. The control group answered 13 questions on the survey; the ARS group responded to 21 questions. The questionnaire had five answer choices: (1) strongly agree, (2) agree, (3) neutral, (4) disagree, and (5) strongly disagree. Responses marked as (1) strongly agree or (2) agree were coded as yes. All other responses were designated as no. The authors chose to separate positive responses, strongly agree and agree from neutral or negative responses, because neutral responses were not endorsements for ARS use. The Pearson's χ2 test was used to compare the differences in percentages between the control and ARS groups on their satisfaction with the MCPP slides.
Subjects in the ARS group were significantly older than the participants in the control group (mean, 27.8 [SD, 9.4] years and 21.4 [SD, 3.0] years, respectively, P < .0001) (Table 1). The subjects in the ARS group were more likely to have prior degrees than those in the MCPP group, reaching a significance level of .007. In addition, the ARS group had more prior ARS course experience than the subjects in the MCPP group (P = .036). No significant differences were found between groups for sex, ethnicity, number of text messages per week, and prior experiences with ARS educational offerings, healthcare employment, and informational technology employment.
Participant Test Scores
No significant differences between groups were found for the overall test scores for examination 2, 3, or 4. The P values were .475, .326, and .207, respectively (Table 2). Likewise, no subtest differences were found between the groups.
Satisfaction Questionnaire Responses
Differences in percentages between the control and ARS groups on the satisfaction statements about MCPP slides are presented in Table 3. The answers differed significantly between groups for two statements. The control group was more likely to agree that correct MCPP answers corresponded to correct examination answers (statement 5): 88.9% for the control and 66.7% for the ARS groups (P = .003). However, since the control group's answers to the MCPP slides were not collected, the accuracy of their perception cannot be verified. The ARS group was more likely to disagree that incorrect MCPP answers corresponded to incorrect examination answers (statement 6), 74.4% versus 57.4% (P = .041). The ARS group had performed better on the test questions than on the MCPP questions. The ARS and control groups both agreed that (1) MCPP slides increased understanding of the lecture content (statement 2): 98.7% and 100%, respectively; (2) a discussion of the correct answers to the MCPP slides increased understanding (statement 3): 98.7% and 98.1%, respectively; and (3) a discussion of the incorrect answers to the MCPP slides increased understanding (statement 4): 97.4% and 98.1%, respectively.
In addition, both the ARS and control groups agreed that the MCPP slides influenced content areas studied for an examination (statement 7): 56.4% and 66.7%, respectively. Both the ARS and control groups denied that correct MCPP answers led them to focus less on the content when studying for an examination (statement 8): 75.6% and 75.9%, respectively, but concurred that incorrect MCPP answers led them to focus more on content when studying for an examination (statement 9): 53.9% and 64.8%, respectively. Both groups readily agreed that MCPP slides assisted the students in attaining better examination scores (statement 10), 78.2% (ARS) and 87.0% (control group), and disagreed that the MCPP slides caused the pupils to perform more poorly on the examinations (statement 11): 97.4% (ARS) and 96.3% (control group).
Interestingly, the control group overwhelmingly disagreed with two statements pertaining to ARS use (Table 3). The control subjects rejected statement 12, which declared that they wished they had the opportunity to use the ARS; 72.2% said no. In addition, 77.8% rejected the statement that they would choose an ARS course over a non-ARS course.
Only the intervention group predominantly agreed that ARS use was beneficial (Table 3). The students felt that ARS feedback increased comprehension of the material (93.6%) and increased retention of lecture content (89.7%). Furthermore, the students in the ARS group agreed that the use of an ARS increased their examination scores (64.1%) and did not decrease their scores (97.4%). Most were glad to have the opportunity to use the ARS (94.9%) and would recommend that the ARS be used in nursing pharmacology courses (92.3%) and in other nursing courses (85.9%). In fact, 61.5% of the interventional students would choose an ARS course over a non-ARS course. Because junior-senior students were noted to be less embracing of ARS technology,20 the authors expected that students with prior degrees might not concur; this was not the case. Interestingly, 64.1% of the ARS group would not choose a course that used laptops to answer MCPP questions. Similarly, 83.3% of the respondents would prefer to use clickers versus laptops in answering MCPP questions.
Baccalaureate nursing students who attended nursing pharmacology lectures using MCPP slides and an ARS did not demonstrate increased comprehension and retention of pharmacological knowledge compared with those students relying solely on MCPP slides. There were no significant differences between course test scores or subtest scores between the MCPP and the MCPP plus ARS groups. The ARS group, who had older students with more prior degrees, performed the same as the control group students, who were younger. These results suggested that the use of an ARS did not provide an additional boost to students' test scores, and these results are similar to the findings of Stein et al13 and Halloran.11 The test scores may have been influenced more by the MCPP slides, which increased student engagement in both the ARS and control groups. Thus, MCPP use in both groups may have encouraged active inquiry and positively affected both groups' test scores, resulting in no discernable differences in test scores between groups, as Crossgrove and Curran suggested.20
As Stein et al13 observed, the students believed that ARS use enabled them to comprehend course material and attain higher test scores, even though the test scores were not significantly increased. Similarly, Abdallah8 noted that students felt that the ARS assisted their comprehension of lecture content, and DeBourgh2 remarked that students positively attributed ARS use to achieving higher examination scores. In the current economic climate, the lack of improvement in test scores may outweigh the increased student satisfaction levels when weighing the financial costs of ARSs to the institution and students. However, the merits of obtaining immediate student feedback by the instructor must not be underestimated when concurrently tailoring a lecture to their needs.
This study supports the benefits of inquiry in lecture courses, regardless of ARS use. Students in the ARS and control groups extolled the virtues of using MCPP slides within the lectures. Students in both groups overwhelmingly agreed that MCPP slides increased understanding of the lecture content and that discussion of correct and incorrect answers to the MCPP increased their understanding.
Generally, both groups agreed that correct answers to the MCPP slides corresponded to correct test answers; the students may have already shown mastery during the lecture and retained this learning at testing. Correctly answering MCPP slides in lecture did not lead the students in both groups to study less on that content when studying for a test. Pupils may have felt the need to revisit mastered material, believing the MCPP questions would be on the examination. The two groups disagreed, however, that incorrect answers to the MCPP slides resulted in incorrect answers on the examinations. The ARS group was more likely to disagree that incorrect MCPP answers corresponded to incorrect examination scores. Possibly, these older students had more experience studying for tests and recognized the need to study content areas in which they had difficulty. Students in both groups concurred that incorrect answers to MCPP slides led them to focus on that content when preparing for the examination. This focus may result in postlecture mastery of content. Thus, pupils prepared for examinations by reviewing mastered and nonmastered content.
Interestingly, the students who attended lectures with MCPP slides did not wish for an opportunity to use an ARS. Perhaps they preferred to avoid additional costs. The costs of ARS USB receivers and site licenses are negotiated between the university and vendor. Some of these costs may be passed to students through keypad purchases at the university bookstore. Or the pupils may have been pleased with the MCPP-slide format and been unaware of the advantages of anonymity and histograms provided by ARSs, the control group having significantly less previous ARS experience. Future crossover design studies are needed to investigate the reasons for the students' preferences between MCPP-enhanced lectures with and without ARSs, as well as to compare these MCPP-enhanced lectures with non-MCPP didactic lectures. In addition, studies are needed to determine whether cumulative final test scores are increased by the use of ARSs.
Although the test scores were not significantly different, the histograms of student responses allowed the instructor to tailor the lectures to overall student needs, instead of the vocal few. If instructors use existing ARS technology already established at their institution, they may decrease student costs by requiring one clicker for all courses at that institution. In addition, favorable student responses suggest that this technology improves the educational climate, which may be reflected in students' evaluations of faculty.12 Furthermore, if the control group had been allowed to try the ARS, the students might have embraced this technology.
A limitation of this study is that ARS participants may have chosen to switch clickers with other students without the researchers' knowledge. However, because no points were awarded for attending lecture or for responding to the MCPP slides, there was no incentive for students to exchange their registered clickers. If the students had answered for each other, comparison between the ARS and test responses for each student in the ARS group would have been invalid. The likelihood of the Hawthorne effect is low because the students used MCPP slides and the ARS throughout the semester, which diminishes the effect of novelty on the study's results. In addition, the satisfaction questionnaire has only been tested for content validity. Only questions deemed valid by all authors were kept. No other tests for reliability or validity were done. No validated satisfaction questionnaires on ARS use were found. Lastly, the generalizability of the results is limited because of the use of convenience samples rather than random sampling. These outcomes are specific to baccalaureate nursing students enrolled in a nursing pharmacology course. Further research is required on the impact of ARSs on student learning within other baccalaureate nursing courses.
1. Collins LJ. Livening up the classroom: using audience response systems to promote active learning. Med Ref Serv Q. 2007;26(1):81-88.
2. DeBourgh GA. Use of classroom "clickers" to promote acquisition of advanced reasoning skills. Nurse Educ Pract. 2008;8(2):76-87.
3. Dufresne RJ, Gerace WJ, Leonard WJ, Mestre JP, Wenk L. A classroom communication system for active learning. J Comput High Educ. 1996;7(2):3-47.
4. Kraft RG. Group-inquiry turns passive students active. Coll Teach. 1985;33(4):149-154.
5. Lowry PB, Romano NC, Guthrie R. Explaining and predicting outcomes of large classrooms using audience response systems. Proceedings of the 39th Hawaii International Conference on System Sciences, January 4-7, 2006. Los Alamitos, CA: Computer Society Press; 2006.
6. Caldwell JE. Clickers in the large classroom: current research and best-practice tips. CBE Life Sci Educ. 2007;6(1):9-20.
7. Menon AS, Moffett S, Enriquez M, Martinez MM, Dev P, Grappone T. Audience response made easy: using personal digital assistants as a classroom polling tool. J Am Med Inform Assoc. 2004;11(3):217-220.
8. Abdallah L. Reflective teaching with technology: use of a personal response system and publisher's web site to enhance students' performance in a nursing assessment and skills course. Online J Nurs Inform
[electronic resource]. 2008;12(1). http://ojni.org/12_1/abdallah.html
. Accessed March 16, 2009.
9. Cain J, Robinson E. A primer on audience response systems: current applications and future considerations. Am J Pharm Educ. 2008;72(4):77.
10. Gamito EJ, Burhansstipanov L, Krebs LU, Bemis L, Bradley A. The use of an electronic audience response system for data collection. J Cancer Educ. 2005;20(1 suppl):80-86.
11. Halloran L. A comparison of two methods of teaching. Computer managed instruction and keypad questions versus traditional classroom lecture. Comput Nurs. 1995;13(6):285-288.
12. Miller RG, Ashar BH, Getz KJ. Evaluation of an audience response system for the continuing education of health professionals. J Contin Educ Health Prof. 2003;23(2):109-115.
13. Stein PS, Challman SD, Brueckner JK. Using audience response technology for pretest reviews in an undergraduate nursing course. J Nurs Educ. 2006;45(11):469-473.
15. Johnson JT. Creating learner-centered classrooms: use of an audience response system in pediatric dentistry education. J Dent Educ. 2005;69(3):378-381.
16. Barber M, Njus D. Clicker evolution: seeking intelligent design. CBE Life Sci Educ. 2007;6(1):1-8.
17. Kelley KA, Beatty SJ, Legg JE, McAuley JW. A progress assessment to evaluate pharmacy students' knowledge prior to beginning advanced pharmacy practice experiences. Am J Pharm Educ. 2008;72(4):88.
18. Foegen A, Hargrave CP. Group response technology in lecture-based instruction: exploring student engagement and instructor perceptions. J Spec Educ Tech. 1999;14(1):3-17.
19. Nosek TM, Wang W, Medvedev I, Wile MZ, O'Brien TE. Use of a computerized audience response system in medical student teaching: its effect on active learning and exam performance. In: Reeves T, Yamashita S, eds. Proceedings of World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education. Chesapeake, VA: AACE; 2006:2245-2250.
20. Crossgrove K, Curran KL. Using clickers in nonmajors- and majors-level biology courses: student opinion, learning, and long-term retention of course material. CBE Life Sci Educ. 2008;7(1):146-154.
21. Hoopes LL. Educator highlight: Morris Maduro. CBE Life Sci Educ. 2008;7:3-4.
22. Freeman S, O'Connor E, Parks JW, et al. Prescribed active learning increases performance in introductory biology. CBE Life Sci Educ. 2007;6(2):132-139.
23. Elliot C. Using a personal response system in economics teaching. Int Rev Econ Educ. 2003;1(1):80-86.
24. Holmes RG, Blalock JS, Parker MH, Haywood VB. Student accuracy and evaluation of a computer-based audience response system. J Dent Educ. 2006;70(12):1355-1361.
25. Torbeck L. Enhancing programme evaluation using the audience response system. Med Educ. 2007;41(11):1088-1089.
26. Preszler RW, Dawe A, Shuster CB, Shuster M. Assessment of the effects of student response systems on student learning and attitudes over a broad range of biology courses. CBE Life Sci Educ. 2007;6(1):29-41.
27. Latessa R, Mouw D. Use of an audience response system to augment interactive learning. Fam Med. 2005;37(1):12-14.
28. Pradhan A, Sparano D, Ananth C. The influence of an audience response system on knowledge retention: an application to resident education. Am J Obstet Gynecol. 2005;193:1827-1830.
29. Conoley J, Moore G, Croom B, Flowers J. A toy or a teaching tool? The use of audience-response systems in the classroom. Techniques. 2006;81(7):46-48.
30. Schackow TE, Chavez M, Loya L, Friedman M. Audience response system: effect on learning in family medicine residents. Fam Med. 2004;36(7):496-504.
For more than 19 additional continuing education articles related to education and to access the CE test for this article, go to NursingCenter.com\CE.
© 2011 Lippincott Williams & Wilkins, Inc.