GRIMES, CORINNE PhD, RN, CNE; JOINER ROGERS, GLENDA PhD, RN, CNS; VOLKER, DEBORAH PhD, RN, AOCN; RAMBERG, ELIZABETH MSN, RN
In response to the nursing shortage, colleges and universities around the country have been creating innovative nursing programs to attract and retain nontraditional students. The Alternate Entry Master of Science in Nursing (AEMSN) program at The University of Texas at Austin School of Nursing is an accelerated course of study for second-degree students in nursing. The 15 months of foundational course work prepares students with degrees in other fields to take the NCLEX and begin practice as RNs. They then go on to advanced course work and, upon completion, attain an MSN.
Newly admitted students have developed sound study habits prior to their entry into the program and have a pattern of success in academics. In the past 4 years, the mean grade point average from undergraduate work is 3.3 on a 4.0-point scale. From early course work in the AEMSN program, it is evident that these students have experience with writing and speaking skills, are computer literate, form teams easily and, as a group, have a strong commitment and desire to learn. In short, they are motivated to succeed. From entrance applications, it is evident that the students are usually older and more mature in life experiences than the average undergraduate nursing student is.
Students in the AEMSN program have devoted time and energy and have made personal sacrifices to return to school. The faculty have observed that they tend to have strong expectations that in-class time will be productive. With that in mind, the faculty felt that there was a need to enhance the classroom experience for this nontraditional student group. In addition, the faculty recognized that the accelerated pace for this group posed additional dangers: a student might falter before a learning problem would be noticed. Thus, a need for early recognition and remediation for marginal students was vital.
While the students who enter the AEMSN course of study are high achievers, many of these students had not been exposed to the real-life, real-time analysis, prediction, and decision-making skills needed by today's RN. A method was needed to foster practice with complex concepts in the classroom and to give teachers immediate classroom feedback about student in-class mastery of core material. For these reasons, the classroom performance system (CPS), an audience participation system designed for use in the classroom, was implemented and a program evaluation project was initiated to assess the usefulness of this tool in our setting. The project focused on student and faculty satisfaction with the tool and on impact on student learning outcomes.
The Student Learner
Some studies indicate that, while students in general approach learning in individual and unique ways, there are commonalities that create student dissatisfaction with the traditional classroom.1-4 Among the techniques likely to result in student dissatisfaction is the tendency of teachers to rely on passive learning strategies such as lectures. Studies indicate that, despite readily available means to vary teaching techniques, there is still a preponderance of passive methods using traditional teacher-led, fixed-content approaches for the delivery of information in classrooms.4 Students may tend to "zone out" during presentations when slides are used primarily to restate facts from the text.3 This disparity between students' needs for interaction and teachers' needs to "push through" a given set of content in a possibly constricted time frame can create problems.
Additional evidence indicates that complex learning, such as that needed to think critically, involves a variety of more interactive strategies to ensure retention of information.5,6 When one considers the trend toward interactive online education for a variety of professional preparation paths, the need to change old ways becomes evident. Effective pedagogic strategies are flexible, facilitate communication among participants, and enhance problem-solving skills and critical thinking.7-9 These strategies also allow students input and choice, which favorably influences the learning process and may result in greater student satisfaction.
In the age of interactive electronics, ensuring engagement of student attention may call for interactive strategies in the classroom. From an early age, today's students are used to interacting with computers on a daily basis for gaming and socialization as well as for course work. Future complex learning may be enhanced by the development of new strategies for teaching that take advantage of this electronic interaction style inherent in most of today's college students. By taking advantage of this natural evolution in student comfort levels with technology, presentations and interaction in the classroom will likely be enhanced and breed success. In contrast, passive classroom learning will likely contribute to dissatisfied students and frustrated teachers.
The Technology Interface
One of the classroom strategies being used to encourage active learning, critical thinking, and application of knowledge is through the use of audience response technologies. These interactive tools are varyingly described as audience response systems, interactive response systems, CPSs, classroom response systems, student response systems, electronic response systems, and wireless response technology. They are all characterized by the use of an electronic platform with individual keypads for learner response to a teacher's prompt or question within a group setting. These response tools have been used since the 1970s in large, higher-education classroom settings to evaluate students' performance on examination-type questions and mastery of critical thinking.10-12 For clarity, the term classroom performance system (CPS) will be used for the remainder of this article.
A CPS is a computer-based, typically wireless, piece of classroom instructional technology designed to improve interaction and collaboration by allowing teachers to solicit student feedback on multiple-choice-type questions. Student responses are collected individually but are generally displayed collectively in the classroom on a computer and projector as histograms of answers. The histogram will also indicate the correct response. Students are then able to see how their responses compare with those of other students. Woods and Chiu13 suggest that, if most of the students answer a question correctly, then those who answer incorrectly may be motivated to study more on that subject. Rice and Bunz8 assert that it is reasonable for students who answer incorrectly to be more attentive to follow-up explanations. Through this increase in attention and involvement, there will be potential for increased learning to take place.
Interactive response technology allows students to make mistakes and enables the instructor to provide on-the-spot clarification of content and/or tailoring of lecture material according to student responses. The use of a CPS provides an atmosphere that is more engaging than traditional methods of lecture and inquiry.8,14 The suggestion is that an improvement in classroom morale occurs when such interactive techniques are used for complex semester-long courses. Authors representing a variety of disciplines suggest that the use of an interactive system in the classroom enhances the outcomes of the educational experience by stimulating critical thinking and fostering student motivation to study.8 Within healthcare educational settings, CPSs have been used with undergraduate nursing students,15,16 medical students,17,18 interdisciplinary continuing education program participants,19,20 dental students,12 and pharmacy students.9 No studies specific to accelerated graduate nursing programs were found.
Description of Classroom Technology
In the spring of 2005, the AEMSN program instructors instituted a CPS available from the E-Instruction Company located in Denton, TX. From a number of commercially produced systems available around the country, the investigators selected this model because of the easy access to technical support and because it was in use in other colleges within the university. The system allows students to respond to either preprogrammed or random questions inserted into a standard lecture format. True-false or multiple-choice questions can be used to determine content mastery or just to assess such simple things as group background demographics on the first day of class. Teachers can use standard black-and-white overhead projections, verbal items, and/or slides and slide software such as PowerPoint (Microsoft, Redmond, WA). The selected CPS works well with any of these options.
Each student is issued a keypad, similar in size and complexity to those of a standard remote control device (also termed clicker). When the teacher indicates that it is time to do so, each student presses a key on the keypad. Students can compare their responses to other students' from a unique code number on each individual keypad. Use of the code number (ranging from the number "1" to "49" to represent the number of students enrolled) is important in keeping each student's response confidential within the larger group. When the teacher directs the system to compile and display all student responses to a particular question, students then can examine the histogram of responses in a projection at the front of the classroom. The correct response is clearly identified through a different color within the histogram's bar. The student is able to identify his/her own answer and to compare it with the correct response, as well as with how the entire class performed on the item.
Not only does this alert each student to a potential problem with mastery of content, but also, through delivery of class performance records to the teacher's office computer, it serves as an early alert system for the teacher regarding marginal students who seem to be struggling with application and analysis-level questions. The system also serves as an early alert for the teacher; if a large number of students are failing in classroom identification of correct responses, either the material is too difficult, the in-class questions are poorly written, student attention is drifting, or students are simply failing to study. That is, while the class is still in session, the CPS can alert the teacher that something is amiss in the delivery of the material or in the items used to assess mastery. This alert can prompt the teacher to modify instructional techniques accordingly.
The CPS works well with PowerPoint and with verbal items inserted "on the spur of the moment" by teachers. For example, if a teacher wanted to retest a given point later in a lecture, a spontaneous question could be inserted verbally and student responses could be reviewed for a second time.
In addition to the periodic assessment or testing of learning throughout a class presentation, the CPS also allows teachers to encourage students to pre-read and prepare for class by having a short quiz at the beginning of each class period. This use of the CPS can prompt better student preparation for each class session and allow teachers to quickly grade quizzes that cover main points that will be expanded upon within the class presentation.
The purpose of this project was to evaluate the usefulness of integrating CPS technology into an accelerated graduate nursing program. Three topics of interest were investigated: (1) student satisfaction with the CPS technology, (2) faculty satisfaction with the CPS technology, and (3) the impact of the CPS technology on student performance on a standardized final examination. Use of the selected CPS software and hardware was implemented in the spring 2005 semester with AEMSN students who were enrolled in a required adult health nursing prelicensure course. Students and faculty involved in the course participated in evaluation of topics 1 and 2. To evaluate topic 3, a previous cohort of AEMSN students who were enrolled in the same course in spring 2004 was used as a comparison group for evaluating examination scores.
Setting and Sample
The adult health course was team-taught by University of Texas at Austin School of Nursing faculty members in a tiered classroom setting. The course occurs in the third semester of a four-semester, prelicensure sequence of course work. The classroom component was held once a week for a 2-hour period over a 15-week session. All 49 of the AEMSN students enrolled in the course participated in the evaluation project. Throughout the lecture component of the course, the CPS technology was used. The students possessed bachelor's degrees in a variety of fields and had completed all of the prerequisite courses for entry into the program. Most of the students were women (86%), whereas the mean age range was 26 to 30 years. Only one student reported not having a personal computer at home. Students reported a variety of preferred learning styles, including visual (45%), hands-on (36%), individual work (26%), and auditory (18%).
Faculty participants were master's-prepared (two) and doctorally prepared (three) experienced adult health clinicians with a mean of 10 years of teaching experience. All were women, and the mean age was 52 years. Study authors C.G. and G.J.R. trained all faculty participants in the use of the CPS technology and served as resources for faculty as they developed and implemented CPS-mediated questions within their classroom presentations.
For comparison purposes, a cohort of "CPS-naive" students was used for evaluation topic 3. This cohort consisted of AEMSN prelicensure students who completed the same adult health course in the spring 2004 semester. Course content and timing were identical to those covered in the 2005 semester; however, CPS technology was not used. This group consisted of 39 predominantly female students (85%) with a mean age range of 26 to 30 years. The group completed the same final examination as did the 2005 cohort. No other data were obtained from this group.
Data Collection and Analysis
At the end of their semester, the spring 2005 students were invited to participate in the CPS evaluation project. Of the 49 students, 48 agreed to complete the CPS Survey Instrument (CPSSI). Students were given a brief explanation of the tool and were assured that responses to the tool would be reported as aggregate data only.
The CPSSI is a 20-item tool developed by the project investigators. The purpose of the tool is to capture information relevant to student satisfaction with the CPS technology within the assigned course. The tool is divided into four sections: student demographic information, technical aspects of using the CPS, opinions about potential learning enhancement with the CPS, and comments about future directions for using the CPS technology within the AEMSN program. Items include a mix of forced-choice and short, open-ended questions, and the tool takes about 5 to 10 minutes to complete. To enhance the validity of the CPSSI, a draft of the tool was distributed to doctorally prepared faculty who had previously taught the course. This panel of judges was asked to review the tool for content and construct validity and internal consistency. After the review, minor changes were made to improve item clarity. Descriptive statistics for responses to the forced-choice items on the CPSSI were analyzed using SPSS. Responses to the open-ended CPSSI questions were analyzed via a content analysis approach.
Faculty satisfaction with the CPS technology was evaluated by a short interview with each faculty member who taught the course in the spring 2005 semester. Interview questions were developed by the project investigators and consisted of three open-ended questions. Responses to the question were analyzed using a content analysis approach.
The data collection tool for evaluation topic 3 consisted of a 65-item multiple-choice final examination that students in both the 2004 and 2005 cohorts completed as a required course component. Items focused on the leadership and management of groups of adult health clients and were developed and refined by faculty who taught the course.
To compare examination scores between the two groups, t tests were run using SPSS software.
Student responses to the CPSSI satisfaction items are summarized in Table 1. Most respondents indicated that the CPS enhanced the classroom experience by being an effective learning tool. This included positive responses for effectiveness, alertness, participation, and enjoyment. Students offered written feedback regarding the efficiency of the system, such as that the CPS "helps engage the classroom attendee," "requires students to pay more attention," serves as "a good test prep," allows students to make "immediate application" of class material, and lets students "know how [well] they are learning the material."
Each class lecture contained no more than eight CPS questions placed at intervals throughout a PowerPoint presentation. The survey indicated that more than 80% of students surveyed preferred to have even more random (as opposed to pretest) CPS questions interwoven with each lecture. Students preferred having questions in class that would be "fashioned from the [final exam] style [for that class]." Consensus among this group of self-identified motivated learners was that attendance levels were not influenced by use of the CPS. Class attendance figures for the semester indicated that attendance was high, with usually 100% of students attending each class session.
The focus of the faculty satisfaction interviews was to determine if the CPS technology was simple to apply when using teaching tools such as PowerPoint. Faculty reacted positively to the CPS. Teachers stated that attendance was simpler to track with the CPS than with paper sign-in sheets. They mentioned that they rapidly learned how to use the system and stated that they easily embedded random questions within their previously prepared PowerPoint presentations. Teachers felt that it was "fun" and that the students seemed to enjoy it and to participate freely. None of the teachers felt rushed when needing to pause with lecture material to engage the students with a question. Probably the biggest challenge for these teachers was getting the students to leave the response pads in a storage container inside the classroom. Not all students followed this request. From the CPSSI student survey feedback specific to the technical aspects of the system, 45% of the students always brought their response pads to class, while 18.4% forgot more than six times during the semester.
A second measure was used to determine the impact of the CPS on student learning. Comparison of test scores for the two successive cohorts was done for very practical reasons at the end of the first semester of CPS use. The investigators had concerns that the CPS might actually hinder student learning with this specialized group of students in some previously unrecognized way. Therefore, data for preliminary impact analysis were generated based on the course's sole content examination: a 65-item multiple-choice examination that tested leader-manager content. Student learning outcomes were measured by test scores for two successive student cohorts (spring 2004, pre-CPS; and spring 2005, post-CPS). Test score comparisons for each of the two cohorts are provided in Table 2. As a preliminary measure of impact, t test was used to compare the leader-manager examination scores from the 49 AEMSN students who had used the CPS in 2005 with the examination scores from the 39 students who had not experienced the CPS and who took the 2004 content examination (Table 1). The intent was to ensure that group scores, presumably a measure of classroom teaching effectiveness, did not decrease with the introduction of the CPS. The course's content examination has been described earlier and consisted primarily of test items on nursing leadership and management.
Results of the comparison suggest that there is a relationship between use of the CPS and improvement in scores on the course's content examination (t = 6.87, P < .001). However, these results should be interpreted with caution because no comparative analysis of group demographics was done, nor was a pretest-posttest design used.
DISCUSSION AND RECOMMENDATIONS
The purpose of this project was to evaluate the potential usefulness of a CPS within an accelerated graduate nursing education program. Findings involved three elements: the student satisfaction survey, the faculty satisfaction survey, and comparison of final examination scores between two groups of students. Responses were largely positive on the student satisfaction survey. This was not a surprise to the teachers, who had observed student reactions within the classroom at time of use. A majority of students stated that the CPS enhanced the classroom experience by being an effective learning tool. This included positive responses for effectiveness, alertness, participation, and enjoyment as well as comments mentioning immediate reinforcement of content within the classroom and students' feelings of being engaged in the classroom learning process. Our findings are similar to other positive reports about use of CPS technology within health education settings.15,16,19 Based on our evaluation data, we have since moved forward with expanding use of the CPS technology in another AEMSN course and have initiated another evaluation project accordingly.
Clearly, the faculty who participated in this evaluation pilot project were satisfied with their experience and expressed interest in expanding the use of the technology to other courses within the graduate and undergraduate curricula at our school. However, one must carefully consider what types of courses and teaching strategies are best suited for CPS technology. For example, in our new evaluation project, we are using the system in a very fast-paced, content-rich course focused on pathophysiology and associated nursing management issues. Although embedding NCLEX-style multiple-choice questions throughout each lecture seems to be a useful tool to evaluate student learning, allocating sufficient time within the lectures to pose each question, analyze and display answers to the questions, and provide follow-up clarification poses a challenge to the traditional way in which we present this class content to students. Faculty will need to carefully plan for sufficient time within class to effectively use the CPS and may need to modify student readings and other out-of-class preparatory work accordingly such that students will decrease their dependence on class time for coverage of all concepts that they need to master. The results of this project will be reported in a separate article.
The fact that the students in the evaluation project fared well on their final examination was heartening. Some of our faculty were initially reluctant to adopt the use of the CPS technology because of worry that student performance on examinations might be negatively affected. Concerns included potential student confusion with the system, faculty discomfort with the technology, and negative impact on classroom time otherwise allocated to other teaching/learning activities. Although our project findings have allayed those fears, further study of the impact of CPS technology on learning outcomes is warranted. Controlled research studies that include matched comparison groups, a variety of classes and learners (eg, undergraduate and graduate), pretest-posttest designs, and a variety of learning outcomes, including performance on the NCLEX, should be conducted. Such studies will assist with identifying best practices associated with the CPS.
Expanded use of CPS technology warrants careful consideration of the cost and availability of technological support on campus. The adoption of a CPS requires funding for both necessary hardware and software. For the first semester of use (spring 2005), school funds were used for activation fees and to purchase the handheld remote devices. For the 49 students, this amounted to a total of $1875. Subsequently, students were required to purchase the handheld remote devices. These currently cost $25, with an activation fee of $12.50 per semester or $35 for lifetime use. However, once purchased, the device can be used in all subsequent classes that use the CPS. Hence, cost-efficiency improves as use becomes more widespread across courses. Another cost of adopting a CPS involves both student and faculty time to learn the system. Again, that cost diminishes over time once CPS use becomes more commonplace.
We were fortunate to have expert technological support from our university because the CPS we selected was already in use in a number of other departments. Certainly, it is best to standardize use of a single system across campus. As universities consider adopting audience response systems on campus, stakeholders from all interested departments should be included in the planning and decision-making process. Ideally, entering freshmen could purchase their handheld devices, become familiar with their use early in their course work, and use the same device throughout their program of study. Integration of the technology across the curriculum will enhance the cost-effectiveness of adopting this tool.
In sum, we anticipate that student learning outcomes can be enhanced by improving student engagement and interaction via audience response technologies. This evaluation project provided valuable insights that have led to expanded use of the CPS within our accelerated MSN program. Future projects and studies will assist us with defining and refining optimal use of this important tool.
The authors thank Morrie Schulman, from the Division of Instructional Innovation and Assessment, University of Texas at Austin, and Wendy Sera, also from the University of Texas at Austin, for their assistance in the preparation of this manuscript.
2. Sellheim DO. Physical Therapy Students' Approaches to Learning: Faculty Beliefs and Other Educational Factors That Influence Them [dissertation]. Minneapolis, MN: University of Minnesota; 2001.
3. Skiba DJ. Got large lecture hall classes? Use clickers. Nurs Educ Perspect. 2006;27:278-280.
4. Ficca MS. The Congruency Between Actual and Intended Teaching Strategies Used by Nursing Faculty in BSN Programs [dissertation]. Chester, PA: Widener University; 1999.
5. Hafner K. In class, the audience weighs in. New York Times. April 29, 2004;G:1.
6. Koeckeritz JL, Hopkins KV, Merrill AS. ILEUM: interactive learning can be effective using mnemonics. Nurs Educ. 2004;29:75-79.
7. DeBourgh GA. Predictors of student satisfaction in distance-delivered graduate nursing courses: what matters most? J Prof Nurs. 2003;19:149-163.
9. Slain D, Abate M, Hodges B, Stamatakis MK, Wolak S. An interactive response system to promote active learning in the doctor of pharmacy curriculum. Am J Pharm Educ
[serial online]. 2004;68(5):117. http://www.ajpe.org/view.asp?art=aj6805117&pdf=yes
. Accessed June 16, 2008.
10. Latessa R, Mouw D. Use of an audience response system to augment interactive learning. Fam Med. 2005;37:12-14.
11. Greer L, Heaney PJ. Real-time analysis of student comprehension: an assessment of electronic student response technology in an introductory earth science course. J Geosci Educ. 2004;52:345-351.
12. Holmes RG, Blalock JS, Parker MH, Haywood VB. Student accuracy and evaluation of a computer-based audience response system. J Dent Educ. 2006;70:1355-1361.
14. Roy KH. Pilot investigation of the utility of a student response system in medical student lectures. J Audiov Media Med. 1996;19:27-32.
15. DeBourgh GA. Use of classroom "clickers" to promote acquisition of advanced reasoning skills. Nurse Educ Pract. 2008;8:76-87.
16. Stein PS, Challman SD, Brueckner JK. Using audience response technology for pretest reviews in an undergraduate nursing course. J Nurs Educ. 2006;45:469-473.
17. Menon AS, Moffett S, Enriquez M, Martinez MM, Dev P, Grappone T. Audience response made easy: using personal digital assistants as a classroom polling tool. J Am Med Inform Assoc. 2004;11:217-220.
18. Nosek T, Wang W, Medvedev I, Wile M, O'Brien T. Use of a computerized audience response system in medical student teaching: its effect on active learning and exam performance. In: Reeves T, Yamashita S, eds. Proceedings of World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education 2006. Chesapeake, VA: AACE; 2006:2245-2250.
19. Miller RG, Ashar BH, Getz KJ. Evaluation of an audience response system for the continuing education of health professionals. J Contin Educ Health Prof. 2003;23:109-115.
20. Gamito EJ, Burhansstipanov L, Krebs LU, Bemis L, Bradley A. The use of an electronic audience response system for data collection. J Cancer Educ. 2005;20(suppl):80-86.
For more than 25 additional continuing education articles related to education, go to NursingCenter.com\CE.
© 2010 Lippincott Williams & Wilkins, Inc.