Preparedness to Write Items for Nursing Education Examinations: A National Survey of Nurse Educators : Nurse Educator

Journal Logo

Articles

Preparedness to Write Items for Nursing Education Examinations

A National Survey of Nurse Educators

Moran, Vicki PhD, MPH, RN, CNE, CDE, PHNA-BC, TNS; Wade, Heather DNP, FNP-BC, RN; Moore, Leigh MSN, RN, CNOR, CNE; Israel, Heidi PhD, FNP, LCSW, CCRC; Bultas, Margaret PhD, RN, CNE, CNL, CPNP-PC

Author Information
Nurse Educator 47(2):p 63-68, March/April 2022. | DOI: 10.1097/NNE.0000000000001102

Abstract

In schools of nursing (SONs), faculty often use multiple-choice questions (MCQs) to assess student knowledge and application of a subject matter. When used on faculty-prepared nursing examinations, MCQs evaluate a hierarchy of priority thinking essential to nursing education.1 Research supports the use of well-written MCQs to accurately discriminate between high- and low-ability students and suggests that the knowledge retrieval required to answer an MCQ enhances long-term recall.1,2

The use of MCQs assesses competency to answer questions on the NCLEX-RN administered by the National Council of State Boards of Nursing (NCSBN). The examination, using an internal adaptive mechanism, uses MCQs and alternate item formats. Currently, NCSBN is testing some optional next-generation NCLEX (NGN)-type questions after the candidate has completed the examination. The candidate passes when answering questions correctly at the application level or a higher cognitive level of Bloom's taxonomy and meeting the required content from the Client Needs categories.3 To ensure public protection, each state requires candidates for licensure to meet set requirements that include passing an examination that measures the competencies needed to perform safely and effectively as a newly licensed, entry-level RN.4

The purpose of the ubiquitous use of MCQs in nursing education is apparent; however, these questions can be time intensive and challenging for nurse educators to write appropriately. Researchers have identified that 50% of items written will fail to discriminate knowledge as expected.5 The difficulty of item writing is confirmed by piloting questions on high-stakes examinations such as certification examinations to determine discriminating values and psychometric properties before administering examinations to students.6 Admittedly, nursing faculty have access to test bank questions within textbooks and NCLEX preparation guides. Although these may be out of date, the educator can enhance these questions instead of learning how to write their own. Unfortunately, these sources are also fraught with item writing flaws.7–9

Furthermore, the time commitment required to develop MCQs should not be understated. It has been acknowledged by other researchers that although instructors spend considerable time planning lectures and course materials for students, insufficient time is allocated for test preparation and review before administration.10 Some estimates suggest that writing just 1 high-quality MCQ can take up to 1 hour.11 The task is not insurmountable, but a faculty member must be dedicated and confident to write items.

There are other barriers to MCQ item writing. Some authors support that learning to compose MCQs is a “critical component to faculty development,” and as such, the hiring institution along with the individual faculty are responsible for this training.7,9 However, this development often comes as fragmented continuing education, and few nurse educators have formal preparation.7 Because nursing is a practice-based profession, naturally, many nurse educators are clinically trained and then promoted “without receiving training on how to write good test questions.”12,13

There are significant consequences to the “learn as you go” approach to examination writing because these instructors are accountable to stakeholders such as the public and licensing bodies. Researchers go so far as to say that the responsibility to ensure that assessments are high quality and valid is an ethical and legal responsibility.8 Knowing that accurate scores are imperative to situations where students can pass or fail and not doing anything about it is unconscionable.5 Navigating the pitfalls of constructing discriminating items is lacking in the literature.6 Without available literature support, it is not easy to gain confidence with item writing. In addition, literature analyzing nurse educators' education and confidence in item writing in the United States is scarce. Research has shown that education improves the quality of MCQs developed by teaching faculty and provides reassurance of reliability of examination items and accuracy of test scores.5,14,15

Despite the problematic nature of item writing, there are some tools available to nursing faculty. Given the high stakes nature of these examinations, NCSBN provides a test plan to guide examinees. Accordingly, this plan also aids nursing faculty in developing questions to prepare students for examination.4 Aptitude in item writing is fundamental to the nurse educator to prepare students for NCLEX-RN success. A typical theme notes increased success when instructors are offered faculty development seminars and best practices for item writing.12–14 Statistical item analysis also provides real-time feedback to the instructor.13,16

Preparing students for licensure by writing useful examinations is not the only complexity in nursing education. The 2018 NCSBNStrategic Practice Analysis highlighted the increasingly complex decisions newly licensed nurses make during patient care and updated the necessary knowledge, skills, and abilities (KSAs) required of new RNs. One recommendation from the analysis was developing new item types to assess these new KSAs.17 NCSBN is currently involved in research to determine whether the proposed innovative item types can reliably assess clinical judgment and decision making in nursing practice.18 The NGN questions will use a case-based approach for the student to answer NCLEX-style questions using clinical judgment based on knowledge and skills learned in the nursing program.3 These anticipated changes in the NCLEX-RN examination format will add an additional burden to nurse educators who already struggle to write NCLEX-style MCQs at the current level. For nursing faculty to develop the new prototype NGN questions, a familiarity with the critical thinking component of the NCSBN's Clinical Judgment Measurement Model, the NCSBN's item writing guidelines, and Bloom's taxonomy will be essential.

A literature review reveals nurse educators' education concerning item writing has not been established in the United States.7 Nursing research addresses item writing flaws rather than item writing protocols, tests statistics, and questions mapped to Bloom's taxonomy.1,7,11,13 The transition to NGN questions is imminent, and the stakeholder expectations remain steady. As nurse educators prepare for the rollout of the NGN in 2023, it is essential they are both prepared and confident in writing multiple-choice NCLEX-style questions to prepare students. We were interested in examining nurse educators' preparedness to change from writing information-based questions to writing more complex, situational questions. Therefore, this study investigated nurse educators' preparedness and confidence to write NCLEX-style items and perform appropriate examination evaluation at accredited SONs in the United States in preparation for the upcoming changes to the NCLEX RN.

Methods

A descriptive survey design was used to collect data for the study. The survey was designed by the researchers and included questions centered on the confidence, experience, and preparation of the nurse educators in writing examination items and analyzing examination results. The directions for the survey stated the survey should be completed by nurse educators. Demographic variables were also obtained including the numbers of years as a nurse and nurse educator, highest educational degree, certifications, nursing course taught, multiple-choice examination development, type of formal education or training on examination writing, and various Likert scales to assess comfort in item writing and examination development. Because no tool exists to measure these items, the researchers wrote a set of survey questions and piloted them to nurse educators locally. The edits and revisions as suggested by this sample were incorporated into the final 21-question survey. The revised test pilot was distributed to the potential sample population.

A test-retest reliability was not performed on the revised questions because a snowball sampling technique was used to obtain general impressions on the topic. The non–probability-based technique of snowball sampling is typically used in “hard-to-reach” populations to attain a wide selection of nurse educators of the nursing network normally not easily recruited.19 Snowball sampling depends on social networks of the initial participants. Because of the concept of snowball sampling, it is impossible to calculate a response rate.

Email addresses of deans/directors/coordinators were collected from publicly available internet sites through the listing provided by the Accreditation Commission for Education in Nursing (ACEN) and the Commission on Collegiate Nursing Education (CCNE). A total of 1550 emails were sent; 717 emails to ACEN and 833 emails were sent to CCNE-accredited programs. In this initial contact, all recipients were encouraged to forward the email to additional potential participants, including applicable faculty members at their institution who write NCLEX questions, resulting in the second step or chain referral-type process.19 In addition, a flyer was distributed at a national nursing conference in which a member of the research team was attending. A follow-up email was repeated 3 months later with encouragement to forward the email to additional participants.

The study received approval from the authors' institutional review board before recruitment. The recruitment email and the follow-up email contained a URL link to the electronic survey tool website, which included information on the purpose of the study, reasons for the individual to participate, time estimate for the completion of the survey, reasons for the research, explanation of the level of confidentiality for participants, and contact information for the investigator. The survey data were exported from the electronic survey tool to IBM SPSS Statistics for Windows, version 26 package (IBM Corp, Armonk, New York) to facilitate analysis and dissemination, including descriptives, t tests, and analysis of variance (ANOVA).

Results

A total of 333 survey participants accessed the survey. All data were returned within 3 months. Of the total participants, 300 completed the entire survey. Demographics of the participants are listed in the Supplemental Digital Content, Table, https://links.lww.com/NE/B3. Most participants were female (n = 287, 96%). A total of 161 participants (54%) were in the age category of 51 to 65 years, 31% (n = 93) were 36 to 51 years, 9% (n = 26) were older than 65 years, and 7% (n = 20) were 18 to 35 years. Most of the participants had an MSN (n = 170, 57%), followed by 18% (n = 55) with a PhD, 14% (n = 43) with a doctor of nursing practice (DNP), 8% (n = 25) reporting other degrees, and 2% (n = 7) with a BSN. Seventy-one percent (n = 213) had been a nurse for more than 20 years. When asked about how long they have been in the role of the nurse educator, 21% (n = 62) had been an educator for more than 20 years, 34% (n = 102) of the participants had been a nurse educator between 10 and 20 years, and 26% (n = 77) had been a nurse educator for 5 to 10 years. This survey represents a nationwide survey because participants were from 44 states. The directions for the survey stated the survey should be completed by nurse educators. In small programs, the nurse educator could have dual roles of dean/director and educator. The participants who reported having a BSN were excluded from the analysis after reporting the demographics.

Of the participants who responded to the survey, 235 indicated they have been writing traditional NCLEX-style questions for more than 5 years. A total of 139 (46%) wrote their own NCLEX-style questions, and 96 (32%) used the test bank associated with the book and modified the questions to the content taught.

Sixty percent (n = 181) of respondents stated they received formal training or education to write traditional NCLEX-style questions, and 37% (n = 112) learned to write questions on their own. A variety of sources for item writing education were reported, which included mainly educational courses for advanced degrees (16%), workshops (15%), conferences (13%), and the NCSBN (12%). Seventy-six participants (25%) reported having the certified nurse educator (CNE) certification.

When the participants were asked about their confidence in writing traditional NCLEX-style items on a 5-point Likert scale, 22% (n = 63) were extremely confident, 45% (n = 136) were somewhat confident, 14% (n = 41) were neither confident nor confident, 11% (n = 34) were slightly confident, and 5% (n = 16) were not confident. Participants who had an CNE (n = 72) reported more confidence in writing questions (mean, 3.11; SD, 0.93) compared with those (n = 217) who did not (mean, 2.53; SD, 1.1) (P = .001).

The survey asked about the process related to performing an item analysis of examination responses. Participants were allowed to choose multiple responses. The most common responses included using point biserial, P value, percentage correct, Q index, and Kuder Richardson (KR) for the examination (Table). Sixteen percent (n = 49) reported using 3 elements: KR, point biserial, and percentage correct. Fourteen percent (n = 42) used 4 elements: KR, point biserial, P value, and percentage correct. Only 8% (n = 24) of the participants did not use any process for item analysis.

Table. - Types of Item Analysis Used for Nursing Examinations
No. Types Used n (%) of Respondents
Do not use any 24 (8)
1 85 (29)
2 67 (22)
3 78 (26)
4 44 (15)
5 0
Includes point biserial, P value, percentage correct, Q index, and KR (any combination).

When asked about leveling questions to Bloom's taxonomy on nursing examinations, 68% (n = 24) of the participants used the taxonomy. The training necessary to level questions to Bloom's taxonomy varied, with 45% stating they did not receive any formal training and 40% stating they did receive training.

Seventy-four percent (n = 222) of respondents provided rationales for examination questions when reviewing examinations with students. Sixty-nine percent (n = 206) were aware of the NCSBN item writing guidelines. Sixty-eight percent of the participants (n = 203) did not have an item writing guideline or policy.

When the participants were asked how prepared they were to incorporate NGN questions into their examinations, 9% (n = 29) were somewhat and extremely prepared, whereas 64% (n = 190) reported slightly or not at all prepared. When asked if their institution was prepared to incorporate NGN questions into their examinations, 8% (n = 22) reported somewhat or extremely prepared compared with 71% (n = 211) who reported slightly or not at all.

A comparison of the educational degrees and confidence in item writing was analyzed. Nurse educators with PhDs had a higher confidence in writing questions (mean, 3.1; SD, 0.77) compared with faculty with DNPs (mean, 2.9; SD, 0.94) and with MSNs (mean, 2.6; SD, 1.1). Feelings of individual (mean, 1.2; SD, 1.08) and institutional (mean, 1.08; SD, 0.98) preparedness for NGN questions were low for all groups. An ANOVA between the 3 groups and confidence in writing questions was statistically significant (P = .04). ANOVAs of the 3 groups and individual and institutional preparedness for NGN questions were not statistically significant (P = .577and P = .69, respectively).

Discussion

This study assessed nurse educators' preparedness and confidence in writing NCLEX-style items at accredited SONs in the United States. Sixty percent (n = 181) of the participants did receive formal education and training for item writing. Nearly 90% reported an educational degree at the master's level or higher. A variety of sources were reported by the participants, yet only 46% reported writing their own questions. There is an apparent dissociation of educational preparation in graduate programs and confidence in item writing. Education and training do not translate to confidence as 30% still reported nonconfidence in item writing. Faculty who write nursing examinations should have preparation in item writing and continue to refine the skills in the analysis of items.

Complicating matters is the concern that if item types are not leveled to the learner and not evaluated, items may not accurately discriminate between high and low performers.2 Very easy or very challenging items have little power and can decrease test scores' reliability.14 Revised items for examinations should include statistical analysis, including items with P values around 0.5, correct answers with low positive or negative point biserial, and distractors with highly positive point biserial.16 The NCSBN has increased the degree of difficulty (logit score) on the NCLEX-RN since 2015, using more application questions based on critical thinking patterns. Therefore, nurse educators should be well versed in the various methods to evaluate their items and level them appropriately to the learner, supporting a pathway to NCLEX success.

After an examination is administered, nurse educators need to analyze the individual items and the performance of the test.14 Interestingly, 8% of the participants do not use any item analysis on their examinations, and one third of the participants only used one type of item analysis. Bristol et al15 reported a low percentage (10%) of nurse educators who do not have access to any type of item analysis to evaluate their examinations. In addition, the study also reported that 57% of the respondents used 1 type of item analysis, with the point biserial as the most reported.15 National nursing examinations such as the NCLEX-RN readily admit to using examination standards to assess item and examination analysis. The study results indicate that the participants lack the ability to effectively review and validate items.

Many nurse educators responded using a test bank for MCQs, modifying test bank questions to align with nursing content. However, nursing test banks can easily be found online and purchased relatively inexpensively using a search engine. If nursing faculty are not aware of the accessibility of the test banks by students, the students may be using them to prepare for the nursing examination. An additional problem is that nursing test bank items are not validated and revised accordingly to pertinent nursing content taught in the course. Billings and Halstead16(p 439) found that developing a valid and reliable test is an ongoing process, and revising examination items should occur immediately after examinations.

Seventy-six participants (25%) reported having the CNE certification. This certification is considered a mark of excellence for nurse educators and is administered by the National League for Nursing.20 The detailed 2020 candidate handbook has 6 major content areas with a category labeled “use assessment and evaluation strategies” section, including item analysis. The certification measures competence in the academic nurse educator practice role.20

A significant concern identified in the study is the lack of preparedness of individuals and institutions for NGN items. As reported, the nurse educators and their institutions are ill-prepared for the NGN item writing, with more than 64% (n = 190) of individuals reporting they were slightly or not at all prepared and 71% (n = 211) of participants' institutions stating slightly or not at all prepared. This study shows that most participants are not prepared currently and will not be prepared for anything new in item writing unless substantial education and application of the learned knowledge are integrated into current practice. Many reasons for this underpreparedness exist. One contributor is the numerous routes an RN takes to become an educator. Nursing faculty may hold a PhD, EdD, DNP, or MSN with many different certificates unrelated to nursing. Although the CNE has presented opportunities for assessing and evaluating tools used in nursing curriculums, more education is needed. In addition, more education is needed for new and experienced educators to learn how to objectively review items and examinations and to write to the level of Bloom's taxonomy.21 Faculty lack the necessary resources needed to develop high-quality, rigorous test items similar to the licensing examination.22

It should be noted this study was conducted during the 2019-2020 academic year, and during that time, the NCSBN developed a dedicated website for the NGN project. In the fall of 2019, the NCSBN NGN resource website published a newsletter describing the newly approved NGN items. Another newsletter describing the use of case studies and the NGN items was published in spring 2020. Although the survey participants may not have had the opportunity to review the publications, the information related to NGN is publicly available, and all nurse educators should be aware of the website and apply the knowledge in their nurse educator role. Currently, there are limited resources for NGN item development in addition to what NCSBN has to offer. The responsibility, therefore, lies on the nurse educator to seek resources and education relevant to the educator role.

Limitations

There are limitations to the study. The study tool could not provide the data needed for factor analysis, correlation tests, and item response theory methods. The survey questions were administered to a convenience sample of nurse educators during an academic school year, which could have contributed to the low response rate. Although snowballing helps reach limited populations or examine subgroups that are difficult to assess, the researchers cannot control the number of responses because of differences in the network size. One limitation of snowball sampling is that the obtained sample population may not represent the entire population. The study did not ask about educational types of highest degrees such as nursing, education, or statistics or the percentage of time in teaching either in the classroom or in clinical, which could have generated a better understanding of nurse educators' job preparation and current responsibilities.

Conclusion

The survey conducted identifies the challenges of nurse educators with writing MCQs to prepare students for the items currently used on the NCLEX. Many nurse educators rely on test banks that accompany a text or modify existing items instead of writing questions with the assumption that the NCLEX-style test bank questions have been validated. Nurse educators reported a lack of consistency and confidence in item writing. Even more concerning is that the participants identified a lack of confidence in the ability to develop NGN items, which will be released in the next couple of years. It is recommended that SONs develop ways to support faculty with item writing development, leveling of questions to Bloom's taxonomy, item analysis, and developing questions using the upcoming NGN item style. To better prepare SONs now and for the future of NCLEX, it may be beneficial to identify a team of nursing faculty with advanced item writing expertise to collaborate with faculty to level items according to Bloom's taxonomy. If implemented, the next step is developing the NGN items that will support the development of critical thinking in nursing students. The progression will facilitate faculty confidence and prepare students for new KSAs necessary to practice safely as an RN.

References

1. Bailey PH, Mossey S, Moroso S, Cloutier JD, Love A. Implications of multiple-choice testing in nursing education. Nurse Educ Today. 2012;32:e40–e44. doi:10.1016/j.nedt.2011.09.011
2. Tarrant M, Ware J. A comparison of the psychometric properties of three- and four-option multiple-choice questions in nursing assessments. Nurse Educ Today. 2010;30:539–543. doi:10.1016/j.nedt.2009.11.002
3. NCSBN. NCSBN NCLEX Conference. 2020. Available at https://www.ncsbn.org/14397.htm. Accessed August 14, 2020.
4. NCSBN. 2019 NCLEX-RN Test Plan. 2019. Available at https://www. ncsbn.org/2019_RN_TestPlan-English.pdf. Accessed December 1, 2020.
5. Haladyna TM, Rodriquez MC.Developing and Validating Test Items. Routledge; 2013.
6. Sutherland K, Schwartz J, Dickison P. Best practices for writing test items. J Nurs Regul. 2012;3:35–39. doi:10.1016/S2155-8256(15)30217-9
7. Tarrant M, Knierim A, Hayes SK, Ware J. The frequency of item writing flaws in multiple-choice questions used in high stakes nursing assessments. Nurse Educ Pract. 2006;6:354–363. doi:10.1016/j.nedt.2006.07.006
8. Nedeau-Cayo R, Laughlin D, Rus L, Hall J. Assessment of item-writing flaws in multiple-choice questions. J Nurs Staff Dev. 2013;29(2):52–57; quiz E1-2. doi:10.1097/NND.0b013e318286c2f1
9. Nemec E, Welch B. The impact of a faculty development seminar on the quality of multiple-choice questions. Curr Pharm Teach Learn. 2016;8(2):160–163. doi:10.1016/j.cptl.2015.12.008
10. Naeem N, van der Vleuten C, Alfaris EA. Faculty development on item writing substantially improves item quality. Adv Health Sci Educ Theory Pract. 2012;17:369–376. doi:10.1007/s10459-011-9315-2
11. Hijji BM. Flaws of multiple choice questions in teacher-constructed nursing examinations: a pilot descriptive study. J Nurs Educ. 2017;56(8):490–500. doi:10.3928/01484834-20170712-08
12. Kranz C, Love A, Roche C. How to write a good test question: nine tips for novice nurse educators. J Contin Educ Nurs. 2019;50(1):12–14. doi:10.3928/00220124-20190102-04
13. Tarrant M, Ware J. A framework for improving the quality of multiple-choice assessments. Nurse Educ. 2012;37(3):72–76.
14. Oermann MH, Gaberson KB.Evaluation and Testing in Nursing Education. 6th ed. Springer; 2021.
15. Bristol TJ, Nelson JW, Sherrill KJ, Wangerin VS. Current state of test development, administration, and analysis: a study of faculty practices. Nurse Educ. 2018;43(2):68–72. doi:10.1097/NNE.0000000000000425
16. Billings DM, Halstead JA.Teaching in Nursing: A Guide for Faculty. 5th ed. Elsevier; 2016.
17. NCSBN. Strategic practice analysis. 2018. Available at https://www.ncsbn.org/18-Strategic-Practice-Analysis.pdf. Accessed September 2020.
18. NCSBN. NCLEX & other exams. 2020. Available at https://ncsbn.org/nclex.htm. Accessed August 20, 2020.
19. Biernacki P, Waldorf D. Snowball sampling: problems and techniques of chain referral sampling. Sociol Methods Res. 1981;10(2): 141–163. doi:10.1177/004912418101000205
20. NLN. NLN certified nurse educator. 2020. Available at http://www.nln.org/Certification-for-Nurse-Educators/cne/handbook. Accessed October 1, 2020.
21. Dreher HM, Smith Glasgow ME, Schreiber J. The use of “high-stakes testing” in nursing education: rhetoric or rigor? Nurs Forum. 2019;54(4):477–482. doi:10.1111/nuf.12363
22. Smith Glasgow ME, Dreher HM, Schreiber J. Standardized testing in nursing education: preparing students for NCLEX-RN and practice. J Prof Nurs. 2019;35(6):440–446. doi:10.1016/j.profnurs.2019.04.012
Keywords:

examinations; NCLEX-style test items; next-generation NCLEX (NGN); nursing faculty; testing

Supplemental Digital Content

© 2021 Wolters Kluwer Health, Inc. All rights reserved.