Secondary Logo

Share this article on:

Students’ Midprogram Content Area Performance as a Predictor of End-of-Program NCLEX Readiness

Brussow, Jennifer A., MA; Dunham, Michelle, PhD

doi: 10.1097/NNE.0000000000000499
Feature Articles
Video

Many programs have implemented end-of-program predictive testing to identify students at risk of NCLEX-RN failure. Unfortunately, for many students, end-of-program testing comes too late. Regression and relative importance analysis were used to explore relationships between 9 content area assessments and an end-of-program assessment shown to be predictive of NCLEX-RN success. Results indicate that scores on assessments for content areas such as medical surgical nursing and care of children are predictive of end-of-program test scores, suggesting that instructors should provide remediation at the first sign of lagging performance.

Author Affiliations: Research Scientists, Ascend Learning, LLC, Leawood, Kansas.

The authors are employed by Ascend Learning, whose subsidiary (Assessment Technologies Institute) produces the assessments discussed in this article. The authors’ compensation is not dependent on research findings. Research design, analysis, and results reporting are conducted independently from the business unit.

The authors declare no conflicts of interest.

Correspondence: Ms Brussow, Ascend Learning, LLC, 11161 Overbook Rd, Leawood, KS 66211 (jennifer.brussow@ascendlearning.com).

Supplemental digital content is available for this article. Direct URL citations appear in the printed text and are provided in the HTML and PDF versions of this article on the journal’s Web site (www.nurseeducatoronline.com).

Accepted for publication: November 14, 2017

Published ahead of print: December 22, 2017

This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND), where it is permissible to download and share the work provided it is properly cited. The work cannot be changed in any way or used commercially without permission from the journal.

A nursing program’s NCLEX-RN pass rate is one of the key indicators by which it is judged.1-4 Because of the impact of NCLEX-RN failure on both the institution and its students, many institutions have chosen to implement predictive testing at the close of a program to identify those students at risk of NCLEX-RN failure. This study examines an end-of-program comprehensive predictive test (CP) designed to assess student readiness and provide a probability of passing the NCLEX-RN. A body of independent research has shown that the CP is positively and significantly associated with first-time test-takers’ success on the NCLEX-RN,5-8 making it a powerful tool in assessing students’ readiness to take the NCLEX-RN. Unfortunately, for many students, end-of-program testing comes too late to identify students and provide remediation to prevent program attrition and NCLEX-RN failure. Research indicates that the second most common point of attrition is near the end of the program, when the unsuccessful student has occupied a seat and taken up a maximum amount of valuable program resources.9 In addition, NCLEX-RN failure carries immense costs for nursing education programs, health care organizations, and graduate nurses.8,9

Given the shortage of qualified, competent RNs8,9 and declining pass rates on the NCLEX-RN,10 nursing programs are focusing on identifying at-risk students and providing remediation to encourage NCLEX-RN success.4,6,8,10-12 To support these remediation efforts, an earlier indicator of at-risk status is needed. Researchers and programs recommend adopting earlier, more frequent standardized testing to provide earlier identification of at-risk students.4,11-13 One such testing schedule consists of a series of 9 content mastery (CMS) assessments aligned to the NCLEX-RN blueprint and designed to measure mastery in the major nursing content areas as students progress through the program. Content mastery assessments are standardized, secured, aligned to the NCLEX-RN test plan, and used by multiple programs in multiple settings, making them a valid candidate for providing schools with an early indicator of student success. To investigate the utility of these assessments in the early identification of student needs, the following 4 research questions are explored in this quasi-experimental study: (1) How are scores on the CMS assessments related to scores on the CP? (2) How does the number of CMS assessments on which a student is successful relate to scores on the CP? (3) How does the number of CMS assessments on which a student is successful relate to the probability of passing the NCLEX-RN? (4) Which CMS assessments are most predictive of scores on the CP?

Back to Top | Article Outline

Methods

The nine 2016 RN CMS assessments provide estimates of the amount of proficiency a student has attained in a given content area. The assessed content areas are Fundamentals, Adult Medical-Surgical Nursing (Adult Med-Surg), Leadership, Maternal-Newborn, Mental Health, Nursing Care of Children, Nutrition, Pharmacology, and Community Health. The CMS assessments contain standard 4-option multiple choice items and 6 alternate item types: multiple response, fill-in-the-blank calculation, hot spot, chart/exhibit, drag and drop/ordered response, and graphic options. Each of the 9 CMS assessments is a fixed-length test composed of 50 to 90 scored items and 10 unscored pretest items. Content mastery assessments’ content specifications across all 9 assessments are designed to align as closely as possible with the NCLEX-RN content specifications as well as common course content in nurse education programs. The assessments provide information regarding mastery of content in relation to specific nursing content areas as well as proficiency levels to quantify a student’s level of knowledge acquisition. After completing a CMS assessment, a student can create a focused review module tailored to his/her strengths and weaknesses to guide remediation.

The 2016 RN CP assessment provides students and educators with a numeric indication of the likelihood of passing the NCLEX-RN at the student’s current level of readiness and can be used to guide remediation efforts based on the examination content missed. The CP assessment is a fixed length test of 150 scored items and 30 unscored pretest items, and its test specifications are designed to mirror the NCLEX-RN to the greatest extent possible.

For these analyses, data were available for 19,535 students enrolled in associate degree in nursing and BSN programs in pursuit of an RN credential. This convenience sample consisted of students who had taken all 9 CMS assessments and completed the CP assessment matched to the 2016 NCLEX-RN test plan. It should be noted that the majority of programs do not administer all 9 CMS assessments. However, in order to make comparisons across all assessments, only those students having complete sets of data were included.

Percent-correct scores were available for each CMS assessment as well as the CP assessment. In addition, scores on the CMS assessments were transformed into proficiency scores based on a previously conducted cut score study.14 For more information on the procedures used to derive the proficiency levels, as well as the specific cut scores for each assessment, the reader is directed to the executive summary for the standard setting study.14 For the CP assessment, percentage correct scores were converted to a probability of passing NCLEX-RN based on the expectancy table developed for the assessment.15 All statistical analyses were conducted using R 3.4.0 (R Core Team, Vienna, Austria).

Back to Top | Article Outline

Results

Relationships Between CMS Scores and CP Overall Score

To evaluate the relationship between scores on each of the CMS assessments and the CP assessment, the correlation between each assessment and the overall score on the CP assessment was calculated. Results are shown in Table 1. Correlations show that there is a moderate to strong relationship between scores on each of the CMS assessments and the CP assessment; the strongest correlation with CP is Adult Med-Surg (r = 0.608), and the weakest is Fundamentals (r = 0.476).

Table 1

Table 1

Back to Top | Article Outline

CP Assessment Outcome Means by CMS Test Successes Based on Proficiency Level

Many programs use proficiency levels, or objective measures of student accomplishment, to facilitate score interpretation. Because proficiency-level rather than percent-correct scores are often used to inform faculty decisions, proficiency-level scores were also examined in relation to CP scores. Specifically, for this analysis, success on a CMS assessment was defined as achievement at or above proficiency level 2, which indicates that a student’s performance exceeds minimum expectations for performance and that a student is fairly certain to meet NCLEX-RN standards in the content area. For the purposes of this study, success was defined as proficiency level 2 or above based on observations of institutional usage; however, institutions may adopt different criteria for success. Individuals were then grouped according to the number of CMS assessments on which they were successful (0-9), and the mean overall scores and mean Probability of Passing (POP) scores on the CP for each of these groups were compared (Figure).

Figure

Figure

The pattern of CP assessment overall score group means shows that the group with the highest mean score was successful (proficiency level 2 or above) on all 9 CMS assessments. With each unsuccessful test score, the group mean overall score dropped an average of 2.12 points (range, 1.63-4.13 points), and the group mean POP dropped an average of 3.62 points (range, 1.08-8.31 points). These declines in mean CP scores are both statistically significant and meaningful in terms of students’ likelihood of passing the NCLEX-RN.

Back to Top | Article Outline

Percent of Examinees Considered Highly Likely to Pass

While mean differences in scores are important to consider, they may be difficult to interpret in terms of actual student outcomes. To facilitate a practical interpretation, students’ CP scores were dichotomized on the basis of their associated predicted probability of passing the NCLEX-RN. For the purposes of this analysis, students with a 96% or greater probability of passing NCLEX-RN were classified as being “highly likely to pass.” This dividing point was intentionally set at a high score point to support the classification of “highly likely to pass” and was informed by similar score cutoffs used in published studies.8 Table 2 displays the percentage of students in each CMS success group who were classified as being “highly likely to pass.”

Table 2

Table 2

Of students who were successful on all 9 CMS assessments, 91.6% were “highly likely to pass NCLEX-RN.” This percentage decreased to 83.4 for students successful on 8 CMS assessments and to 71.6 for students successful on 7 assessments. The percentage of students in the “highly likely to pass” category continued to decrease with each additional unsuccessful CMS test (see Table, Supplemental Digital Content, http://links.lww.com/NE/A444).

Back to Top | Article Outline

CMS Scores’ Predictive Relationship to CP Assessment Scores

To investigate the contribution of individual content area assessments to the prediction of achievement on the CP assessment, a multiple regression analysis was carried out. The regression results showed that, taken together, the set of CMS examinations explains 54.6% of the variability in CP scores (R 2 = 0.546, P < .001). To examine the relative contribution of each examination in a set of highly correlated predictors, a relative importance analysis16 was performed using the relaimpo package.17 Supplemental Digital Content 2, http://links.lww.com/NE/A445, Figure, shows the percentage of accounted-for variance attributed to each of the CMS assessments.

An examination of the relative importance analysis results shows that the 3 CMS assessments that contribute the most to prediction of CP performance are Adult Med-Surg, Leadership, and Nursing Care of Children (17.4%, 12.4%, and 12.3% of R 2 accounted for, respectively), whereas Fundamentals makes the least contribution (8.2%). All of the CMS assessments make substantial contributions toward the prediction of CP performance.

Back to Top | Article Outline

Discussion

These analyses of the CMS assessments show that they are a valuable source of information about students’ content area proficiency as they progress through the program. Group CP assessment means show that as students fail to achieve proficiency level 2 or above on even 1 CMS test, the percentage of students classified as “highly likely to pass NCLEX-RN” (≥96%) drops dramatically. For educators, this should translate into a call to action. While it can be tempting to excuse poor performance on a single test as “a bad day,” these data show that doing poorly on even 1 test significantly impacts students’ performance on the CP assessment, which is a proven predictor of NCLEX-RN performance. As a result, educators should provide assistance and remediation opportunities to students at the first sign of lagging performance as evidenced on a CMS assessment. One recent study suggests that remediation approaches such as adaptive quizzing and faculty-to-student mentoring may be effective in this context.18

In addition, the relative importance analysis provides evidence that all of the CMS assessments contribute substantially to the prediction of CP assessment performance. While Adult Med-Surg, Leadership, and Nursing Care of Children contribute the most, all assessments significantly contribute to the accounted-for variance in CP scores. Thus, the data here suggest that all of the assessments have valuable information to contribute to our understanding of student mastery. Programs not currently using all of the CMS examinations may find that they are losing valuable information by not administering the full battery of available assessments. When programs must administer a subset of assessments because of constraints on time or resources, this relative importance analysis can help them choose assessments that are most useful in identifying student needs.

Results from this study were computed based on data from students who had taken all 9 CMS assessments, and caution should be used when generalizing to students enrolled in programs that do not use all 9 assessments. Future research could explore the predictive power of the CMS assessments for students who complete only a partial sample of the available assessments. Additional research could also include NCLEX-RN pass/fail data in the model to more directly assess the relationships between the CMS assessments, CP assessment, and NCLEX-RN.

Back to Top | Article Outline

References

1. Commission on Collegiate Nursing Education. Standards for accreditation of baccalaureate and graduate degree nursing programs: supplemental resource. http://www.aacn.nche.edu/ccne-accreditation/Supplemental-Resource.pdf. Published 2016. Accessed August 8, 2017.
2. Commission on Collegiate Nursing Education. Standards for accreditation of baccalaureate and graduate degree nursing programs. http://www.aacn.nche.edu/ccne-accreditation/Standards-Amended-2013.pdf. Published 2013. Accessed August 8, 2017.
3. National League for Nursing Commission for Nursing Education Accreditation. Accreditation standards for nursing education programs. http://www.nln.org/docs/default-source/accreditation-services/cnea-standards-final-february-201613f2bf5c78366c709642ff00005f0421.pdf. Published 2016. Accessed August 8, 2017.
4. Roa M, Shipman D, Hooten J, Carter M. The costs of NCLEX-RN failure. Nurse Educ Today. 2011;31(4):373–377.
5. Alameida MD, Prive A, Davis HC, Landry L, Renwanz-Boyle A, Dunham M. Predicting NCLEX-RN success in a diverse student population. J Nurs Educ. 2011;50(5):261–267.
6. Davenport NC. A comprehensive approach to NCLEX-RN success. Nurs Educ Perspect. 2007;28(1):30–33.
7. Grant AR. NCLEX-RN Predictor Test Scores and NCLEX-RN Success for First Attempt Test Takers [dissertation]. Minneapolis, MN: Walden University; 2015.
8. Wray K, Whitehead T, Setter R, Treas L. Use of NCLEX preparation strategies in a hospital orientation program for graduate nurses. Nurs Adm Q. 2006;30(2):162–177.
9. Serembus JF. Improving NCLEX-RN first-time pass rates: a comprehensive program approach. J Nurs Reg. 2016;6(4):38–44.
10. Culleiton AL. Remediation: a closer look in an educational context. Teach Learn Nurs. 2009;4(1):22–27.
11. Heroff K. Guidelines for a progression and remediation policy using standardized tests to prepare associate degree nursing students for the NCLEX-RN at a rural community college. Teach Learn Nurs. 2009;4(3):79–86.
12. Jacobs P, Koehn ML. Implementing a standardized testing program: preparing students for the NCLEX-RN. J Prof Nurs. 2006;22(6):373–379.
13. Simon EB, McGinniss SP, Krauss BJ. Predictor variables for NCLEX-RN readiness exam performance. Nurs Educ Perspect. 2013;34(1):18–24.
14. Assessment Technologies Institute. RN Content Mastery Series 2013 National Standard Setting Study. Published 2013. Leawood, KS.
15. Assessment Technologies Institute. Technical Manual for the RN Comprehensive Predictor 2016. Published 2016. Leawood, KS.
16. Lindeman RH, Merenda PF, Gold RZ. Introduction to Bivariate and Multivariate Analysis. Scott Foresman: Glenview, IL; 1980.
17. Grömping U. Relative importance for linear regression in R: the package relaimpo. J Stat Soft. 2006;17(1):1–27.
18. Corrigan-Magaldi M, Colalillo G, Molloy J. Faculty-facilitated remediation: a model to transform at-risk students. Nurs Educ. 2014;39(4):155–157.
Keywords:

mastery testing; nursing licensure; NCLEX-RN readiness; predictive testing; remediation; standardized testing

Supplemental Digital Content

Back to Top | Article Outline
Copyright © 2018 Wolters Kluwer Health, Inc. All rights reserved