Secondary Logo

Journal Logo

SYSTEMATIC REVIEW PROTOCOLS

Instruments for measuring undergraduate nursing studentsknowledge, attitudes and skills in evidence-based practice: a systematic review protocol

Cardoso, Daniela1,2,3,4; Santos, Eduardo1,2,3,5; Cardoso, Maria L.1,2; Oliveira, Catarina R.4,6; Rodrigues, Manuel A.1,2,3; Apóstolo, João1,2,3

Author Information
JBI Database of Systematic Reviews and Implementation Reports: August 2017 - Volume 15 - Issue 8 - p 1979-1986
doi: 10.11124/JBISRIR-2016-003218
  • Free

Abstract

Background

Evidence-based practice (EBP), also referred to as evidence-informed practice,1 is defined as “clinical decision-making that considers the best available evidence; the context in which the care is delivered; client preference; and the professional judgment of the health professional”.2(p.209)

Several studies have indicated the multiple benefits of using EBP in clinical practice, such as high-value health care, improved patient outcomes, decreased health care costs and, consequently, increased quality of care.3-5 Therefore, the adoption, implementation and sustainment of EBP in healthcare organizations is becoming increasingly important due to this impact on health care quality.6-8

Several organizations, such as the World Health Organization,9 the International Council of Nurses10 and the Agency for Healthcare Research and Quality11 have recommended the implementation of EBP. These organizations claim that decision-making is simplified, uncertainty, risk and variability are reduced and quality of care is improved. In addition, the Sicily statement on EBP has pointed out that “all health care professionals need to understand the principles of EBP, recognize EBP in action, implement evidence-based policies, and have a critical attitude to their own practice and to evidence”.12(p.4) However, due to the gap between research and practice, EBP is not up to the standard of care worldwide,3 which is often described as a problem.13

Indeed, the literature reveals several barriers to EBP implementation, such as time limitations, an organizational culture and philosophy of “that is the way we have always done it here”, leader resistance and inadequate knowledge or training to access or critically appraise evidence.3(p.6) Moreover, Aarons et al.14 pointed out that personal characteristics of frontline staff, including age, level of education, training, the level of professional experience, knowledge and attitudes toward EBP are essential to successfully implement EBP.

Studies have identified education as a strategy to promote EBP implementation, that is, to close the gap between research and practice.15-17 Furthermore, the report of the Institute of Medicine Committee on the Health Professions Education Summit in 2003 stated18 that all professional education programs in the health area should promote the development of EBP skills. Undergraduate nursing curricula should also be based on EBP principles with a view to educating future nurses on EBP use in clinical practice and, consequently, improving their acquisition and further development of knowledge, attitudes and skills regarding EBP.19 Therefore, good quality instruments are required to assess the impact of the educational programs on undergraduate nursing studentsattitudes, knowledge and skills regarding EBP.

According to the Classification Rubric for EBP Assessment Tools in Education (CREATE), attitudes refer to “the values ascribed by the learner to the importance and usefulness of EBP to inform clinical decision-making”,20(p.4) knowledge refers to “learners’ retention of facts and concepts about EBP”20(p.5) and skills refer to “the application of knowledge, ideally in a practical setting” (Freeth et al. cited by Tilson et al.).20(p.5)

Instruments such as the EBP Evaluation Competence Questionnaire21 and the Student EBP Questionnaire22 are already used to assess undergraduate nursing studentsknowledge, attitudes and skills regarding EBP. Nonetheless, information about other instruments available to measure undergraduate nursing studentsknowledge, attitudes and skills regarding EBP has not yet been gathered, as well as information about their measurement properties, including internal consistency, reliability, measurement error, content validity, structural validity, hypothesis testing, cross-cultural validity, criterion validity, responsiveness and interpretability according to the COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) definitions.23

A systematic review has been conducted to identify instruments available for measuring nurses’ EBP knowledge, skills and attitudes (59 studies met the inclusion criteria in a total of 24 self-report instruments).24 However, no attempt has yet been made to synthesize the instruments available for undergraduate nursing students. An initial search of the JBI Database of Systematic Reviews and Implementation Reports, the Cochrane Database of Systematic Reviews, PROSPERO, MEDLINE and CINAHL found no systematic review (published or in progress) on the measurement properties of the instruments available for measuring undergraduate nursing students’ EBP knowledge, attitudes and skills. Therefore, there is a clear need to identify and assess the properties of these instruments and, consequently, determine the most valid and reliable one. The findings of this systematic review will help in planning the validation of promising instruments or deciding on the need to develop a new instrument.

Inclusion criteria

Types of participants

The current systematic review will consider studies that focus on undergraduate nursing students, aged 18 years or over. In this systematic review, we will consider undergraduate nursing students as students who are not yet licensed as registered nurses.

Constructs of interest

The current systematic review will consider studies that explore the following constructs: attitudes, knowledge and skills in EBP. This systematic review will consider the definition of these constructs according to CREATE, as presented below:

  • Attitudes – “the values ascribed by the learner to the importance and usefulness of EBP to inform clinical decision-making”20(p.4)
  • Knowledge – “learners’ retention of facts and concepts about EBP”20(p.5)
  • Skills – “the application of knowledge, ideally in a practical setting” (Freeth et al. cited by Tilson et al.)20(p.5)

Type of measurement instrument of interest

The current systematic review will include any type of measurement instrument, including, but not limited to, self-report questionnaires.

Outcomes

The current systematic review will include studies that consider at least one of the measurement properties (or aspects of measurement properties) of the instruments according to the operationalization and conceptualization of COSMIN.23

The COSMIN taxonomy includes three quality domains: reliability, validity and responsiveness. The reliability and validity domains contain three measurement properties each: reliability encompasses internal consistency, reliability and measurement error; and validity encompasses content validity, construct validity and criterion validity. The domain responsiveness encompasses the measurement property responsiveness.23 For more details, please see the table extracted from Noben et al.25 in Appendix I.

Types of studies

The current systematic review will consider validation studies or studies with other designs on the development of a measurement instrument or the assessment of one or more of its measurement properties.

Search strategy

The search strategy aims to find both published and unpublished studies. A three-step search strategy will be utilized in this review. An initial limited search of MEDLINE and CINAHL will be undertaken followed by analysis of the text words contained in the title and abstract, and of the index terms used to describe the article. A second search using all identified keywords and index terms will then be undertaken across all included databases. Third, the reference lists of all identified reports and articles will be searched for additional studies. Studies published in English, Spanish and Portuguese will be considered for inclusion in this review. Studies published after 1996 (date when EPB first emerged) will be considered for inclusion in this review.26,27

The databases to be searched include: PubMed, CINAHL, Scopus, Academic Search Complete, SciELO (Scientific Electronic Library Online) and ERIC.

The search for unpublished studies will include: Banco de teses da CAPES (Brazil), RCAAP (Repositório Científico de Acesso Aberto de Portugal), OpenGrey (System for Information on Grey Literature in Europe) and Virginia Henderson Global Nursing e-Repository.

Initial keywords to be used will be:

  • undergraduate nursing students
  • attitudes, knowledge and skills regarding EBP”
  • “self-report questionnaires”
  • “measurement instrument”
  • “validity”
  • “reliability”
  • “measurement properties”
  • “psychometric properties”

Assessment of methodological quality

Due to the lack of JBI tools for assessing the methodological quality of the measurement properties of instruments, the papers selected for retrieval will be assessed for methodological validity prior to inclusion in the review by two independent reviewers, using the COSMIN checklist with a four-point rating scale.28,29 Using a four-step process, the reviewers will: (i) identify the measurement properties assessed in the paper; (ii) verify if the statistical methods used in the paper are based on the Classical Test Theory or on the Item Response Theory; (iii) assess the methodological quality of the studies on the properties identified in step 1; and (iv) analyze the generalizability of the results of the studies on the properties identified in step 1. Four response options were defined for each COSMIN item (excellent, good, fair and poor). The reviewers will rate the methodological quality of each measurement property based on the principle of “worst score counts” (the lowest rating of any item in the corresponding box), as suggested by Terwee et al.29

Any disagreements that arise between the reviewers will be resolved through discussion or with a third reviewer.

Data extraction

According the COSMIN protocol for systematic reviews of measurement properties,30 the data extracted will include the following specific details:

  • General characteristics of the instruments (construct, subscales, number of items, version, etc.).
  • Characteristics of the study populations in which the measurement properties were assessed (age, gender, setting, country, language, graduation year – information mentioned in items 1 to 6 from the COSMIN box generalizability).
  • Results of the measurement properties.
  • Evidence on the interpretability of the included questionnaires (distribution of scores, floor and ceiling effect and minimal important change – information described in items four to eight of the COSMIN box interpretability).

Data will be directly extracted into tables by two independent reviewers. Authors of primary studies will be contacted to provide missing or additional data. Any disagreements that arise between the reviewers will be resolved through discussion, or with a third reviewer.

Data synthesis

Data will be synthesized by two independent reviewers through the creation of overview tables with descriptive summaries of: details of included studies, details of included instruments, methodological quality assessment of each included study and measurement properties assessed per instrument.

Whenever the studies are similar in terms of study population, setting, instrument version (e.g. language) and form of administration (assessed through the generalizability box of the COSMIN checklist), their results on a measurement property of an instrument will be synthesized through a best-evidence synthesis.30 Two independent reviewers will rate the results of the measurement properties for each study as positive, indeterminate or negative (Appendix II)31 and assign a level of evidence (strong, moderate, limited, conflicting, unknown) as proposed by the Cochrane Collaboration Back Review Group32 (Appendix III). Furthermore, if the studies are of at least fair quality, statistical pooling will be performed for reliability and correlation coefficients.

Appendix I: Description of the measurement domains, properties, aspects and statistics and methods25

Appendix II: Quality criteria for the measurement property29

Appendix III: Levels of evidence for the quality of the measurement property30

Acknowledgements

The authors gratefully acknowledge the support of the Health Sciences Research Unit: Nursing (UICISA: E), hosted by the Nursing School of Coimbra (ESEnfC) and the Foundation for Science and Technology (FCT). The authors also gratefully acknowledge the English translation services of the Nursing School of Coimbra in translating the protocol.

References

1. Melnyk BM, Newhouse R. Evidence-based practice versus evidence-informed practice: a debate that could stall forward momentum in improving healthcare quality, safety, patient outcomes, and costs. Worldviews Evid Based Nurs 2014; 11 6:347–349.
2. Pearson A, Wiechula R, Court A, Lockwood C. The JBI model of evidence-based healthcare. Int J Evid Based Healthc 2005; 3 8:207–215.
3. Melnyk BM, Gallagher-Ford L, Long LE, Fineout-Overholt E. The establishment of evidence-based practice competencies for practicing registered nurses and advanced practice nurses in real-world clinical settings: proficiencies to improve healthcare quality, reliability, patient outcomes, and costs. Worldviews Evid Based Nurs 2014; 11 1:5–15.
4. Melnyk BM. The evidence-based practice mentor: a promising strategy for implementing and sustaining EBP in healthcare systems. Worldviews Evid Based Nurs 2007; 4 3:123–125.
5. André B, Aune AG, Brænd JA. Embedding evidence-based practice among nursing undergraduates: results from a pilot study. Nurse Educ Pract 2016; 18:30–35.
6. Apóstolo J, Cardoso D, Rodrigues MA. It takes three to tango: embracing EBP. JBI Database System Rev Implement Rep 2016; 14 4:1–2.
7. Aarons GA, Ehrhart MG, Farahnak LR. The implementation leadership scale (ILS): development of a brief measure of unit level implementation leadership. Implement Sci 2014; 9 1:1.
8. Pearson A, Jordan Z, Munn Z. Translational science and evidence-based healthcare: a clarification and reconceptualization of how knowledge is generated and used in healthcare. Nurs Res Pract 2012; 2012:1–6.
9. World Health Organization. World report on knowledge for better health: strengthening health systems. Geneva, Switzerland: World Health Organization; 2004.
10. International Council of Nurses. Closing the gap: from evidence to action. 2012; Geneva, Switzerland: International Council of Nurses, Available from: http://www.icn.ch/images/stories/documents/publications/ind/indkit2012.pdf. [Accessed 26 July 2016].
11. Agency for Healthcare Research and Quality. Accelerating change and translating research into practice (TRIP)-II: fact sheet. 2001; Rockville, MD: Agency for Healthcare Research and Quality, Available from: http://archive.ahrq.gov/research/findings/factsheets/translating/tripfac/trip2fac.html. [Accessed 26 July 2016].
12. Dawes M, Summerskill W, Glasziou P, Cartabellotta A, Martin J, Hopayian K, et al. Sicily statement on evidence-based practice. BMC Med Educ 2005; 5 1:1.
13. Oliver K, Innvar S, Lorenc T, Woodman J, Thomas J. A systematic review of barriers to and facilitators of the use of evidence by policymakers. BMC Health Serv Res 2014; 14 1:1.
14. Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health 2011; 38 1:4–23.
15. Asokan G. Evidence-based practice curriculum in allied health professions for teaching-research-practice nexus. J Evid Based Med 2012; 5 4:226–231.
16. Black AT, Balneaves LG, Garossino C, Puyat JH, Qian H. Promoting evidence-based practice through a research training program for point-of-care clinicians. J Nurs Adm 2015; 45 1:14.
17. Mohsen MM, Safaan NA, Okby OM. Nurses’ perceptions and barriers for adoption of evidence based practice in primary care: bridging the gap. Am J Nurs Res 2016; 4 2:25–33.
18. Committee on the Health Professions Education Summit. Health professions education: a bridge to quality. 2003; Washington, DC: Committee on the Health Professions Education Summit, 192.
19. Melnyk BM, Fineout-Overholt E, Gallagher-Ford L, Kaplan L. The state of evidence-based practice in US nurses: critical implications for nurse leaders and educators. J Nurs Adm 2012; 42 9:410–417.
20. Tilson JK, Kaplan SL, Harris JL, Hutchinson A, Ilic D, Niederman R, et al. Sicily statement on classification and development of evidence-based practice learning assessment tools. BMC Med Educ 2011; 11 78:1–10.
21. Ruzafa-Martinez M, Lopez-Iborra L, Moreno-Casbas T, Madrigal-Torres M. Development and validation of the competence in evidence based practice questionnaire (EBP-COQ) among nursing students. BMC Med Educ 2013; 13 19:1–10.
22. Upton P, Scurlock-Evans L, Upton D. Development of the Student Evidence-based Practice Questionnaire (S-EBPQ). Nurse Educ Today 2016; 37:38–44.
23. Mokkink LB, Terwee CB, Patrick DL, Alonso J, Stratford PW, Knol DL, et al. The COSMIN study reached international consensus on taxonomy, terminology, and definitions of measurement properties for health-related patient-reported outcomes. J Clin Epidemiol 2010; 63 7:737–745.
24. Leung K, Trevena L, Waters D. Systematic review of instruments for measuring nurses’ knowledge, skills and attitudes for evidence-based practice. J Adv Nurs 2014; 70 10:2181–2195.
25. Noben CYG, Evers SMAA, Nijhuis FJ, de Rijk AE. Quality appraisal of generic self-reported instruments measuring health-related productivity changes: a systematic review. BMC Public Health 2014; 14 115:1–21.
26. Closs S, Cheater F. Evidence for nursing practice: a clarification of the issues. J Adv Nurs 1999; 30 1:10–17.
27. Sackett DL, Rosenberg WM, Gray JM, Haynes RB, Richardson WS. Evidence based medicine: what it is and what it isn’t. BMJ 1996; 312 7023:71–72.
28. Mokkink LB, Terwee CB, Patrick DL, Alonso J, Stratford PW, Knol DL, et al. The COSMIN checklist for assessing the methodological quality of studies on measurement properties of health status measurement instruments: an international Delphi study. Qual Life Res 2010; 19 4:539–549.
29. Terwee CB, Mokkink LB, Knol DL, Ostelo RWJG, Bouter LM, de Vet HCW. Rating the methodological quality in systematic reviews of studies on measurement properties: a scoring system for the COSMIN checklist. Qual Life Res 2012; 21 4:651–657.
30. Terwee C, de Vet HCW, Prinsen CAC, Mokkink LB. Protocol for systematic reviews of measurement properties. 2011; Amsterdam: COSMIN, Available from: http://www.cosmin.nl/images/upload/files/Protocol%20klinimetrische%20review%20version%20nov%202011%281%29.pdf. [Accessed 15 July 2016].
31. Terwee CB, Bot SDM, de Boer MR, van der Windt DAWM, Knol DL, Dekker J, et al. Quality criteria were proposed for measurement properties of health status questionnaires. J Clin Epidemiol 2007; 60 1:34–42.
32. Van Tulder M, Furlan A, Bombardier C, Bouter L. Editorial Board of the Cochrane Collaboration Back Review Group. Updated method guidelines for systematic reviews in the Cochrane Collaboration Back Review Group. Spine 2003; 28 12:1290–1299.
Keywords:

Attitudes; evidence-based practice; knowledge; skills; undergraduate nursing students

© 2017 THE JOANNA BRIGGS INSTITUTE