Nurses need multiple ways of thinking to manage the increasingly complex health care environment and their expanding professional roles. These ways of thinking include clinical reasoning, critical thinking, creative thinking, scientific thinking, and formal criterial reasoning (Benner, Sutphen, Leonard, & Day, 2010; Simpson & Courtney, 2002). Benner and colleagues (2002) recommend that academia focus on clinical reasoning to prepare future nurses for their complex roles. Nurse experts agree that clinical reasoning is an essential competency for newly licensed nurses who will need to make autonomous decisions in clinical practice (Banning, 2008a; Levett-Jones et al., 2010; Simmons, 2010).
Research indicates that nurses with good clinical reasoning skills can have a positive impact on patient outcomes. Nurses lacking clinical reasoning skills, on the other hand, often fail to recognize clinical deterioration, which can result in compromised patient safety (Lapkin, Levett-Jones, Bellchambers, & Fernandez, 2010). In fact, Lapkin and colleagues (2010) linked the top three reasons for adverse patient outcomes (failure to recognize, failure to intervene, and inappropriate management of complications) to poor clinical reasoning skills.
The synonymous use of multiple terms to describe how nurses think is one of the primary barriers to identifying effective teaching strategies and associated outcome measures. The terms clinical reasoning, clinical judgment, problem solving, critical thinking, and decision-making are often used interchangeably (Kriewaldt & Turnidge, 2013; Levett-Jones et al., 2010; Simmons, 2010). Although there is no consensus in the extant literature on an exact definition of clinical reasoning, there is agreement that it is a complex clinical decision-making process that involves discipline-specific knowledge, multiple types of thinking, and reasoning skills. Common attributes of definitions of clinical reasoning are gathering information, interpreting information, deciding on actions, and using reflection (Hoffman et al., 2011). For this integrative review, Simmons’ (2010) conceptual definition of clinical reasoning, “a complex process that uses cognition, metacognition, and discipline-specific knowledge to gather and analyze patient information, evaluate its significance, and weigh alternative actions” (p. 1151) is used to evaluate educational strategies and outcomes.
Another significant problem for academia is a noted gap in the literature associated with identifying what educational strategies are effective in promoting higher-level thinking in nursing. Thus, there is limited evidence to support the current methods of teaching and learning used to foster clinical reasoning (Banning, 2008b; Benner et al., 2010; Deschenes, Charlin, Gagnon, & Goudreau, 2011; Kriewaldt & Turnidge, 2013; Levett-Jones et al., 2010; Simmons, 2010). Research suggests contemporary teaching and learning approaches do not always facilitate the development of basic clinical reasoning skills (Levett-Jones et al., 2010). The development of clinical reasoning skills is often attributed to such variables as complexity of task, uncertainty of outcome, time spent in the practice setting, and the risk involved (Simmons, 2010). Finally, there is inconsistency in the literature regarding the measurement and evaluation of educational strategies that promote clinical reasoning. The purpose of this inquiry is to utilize the integrative review method of Whittemore and Knafl (2005) to explore the teaching strategies commonly used to promote clinical reasoning in nursing students and identify the outcomes or methods used to evaluate their effectiveness.
An integrative review is an appropriate method to examine and summarize past empirical literature to provide a more comprehensive understanding of the strategies used to promote clinical reasoning in nursing education and the tools used to measure effectiveness of educational approaches. The integrative review provides the opportunity to examine both qualitative and quantitative studies and include diverse methodologies. Integrative reviews are used to review theories and current evidence and to examine methodological issues (Whittemore & Knafl 2005). The framework of Whittemore and Knafl (2005), as modified from Cooper’s (1998) process, was used in this integrative review and incorporates the following stages: problem identification, literature search, data evaluation, data analysis, and presentation. Simmons’ conceptual definition of clinical reasoning was used to explore the current instrumentation, educational strategies, and identified gaps associated with clinical reasoning in prelicensure nurses. The guiding questions for this review were as follows: 1) What types of teaching strategies or interventions are commonly used to promote clinical reasoning in nursing students? 2) What outcomes or methods are used for measuring the effectiveness of educational approaches to teach clinical reasoning in nursing education?
PubMed, CINAHL, ERIC, and PsychINFO were searched using a combination of search terms including clinical reasoning and nursing education, clinical reas* in nursing education, clinical reas* and outcome measures, clinical reas* AND nursing education, clinical reas* in nursing, clinical reasoning instruments, clinical reasoning evaluation methods, and measuring clinical reasoning. The search was limited to English language articles published in peer-reviewed journals; the initial search totaled 381 articles that fit the search criteria and purpose of the review. (See Supplemental Digital Content 1, available at http://links.lww.com/NEP/A86, for PRISMA diagram illustrating the search strategy and output.)
Publications were evaluated for relevance to the identified guiding questions. Inclusion criteria included studies from the nursing discipline, with prelicensure nurses, and educational strategies specific to clinical reasoning with outcome measures for the selected clinical reasoning educational strategies. Studies were excluded if they were a) not empirical, b) written in a language other than English, c) dissertation papers, (d) conference abstracts, e) conducted with students from other disciplines, f) did not include prelicensure nurses, and h) were not deemed relevant to contributing to an understanding of the guiding questions.
Of the 381 articles reviewed using the stated inclusion and exclusion criteria, 336 were excluded after a review of titles and abstracts. The author assessed 43 full-text studies for eligibility; 37 research studies subsequently met inclusion criteria.
The initial review of each study examined the following items: author, publication year, country of origin, sample and setting, educational strategy, research methodology, instruments, outcome measures, and the results of the study. For ease in comparison, a table with the above criteria as column headings was organized chronologically by publication date and alphabetically by last name of the authors. A final column was added with the quality score of the article for rapid interpretation of methodological rigor. (See Supplemental Digital Content 2, available at http://links.lww.com/NEP/A87, for Table A.)
The quality of data was evaluated using a research review instrument from Robbin and Asselin (2015), a revised version of the Hawker, Payne, Kerr, Hardy, and Powell (2002) instrument. Hawker and colleagues’ instrument provides evaluation criteria for nine research components: 1) abstract and title, 2) introduction and aims, 3) method and data, 4) sampling, 5) data analysis, 6) ethics and bias, 7) findings/results, 8) transferability/generalizability, and 9) implications and usefulness. The revised appraisal instrument scored the nine methodological criteria as “good,” “fair,” “poor,” or “very poor,” and an attributed numerical score of 1 (very poor) to 4 (good) was added. Robbin and Asselin (2015) converted the scores to “high (3.0-4.0)” and “low (2.9-1.0).” The range of quality scores for the research studies included in this review was 2.0 to 4.0; no articles were excluded because of rank. Key themes that emerged from the literature were traditional teaching strategies, active learning, innovative technology, and combinations of the above themes. Key themes were clustered as appropriate and summarized in narrative format.
The 37 articles selected for this integrative review represented quantitative (n = 25), qualitative (n = 5), and mixed-method (n = 7) studies; publication dates ranged from 1998 to 2016. Most studies were from university baccalaureate nursing programs (n = 34), two were from community college nursing programs (n = 2), and one represented a combination of community college and BSN programs (n = 1). The reviewed studies incorporated populations from across different age ranges, both female and male genders, and from different countries; however, specific sample demographic data pertaining to ethnicity (n = 8), gender (n = 14), or age (n = 19) was not reported in every study (see Table 1). Of the studies that reported ethnicity, the majority had 35 percent to 76 percent Caucasian samples; only two studies (Kautz, Kuiper, Pesut, Knight-Brown, & Daneker, 2005; Kautz, Kuiper, Pesut, & Williams, 2006) reported greater diversity (60 percent to 61 percent African American). The majority of studies were primarily representative of 70 percent to 100 percent women, ages 20 to 36.
The results of the 37 studies varied, with 23 reporting statistically significant increases in clinical reasoning, five reporting nonsignificant results, three with mixed results, and six with nonspecific results. Studies that used a clinical reasoning structured framework reported greater success than studies that did not use a framework (Dreifuerst, 2012; Forneris et al., 2015; Georg & Zary, 2014; Gonzol & Newby, 2013; Harmon & Thompson, 2015; Hicks Russell, Geist, & House Maffett, 2013; Hoffman et al., 2011; Jensen, 2013; Kautz et al., 2005; Tesoro, 2012). Studies without clinical reasoning frameworks reported nonsignificant, mixed, or nonspecific findings (Chan et al., 2016; Haffer & Raingruber, 1998; Hunter & Arthur, 2016; Jessee & Tanner, 2016; Johnsen, Fossum, Vivekananda-Schmidt, Fruhling, & Slettebø, 2016; Khanyile & Mfidi, 2005; Le Roux & Khanyile, 2012; Murphy, 2004; Russell, McWilliams, Chasen, & Farley, 2011).
Clinical reasoning models used in the reviewed studies included the Outcome-Present Test (OPT) Model, Interactive Computer Decision Support, Think Aloud, Debriefing for Meaningful Learning, Developing Nurses’ Thinking, SAFETY, Lasaster Clinical Judgment, Virtual patient Nursing Design Model, Newman’s Health Expanding Consciousness theory, and IRUEPIC (Identify, Relate, Understand, Explain, Predict, Influence, and Control). The OPT model (n = 7) was used most frequently as a framework or educational strategy in the reviewed studies; however, no study compared the clinical reasoning models against each other to determine if one model was superior.
Educational strategies used to promote clinical reasoning in the reviewed nursing education studies and the associated outcome measurements are listed in Table B as Supplemental Digital Content 3, available at http://links.lww.com/NEP/A88. Traditional strategies, such as clinical experiences and case studies, were implemented in combination with simulation and high quality debriefing (Dreifuerst, 2012; Forneris et al., 2015) or with a structured clinical reasoning model, such as the OPT model (Bland et al., 2009; Harmon & Thompson, 2015) and the Developing Nurses’ Thinking model (Tesoro, 2012).
Studies that reported statistically significant increases used active learning strategies such as the OPT model worksheet and Clinical Reasoning Web (Bartlett et al., 2008; Brandao de Carvalho Lira & Venicios de Oliveria Lopes, 2011; Deschenes et al., 2011; Dreifuerst, 2012; Forneris et al., 2015; Jensen, 2013; Kautz et al., 2005; Kuiper, Pesut, & Kautz, 2009; Murphy, 2004; Tesoro, 2012) or a combination of the OPT and case study (Bland et al., 2009; Harmon & Thompson, 2015). Studies that utilized simulation, problem-based learning, or clinical coaching without a structured clinical reasoning framework reported results that were not statistically significant or results that were nonspecific to clinical reasoning (Jessee & Tanner, 2016; Kubin, Fogg, Wilson, & Wilson, 2013; Le Roux & Khanyile, 2012; Lee, Lee, Lee, & Bae, 2016).
Surprisingly, studies that employed innovative educational strategies, such as serious games (virtual hospitals), virtual patients, web-based case learning, and clicker technology, were also not successful in promoting clinical reasoning. This finding was partly due to implementation issues and design flaws, such as difficulty in tracking answers (Chan et al., 2016; Russell et al., 2011), difficulty in navigating software (Johnsen, Fossuma, Vivekananda-Schmidt, Fruhling, & Slettebø, 2016), technical errors (Forsberg, Georg, Ziegert, & Fors, 2011; Johnsen, Fossuma, Vivekananda-Schmidt, Fruhling, & Slettebø, 2016), complexity of tasks (Johnsen, Fossuma, Vivekananda-Schmidt, Fruhling, & Slettebø, 2016), and medical-based concepts (Forsberg et al., 2011).
Most studies used exams, worksheets, observations by clinical instructors, or a combination of clinical reasoning instruments and paper-based exams as methods to measure effectiveness of teaching strategies (see Supplemental Table B). Multiple exams, both faculty-created and standardized tests, were used to measure clinical reasoning, including the multiple-choice question-based exam (Forsberg et al., 2011), the Health Education Systems Inc. standardized exam (Kubin et al., 2013), TesGen (Lapkin & Levett-Jones, 2011), Health Sciences Reasoning Test (Dreifuerst, 2012; Forneris et al., 2015), and the Script Concordance Test (SCT; Dawson, Comer, Kossick, & Neubrander, 2014; Deschenes et al., 2012). Instructors also scored students’ completed OPT model and clinical reasoning web (Bartlett et al., 2008; Harmon & Thompson, 2015; Kautz et al., 2005; Bland et al., 2009; Kuiper, Heinrich, Matthias, Graham, & Bell-Kotwall, 2008; Kuiper et al., 2009) or completed concept map (Adema-Hannes & Parzen, 2005; Trevisani et al., 2016) as outcome measures. Two studies compared the SCT scores of faculty to students’ scores, and results suggested a linear relationship existed between SCT scores and clinical experience (Dawson et al., 2014; Deschenes et al., 2011). Simulation learning outcomes were measured using think aloud and structured debriefing approaches. The think aloud approach (Burbach, Barnason, & Thompson, 2015; Khanyile & Mfidi, 2005; Lapkin & Levett-Jones, 2011) and other verbal analysis protocols (Johnsen, Fossum, Vivekananda-Schmidt, Phil, et al., 2016; Johnsen, Fossum, Vivekananda-Schmidt, Fruhling, & Slettebø, 2016) promote higher-order cognitive skills and help students to demonstrate evidence of clinical reasoning. Structured debriefing incorporates the use of reflection, a proven strategy for promoting higher-level thinking (Dreifuerst, 2012; Kautz et al., 2005; Kuiper et al., 2008).
Clinical observation is often unstructured, biased, and has lower interrater reliability (Forsberg et al., 2011). Kubin et al. (2013) suggest the type of clinical practicum is not especially important for developing clinical reasoning abilities. These researchers found that students can develop clinical reasoning skills in a variety of hospital and community settings the students in their study preferred the hospital setting. Although the type of clinical setting may not be important, three studies found that faculty preparation is closely tied to facilitating clinical reasoning during practicum (Hunter & Arthur, 2016; Jessee & Tanner, 2016; Stec, 2016). This was largely attributed to the disconnect between classroom and clinical practice (Hunter & Arthur, 2016), the use of adjunct faculty that do not have adequate training in promoting higher-level thinking (Hunter & Arthur, 2016; Jessee & Tanner, 2016), a lack of the adjunct faculty’s understanding of the concept of clinical reasoning (Hunter & Arthur, 2016), a lack of adjunct faculty’s recognition of clinical reasoning (Hunter & Arthur, 2016), the delay in providing feedback to the student about clinical performance (Jessee & Tanner, 2016), and inconsistent or poorly defined grading rubrics (Forsberg et al., 2011).
Other studies used surveys to measure student satisfaction, engagement, or perception (Chan et al., 2016; Georg & Zary, 2014; Hoffman et al., 2011; Le Roux & Khanyile, 2012; Levett-Jones et al., 2011; Russell et al., 2011); a few studies only measured the student’s ability to use the instrument (Bartlett et al., 2008; Johnsen, Fossuma, Vivekananda-Schmidt, Fruhling, Slettebø, 2016 Kautz et al., 2006). Studies that measured student satisfaction proposed that students who are satisfied with educational strategies and curriculum will be more engaged in learning (Hoffman et al., 2011; Levett-Jones et al., 2011; Le Roux & Khanyile, 2012). Finally, researchers that measured the student’s ability to utilize the instrument made the point that a student’s ability to reason can be impacted by self-confidence, self-efficacy, and experience. Students benefit from using a structured instrument that is familiar to them, especially when they are disorganized and scattered in their attempts to concentrate under stressful conditions (Haffer & Raingruber, 1998).
Many scholars agree that clinical reasoning is an important part of competence, but a review of the literature revealed that not all teaching and learning strategies facilitate the development of even basic levels of clinical reasoning (Levett-Jones et al., 2010). Educators use similar learning strategies, such as case studies, simulation, group discussion, and reflection, to foster clinical reasoning in nursing. Although these teaching strategies may be useful in facilitating the development of clinical reasoning skills, they should be used as part of a whole curricula approach to teaching clinical reasoning. The literature suggests strategies are more effective when implemented as part of a clinical reasoning framework, such as the Outcome Present State Test model and clinical reasoning web (Bartlett et al., 2008; Bland et al., 2009; Georg & Zary, 2014; Harmon & Thompson, 2015; Kautz et al., 2005; Kuiper et al., 2009). Implementing a clinical course using a concept-based curriculum would allow students to develop experiential knowledge and provide an opportunity to practice skill development. At minimum, educational strategies should be developed from the key components of clinical reasoning, such as critical thinking, reasoning, reflection, data analysis, heuristics, inference, metacognition, logic, cognition, information processing, and intuition (Simmons, 2010).
The use of innovative, technology-based teaching strategies can help engage students and increase student satisfaction; however, to promote clinical reasoning the strategies need to be delivered as part of a clinical reasoning framework. The literature suggests virtual patient methodology may be beneficial for distance learning students (Forsberg et al., 2011), an increasingly common choice for adult learners that choose to advance their nursing career. In addition, innovative technology is a preferred learning medium for the millennial learner and is well suited to prepare students for the complex technologies used in practice. Innovative technology-based strategies, such as virtual patient simulations and health care-related gaming, may provide prelicensure, unexperienced nurses with an opportunity to practice ambiguous clinical cases that require clinical reasoning and critical intervention without harming real patients; however, more research is needed to confirm this assumption.
A variety of methods were used to measure the effectiveness of clinical reasoning strategies, including exams, student-completed worksheets, and observations by clinical instructors; however, some studies used surveys to measure student satisfaction, engagement, or perception, and two studies measured the student’s ability to use the instrument. It was noted that clinical faculty may not be adequately prepared to teach or evaluate clinical reasoning skills and the current evaluation tools used to evaluate clinical practice are not explicit to clinical reasoning. Adjunct faculty play an essential role in fostering clinical reasoning during practica and should receive adequate training to promote student success. Further research is needed to construct evaluation tools that are reflective of the nursing discipline’s language and national competency standards (Kautz et al., 2006; Hunter & Arthur, 2016).
IMPLICATIONS FOR FUTURE RESEARCH
Several limitations were identified in the reviewed studies. Many studies used small sample sizes (Forneris et al., 2015; Harmon & Thompson, 2015; Kautz et al., 2006; Lapkin & Levett-Jones, 2011). Other limitations included issues with the expert panel board composition (Dawson et al., 2014; Deschenes et al., 2011), selection bias (Dreifuerst, 2012), homogeneous samples (Burbach et al., 2015; Kubin et al., 2013; Stec, 2016), lack of a pretest (Kubin et al., 2013), and the use of an instrument not specifically designed for nursing (Forneris et al., 2015).
Many studies used multiple teaching strategies in tandem, for example, the use of simulation with the verbal protocol analysis or think aloud approach; thus, it was difficult to determine if all strategies contributed to clinical reasoning or if it was one strategy that accounted for most of the learning. Further research-isolating strategies may allow for better analysis of the effectiveness of each strategy. There was also limited research on how to assess clinical reasoning in distance-based courses or online learning, where instructors have fewer opportunities to evaluate clinical reasoning skills (Forsberg et al., 2011). The literature suggests the use of technology-based strategies, such as virtual patient case studies or electronic concept mapping, may be beneficial; however, more research is needed to further evaluate these strategies.
Today’s increasingly complex health care environment requires nurses to become more advanced thinkers. Good clinical reasoning skills can have a positive impact on patient outcomes, whereas the lack of clinical reasoning often compromises patient safety and contributes to poor clinical outcomes (Lapkin et al., 2010). A more defined and universally accepted definition of clinical reasoning is needed. The synonymous use of multiple terms to describe clinical reasoning is not helpful and contributes to the difficulty of developing effective teaching strategies and assessment methods.
Clinical reasoning is a learned process that requires the use of analytical reasoning skills and experiential knowledge. The recommendation by Benner et al. (2010) and the Institute of Medicine (2010) is for academic nursing faculty to use clinical reasoning and multiple ways of thinking to guide the curriculum for nurse thinking; however, more effective educational strategies and efficient assessment methods specific to the development of clinical reasoning are needed. Furthermore, specific clinical reasoning instruments that are valid and reliable ways to measure and evaluate teaching and learning strategies are needed to drive this educational initiative. Self-reported levels and satisfaction surveys, although useful for student engagement, do not provide a comprehensive assessment of clinical reasoning.
Adema-Hannes R., & Parzen M. (2005). Concept mapping: Does it promote meaningful learning in the clinical setting? College Quarterly
, 8(3), 1–7.
Banning M. (2008a). Clinical reasoning
and its application to nursing
: Concepts and research studies. Nurse Education in Practice
, 8(3), 177–183. doi:10.1016/j.nepr.2007.06.004
Banning M. (2008b). The think aloud approach as an educational tool to develop and assess clinical reasoning
in undergraduate students. Nurse Education Today
, 28(1), 8–14. doi:10.1016/j.nedt.2007.02.001
Bartlett R., Bland A., Rossen E., Kautz D., Benfield S., & Carnevale T. (2008). Evaluation of the Outcome-Present State Test Model as a way to teach clinical reasoning
. Journal of Nursing Education
, 47(8), 337–344.
Benner P., Sutphen M., Leonard V., & Day L. (2010). Educating nurses: A call for radical transformation
. San Francisco, CA: Jossey Bass.
Bland A., Rossen E., Bartlett R., Kautz D., Carnevale T., & Benfield S. (2009). Implementation and testing of the OPT model as a teaching strategy in an undergraduate psychiatric nursing
course. Nursing Education Perspectives
, 30(1), 14–21.
Brandao de Carvalho Lira A., & Venicios de Oliveria Lopes M. (2011). Nursing
diagnosis: Educational strategy based on problem-based learning. Revista Latino-Americana de Enfermagem
, 19(4), 936–943.
Burbach B., Barnason S., & Thompson S. (2015). Using “think aloud” to capture clinical reasoning
during patient simulation. Journal of Nursing Education
, 12(1), 1–7. doi:10.1515/ijnes-2014-0044
Chan A., Chair S., Sit J., Wong E., Lee D., & Fung O. (2016). Case-based web learning versus face-to-face learning: A mixed-method study on university nursing
students. Journal of Nursing Research
, 24(1), 31–40. doi:10.1097/jnr.0000000000000104
Cooper H. (1998). Synthesizing Research: A Guide for Literature Reviews
(3rd Ed.). Thousand Oaks, C: Sage Publications.
Dawson T., Comer L., Kossick M., & Neubrander J. (2014). Can script concordance testing be used in nursing
education to accurately assess clinical reasoning
skills? Journal of Nursing Education
, 53(5), 281–286. doi:10.3928/01484834-20140321-03
Deschenes M. F., Charlin B., Gagnon R., & Goudreau J. (2011). Use of a script concordance test to assess development of clinical reasoning
students. Journal of Nursing Education
, 50(7), 381–386.
Dreifuerst K. T. (2012). Using debriefing for meaningful learning to foster development of clinical reasoning
in simulation. Journal of Nursing Education
, 51(6), 326–333. doi:10.3928/01484834-20120409-02
Forneris S., Neal D., Tiffany J., Kuehn M., Meyer H., Blazovich L., … Smerillo M. (2015). Enhancing clinical reasoning
through simulation debriefing: A multisite study. Nursing Education Perspectives
, 36(5), 304–309. doi:10.5480/15-1672
Forsberg E., Georg C., Ziegert K., & Fors U. (2011). Virtual patients for assessment of clinical reasoning
: A pilot study. Nurse Education Today
, 31, 757–762.
Georg C., & Zary N. (2014). Web-based virtual patients in nursing
education: Development and validation of theory-anchored design and activity models. Journal of Medical Internet Research
, 16(4), e105.
Gonzol K., & Newby C. (2013). Facilitating clinical reasoning
in the skills laboratory: Reasoning model versus nursing
process-based skills checklist. Nursing Education Perspectives
, 34(4), 265–267.
Haffer A. G., & Raingruber B. J. (1998). Discovering confidence in clinical reasoning
and critical thinking development in baccalaureate nursing
students. Journal of Nursing Education
, 37(2), 61–69.
Harmon M., & Thompson C. (2015). Clinical reasoning
in pre-licensure nursing
students. Teaching and Learning in Nursing
, 10, 63–70. doi:10.1016/j.teln.2014.12.001
Hawker S., Payne S., Kerr C., Hardey M., & Powell J. (2002). Appraising the evidence: Reviewing disparate data systematically. Qualitative Health Research
, 12(9), 1284–1299. doi:10.1177/1049732302238251
Hicks Russell B., Geist M., & House Maffett J. (2013). Safety: An integrated clinical reasoning
and reflection framework for undergraduate nursing
students. Journal of Nursing Education
, 52(1), 59–62. doi:10.3928/01484834-20121217-01
Hoffman K., Dempsey J., Levett-Jones T., Noble D., Hickey N., Jeong S., … Norton C. (2011). The design and implementation of an interactive computerized decision support framework (ICDSF) as a strategy to improve nursing
students' clinical reasoning
skills. Nurse Education Today
, 31(6), 587–594. doi:10.1016/j.nedt.2010.10.012
Hunter S., & Arthur C. (2016). Clinical reasoning
students on clinical placement: Clinical educators’ perceptions. Nurse Education in Practice
, 18, 73–79.
Jensen R. (2013). Clinical reasoning
during simulation: Comparison of student and faculty ratings. Nurse Education in Practice
, 13, 23–28.
Jessee M., & Tanner C. (2016). Pursuing improvement in clinical reasoning
: Development of the clinical coaching interactions inventory. Journal of Nursing Education
, 55(9), 495–503.
Johnsen H., Fossum M., Vivekananda-Schmidt P., Phil D., Psychol C., Fruhling A., & Slettebo A. (2016). A serious game for teaching nursing
students clinical reasoning
and decision-making skills. Nursing Informatics
, 225, 905–906. doi:10.3233/978-1-61499-658-3-905
Johnsen H., Fossum M., Vivekananda-Schmidt P., Fruhling A., & Slettebø A. (2016). Teaching clinical reasoning
and decision-making skills to nursing
students: Design, development, and usability evaluation of a serious game. International Journal of Medical Informatics
, 94, 39–48.
Kautz D., Kuiper R., Pesut D., Knight-Brown P., & Daneker D. (2005). Promoting clinical reasoning
in undergraduate nursing
students: Application and evaluation of the Outcome Present State Test (OPT) model of clinical reasoning
. International Journal of Nursing Education Scholarship
, 2(1. Epub 2005 Jan 24. doi:10.2202/1548-923x.1052).
Kautz D., Kuiper R., Pesut D., & Williams R. (2006). Using NANDA, NIC, and NOC (NNN) language for clinical reasoning
with the Outcome-Present State-Test (OPT) model. International Journal of Nursing Terminologies and Classifications
, 17(3), 129–138.
Khanyile T., & Mfidi F. (2005). The effect of curricula approaches to the development of the student’s clinical reasoning
, 28(2), 70–76.
Kriewaldt J., & Turnidge D. (2013). Conceptualising an approach to clinical reasoning
in the education profession. Australian Journal of Teacher Education
, 38(6), 103–115. doi:10.14221/ajte.2013v38n6.9
Kubin L., Fogg N., Wilson C. E., & Wilson J. (2013). Comparison of student learning among three teaching methodologies in the pediatric clinical setting. Journal of Nursing Education
, 52(9), 501–507. doi:10.3928/01484834-20130819-07
Kuiper R., Heinrich C., Matthias A., Graham M., & Bell-Kotwall L. (2008). Debriefing with the OPT model of clinical reasoning
during high fidelity patient simulation. International Journal of Nursing Education Scholarship
, 5(1), 1–14.
Kuiper R., Pesut D., & Kautz D. (2009). Promoting the self-regulation of clinical reasoning
skills in nursing
students. The Open Nursing Journal
, 3, 76–85.
Lapkin S., & Levett-Jones T. (2011). A cost-utility analysis of medium vs. high-fidelity human patient simulation manikins in nursing
education. Journal of Clinical Nursing
, 20, 3543–3552. doi:10.1111/j.1365-2702.2011.03843.x
Lapkin S., Levett-Jones T., Bellchambers H., & Fernandez R. (2010). Effectiveness of patient simulation manikins in teaching clinical reasoning
skills to undergraduate nursing
students: A systematic review. Clinical Simulation in Nursing
, 6(6), e207–e222. doi:10.1016/j.ecns.2010.05.005
Le Roux L., & Khanyile T. (2012). A cross-sectional survey to compare the competence of learners registered for the Baccalaureus Curationis programme using different learning approaches at the University of the Western Cape. Curationis
, 34(1), E1–E7. doi:10.4102/curationis.v34i1.53
Lee J., Lee Y., Lee S., & Bae J. (2016). Effects of high-fidelity patient simulation led clinical reasoning
course: Focused on nursing
core competencies, problem solving, and academic self-efficacy. Japan Journal of Nursing Science
, 13, 20–28. doi:10.1111/jjns.12090
Levett-Jones T., Hoffman K., Dempsey J., Jeong S., Noble D., Norton C., Roche J., & Hickey N. (2010). The ‘five rights’ of clinical reasoning
: An educational model to enhance nursing
students' ability to identify and manage clinically ‘at risk’ patients. Nurse Education Today
, 30, 515–520. doi:10.1016/j.nedt.2009.10.020
Levett-Jones T., McCoy M., Lapkin S., Noble D., Hoffman K., Dempsey J., … Roche J. (2011). The development and psychometric testing of the satisfaction with simulation experience scale. Nurse Education Today
, 31, 705–710. doi:10.1016/j.nedt.2011.01.004
Murphy J. (2004). Using focused reflection and articulation to promote clinical reasoning
: An evidence-based teaching strategy. Nursing Education Perspectives
, 25(5), 226–231.
Robbin M., & Asselin M. (2015). Reflection as an educational strategy in nursing
professional development: An integrative review
. Journal for Nurses in Professional Development
, 31(2), 62–72. doi:10.1097/NND.0000000000000151
Russell J., McWilliams M., Chasen L., & Farley J. (2011). Using clickers for clinical reasoning
and problem solving. Nurse Educator
, 36(1), 13–15. doi:10.1097/NNE.0b013e3182001e18
Simmons B. (2010). Clinical reasoning
: Concept analysis. Journal of Advanced Nursing
, 66(5), 1151–1158. doi:10.1111/j.1365-2648.2010.05262.x
Simpson E., & Courtney M. (2002). Critical thinking in nursing
education: Literature review. International Journal of Nursing Practice
, 8(2), 89–98.
Stec M. (2016). Health as expanding consciousness: Clinical reasoning
in baccalaureate nursing
students. Nursing Science Quarterly
, 29(1), 54–61. doi:10.1177/0894318415614901
Tesoro M. G. (2012). Effects of using the developing nurses' thinking model on nursing
students' diagnostic accuracy. Journal of Nursing Education
, 51(8), 436–442. doi:10.3928/01484834-20120615-01
Trevisani M., Cohrs C., de Lara Soares M. A., Duarte J., Mancini F., Pisa I., & De Domenico E. (2016). Evaluation of learning in oncology of undergraduate nursing
with the use of concept mapping. Journal of Cancer Education
, 31, 533–540.
Whittemore R., & Knafl K. (2005). The integrative review
: Updated methodology. Journal of Advanced Nursing
, 52(5), 546–553.