The assessment, diagnosis, care, and evaluation of a patient's skin integrity and existing wounds are important elements of nurse practice and should be based on best available evidence.1,2 Several decades have passed since it was first reported that wounds left to dry out heal significantly slower than wounds that are kept moist.3,4 Multiple studies have confirmed that using moisture retentive dressings is associated with more expedient healing, fewer infections, greater patient comfort, reduced scarring, and reduced costs of care.5–17 Nevertheless, the use of moist gauze as a control treatment in clinical studies continues; approximately 50% of wounds are covered with gauze.11,18 This translates into millions of patients receiving non–evidence-based wound care. For a typical managed-care plan with 100,000 covered lives in the year 2000, the cost difference between the least and most-effective treatment modalities was $1.9 million for patients with pressure injuries and $5.8 million for patients with venous leg ulcers.14
Research examining barriers to implementing evidence-based practice (EBP) in wound care is limited. Existing evidence strongly suggests that organizational factors such as lack of time and resources and product barriers such as confusion surrounding dressing categories and confusion applying care guidelines negatively influence delivery of optimal care.19–28 In addition to these factors, personal/end-user barriers such as lack of knowledge and awareness of evidence-based (EB) wound care practices and wound assessment are problematic.19–21,25,28–33 The roots of some of these problems can be traced to basic nursing education; studies in the United States have shown that prelicensure nursing education programs generally provide insufficient wound care education.19,25 Nursing textbooks must cover a wide array of topics within skin and wound care chapters and descriptions of wound assessment and related optimal dressing selections may not be included.34–37 Rather, illustrations and descriptions of gauze-based dressings and dressing techniques remain common, and nursing student experience with non–gauze-based dressings is limited. In 1 study when second-year nursing students were asked to select the best practice option for wound care, fewer than 15% selected the correct response based on principles of moist wound healing.25 Evidence further suggests that limited knowledge about EB wound care may carry over into practice. For example, when provided the ability to use an EB wound assessment and dressing selection program, nurse respondents indicated that they reserved this option for “more serious” wounds because they believed that moist wound healing strategies should be reserved for deep and/or complicated wounds.38 While a change in knowledge is not sufficient to change practice, it is a prerequisite to change behavior.39 Studies in several countries outside the United States also support the need for expanding wound care education for nurses.19,20,40–42
Online education may be an efficient method for reaching millions of RNs but the effect of online education on nurses' knowledge about wound assessment and EBPs has not been examined. A study of 56 certified wound care nurses found that completion of an online, interactive, wound care program significantly increased the percentage of correct and partially correct algorithmic decisions; participants also recommended use of the program in nurses without expertise in wound care.43 Based on literature review, the authors concluded that learner satisfaction with online programs is generally good and knowledge is increased but few high-quality studies have been conducted to determine its effectiveness.44,45 Computer-based education is increasingly used in higher education and health care organizations to replace or supplement face-to-face continuing education programs.45,46 While literature about the effects of computer-based learning in nursing may be limited, it is well established that guided learning, using tools that are interactive and engage the user are particularly useful for adult learners.47
The purpose of this descriptive study was to evaluate a previously validated,43 online, interactive wound assessment and wound care clinical pathway in RNs. Specific aims were to (a) evaluate the percentages of correct, partially correct (safe but not fully correct), and incorrect algorithmic decisions and dressing selections, (b) compare algorithm and dressing selections between nurses who are and who are not wound care certified, and (c) evaluate its ease of use, educational value, and applicability in clinical practice.
The online, interactive program used in this study was based on Merrill's48 first principles of instructional design; it contains a set of 8 wound care algorithms (Solutions Wound Care Algorithms, ConvaTec, Bridgewater, New Jersey). The online program has undergone extensive content and construct validity testing.20,38,49,50 The program comprises introduction slides about the algorithms (Figure) and how to use the program, followed by 15 wound scenarios. Participants select a care pathway/algorithm and appropriate dressing(s) for each wound based on a photograph and description of moisture in the wound bed. Participants are also able to learn more about each decision point by clicking a “Further Reading” button. Feedback is provided after the learner completes all exercises via a cumulative score measuring the proportion of correct or partially correct responses. Scores vary from 0 to 100 and higher scores indicating more correct responses. Responses are scored as correct, partially correct (operationally defined as safe but not fully correct), or incorrect. In addition to their cumulative scores, respondents may review partially correct of incorrect choices and print a certificate of completion. Construct validity of the online program was established following testing by 56 expert wound care nurses.43
The target population was licensed health care professionals practicing in the United States. Convenience and snowball sampling methods were used to prospectively recruit participants, and all licensed health care professionals practicing in the United States were eligible to participate. However, analysis for this study was limited to data from RN participants. Potential participants had to confirm that they were practicing licensed health care professionals before being able to create a username and password to access the program. All participants provided written informed consent by clicking on the “opt-in/yes” button and were encouraged to print a copy of the consent form for their records.
Retrospectively analyzed data were collected under institutional review board approval from La Salle University School of Nursing. West Chester University institutional review board approval was obtained for prospective data collection. All data were collected anonymously by the program on a separate secure, password-protected server on the World Wide Web. Participants were not asked to provide their name or any personal information (eg, e-mail address) at any time. All demographic, user survey, and exercise result variables were coded for export into Excel software (Microsoft, Redmond, Washington). I did not collect IP addresses; rather, participants received a unique user ID after creating a user account. This ID was not connected to any user-identifiable information but was needed to enable participants to log back in to their own program. I collected and analyzed all study data under an agreement with the sponsoring organization (ConvaTec, Bridgewater, New Jersey) that provided IT support during the prospective data collection phase.
After providing informed consent, participants were asked to complete the demographics and practice setting questionnaire. Demographic and practice environment variables included gender, age, wound care certification, level of highest education, years of clinical experience, approximate number of patients with wounds per year, primary practice site, and state in which they practice. User choices of wound assessment and dressing(s) selection variables/algorithm steps were recorded by the program that was coded to record choices as correct or partially correct. Partially correct was operationally defined as an algorithm/assessment or treatment choice deemed safe but not optimal. After completing the exercises, participants were asked to complete a 4-item survey that queried ease of use of the computer program, ease of use of the algorithm, educational value, and applicability in clinical practice. Responses were based on a rating scale of 1 (indicating not at all) to 5 (indicating very). All variables, including the number of participants who opted out of the study, were recorded by the program.
Data from all nurses who consented to participate and completed all 15 wound assessments were extracted from the data set and included in the analysis. Practice site and state data were grouped by geographic region and individual algorithm and dressing choices were coded. Descriptive statistics were used to analyze the demographic, survey, and algorithm and dressing selection outcomes variables. A 2-sample Student t test assuming unequal variances was used to compare algorithm and dressing selection outcomes and survey responses between nurses who were and nurses who were not wound care certified. Individual correct algorithm selections for the first 6 wounds were compared to the last 6 wounds using a paired t test.
Four hundred twenty health care professionals agreed to participate prior to the prospective data collection phase and 119 agreed during the 3-month prospective data collection period. Four hundred fifty-one participants (84%) completed all exercises including 418 RNs who comprised the sample for this study. The average age of nurse participants was 46 years (SE: 0.54, median: 47.5) and the majority (368, 88%) were female (Table 1). One hundred nine (26%) had received formal wound care education and were certified in wound care. The majority (77, 70%) were certified by the Wound Ostomy Continence Nursing Certification Board (eg, certified wound ostomy care nurse, certified wound care nurse, or certified WOC nurse). Most nurses (277, 66%) practiced in an inpatient acute care setting located in the North Eastern area of the United States.
Algorithmic Decisions and Dressing Selection Response Rates
Based on photographs and moisture descriptions, the mean proportion of fully correct selections was 77% (SE: 1.09, 95% confidence level [CL]: 2.14). When partially correct choices were added, the proportion rose to a mean of 81% (SE: 0.88, 95% CL: 1.73, range: 13.3%-100%). The percent fully and partially correct dressings(s) choices were 78.1% (SE: 0.70, 95% CL: 1.39) and 0.5% (SE: 0.16, 95% CL: 0.31), respectively, for a total 78.6% (SE: 0.71, 95% CL: 1.40, range: 20%-100%). The mean proportion of correct/partially correct algorithm choices was significantly higher for nurses certified in wound care versus those who were not (M: 89.2, SE: 1.27 vs M: 77.8, SE: 1.102; t = 6.76, P < .001). Similarly, correct dressing selection scores were significantly higher for nurses who were, compared to nurses who were not wound care certified (M: 84.2, SE: 1.18 vs M: 76.6, SE: 0.86; t = 5.15, P < .001) (Table 2).
Of the 6270 algorithm choices made, an incorrect amount of wound moisture was selected 353 times (6%) and in 250 instances (4%), a partially correct amount of wound moisture was chosen. The remaining incorrect selections were based on an incorrect assessment of the amount of necrotic tissue in the wound (845, 13%). Incorrect assessment of necrotic tissue resulted in the largest proportion of all partially correct and incorrect responses (n = 1448, 58%).
The number of correct algorithm selections for the first 6 algorithms (n = 2508) was 1860 (74%) compared to 2023 (81%) for the last 6 algorithm choices. The difference between fully correct choices for the first compared to the last 6 algorithms was statistically significant (M: 310, SE: 0.02 vs M: 337, SE: 9.32; paired t = −2.05, P = .04).
Ease of Use, Educational Value, and Clinical Applicability Survey Responses
One hundred forty-four of 418 participants (24%) completed the 4-item survey that queried ease of use and clinical applicability. The mean scores for ease of use of the online program and algorithm were 4.22 (SE: 0.08) and 4.14 (SE: 0.08), respectively, out of a possible 5 (Table 3). The mean scores for educational value and applicability in clinical practice were 4.22 (SE: 0.08) and 4.19 (SE: 0.08), respectively. Analysis revealed no significant differences in rating scores based on certification in wound care (P = .49). Similarly, no significant differences based on participant wound care certification were observed for ease of use scores (M: 4.122, SE: 0.15 vs M: 4.16, SE: 0.1; t = −0.2, P = .4), educational value (P = .4), clinical applicability (P = .39) or overall usefulness (P = .329).
The need for effective wound care education for nurses is well documented, and online education may help fulfill this need.19,20,21,25,28,29,31,32,51,52 I evaluated responses from 418 RNs, most of whom were working in acute care facilities and did not hold wound care certification using a validated online education program and algorithm for wound assessment and dressing selection. Based on a wound photograph and description of the amount of wound moisture, nurses selected a safe, EB dressing option for 15 wounds a little over 75% of the time. Participants who were wound care certified scored significantly higher than did nurses who were not certified. This finding and the similarity between these results and previous studies using the algorithms or algorithm program provide further support for the construct validity of this online program.20,43
Assessments and Algorithm Selections
The most common areas associated with partially correct and incorrect responses were related to amount of moisture and presence of necrotic tissue. The type and amount of wound exudate (moisture) and type of tissue in the wound bed are important drivers of the plan of care and resultant dressing selection.5,33,53 Essential wound assessment variables that require monitoring and guide treatment included in the algorithm used in this online program were based on the Bates-Jensen Wound Assessment Tool, previously called the Pressure Sore Status Tool.20,53–55 To provide EB wound care and maintain a moist wound environment,5–9 a dressing that absorbs, donates, or maintains moisture must be selected based on the assessed amount of exudate. Because the amount of wound moisture cannot be ascertained from a photograph, this information was provided in a text box. Nevertheless, 10% of the 6270 algorithm selections differed from the information provided. While it is possible that participants decided to override given answer because they did not agree with the stated level of moisture in the text box, I believe it much more likely that participants simply did not see or read the text. Because framing is an important component of computer-based learning tools,56 each screen contains a visual of the algorithm pathway and selections made in addition to the wound photograph itself and a decision selection box (Figure). Whether this combination resulted in too much information on the screen itself is unknown. Alternatively, the text font size or color may be less than optimal, resulting in learners failing to include moisture information into subsequent decisions.56 This finding is an important reminder about the importance of testing computer-based learning programs using large samples of end-users. The potential design error on this screen was not evident until the program was tested by hundreds of nurses.
Assessment of the presence and amount of necrotic tissue (<25% or >25%) is essential because it determines the need for wound debridement. In this study, an incorrect necrotic tissue assessment was made 845 times (13%) accounting for the largest proportion (58%) of partially correct or incorrect wound care path/dressing choices.
Detailed information about which aspects of the wound assessment process are most challenging for nonexpert nurses as well as data about the validity and reliability of most wound and pressure injury assessment instruments is limited.57 Information about health care professionals' ability to assess wound necrotic tissue is also sparse and shows considerable variation. One study evaluated the inter- and intrareliability of 4 assessors using a photographic wound assessment instrument with 95 wound photographs. After completing training on use of the instrument, the intrarater reliability for amount of necrotic tissue ranged from 0.65 to 0.90 and the interrater reliability for this variable was 0.70.58 In another study, intrarater variability for assessing percentage of devitalized tissue by a dermatologist and dermatology residents in 31 wounds was 15%.59 Terris and colleagues60 asked 2 wound care nurses to assess 31 wounds in 15 patients. Interrater agreement was fair for the presence of yellow or brown tissue with slough and substantial for the presence of eschar. When Buckley and colleagues21 asked 33 home health care nurses to assess 10 wound photographs, they found an 85% average correct rating for the presence of slough and a 92% correct identification of eschar.
The small sample size of these studies may have contributed to the observed variability but, as noted by Buckley and colleagues,21 misunderstanding about terminology also may have influenced results. Concerns about the validity and reliability of commonly used wound terms, identified many years ago, have not been resolved.49 Wound assessment descriptions and mnemonics have been developed with limited, if any, testing.57 When originally tested using the Pressure Sore Status Tool instrument, neither necrotic tissue type nor amount had low inter- or intrarater reliability scores54 and the total item correlation for these items was 0.73 when translated and tested by 102 nurses in Turkey.61 Moreover, the Content Validity Index of the necrotic tissue amount item in these algorithms was more than 0.8 when rated by expert as well as nonexpert nurses in 2 studies.20,49 However, Beitz and van Rijswijk20 evaluated the algorithm used in this educational program and reported that the proportion of correct choices made by nonexpert nurses for wounds with necrotic tissue was much lower than the proportion correct for wounds without necrotic tissue (average 59% vs 80%). Similarly, in this study, the percentage of correct/partially algorithm choices, based on wound moisture level and necrotic tissue assessment, was significantly lower for nonexpert than for wound expert nurses. Therefore, I assert that classification and assessment of wound necrotic tissue should be an important component of wound education programs. I also believe that it is important to educate nurses using classification and assessment terminology that has been shown to be valid and reliable.
Computer-based learning is increasingly used by health care and educational organizations,45,46 but many computer-based learning programs consist of digitized text and photographs. The first of Merrill's48 4 principles of instructional design holds that learners should be engaged in solving real-world problems. Evidence suggests that situational or problem-based interactive e-learning is an effective method to improve novice learners' performance, and interactive education has a more positive effect on improving EBP than didactic education.62–64
Learning outcomes were not assessed in this study but the difference between fully correct choices for the first compared to the last 6 wounds was statistically significant (74% vs 81%). While a difference is to be expected as a result of a natural program learning curve, the statistical significance of the difference suggests that participant learning occurred. In the previous study using these algorithms, the percentage of correct and partially correct algorithm choices increased only slightly, but the difference was statistically significant.20
Interactive technologies are generally well received by nurses and are consistent with principles of adult learning theory.47,65 The online educational program evaluated in this study was rated highly for ease of use as was the algorithm within the program (average scores >4.1 on a scale of 1-5). Both wound care–certified and nonexpert nurses indicated that the program was valuable for education and applicable to clinical practice (mean and mode scores were 5 for both items in both groups). I acknowledge that the response rate to the survey was relatively low (24%). This outcome may have been influenced by the placement of the survey within the program. It was placed at the end of the exercises and next to a summary of the algorithm and outcomes. It was also placed below a box enabling the participant to print a certificate of completion. After downloading the certificate, participants may not have returned to that last window or they did not see the survey option because they focused on the results.
Many participants did not complete all exercises and the reasons for failing to complete the program are not known. Participants were able to complete the exercises during a second session but, in order to ensure confidentiality, usernames were not collected by the program. Therefore, the program could not retrieve the participant's username if forgotten when attempting to log in and complete the exercises. Not knowing why participants failed to complete all exercises is a limitation of this study and may have biased the results.
The geographic distribution of study participants heavily favored the North East region of the United States, most likely due to the sampling method used and my geographic location. This disproportion in the distribution in participants may affect the external validity of findings. In addition, the availability of the Web site was not widely promoted and external incentive for nurses to participate was not provided. The influence of these factors on the external validity of the study is not known.
IMPLICATIONS FOR PRACTICE AND RESEARCH
Increased emphasis on the quality (and testing) of clinical guidelines, simple guideline tools, decision aids, and computerized clinical support systems may increase adoption of EB wound care practices.5,64,66 However, as shown in the studies using these algorithms, any necessary changes must be carefully considered and tested to maintain instrument validity.
Study findings suggest that RNs enjoy interactive e-learning, and it exerts a positive effect on learning, which may improve EBP. The program used in this study incorporated Merrill's48 first principles of instructional design, facilitating the immediate application of new knowledge. The nurse participants possess some knowledge related to wound assessment and care; the photographs activated that knowledge while the algorithmic pathways provided visual maps and crucial relationships supporting clinical decision making.43 Interactive e-learning programs also may be used to help nursing students practice their wound assessment and dressing selection skills.
Studies examining the role of potential barriers and facilitators for adopting EB wound care practices across the continuum of care are needed. Existing evidence suggests that time (including time for education), administrative/leadership and/or organizational support, lack of awareness/knowledge about available evidence, lack of wound care knowledge, and communication are barriers to adoption of EB wound care practices.19–21,25–29,31,32,52,67,68
Additional research examining the validity and reliability of commonly used wound assessment terms and nonexpert nurses' ability to assess wounds and make EB dressing selections is also needed. A solid foundation of valid and reliable definitions is a prerequisite for progress in all areas of wound care. I believe that the value of educational programs will be enhanced when common wound assessment knowledge deficits are known, and additional research in this area is needed, including secondary analysis of data from this study.
Finally, more research is needed to optimize the design and usability of this and other e-learning programs and to test their effectiveness. Seemingly minor design issues, such as the placement or font type/box color of one important source of information in the program tested, may have influenced findings. While use of e-learning and computer-assisted methods is rapidly increasing, evidence of their effectiveness for nurse and student nurse learning is limited and many traditional and e-learning programs remain focused on teaching and content instead of learning and thinking.69,70 The imperative to teach EBP is directly linked to the need for building evidence-based teaching capacity.71 Research is needed to provide the tools educators need to develop EB content and use EB delivery methods and principles of instructional design.
Four hundred eighteen nurses assessed 15 wound photographs in an EB, online, interactive, wound assessment and care algorithm program. The majority of algorithmic decisions (M: 81%) and dressing selections (M: 78.6%) were correct or partially correct. In addition, the difference between the percent correct answers for the first compared to the last 6 assessments increased significantly. These findings strongly suggest that the e-learning program may facilitate learning and help nurses make EB wound care decisions. Most incorrect choices were made based on an incorrect necrotic tissue assessment, suggesting the need for more research and education about this wound assessment variable. Participants rated the program and algorithms as very easy to use, valuable for education, and applicable in clinical practice. Additional research is needed to uncover other potential gaps in nurses' wound care knowledge that may hamper the adoption of EB practice and to develop effective, EB education delivery techniques to optimize care.
The author gratefully acknowledges all colleagues who volunteered to participate in the study and complete all exercises and the support of ConvaTec to further their collective wound care and education knowledge by providing Web-based support and donating the data. In addition, this project would not have been possible without the participant recruiting efforts of many colleagues, Health Management Communications, LtD, and faculty members at the W. Cary Edwards School of Nursing at Thomas Edison State University and the West Chester University School of Nursing.
1. American Nurses Association. Scope and Standards of Practice: Nursing. 2nd ed. Silver Springs, MD: American Nurses Association; 2010.
2. Patton RM. Is diagnosis of pressure ulcers within an RN's scope of practice? [Editorial] Am Nurse Today. 2010;5(1):20.
3. Gilje O. On taping (adhesive tape treatment) of leg ulcers. Acta Derm Venereology (Stockholm). 1949;28:454–467.
4. Winter GD. Formation of the scab and the rate of epithelialization of superficial wounds
in the skin of the young domestic pig. Nature. 1962;193:293–294.
5. Bolton LL, Girolami S, Corbett L, van Rijswijk L. The association for the Advancement of Wound Care (AAWC) venous and pressure ulcer guidelines. Ostomy Wound Manage. 2014;60(11):24–66.
6. National Pressure Ulcer Advisory Panel, European Pressure Ulcer Advisory Panel, Pan Pacific Pressure Injury Alliance. Prevention and Treatment of Pressure Ulcers: Clinical Practice Guideline. 2nd ed. Washington, DC: National Pressure Ulcer Advisory Panel; 2014.
7. Registered Nurses' Association of Ontario. Assessment and management of Pressure Injuries for the Interprofessional Team. 3rd ed. Toronto, Ontario, Canada: Registered Nurses' Association of Ontario; 2016.
8. Registered Nurses' Association of Ontario. Assessment and Management of Foot Ulcers for People with Diabetes. 2 ed. Toronto, Ontario, Canada: Registered Nurses' Association of Ontario; 2013.
9. Qaseem A, Humphrey LL, Forciea MA, Starkey M, Denberg TD. Clinical Guidelines Committee of the American College of Physicians. Treatment of pressure ulcers: a clinical practice guideline from the American College of Physicians. Annals Intern Med. 2015;162(5):370–379.
10. Bolton LL, Monte K, Pirone LA. Moisture and healing: beyond the jargon. Ostomy Wound Manage. 2000;46(1A suppl):51S–62S.
11. Cowan LJ, Stechmiller J. Prevalence of wet-to-dry dressings in wound care. Adv Skin Wound Care. 2009;12:567–573.
12. Harrison MB, Graham ID, Brandys T. Leg ulcer care in the community, before and after implementation of an evidence-based service. CMAJ. 2015;172(11):1447–1452.
13. Heyneman A, Beele H, Vanderwee K, Defloor T. A systematic review of the use of hydrocolloids in the treatment of pressure ulcers. J Clin Nurs. 2008;17(9):1164–1173.
14. Kerstein MD, Gemmen E, van Rijswijk L, et al Cost and cost effectiveness of venous and pressure ulcer protocols of care. Dis Manage Health Outcomes. 2001;9(1):651–663.
15. McIsaac C. Closing the gap between evidence and action: how outcome measurement informs the implementation of evidence-based wound care practice in home care. Wounds
16. Ovington L. Hanging wet-to-dry dressings out to dry. Home Healthc Nurse. 2001;19(8):477–483.
17. Wiechula R. The use of moist wound-healing dressings in the management of split-thickness skin graft donor sites: a systematic review. Intern J Nurs Pract. 2003;9(2):S9–S17.
18. Armstrong MH, Price P. Wet-to-dry dressings: fact and fiction. Wounds
19. Ayello E, Baranoski S. 2014 survey results: wound care and prevention. Adv Skin Wound Care. 2014;27(8):371–380.
20. Beitz JM, van Rijswijk L. A cross-sectional study to validate wound care algorithms for use by registered nurses. Ostomy Wound Manage. 2010;46(4):46–59.
21. Buckley KM, Tran BQ, Adelson LK, Agazio JG, Halstead L. The use of digital images in evaluating home care nurses' knowledge of wound assessment. J Wound Ostomy Continence Nurs. 2005;32(5):307–316.
22. Dugdall H, Watson R. What is the relationship between nurses' attitude to evidence based practice and the selection of wound care procedures?J Clin Nurs. 2009;18:1442–1450.
23. Eskes AM, Storm-Versloot MN, Vermeulen H, Ubbink DT. Do stakeholders in wound care prefer evidence-based wound care products? A survey in the Netherlands. Int Wound J. 2012;9(6):624–632.
24. Helberg D, Mertens E, Halfens RJ, Dassen T. Treatment of pressure ulcers: results of a study comparing evidence and practice. Ostomy Wound Manage. 2006;52(8):60–72.
25. Huff JM. Adequacy of wound education in undergraduate nursing curriculum. J Wound Ostomy Continence Nurs. 2011;38(2):160–164.
26. Lloyd-Vossen J. Implementing wound care guidelines: observations and recommendations from the bedside. Ostomy Wound Manage. 2009;55(6):50–55.
27. Mitchell SA, Fisher CA, Hastings CE, Silverman LB, Wallen GR. A thematic analysis of theoretical models for translational science in nursing: mapping the field. Nurs Outlook. 2010;58(6):287–300.
28. Stremitzer S, Wild T, Hoelzenbein T. How precise is the evaluation of chronic wounds
by health care professionals? Intern Wound J. 2007;4(2):156–161.
29. Cook L. Wound assessment: exploring competency and current practice. Br J Community Nurs. 2011;16(suppl 12):S34–S40.
30. Greatrix-White S, Moxey H. Wound assessment tools and nurses' needs: an evaluation study
. Intern Wound J. 2013;12(3):293–301.
31. Gillespie BM, Chaboyer W, Allen P, Morely N, Nieuwenhoven P. Wound care practices: a survey of acute care nurses. J Clin Nurs. 2014;23(17-18):2618–2627.
32. Tickle J. Wound exudate: a survey of current understanding and clinical competency. Br J Nur. 2016;25(2):102–109.
33. van Rijswijk L, Eisenberg M. Wound assessment and documentation. In: Krasner D, van Rijswijk L. eds. Chronic Wound Care: The Essentials e-Book. Malvern, PA. HMP; 2018:29–46.
34. Berman A, Snyder SJ, Frandsen G. Skin integrity and wound care. In: Berman A, Snyder SJ, Frandsen G, eds. Kozier & Erb's Fundamentals of Nursing: Concepts, Process, and Practice. 10th ed. Boston, MA: Pearson Education Inc; 2016:828–864.
35. Craven R, Hirnle C, Jensen S. Skin integrity and wound healing. In: Craven R, Hirnle C, Jensen S, eds. Fundamentals of Nursing, Human Health and Function. 7th ed. Philadelphia, PA: Lippincott Williams & Wilkins; 2013:925–979.
36. Pieper B, Keves-Foster MK, Zugcic M, Albdour M, Albdour D. A cross-sectional, descriptive, quality improvement project to assess undergraduate nursing students' clinical exposure to patients with wounds
in an introductory nursing course. Ostomy Wound Manage. 2016;62(4):20–29.
37. Pittman J, Osborn KS. Caring for the patient with wounds
. In: Osborn KS, Wraa CE, Watson AB, Holleran R, eds. Medical Surgical Nursing. Upper Saddle River, NJ: Pearson Education, Inc; 2014:1889–1923.
38. Bolton L, McNees P, van Rijswijk L, et al Wound-healing outcomes using standardized assessment and care in clinical practice. J Wound Ostomy Continence Nurs. 2004;31(2):65–71.
39. Terhaar MF, Wilson ML. Education and translation. In: White KM, Dudley-Brown S, Terhaar MF, eds. Translation of Evidence Into Nursing and Health Care. 3rd ed. New York, NY: Springer Publishing Company; 2016:211–224.
40. Zarchi K, Latif S, Haugaard VB, Hjalager IRC, Jemec GBE. Significant differences in nurses' knowledge of basic wound management—implications for treatment. Acta Derm Venereol. 2013;94(9):403–407.
41. Thomas A. Assessment of nursing knowledge and wound documentation. Wound Pract Res. 2012;20(3):142–158.
42. Ross GC, Tuovinen JE. Deep versus surface learning with multimedia in nursing education: development and evaluation of wound care. Comput Nurs. 2001;19(5):213–223.
43. Beitz JM, van Rijswijk L. Development and validation of an online interactive, multimedia wound care algorithms program. J Wound Ostomy Continence Nurs. 2011;39(1):23–34.
44. Bloomfield JG, While AE, Roberts JD. Using computer assisted learning for clinical skills education in nursing: integrative review. J Adv Nurs. 2008;63(3):222–235.
45. Petty J. Interactive, technology-enhanced self-regulated tools in healthcare education: a literature review. Nurse Educ Today. 2012;33(1):53–59.
46. Durkin GJ. A comparison of the effectiveness of computer-based learning courses among nursing staff. J Nurses Staff Dev. 2008;24(2):62–66.
47. Knowles M. The Modern Practice of Adult Education. Englewood Cliffs, NJ: Cambridge Adult Education; 1980.
48. Merrill MD. First principles of instruction. Educ Tech Res Dev. 2002;50(3):43–59.
49. Beitz JM, van Rijswijk L. Using wound care algorithms: a content validation study. J Wound Ostomy Continence Nurs. 1999;26(5):238–249.
50. Ohura T, Sanada H, Mino Y. Clinical activity-based cost effectiveness of traditional versus modern wound management in patients with pressure ulcers. Wounds
51. Ashton J, Price P. Survey comparing clinicians' wound healing knowledge and practice. Br J Nurs. 2006;15(19):S18–S26.
52. Deeth M, Grothier L. Wound bed preparation: a survey of general nurses' understanding. Br J Nurs. 2016;25(12):566–570.
53. Bates-Jensen B. Assessment of the patient with a wound. In: Doughty DB, McNichol LL, eds. Wound Ostomy and Continence Nurses Society Core Curriculum Wound Management. Philadelphia, PA: Wolters Kluwer; 2016:36–68.
54. Bates-Jensen BM, Vredevoe DL, Brecht ML. Validity and reliability of the pressure sore status tool. Decubitus. 1992;5(6):20–28.
55. Bates-Jensen BM, McNees P. Toward an intelligent wound assessment system. Ostomy Wound Manage. 1995;41(7A suppl):80S–86S.
56. Lau KHV. Computer-based teaching module design: principles derived from learning theories. Med Educ. 2014;48(3):247–254.
57. Pillen H, Miller M, Thomas J, Puckridge P, Sandison S, Spark JI. Assessment of wound healing: validity, reliability and sensitivity of available instruments. Wound Pract Res. 2009;17(4):208–217.
58. Thompson N, Cordey L, Bowles H, Parslow N, Houghton P. Reliability and validity of the revised photographic wound assessment tool on digital images taken of various types of wounds
. Adv Skin Wound Care. 2013;26(8):360–374.
59. Laplaud AL, Blaizot X, Gaillard C, et al Wound debridement: comparative reliability of three methods for measuring fibrin percentage in chronic wounds
. Wound Repair Regen. 2010;18(1):13–20.
60. Terris DD, Woo C, Jarczok MN, Chester HH. Comparison of in-person and digital photograph assessment of stage III and IV pressure ulcers among veterans with spinal cord injuries. J Rehabil Res Dev. 2011;48(3):215–224.
61. Karahan A, Toruner EK, Ceylan A, Abbasoglu A, Tekindal A, Buyukgonenc L. Reliability and validity of a Turkish language version of the Bates-Jensen wound assessment tool. J Wound Ostomy Continence Nurs. 2014;41(4):340–344.
62. Feng JY, Chang YT, Chang HY, Erdley WS, Lin CH, Chang YJ. Systematic review of effectiveness of situated e-learning on medical and nursing education. Worldviews Evid Based Nurs. 2013;10(3):174–183.
63. Kim SC, Brown CE, Fields W, Stichler J. Evidence-based practice
-focused interactive teaching strategy: a controlled study. J Adv Nurs. 2009;65(6):1218–1227.
64. Titler MG. Translation science and context. Res Theory Nurs Pract. 2010;24(1):35–55.
65. Vogt MA, Schaffner BH. Evaluating interactive technology for an evolving case study on learning and satisfaction of graduate nursing students. Nurs Educ Pract. 2016;19:79–83.
66. Institute of Medicine. Clinical Practice Guidelines We Can Trust. Washington, DC: The National Academies Press; 2011.
67. Clarke HF, Bradley C, Whytock S, Handfield S, van der Wal R, Gundry S. Pressure ulcers: implementation of evidence-based nursing practice. J Adv Nurs. 2004;49(6):578–590.
68. Padula WV, Makic MBF, Wald HL, et al Hospital-acquired pressure ulcers at academic medical centers in the United States, 2008-2012: tracking changes since the CMS nonpayment policy. Jt Comm J Qual Patient Saf. 2015;41(6):257–263.
69. Lahti M, Hätönen H, Välimäka M. Impact of e-learning on nurses' and student nurses' knowledge, skills, and satisfaction: a systematic review and meta-analysis. Int J Nurs Stud. 2014;51(1):136–149.
70. Valiga TM. Nursing education trends: future implications and predictions. Nurs Clinics North Am. 2012;47(4):423–434.
71. Kalb KA, O'Conner-Von SK, Brockway C, Rierson CL, Sendelbach S. Evidence-based teaching practice in nursing education: faculty perspectives and practices. Nurs Educ Perspect. 2015;36(4):212–219.
For more than 179 additional continuing education articles related to education, go to NursingCenter.com/CE.