Skip Navigation LinksHome > September 2011 - Volume 29 - Issue 9 > Pain Management Documentation: Analyzing One Hospital's Comp...
CIN: Computers, Informatics, Nursing:
doi: 10.1097/NCN.0b013e31821a1582
Continuing Education

Pain Management Documentation: Analyzing One Hospital's Computerized Clinical Records

SAMUELS, JOANNE G. PhD, RN; KRITTER, DAWN MS, RRT

Free Access
Continued Education
Article Outline
Collapse Box

Author Information

Author Affiliation: Department of Nursing, University of New Hampshire, Durham.

There were no outside sponsors for this work.

The authors have disclosed that they have no significant relationships with, or financial interest in, any commercial companies pertaining to this article.

Corresponding author: Joanne G. Samuels, PhD, RN, Department of Nursing, University of New Hampshire, Hewitt Hall, 4 Library Way, Durham, NH 03824 (joanne.samuels@unh.edu).

Collapse Box

Abstract

Pain management documentation, consisting of assessment, interventions, and reassessment, can help provide an important means of communication among practitioners to individualize care. Standard-setting organizations use pain management documentation as a key indicator of quality. Adoption of the electronic medical record alters the presentation of pain management documentation data for clinical and quality evaluation use. The purpose of this study was to describe pain management documentation output from the electronic medical record to gain an understanding of its presentation and evaluate the quantity and quality of the output. After institutional review board approval, data were abstracted from 51 electronic records of postsurgical patients in a 100-bed community hospital. Time-variant pain assessments, interventions, and reassessments were organized into pain management episodes to provide clinically interpretable data for evaluation. Data sources were identified. Data generated 1499 episodes for analysis. Analysis of variance results implied that pain management documentation changes with pain severity. Despite legibility and date and time stamping, inconsistencies and omitted and duplicated documentation were identified. Inconsistent data origination posed difficulty for interpreting clinically relevant associations. Improvements are required to streamline fields and consolidate entries to allow for output in alignment with care.

Pain management documentation (PMD) is a critical element of pain management care. It provides an important means of communication among members of the healthcare team, chronicling the patients' problems, treatments, and responses. It can help practitioners individualize care and communicate information required for continuity. It is often an objective measure of care provided and can reflect clinical judgment.1 Documentation provides a major data source for knowledge generation and provides evidence needed for practice accountability. Multiple standard-setting organizations require PMD as a key indicator of quality.2,3

The electronic medical record (EMR), called for by the Institute of Medicine4 to improve quality in hospitals and supported by the American Recovery and Reinvestment Act of 20095 is quickly replacing the paper record. Hospitals can earn up to $11 million in incentives for implementing the EMR, forfeited if not paperless within 5 years.6 It is now estimated that more than 70% of acute-care hospitals will implement some form of an electronic record in the future.7 Electronic medical records promise to improve care by providing a better quality of information for decision making, eliminate illegibility, offer better chart access, reduce documentation omissions and redundancy, and improve organizational efficiencies.8-10 Such documentation improvements are long overdue for pain management. However, the implementation of the EMR changes the face of PMD and information provided to clinicians to guide care.

Nurse informatics specialists are at the forefront of EMR implementation in hospitals. Informaticists are accountable for implementing new documentation systems, evaluating their quality, and designing required improvement efforts. In doing so, an understanding of the information requirements of the service provided is necessary. Therefore, the purpose of this study was to comprehensively describe the quality, quantity, and location of PMD output. Since the conversion of PMD to the EMR can require different data retrieval and quality improvement skill sets, understanding PMD output can help provide evidence on which to base future development and improvements.

Back to Top | Article Outline

BACKGROUND

Documentation has been an important element of nursing care since the time of Florence Nightingale. Information quality, including the ability to capture and use data, is a cornerstone of quality care.11,12 Nurses generate and use a substantial proportion of information required for interdisciplinary communication and care evaluation.11 A recent meta-study13 compiling over 25 years of research identified seven essentials of nursing documentation. The researchers concluded that nursing documentation should be patient centered and reflect the actual work including patient education and psychosocial support. Documentation should be written to reflect clinical judgment and presented logically, sequentially, and as events occur. Documentation should record variances in care and fulfill legal requirements.13 An information system needs to present documentation in an accessible, usable, and reliable form.12 Evaluation of the EMR effectiveness at this early phase of adoption seems critical.

Guidelines for PMD have been available for use since 1992.14 In 1995, the American Pain Society (APS) implemented guidelines for quality improvement,15 which incorporated specifications for PMD. After revisions in 2005,2 guidelines recommended a comprehensive pain assessment incorporating pain intensity and other parameters such as the location, duration, and associated symptoms. Treatments are to be based on assessment. Pain reassessment was highlighted in the 2005 recommendations2 as an important mechanism for determining the changing nature of pain needs and identifying the need for further treatments. These 2005 APS guidelines largely reflect the Joint Commission pain management standards, implemented in 2001,16 and a solid evidence base showing that routine and regular pain assessment with valid and reliable instruments, assessment-based interventions, and reassessment are essential components of pain management care.17 These standards were conceived within the paradigm of the paper record, not the EMR.

Patient paper records, cordoned into functional sections, are easily reviewed and provide a narrative of the patient's experience and the pain management care provided. Multiple resources existed to facilitate PMD paper record review. For example, Ferrell has created the City of Hope Web site that provides chart review instruments (http://prc.coh.org/pdf/pain_audit_tools.pdf). Since PMD in paper records have been consistently below the quality dictated by guidelines, and notoriously fraught with omissions, duplication, and illegibility, improvement efforts frequently resulted in the adoption of targeted flow sheets, forms revision, and continual staff education.

The conversion from paper to the EMR embodies a major paradigm shift in healthcare affecting all who rely on nursing documentation as a data source. A complete EMR consists of clinical data repositories (eg, laboratory, radiology), physician order entry, pharmacy and clinical documentation applications including nursing documentation, and electronic medication administration records (eMARs).18 Fields are either structured, for fixed entries such as medications or laboratory reports, or unstructured for free text.19 Fixed-entry formats are easily built into reports, yet may create silos of data because they may unsuccessfully interface.20,21 Silos of data have the potential of fragmenting PMD and therefore disrupting the connection among the assessment, intervention, and reassessment required for clinical judgment. Perhaps these disconnections as perceived by nurses led Darbyshire22 to conclude that computerization does not enhance clinical practice or patient outcomes.

Documentation compliance results obtained from studies evaluating pre- and post-clinical nursing EMR implementation vary. For example, Mahler and colleagues23 compared the quantity and quality of routine nursing documentation before and after the implementation of the EMR on four different nursing units at three different time points: before, during, and 9 months after the implementation process. Statistically significant improvements were identified in the quantity of documentation in two of the units and near-perfect compliance with documentation of all phases of the nursing process with the exception of evaluation. Only one unit reached 100% compliance in the documentation of evaluation, while two were noncompliant. Smith et al24 found significant improvements in PMD in four of eight nursing intervention categories after the implementation of an EMR. The evaluation category remained deficient. Targeting PMD, Gilbertson-White and Shapiro25 showed that the number of PMD assessments per day increased on average by 1.6 after EMR implementation, but slight reductions in compliance with documentation policies especially with patient-controlled analgesia (PCA).

Little attention has been paid to the difficult data retrieval considerations of the EMR for PMD. Data consumers, such as physicians, have reported difficulty finding information in the nursing EMR,26,27 some actually preferring the narrative note.28 In 2006, the Agency for Healthcare Research and Quality began to address data retrieval concerns generated from a desire to realize the many benefits of the widespread use of the EMR.20 Concerns included issues of time-consuming and problematic data acquisition, multiple and disparate systems within healthcare organizations, and the lack of standardized terminology. Data warehousing and retrieval work appear to be directed toward high-volume, medically oriented disease states that present ramifications for insurance reimbursement.20,29 Samuels30 described the need for different data retrieval methods depending on the proprietary platform. More needs to be understood about the collection and the output of PMD from the EMR at the point of care.

Back to Top | Article Outline

METHOD

Data for this cross-sectional descriptive study were collected from a 100-bed New England community hospital. The nursing staff at this hospital had been documenting care using the Meditech 5.6 Client Server (Medical Information Technology, Westford, MA) since 2005. Fifty-one postsurgical adult patient EMRs of discharged patients comprised the convenience sample. Records included into the study were those of discharged patients who had undergone a surgical procedure in 2008 and stayed in the hospital longer than 24 hours after discharge from the postanesthesia care unit (PACU). General surgical procedures included, but were not limited to, open or laparoscopic, thoracic, abdominal, vascular, gynecologic, prostate, plastic, or orthopedic surgeries. Patients having these surgeries usually reside on one or two of the hospitals' units. These procedures are performed regularly at the study hospital. Neurosurgical, spinal, cesarean delivery, or cardiac surgical or procedures performed to implant devices were excluded from the data set. An intensive care unit stay also excluded the patient record.

After approval by the institutional review board, the researcher attended confidentiality and data retrieval training. Two expert staff consultants serving as research assistants were provided with training specific to the study procedures. One assistant was identified as a superuser, responsible for teaching others to use the EMR. Criteria for inclusion were provided to medical record personnel who then pulled an appropriate record. Once obtained, the researcher determined the record's acceptability for the study.

Data collected for the study included patient demographics, nursing pain assessments, interventions, and reassessments occurring after return from the PACU and until discharge. The EMR platform provided a spreadsheet with several nursing assessment categories. The nursing data elements and record sources were extracted from the patient's EMR pain and comfort management spreadsheet section, the PCA and epidural (PCEA) sections, a numeric pain score trending field, the eMARs, and unstructured nursing progress notes. The pain and comfort portion category of the spreadsheet contained 13 fields (Table 1). Typically, intervention of an EMR system entails the presentation of standardized fields offered by the propriety system of interest. These fields can be individualized to accommodate hospital standards, policies, and practitioner preferences. The pain assessment fields in this hospital contain similar, although not exact, terminology to standardized paper assessment instruments.31 To date, there are no national standards guiding the design of pain management EMR computerized documentation fields.30 The platform also included a field in the management technique portion of the care activity entitled "effective." The PCA and PCEA categories provided a location to document medication dosages along with attempts and injections of the medication.

Table 1
Table 1
Image Tools

The study site was unable to generate a computer report incorporating all PMD fields. Thus, data collection was entered by hand onto a time-oriented data collection sheets. A research assistant checked 100% of the data collection for accuracy.

Time-variant data from the EMR spreadsheet categories, eMAR, and progress notes were entered onto an Excel 2007 (Microsoft, Redmond, WA) spreadsheet and organized into 30-minute PMD episodes of care. Episodes are composed of various combinations of pain assessment, intervention, reassessment, further intervention, and further reassessment.32 Assessments were defined as the entry of a numeric rating scale (NRS) score or a verbal pain description where no documentation occurred in the previous 90 minutes.

Interventions were defined as actions taken on behalf of the patient.32 It is possible that an intervention could occur without a previous assessment; however, by definition, interventions are responses to an assessment occurring within 1 hour. Interventions consist of the administration of a medication, documenting a PCA or PCEA dose, contacting a prescriber, offering or discontinuing a pain medication, offering or providing a nonpharmacological intervention, and so on. Medication data collected were classified as opioids and nonsteroidal agents. No anxiolytic or antidepressant medication classes were included as interventions.

Reassessment was considered any assessment occurring within an hour of an intervention or a previous assessment. Oakes et al33 also adopted the recognized 1-hour time frame reassessment, which can accommodate the half-life of many intravenous and oral pain medications. Further intervention and further reassessment, if occurring, were similarly defined, potentially making one episode a 4-hour time frame. All coded data and time frames were verified for accuracy by the primary investigator. Other than categorizing the patterns, no special handling of missing data occurred.

Data and data sources were analyzed descriptively. Further reassessment and further intervention entries were low in number and therefore not tabulated. An average NRS score per patient was calculated, as were the length of stay from the time discharged from the PACU to the time of the last data entry and the time between PMD episodes. Associating documentation output with NRS scores provides a measure of PMD relevance to outcomes. Pearson correlation determined the relationship between the number of PMD entries per day and the average NRS as a measure of the quantity of documentation based on patient condition. Analysis of variance demonstrated the variance of NRS score by PMD episode category as a measure of differences in documentation by pain severity scores. It is assumed that a higher NRS score would require different clinical judgments than a lower score. For example, the "assessment only" category with an average NRS score of 0 is clinically justified since no further documentation may be required.

Back to Top | Article Outline

RESULTS

Data from 51 patient records from discharged surgical patients generated 1499 PMD episodes for analysis. Since a variety of surgical procedures were identified in the data set, procedures were recoded into orthopedic, gynecologic, and general surgical to improve interpretability. Demographic and shift-specific information is presented in Table 2.

Table 2
Table 2
Image Tools

The Pearson r showed no relationship (0.24, not statistically significant) between the number of PMD episodes per day and the average NRS. Of the 1499 PMD episodes, 1175 (78%) contained assessments, 678 episodes contained an intervention, and 396 episodes (26.4%) contained a reassessment. Forty-nine percent of assessment entries (n = 735) were documented by nurses using NRS only; 30.4% (n = 456) were documented as a combination of verbal and NRS, and 13.6% (n = 204) assessment entries contained no NRS score. Table 3 identifies the frequency of the six possible PMD episode category combinations. A box plot (Figure 1) highlights the NRS ranges identified within each PMD episode category. Analysis of variance detected a significant difference in NRS by PMD episode category (ANOVA F3,1188 = 110.56, P = .000, although the two categories containing "no assessment" could not be computed. One hundred three of the episodes (6.8%) contained an intervention without a previous assessment. Tukey post hoc comparisons indicated significant differences among all four groups.

Table 3
Table 3
Image Tools
Figure 1
Figure 1
Image Tools

Pain management documentation data sources are identified in Table 4. Less than half of the assessments (43.6%) were abstracted from the computerized spreadsheet fields alone. Many assessments were located in two fields, implying duplicate documentation. The majority of interventions (32.5%) were also identified in two EMR locations. The PCA and PCEA interventions were found in the computerized spreadsheet and not in the eMARs. Reassessment, again, was identified in two locations. Of note, 31.0% of reassessments were found only in the unstructured progress notes.

Table 4
Table 4
Image Tools
Back to Top | Article Outline

DISCUSSION

Despite date and time stamping and legible entries, inconsistencies and omitted and duplicated documentation in pain management assessments, interventions, and reassessments were identified in this study, validating other studies showing that the EMR does not completely remediate long-standing issues with PMD.25,26 Aside from the quantity and frequent nursing entries, there seemed to be an inconsistent use of the different fields and an overuse of the progress notes and pain comments when information had already been documented in other EMR fields. Contrary to standards identifying the need for an initial comprehensive assessments, then targeted ongoing assessments,2 the number of fields in this hospital's PMD fields suggests that the nurse complete a comprehensive assessment with each patient contact, potentially generating unnecessary documentation. In addition, documentation of interventions such as the PCA and PCEA was also charted in the pain and comfort management section of the spreadsheet.

The reassessment field in the spreadsheet was rarely used. Instead, nurses created another spreadsheet column for the reassessment, yet a larger portion of reassessments was documented in the progress notes or pain comment fields. The use of the unstructured or inconsistent fields for reassessment data prevents future report building and may contribute to erroneous PMD quality evaluations.

Obtaining a narrative construction of the patients' pain trajectory would have been extremely difficult without the use of the time-oriented data collection instrument. Trending fields compiled only pain assessment data. No connection existed among the assessment, interventions, and reassessment fields other than in the narrative progress note. Clinical judgment, identified as a critical element of documentation,13 appeared absent when accessing the fields separately. Perhaps this disconnection of the clinical judgment process in the EMR explains why some physicians prefer reading narrative notes.28

In addition, scheduled as opposed to "as needed" pain management plans were rarely connected to reassessment. For example, regularly scheduled opioids or nonsteroidal anti-inflammatory drugs were not addressed as contributing to patients' pain control. While the components of the PMD standards were identified, they stood independent and did not comprise a whole. The lack of connectedness has serious implications for pain management quality, especially in light of the many pharmacological interventions used in practice today. While comprehensive assessments may be present to initiate therapy, effective medication titration in a 24-hour environment cannot be performed without available information of previous assessment-based interventions to inform clinical judgments and individualization of care.

This fragmentation found in PMD may also operate in other areas of nursing concern such as skin care and fall prevention. Beyond the clinical practice guidelines and care planning that may be incorporated into some EMR systems, nurse thinking seems unsupported by the documentation system at the study hospital. The current inability to generate reports that incorporate all required PMD fields in a clinically interpretable way contributes to a lowered quality potential. Reports now from fixed fields represent only the presence or the absence of PMD documentation, not the appropriate response based on an assessment. Thus, the EMR portrays nursing work as a task to be completed as opposed to thinking about the best intervention for the patient. Furthermore, individualization of the fields by hospitals during EMR implementation deters the ability to benchmark outcomes across organizations.

Back to Top | Article Outline

LIMITATIONS

The findings are based on an analysis of EMR documentation from one hospital and one proprietary EMR system and therefore cannot be generalized. The strict 60-minute time frame to define data points within the PMD episodes may have resulted in fewer reassessments than actually existed.

Back to Top | Article Outline

IMPLICATIONS

Adjusting to EMR implementation as a new practice reality provides opportunity for developers and informatics specialists to use their creativity and knowledge of systems to help design clinical interpretable data output. An important aspect of this work would be to implement strategies to help nurses enter data in specified fields and eliminate the duplication. To do so, however, requires unlearning of long-time practices of summarizing patient information in a progress note. Nurses who graduated before the implementation of the EMR may be required to relearn the efficiencies offered by the EMR. Soliciting staff input may help specialists understand barriers to change. Supporting the documentation of PMD in specified fields allows the generation of reports. Despite their current flaws, with input from agencies, reports may become valuable resources to assess clinical judgment.

Other consumers of PMD data, such as educators, quality improvement specialists, physicians, and administrators, not only require data retrieval training, but also can provide insight into the types of data output required for quality evaluations and information needs. Information systems need to accurately reflect the level of care provided to provide data for improvement efforts. In addition, national standards guiding EMR field design would support nursing knowledge generation.

As valuable resources to the larger healthcare community, insights generated from the day-to-day work of design, implementation, and evaluation of EMR systems provide valuable information to administrators, software developers, and policy makers. As such, communicating those insights either at conferences, policy-making workgroups, or professional manuscripts seems essential so that the benefits of the EMR can be realized. An information system that supports clinical work has the potential to support positive patient outcomes.

Back to Top | Article Outline

REFERENCES

1. Samuels JG, Fetzer S. Pain management documentation quality as a reflection of nurses' clinical judgment. J Nurs Care Qual. 2009;24(3):223-231.

2. Gordon DB, Dahl JL, Miaskowski C, et al American Pain Society recommendations for improving the quality of acute and cancer pain management. Arch Intern Med. 2005;165(14):1574-1580.

3. Joint Commission on Accreditation of Healthcare Organizations. Comprehensive Accreditation Manual for Hospital: The Official Handbook. Chicago, IL: Joint Commission Resources, Inc; 2009.

4. Institute of Medicine. Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, DC: National Academy Press; 2001.

5. The American Recovery and Reinvestment Act of 2009, Pub L No 111-5, 123 Stat 115. http://www.gpo.gov/fdsys/pkg/PLAW-111publ5/content-detail.html. Accessed April 12, 2011.

6. Huslin A. Online health data in remission: nascent industry ready with systems if money and standards are resolved. Washington Post. February 16, 2009;(Financial):D01.

7. Malloch K. The electronic health record: an essential tool for advancing patient safety. Nurs Outlook. 2007;55:159-161.

8. DesRoches CM, Campbell EG, Rao SR, et al Electronic health records in ambulatory care: a national survey of physicians. N Engl J Med. 2008;359:50-60.

9. Stetson PD, Morrison FP, Bakken S, Johnson SB. Preliminary development of the physician documentation quality instrument. JAMIA. 2008;15(4):534-541.

10. Otieno OG, Toyama H, Asonuma M, Kanai-Pak M, Naitoh K. Nurses' views on the use, quality and user satisfaction with electronic medical records: questionnaire development. J Adv Nurs. 2007;60(2):209-218.

11. Currell R, Urquhart C. Nursing record systems: effects on nursing practice and health care outcomes [review]. Cochrane Database Sci Rev. 2003;(3):CD002099.

12. Oroviogoicoechea C, Elliott B, Watson R. Review: evaluating information systems in nursing. J Clin Nurs. 2007:567-575.

13. Jeffries D, Johnson M, Griffiths R. A meta-study of the essentials of quality nursing documentation. Int J Nurs Pract. 2010;16:112-124.

14. US Department of Health and Human Services. Acute pain management: operative or medical procedures and trauma. Clinical practice guideline. In: Public Health Service US Dept HHS. Agency for Health Care Policy and Research (Now Agency for Health Research and Quality; 1992. AHCPR No. 92-0032.

15. American Pain Society Quality of Care Committee. Quality improvement guidelines for the treatment of acute pain and cancer pain. JAMA. 1995;274(23):1874-1880.

16. Phillips DM, for the Joint Commission on Accreditation of Healthcare Organizations. JCAHO pain management standards are unveiled. JAMA. 2000;284:428-429.

17. American Society of Anesthesiology Task Force on Acute Pain Management. Practice guidelines for acute pain management in the perioperative setting. Anesthesiology. 2004;100:1573-1581.

18. Garets D, Davis MR. Electronic Medical Records vs. Electronic Health Records: Yes, There Is a Difference: A HIMSS Analytics White Paper. Chicago, IL: HIMSS Analytics LLC; 2006.

19. Mitiku TF, Tu K. Using data from electronic medical records: theory versus practice. Healthc Q. 2008;11(4):23-25.

20. Agency for Healthcare Research and Quality Conference on Health Care Data Collection and Reporting, November 8-9, 2006. http://library.ahima.org/xpedio/groups/public/documents/ahima/bok1_033940.pdf. Accessed February 12, 2011.

21. Hyun S, Johnson SB, Stetson PD, Bakken S. Development and evaluation of nursing user interface screens using multiple methods. J Biomed Inform. 2009;42(6):1004-1012.

22. Darbyshire P. 'Rage against the machine?': nurses' and midwives' experiences of using computerized patient information systems for clinical information. J Clin Nurs. 2004;13(1):17-25.

23. Mahler C, Ammenwerth E, Wagner A, et al Effects of a computer-based nursing documentation system on the quality of nursing documentation. J Med Syst. 2007;31:274-282.

24. Smith K, Smith V, Krugman M, Oman K. Evaluating the impact of computerized clinical documentation. Comput Nurs. 2005;23(3):132-138.

25. Gilbertson-White S, Shapiro S. Electronic documentation systems: the impact on pain documentation. Commun Nurs Res. 2007;40:264.

26. Saigh O, Triola MM, Link RN. Brief report: failure of an electronic medical record tool to improve pain assessment documentation. J Gen Intern Med. 2006;21:185-188.

27. Tornvall E, Wilhelmsson S. Nursing documentation for communicating and evaluating care. J Clin Nurs. 2008;17:2116-2124.

28. Green SD, Thomas JD. Interdisciplinary collaboration and the electronic medical record. Pediatr Nurs. 2008;34(3):225-227.

29. Hynes DM, Weddle T, Smith N, Whittier E, Atkins D, Francis J. Use of health information technology to advance evidence-based care: lessons from the VA QUERI program. J Gen Intern Med. 2010;25(suppl 1):44-49.

30. Samuels JG. Abstracting pain management documentation from the electronic medical record: comparison of three hospitals. ANR. 2010. DOI: 10.1016/j.apnr.2010.05.001.

31. Pasero C, McCaffery M. Pain Assessment and Pharmacological Management. St Louis, MO: Mosby, an Affiliate of Elsevier; 2011.

32. Samuels JG, Fetzer S. Development of the Samuels Pain Management Documentation Rating Scale. Pain Manag Nurs. 2008;9(4):166-170.

33. Oakes LL, Anghelescu DL, Windsor KB, Barnhill PD. An institutional quality improvement initiative for pain management for pediatric cancer inpatients. J Pain Symptom Manage. 2008;35(6):656-669.

For more than 9 additional continuing education articles related to electronic information in nursing, go to NursingCenter.com\CE.

Keywords:

Computerized clinical record; Documentation; Electronic health record; Information retrieval; Nursing documentation; Pain management; Pain measurement; Quality improvement

© 2011 Lippincott Williams & Wilkins, Inc.

 

Login

Search for Similar Articles
You may search for similar articles that contain these same keywords or you may modify the keyword list to augment your search.

Featured Collections