Despite widespread implementation of electronic medical record systems, retrieval of data from clinical notes continues to be mostly relegated to individual chart review and/or manual entry into databanks. We hypothesized that standard electronic medical record system notes could be redesigned to provide customary documentation and allow automated clinical data retrieval by a text data extraction software, with minimal disruption to workflow (documentation time).
The clinical paths of 20 fictitious patients presenting with macromastia and undergoing reduction mammaplasty were created in our electronic medical record systems’ practice environment. Each encounter was documented with our previous standard note and a redesigned “data-friendly” note. Provider documentation time was measured for each note. Provider documentation times for the two groups were compared using the t test (p < 0.05).
Standard notes were based on electronic templates within our electronic medical record systems, at the time used in our real-life clinical practice. For the creation of data-friendly note templates, each monitored clinical data point was assigned a standardized text prompt. For instance, within the operative note, the phrase “A *** skin pattern was designed.” prompted the documenting provider to enter the incision pattern. Later, the text data extraction software would scan the text for the string “skin pattern was designed” and capture all words between it and the article (“A”) at the beginning of the phrase.
Finalized notes were exported from the electronic medical record system in PDF format and loaded to the text data extraction software (TextConverter Standard 4.3; SIMX Corp., Princeton, N.J.) for data accrual. Seventy-six clinical data points were assigned for capture, including encounter/patient information, elements of the history and physical examination, operative details, and postoperative outcomes. [See Document, Supplemental Digital Content 1, which shows a List of clinical data points monitored/retrieved from data-friendly notes, http://links.lww.com/PRS/B901. See Document, Supplemental Digital Content 2, which shows an example of an operative note (data-friendly note) documented in our study. Despite its ordinary appearance, this document can be processed by the text data extraction software, which scans for defined text prompts (highlighted in yellow) and retrieves the associated clinical data points (highlighted in green). The text prompt “>Name:”, at the beginning of the document, is used to retrieve both the patient’s name (to the right of it, “Arcwelder, Abby”) and the type of note (line above it, “Operative Note”). All other text prompts are mapped to a unique clinical data point. When creating the note, clinical data points are either entered manually by the user (such as in “Wise” for skin pattern designed and “None” for complications) or automatically by the electronic medical record system (e.g., in the patient’s name, age, sex), http://links.lww.com/PRS/B902.]
The 120 data-friendly notes exported from our electronic medical record systems constituted a 300-page document. The text data extraction software recognized all data-friendly notes, generating a databank containing 4850 clinical data points. Manual crosschecking revealed 100 percent accuracy. Provider documentation times are listed in Table 1.
Natural language processing has been used for decades as an enhancement to manual chart abstraction.1,2 Similar to our data-friendly note model, it uses text data extraction software to capture data from medical notes. However, in natural language processing, the text is scanned directly for clinical data points. Because providers can vary greatly in the way they document information, manual chart review is still required to create/validate all terms that correspond to the clinical data points of interest, an extremely labor-intensive process. The data-friendly note model takes advantage of the electronic medical record system’s ability to standardize documentation (specifically, the text prompts that are associated with the clinical data points). Provided that the documenting provider does not alter these templated text prompts, he or she retains full flexibility to customize the body of the note, including any method of data entry (e.g., drop-down menus, free typing/dictating). Electronic templates have been associated with less variation and increased completeness of data entry, and increased provider satisfaction with the electronic medical record system.3–5 In addition, many clinical data points can be automatically inserted on the note by the electronic medical record system (e.g., name, date).
In conclusion, redesigned electronic medical record system notes (data-friendly notes) provided customary clinical documentation and allowed automated data retrieval by text data extraction software, with 100 percent accuracy. Data-friendly notes were also found to improve our workflow, with a statistically significant decrease in provider documentation time.
The author has no financial interest in any of the products or devices mentioned in this article.
Jose G. Christiano, M.D.
Division of Plastic Surgery
University of Rochester
1. Carrell DS, Halgrim S, Tran DT, et al. Using natural language processing to improve efficiency of manual chart abstraction in research: The case of breast cancer recurrence. Am J Epidemiol. 2014;179:749758.
2. Hanauer DA, Englesbe MJ, Cowan JA Jr, Campbell DA. Informatics and the American College of Surgeons National Surgical Quality Improvement Program: Automated processes could replace manual record review. J Am Coll Surg. 2009;208:3741.
3. Chalmers DJ, Deakyne SJ, Payan ML, Torok MR, Kahn MG, Vemulakonda VM. Feasibility of integrating research data collection into routine clinical practice using the electronic health record. J Urol. 2014;192:12151220.
4. Grogan EL, Speroff T, Deppen SA, et al. Improving documentation of patient acuity level using a progress note template. J Am Coll Surg. 2004;199:468475.
5. Harshberger CA, Harper AJ, Carro GW, et al. Outcomes of computerized physician order entry in an electronic health record after implementation in an outpatient oncology setting. J Oncol Pract. 2011;7:233237.
Viewpoints, pertaining to issues of general interest, are welcome, even if they are not related to items previously published. Viewpoints may present unique techniques, brief technology updates, technical notes, and so on. Viewpoints will be published on a space-available basis because they are typically less timesensitive than Letters and other types of articles. Please note the following criteria:
- Text—maximum of 500 words (not including references)
- References—maximum of five
- Authors—no more than five
- Figures/Tables—no more than two figures and/or one table
Authors will be listed in the order in which they appear in the submission. Viewpoints should be submitted electronically via PRS’ enkwell, at www.editorialmanager.com/prs/. We strongly encourage authors to submit figures in color.
We reserve the right to edit Viewpoints to meet requirements of space and format. Any financial interests relevant to the content must be disclosed. Submission of a Viewpoint constitutes permission for the American Society of Plastic Surgeons and its licensees and assignees to publish it in the Journal and in any other form or medium.
The views, opinions, and conclusions expressed in the Viewpoints represent the personal opinions of the individual writers and not those of the publisher, the Editorial Board, or the sponsors of the Journal. Any stated views, opinions, and conclusions do not reflect the policy of any of the sponsoring organizations or of the institutions with which the writer is affiliated, and the publisher, the Editorial Board, and the sponsoring organizations assume no responsibility for the content of such correspondence.