- Content, order, appearance, and mode of entry in screen design
- Standardization of screens and introduction of evidence
- Benefits of thoughtful design strategies
Screen design is an important issue in usability for the electronic medical record (EMR). The ISO 9241-210 standard defines usability as “the extent to which a system can be used by specific users to achieve specified goals with effectiveness, efficiency, and satisfaction in a specified context of use.”1 Conversely, poor usability may result in application errors, workarounds by frustrated end users, loss of data integrity, missing data, and overall system avoidance. Incorporating principles of human computer interaction can assist in the adoption of EMR documentation by nurses and other health professionals. Human factor methodologies have only been recognized in healthcare since the 1990s but have been prevalent in other industries such as aviation for a long time.2
Nurse informaticists often take a leadership role in designing screens for nursing or other clinical departments such as nutrition, case management, or rehabilitation services. These projects are often more difficult than they seem because they involve assisting staff to review and evaluate their current workflows, challenge old assumptions, and consider the needs of interfacing departments and staff. Expectations need to be articulated and continually shaped and managed. Most important, the application must be configured as much as possible to support best practices in clinical workflow. In the past, nurses have too often been passive consumers of technology rather than active designers,3 and this holds true for other clinical staff as well. With increased use of EMR documentation, it is time to change how we approach screen design, which is the focus of this article.
The Chester County Hospital and Health System (TCCH), a 220-bed independent nonprofit community hospital in southeastern Pennsylvania, took on the challenge of redesigning its nursing assessment forms after an initial project resulted in screens that staff perceived as cumbersome and time consuming. TCCH had been a beta site for Soarian Clinicals (Siemens) since 2002 and remains to this day a strategic partner with them. At the onset of this redesign project in 2007, the hospital had implemented the Nursing Admission Assessment, Shift Assessment, Vital Signs, and Intake and Output screens for the medical/surgical, telemetry, and pediatrics areas. Approximately 550 nurses had been using the forms. Staff dissatisfaction had resulted in a lack of momentum to move forward in expanding use of the EMR. Root causes of dissatisfaction with the forms were identified as the naiveté of the screen “designers” who had moved to electronic forms for the first time, the ambitious deadlines imposed on the project so that the application could be marketed, the limited opportunity for staff involvement, and the immature stage of the software at the time of the beta project.
A director of nursing informatics position was created and filled. Our chief nursing officer invested in a more substantial involvement of staff by authorizing a staff informatics council that would meet for one 4-hour period each month. The previous approach of 1-hour meetings would not have permitted the kind of engagement and focus that the project required. Approximately 25 nurses, one respiratory therapist, 10 leadership staff, and one unit secretary quickly volunteered. To make maximum use of the larger group’s time, a small task force (three informaticists and one clinical manager) worked intensively on the screens for the larger group to review. Rigorous debate was encouraged among the council members, a skill that took some time to build, but resulted in a group of super users who supported the redesign and have since become expert ambassadors on their home units for all application rollouts. The larger staff were engaged through our council members and by giving direct responses to individuals who provided feedback, whether or not their request was feasible.
To prepare for this venture, the team reviewed screens that were shared by hospitals in our vendor community. These showed us that many hospitals were struggling with the same issues.
Before discussing the specific rules of thumb, there is an overarching principle that needs to be articulated: Regard your staff as professionals and avoid designing your systems for your worst performers.
Screen design reflects an organization’s philosophy. Excessive control can result in unnecessary frustration to users, as well as circumventing the culture of professionalism. As decisions are made about such things as mandatory fields, “carry-forwards,” the uses of triggers, shortcuts, and the documentation of routine activities, consider the message this sends to your nurses.
There is a huge difference between designing a system that prevents a reasonably competent nurse from omitting something or making a mistake and designing a system to control deficient practitioners by enforcing rules at all costs. This latter practice can easily result in resentment, frustration, and workarounds. There might be rare instances where control is necessary, but too often the question why this is necessary is not asked. If a majority of staff are avoiding something, ask why. Streamline your documentation processes, adjust timing and responsibilities, and integrate screens with clinical workflow so that it all makes sense. This will help to ensure that applications meet established goals. Our chief medical information officer, Dr Karen Pinsky, said it best: “With the exception of truly mission critical activities and high-risk orders where hard stops and tight control may be warranted, the focus of application design should be on making it easy for the clinician to do the right thing, rather than impossible for them to do the wrong thing.”
The old saying that people will behave as expected is true here. If we expect professional behavior, we are more likely to get it. If there are poor practitioners, monitor them, counsel them, reassign them, or request a resignation. Don’t expect your applications to babysit deficient staff. Conscientious nurses should not have to pay the price for poor performers.
The lessons learned can be roughly grouped into five categories: efficiency, scope and content, appearance and cognitive effort, quality, and communication and exchange of data. The rules of thumb within each category became evident to the author as work progressed from one screen to the next. You can see the categories and rules in Table 1.
These rules of thumb were then articulated to the larger group, thus gradually changing the culture of screen design and allowing everyone to participate in a meaningful way. These guidelines are offered knowing that every situation is different (ie, there are exceptions to every rule).
These suggestions should be helpful whether you are redesigning screens, as we were, or making the initial move from paper to the computer. It should be noted that rigorous consideration of these factors will slow down the design and building process as more attention is paid to clinical workflow and in determining what is really necessary, but the end result should be increased user acceptance and improved accuracy, timeliness, and completeness of data entry.
Category 1: Efficiency—reducing the time that staff spends in unnecessary documentation is an important factor in today’s cost-conscious climate.
Rule of Thumb 1: If It Can Be Asked With One Question, Don’t Make It Two
This is an easy rule to break if one is not vigilant. For example, on our old screens, nurses had to check yes or no for caffeine use, then separately enter the amount (Figure 1A). On the new screen, one click answers the question (Figure 1B).
In other cases where a checkbox is present, but detailed comments are also required, it begs the question of whether the checkbox is needed. For example, in a form for case managers, there was a checkbox indicating a consultation with another discipline. Following this was a detailed comments field asking with whom they spoke and what transpired during the consultation. In this case, the checkbox was not needed.
Rule of Thumb 2: Replace Yes/No Choices With One Checkbox
Similar to the first rule of thumb, one box that is checked if a condition is present, but left blank if not (sometimes called charting by exception), eliminates many clicks, all of which are time consuming. As can be seen in Figure 2 below, on our original screen, the nurse has to check either yes or no for each item on the functional status. In the new screen, an item is checked only if it was present, with one “strong normal” for independent status with activities of daily living (ADL). (Note: “Strong normals” are occasionally needed for regulatory reasons or are requested for hospital-specific quality improvement purposes).
Rule of Thumb 3: Use Check Boxes Instead of a Drop-Down Box Where It Makes Sense
This one rule can save the greatest amount of time. A multiselect drop-down box typically requires at least four clicks on the part of the typical end user (one click to open it, one or more to choose responses, and one click to close it), whereas a checkbox requires only one or two clicks. Drop-down selections often hide the items below it, necessitating the additional click somewhere else on the screen to close the previous item. While more sophisticated users might avoid one of these extra clicks by using the tab key, this “shortcut” requires a change in mode between using the mouse and using the keyboard. Additionally, drop-down boxes “draw the curious” even when the item (field) does not apply to the patient, wasting time as irrelevant items are perused. With checkboxes, all choices are clearly seen right away. In Figures 3A and B, many clicks were saved in this redesigned screen segment. If you estimate two choices selected in each field, you go from 20 clicks to 10.
Rule of Thumb 4: Incorporate Summary Screens to Limit Unnecessary Travel to Screens That Do Not Apply to the Patient
This is especially helpful with body system assessment of patients who have short problem lists. Figure 4 shows part of the screen for systems review. One Click at the top for “All Body Systems WNL” triggers a check mark in the normal fields of each body system going down. The nurse can then change the click for the abnormal body systems that need further documentation. The nurse then enters only the body systems (shown on left) where the patient does not meet WNL criteria. Note that “Within Normal Limits” (WNL) is defined for each system. The purple hyperlink for each system takes the more novice user to a separate screen that describes the normal criteria in more detail. If the same nurse is performing multiple head-to-toe assessments in the same shift, they can use the “all body systems unchanged” option.
Rule of Thumb 5: “Carry Forward” Data When Appropriate
“Carry forward” is the process of bringing forth previously entered (often highly repetitive) data to prevent the user from having to enter it again and again. There are occasions where carry-forwards are appropriate and desirable. Some organizations freeze up at the thought of this practice, afraid that a careless nurse won’t look at the completed items before saving, thus allowing obsolete data to continue indefinitely. To prevent this, they make all nurses enter all data “from scratch.” This is a lost opportunity to harness the benefits of electronic documentation. Educate the staff about the need to review and update all information before signing and follow up on those who abuse this.
In Figure 5A, one typical “carry forward” scenario can be seen—the medical/surgical history from a previous admission. The nurse verifies and updates the information each time, but starts from established history. For the patient, knowing that the hospital has not lost its previous information makes him/her feel secure. It is also a satisfier in that the patient is not being asked the same question repeatedly and can expand on the topic differently.
Note: The item “Patient Denies Past Medical/Surgical history” means that the patient is stating that he/she has no medical or surgical history. It actually is another example of a “strong normal”—leaving the entire medical surgical history blank looks like the nurse skipped the entire section; this is a way to indicate a previously healthy patient with no history.
Another venue for carry-forwards is when you are doing a series of assessments in a short time frame, such as postprocedure. In Figure 5B, the pulse site carries forward on the active (left) column of the vascular checks form, but the actual quality (assessment) of the pulse site does not. Thus, the nurse only has to select the pulse sites for the assessment once, but must enter the quality of the pulses each time.
When a nurse is back charting at 6PM after a busy day, some carefully considered carry-forwards are very helpful and save time. For hourly intake and output screens, carrying forward the maintenance intravenous (IV) fluids from one hour to the next can save time and frustration.
Category 2: Scope and content—much thought needs to be given to the usefulness of each and every item on the screen.
Rule of Thumb 6: Focus on Usual Situations (Also Known as “Meat and Potatoes” Rule)
Using Pareto’s “80/20” rule, 80% of the needed information can be obtained with 20% of the fields. Move less frequently used, but still needed, findings to the bottom of the screen or (better yet) to separate specialty screens that are opened only when needed. Eliminating fields altogether is another, sometimes better option. From an ethical point of view, one needs to ask “If we are not going to actually use the information in the care of the patient, is it ethical to collect it?”
Applying this rule is much more difficult than it sounds. Nurses are often wedded to their more obscure items, and like to be completely inclusive of all situations, a reflection of the comprehensive, holistic case studies many of us produced during our nursing education!
However, even looking at unneeded fields adds cognitive “work,” causes excessive scrolling, and can affect the performance (speed) of screen transitions.
Questions to ask when evaluating whether information was needed:
- Who uses the information?
- When is it used?
- How often is it used?
- How is it used?
- What happens to it: is it printed, sent to another department?
Getting the answers to the above can be a tedious process, and some of the answers are rather amazing. Rumors abound as to who receives the information and for what purpose. When you dig in, you may find such things as: the person who originally requested the data has retired, a computer system in another department now provides this data, no one ever used the data because he/she never established its accuracy, or regulations changed and the need no longer exists. For example, we found that years ago there was an initiative started by a clinical nurse specialist (CNS) to identify the pacemaker vendors for every patient admitted with a pacemaker. However, because the patient or the nurse could not always reliably identify the correct vendor upon admission, the CNS later found a better way to accomplish his/her goal. But no one ever told the staff, and they continued to enter this (sometimes erroneous) information. Without extreme diligence, collecting unnecessary data can easily occur. Another useful tool for this phase is to obtain metrics of which fields are actually used (shown in Figure 6A).
Then when stakeholders are vehemently arguing the need for, let’s say, brachial pulse sites, you can point out that of 40000 shift assessments done in the last 6 months, this was documented only 147 times. If your metrics are more sophisticated, you might be able to determine that nurses were populating some of those 147 entries with such banalities as “unknown,” “unsure,” or “not applicable.”
Once you decide that an item is obscure or rare, it should be removed from the universal (high volume) assessments. You then have several options: place the item as a choice on a specialty form/screen, remove it altogether, or, if interoperability (ie, data retrieval and research) is not an issue, you can instruct the nurses to use the comments field. If using comments, try to place one large comment field on a screen, not a comment in every section or next to each item.
Figure 6B shows a portion of our original and revised cardiovascular assessments. Removed pulse sites were placed on the vascular check form to be used for postoperative or vascular procedures, and a comment field was used for unusual edema sites.
Rule of Thumb 7: Avoid Redundant Items Within Your Assessments
Redundancy here refers to areas of overlap in our assessments where the definitions are blurred, and thus the same or similar items are repeated. One common example occurs when the context of the assessment changes slightly—for example, between screens addressing functional assessments, assistive devices, valuables (which often include assistive devices), activities of daily living, and discharge planning. These screens often contain fields that are almost identical. Often this is the result of stakeholders adding items to various screens and not taking the time to see if another item could be removed or repurposed. In addition to making more work for the staff, redundant or overlapping items increase the risk of inconsistency in data entry between staff, resulting in confusion, errors, and potential legal issues.
Similarly, Neurological and Musculoskeletal Assessment screens often contain redundancies. Figure 7 shows the new descriptors for Extremity Assessment in the Neurological Assessment screen, followed by the Musculoskeletal Assessment screen. Both screens no longer contain overlapping and duplicative items—the descriptors are now much more related to their respective body system etiologies.
There is also apt to be redundancy between Assessments of Wounds, Drains, and Incisions and the Integumentary screens. One final example: In both the neurological and psychosocial assessments, the terms agitated, combative, anxious, and forgetful appeared. All four were eliminated from the neurological assessment. The only exception made to our “redundancy rule” was to allow “confusion” to appear on both assessments because its etiology can be either psychosocial (dementia) or neurological (changes in level of consciousness).
In summary, take the time to clarify what specific information is helpful to patient care, and remove similar items that provide no additional value. There is also redundancy in assessments between various disciplines, hence the next rule.
Rule of Thumb 8: Use Data in Multiple Contexts Across Disciplines
In screen design, we often forget that one of the benefits of electronic data is that it can reappear in many places and be presented in different ways. Most of us have heard the expression, “one entry, many uses.” Determining who collects or uses which data is a critical first step. For example, when designing physical therapy forms, ask questions such as “What data does physical therapy need that nursing may have already collected and entered?” Then use the nursing data (already collected) to populate their screens. The reverse is also true. If another discipline has collected information that is helpful to nurses, populate the nursing screen with this information. In some cases, it will be view only, and in other cases, it might invite the other discipline to update the item with more specialized information.
The best way to accomplish this is to involve in your design process all stakeholders who will either use or enter similar data. While there are some unique differences among professional groups in the definition or focus of their assessments, this does not have to deter data sharing. For example, a therapist may need more specificity in walker types. The nurse may initially enter basic walker information, and the therapist can then update the same field with the more detailed information. For the patient, it is much different to ask “I see you use a walker at home—can you describe to me what it looks like?” rather than starting from scratch with a repetitive, “Do you use a walker?”
When implementing this rule of thumb, one also needs to consider how data will display on the different screens. Consider length-of-field discrepancies, avoid obscure acronyms specific to one profession, and encourage standardization of terminology. Also, certified systems should ensure that if data are later revised, the active record will reflect the updated information, but a history of former data entries to the field will still be available if needed.
Category 3: Appearance and cognitive effort—well-organized, attractive screens can be a calming influence!
Rule of Thumb 9: Maintain a Predictable Rhythm and Keep Mouse Movement to a Minimum
Requiring a user to switch modes of entry back and forth between checkboxes, drop downs and comment fields is cognitively exhausting. Think about it—click checkbox item, then open drop down, click choice, close drop down, click checkbox, switch to keyboard, and type a comment, two more checkboxes, then back to typing a comment, then to a drop-down, and so on. Getting through a screen starts to feel like you are running on an obstacle course. Similarly, requiring a user to move the mouse all over the screen instead of providing a logical movement from the top of the screen to the bottom also requires extra time and mental effort.
On our old respiratory assessment, the user had to create three separate entries for each chest tube, each with an attendant inconvenient series of mouse movements (in Figure 8, only one chest tube site is shown). In the new screen, the answers for each type of tube are easily entered with minimal mouse movement. On the old screen, even those users sophisticated enough to use the tab key to move from field to field would have to work to follow the insertion point, whereas in the redesigned screen, the eye can easily follow the progression. Notice also that the real estate needed for three chest tubes on the revised form is only slightly larger than that needed for one chest tube on the old form.
Rule of Thumb 10: Omit Items for Which the Information Is Not Available to the User
Omit empty items, particularly at the top of a screen, even if they are “grayed out” (meaning disabled or view only) that the user cannot complete. Starting off with a “failure” is subconsciously upsetting to the user, especially at the top of the screen. An example is seen in Figure 9A, where the emergency department (ED) staff never knows the attending physician when he/she is completing this screen. Even though that data would be available later from another system, they were unnecessary on this form and were removed, as shown in Figure 9B.
Rule of Thumb 11: Put Things in Logical Order
Screens should represent the logical sequence of normal clinical workflow. One way to do this is to put later occurring items later on the screens, or otherwise separate from earlier items. For example, in the original integumentary assessment screen, the order of the findings was illogical. Pressure relief interventions came after pressure ulcers. So, in the new screen, routine skin assessment was followed by preventive pressure relief strategies, then followed by pressure ulcer and treatment. In an ideal world, if you faithfully implement pressure relief strategies, there should be no need to document pressure ulcers at all; thus, scrolling to these later sections should rarely be needed.
On a more granular level, if an item is calculated from entered information, place it after the feeder data that drive the calculation. Note body mass index placement in both Figures 9A and B .
Category 4: Quality of information—the information entered into the EHR must be complete and accurate.
Rule of Thumb 12: Use Triggers to Guide Content Where Needed
A trigger is a selection that when chosen, opens a screen to permit the entry of more detail. For example, when evaluating alcohol use, if the nurse checks “3–6 drinks per day,” more fields then appear that ask questions such as if the patient has a history of physical withdrawal, morning cravings, prior detox treatment, and so on. If these fields are not hidden, a novice nurse may ask a person who only occasionally has a drink these unnecessary and annoying questions.
Rule of Thumb 13: Incorrect Information Is Worse Than Omitted Information
A common way to “control” performance is to make an item mandatory. Mandatory items can cause errors because situations arise where the required information is simply not available. Staff become very creative when they must somehow bypass a required field that is in the way of getting their work done. We used very, very few mandatory fields. Allergy entry is a hard stop for computerized provider order entry (CPOE), medication administration, and other major processes, so we made this mandatory. Other than that, vaccine history is required before final completion of the admission assessment, but these assessments can be answered with “unknown”, or saved in a draft status. Height and weight were removed from mandatory status, because, for example, our ED staff would have been forced to make some wild guesses on ambulance patients.
A perfect example of problems caused by mandatory fields was evident in our original radiology order screens that contained mandatory IV, oxygen, and mode of transportation indicators. Before CPOE, our unit secretaries entered this information, and while they could not accurately answer these questions, they were sensitive enough to the urgent timing and politics of healthcare operations to avoid asking anyone. As a result, if an ambulatory patient came in with a complaint of bone pain and the physician ordered a chest computed tomography, the patient was greeted by transport with an unnecessary stretcher, IV pole, and oxygen support. As we discovered, frustrated users may change workflows in unexpected and unwanted ways.
Use auditing and alerting if important data are not getting entered. Ask why. Offer the item on later screens if the information may present itself at another time. If you must include a mandatory item, make absolutely sure that the person who needs to complete that screen will always, always, be able to complete it. Or offer an out such as “No information available at this time,” so a user can truthfully get past it.
Rule of Thumb 14: Use Visual Cues for Important Information
If all the rules above have been implemented, most every item on your screens will be important. However, using visual cues can improve performance for necessary elements such as those needed to meet Joint Commission, Centers for Medicare & Medicaid Services (CMS), or other regulations. For example, when a urinary catheter is needed, it is imperative to collect other data so that workflows can remind the physician to order removal as soon as it is clinically appropriate. To draw the nurse’s attention to the need for additional data, we highlight the necessary information in orange as seen in Figure 10. Incidentally, these items are carried forward as well.
Rule of Thumb 15: Create Opportunities for Evidence-Based Practice
When possible, we provide easy links from the screen to and from online references as well as to internal policies and procedures. For example, for pressure ulcers, there are links to information about typing and staging of ulcers, including pictures and descriptions of each stage.
Rule of Thumb 16: Rethink the Documentation of Routine Activities
As one can imagine, this was an area that generated much heated discussion in the Informatics Council. In healthcare, we have adopted a “cover your backside” mentality when it comes to documentation of standard routines, defined here as things that you do for all patients regardless of their individual needs. Nurses and nursing assistants are often required to complete hourly forms indicating that they performed routine rounding, checked the patient’s ID band, call bell in reach, safety equipment in the room, and so on. Another activity done on behalf of all patients is hand washing. Do we require the nurse to document this? As a profession, nurses need to carefully evaluate and speak out about the value of this type of documentation. Could the staff be using their time better on behalf of the patient in the time it takes them to complete all of this ponderous documentation?
Consider if this documentation really meets its intended goals (usually legal in nature), especially when it is being back charted at the end of the shift. Unlike on paper, in the electronic world the time of documentation is captured. Hourly documentation charted at the end of a shift can appear self-serving to a jury, or worse. One case that the author was familiar with as Risk Manager at another hospital involved a patient who died after a fall in the middle of the shift, and who had hourly rounding documented faithfully by a nursing assistant 4 hours after the patient had passed away.
Our tendency to rely on the premise that “if it was not documented, it was not done” has been carried to extremes. We need to start questioning these tendencies, open up dialogues with our legal advisors, and set universal standards as a profession. Implement a culture of customer service and safety that empower staff to work together to promote desired outcomes for patients. (RFID technology may soon bail us out on this issue by capturing actual staff visits to the patient room.)
After much soul searching in the council, and consultations with quality and risk management, we removed the original patient safety screen, which required that every individual safety item be checked, and placed a global statement for completion of routine Patient Safety items on the Body Systems Review page (shown near the top of Figure 4).
Category 4: Communication of information—interoperability for purposes of data retrieval and research is of key importance, but on a smaller scale, consider how the data are displayed in the EHR.
Rule of Thumb 17: Your Forms are Not Just for the People Who Enter the Data, But Also the People Who Look At It
Sometimes the clinicians involved in the design of data entry screens do not consider how the data will ultimately be displayed. One common example of this is overlooking the maximum characters for display. Usually with a little thought, problems can be avoided.
In Figure 11A, from the patient record view, the text box icons represent values that are too long to display. This could be indicative of an abnormality or simply the words “Within Normal Limits,” which was also too long. So the icon by itself tells you nothing. When you click on these icons, you may view the entire entry, but this would be a time consuming approach if you are only interested in abnormal findings.
If the builder changes the descriptor from Within Normal Limits to “WNL” or the also popular WDL (Within Defined Limits), the view would change significantly as shown in Figure 11B. Thus, when the text box icon shows up, the end user would be alerted to the two areas that are abnormal—the cardiovascular and respiratory systems.
Rule of Thumb 18: Standardize Data Terms
In the EHR, data have many uses beyond the care of one specific patient. Interoperability within one specific patient’s record has already been discussed, but the interoperability imperative extends to the ability of multiple clinicians, sites, healthcare systems, regions, or countries to exchange, study, and apply information for the betterment of patient care. Standardization of terminology across the world has to be the goal.
That being said, standardization is a struggle. Even within our organization, there was difficulty in getting consensus between individual clinicians on the naming conventions for items and descriptors. For example, it was surprising to find how different units ranked pulse qualities (one example is shown in Figure 12A). We standardized pulse qualities noted in Figure 12B after consulting the literature.4
Similarly, with extremity assessments, we found several different forms in use—all with different extremity strength descriptors. We checked references5 and used values noted in Figure 13.
The easy way out would have been to allow each area to use its own values—“hiding” the values that another area might not use. It is best to resist this temptation and address the cultural, policy, and political issues upfront. It may be that nurses in a particular area are not up to date in certain practices. It is important to check references so that the data reflect best practices as well as standardization.
It is beyond the scope of this article to dig deeply into the topic of standardization, only to express appreciation for the fine work that is being done in this area across the world and to give some small examples of what must be enormous roadblocks in this area.
Additionally, on a local level, it is helpful to consult stakeholders such as quality, risk management, and other departments that are struggling to produce metrics for regulatory purposes. The Joint Commission, CMS, and other regulatory bodies are very specific in how they want metrics collected and reported. As we move toward embedded analytics, standardization is a key component.
BENEFITS FROM CAREFULLY CONSIDERED SCREEN DESIGN
We have obtained many benefits from redesigning using these rules as precepts. Quantitatively, the project resulted in reducing the data items (fields) on the admission and shift assessments from 669 items to 354. A part of that loss was from items moved to specialty forms, opened only if the need exists. The number of clicks saved was much, much higher, if you consider the many items we changed from drop-downs to checkboxes.
The major qualitative benefit has been a complete turn-around in the attitude of our staff about computerization. The staff now approaches us frequently with requests to put more into the system. Areas and departments outside nursing that did not have electronic documentation have been actively working with us to document in the computer. The result is that our small department is inundated with the demands of our now interested users throughout the organization.
During time studies, we found that screens loaded much faster with the new, leaner forms. This resulted in improved system performance, another crowd pleaser. We also were able to improve workflows, alerts, and metrics through consultation with quality, risk, and infection control when we were building fields.
The impact from a productivity standpoint is often unrecognized and difficult to measure. Conceptually, if you multiply the time needed to read/perform an unnecessary entry/click by the number of times nursing staff address these items on a particular shift (number of patients they have by the number of items per patient), then multiply that by the number of staff shifts worked over a pay period; it would give you some idea of time saved.
The impact on quality of care is also difficult to define. In concept, the less time spent documenting, the more time nurses can observe and interact with patients. But it would be a challenge for researchers to isolate all the variables and create a research design leading to a cause-and-effect analysis. Our hospital is currently considering a research study looking at the impact of point-of-care documentation on patient satisfaction.
These suggestions are not intended to be hard and fast rules, but rather things to consider when designing screens. There are times when one rule will conflict with another, or a discipline may feel discounted by a change, and the price becomes too high. One example of this latter issue related to the nursing assistants. Although in most cases the documentation of routine activities needs to be reduced, when the nursing assistants discovered no item for documenting their hourly customer service rounding, they felt as if their contribution was not valued. Hence, we put a rounding indicator into the ADL form as a way to recognize their contribution to patient care.
Redesign of electronic documentation screens is an enormous effort for any organization. With adequate resources and lots of input from all stakeholders, the final product will enhance usability and provide significant benefits for staff, physicians, and patients.
The author thanks her fellow members of the core redesign team composed of Debbie Hetrick, BSN, RN, Pat Friedman, MSN, RN-BC, and Anne Satterthwaite, BSN, RN-BC, who worked with the author to create the new proposed designs for Council review; the Staff Informatics Council members for their conscientious feedback and support; Jean Anderson, Siemens UX analyst, for her assistance with screen images and suggestions; and Linda Q. Thede, PhD, RN-BC, and Dr Richard Donze for their editorial assistance. She also thanks CentraState Healthcare System, Riverside Hospital, Med Central Hospital, and Main Line Health Hospitals for offering their own documentation screens for our review and consideration.
The author did not receive funding in conjunction with writing this article.
The author has disclosed that she has no significant relationships with, or financial interest in, any commercial companies pertaining to this article.