Value-based Healthcare: Measuring What Matters—Engaging Surgeons to Make Measures Meaningful and Improve Clinical Practice : Clinical Orthopaedics and Related Research®

Secondary Logo

Journal Logo

REGULAR FEATURES

Value-based Healthcare: Measuring What Matters—Engaging Surgeons to Make Measures Meaningful and Improve Clinical Practice

Winegar, Angela L. PhD; Moxham, Jamie MSPH; Erlinger, Thomas P. MD, MPH; Bozic, Kevin J. MD, MBA

Author Information
Clinical Orthopaedics and Related Research: September 2018 - Volume 476 - Issue 9 - p 1704-1706
doi: 10.1097/CORR.0000000000000406
  • Free

The long-standing management tenet, “what gets measured gets managed” is likely influenced by the Hawthorne effect, which holds that individuals modify their behavior when they are aware that they are being monitored. With bundled-payment programs like the Comprehensive Care for Joint Replacement model, Bundled Payments for Care Improvement, and the Quality Payment Program already underway, we believe that these initiatives have generated a Hawthorne effect among surgeons who are now well aware that they are being observed. Linking reimbursement with both cost and outcomes provides surgeons with a motivation to improve, and the associated measurement (by payers and health systems) gives orthopaedic surgeons visibility into opportunities to deliver higher-value care to our patients [6].

While the interest in, and dependence on, robust and reliable measurement is steadily increasing to support the transition to fee-for-value reimbursement, the field of healthcare analytics is still nascent in its development; as we are still in the early stages of learning how to translate “big data” into meaningful and actionable information that improves clinical outcomes [3]. Successful development and dissemination of clinical and operational measures is dependent on both a structured surgeon engagement model and a robust metrics-definitions process.

Previous authors have addressed the need for increasing surgeon awareness to measures that reflect their financial, operational, and clinical performance, allowing surgeons to compare their results to peers and to national benchmarks [4, 5]. Still, healthcare organizations often underestimate the effort required to generate and disseminate meaningful data that is needed to achieve broad-based changes in practice; many assume that one or two meetings and a series of PDFs posted on a bulletin board will transform clinical practice. Advances in healthcare analytics and the vast amount of clinical data available in the electronic health records provide hospitals the opportunity to generate extensive clinical dashboards, but doing so requires a substantial investment in human and financial resources [7]. As part of a broader program to create a learning health system within a community hospital system, Dell Medical School and Ascension’s Seton Healthcare Family in Austin, TX, USA collaborated to develop a structured-engagement model that encourages influential clinical champions, who are engaged throughout the development process, to work with their data analytic partners to create and distribute data to clinicians. Surgeons who work within this structured-engagement model, we believe, will have a vested interest in using meaningful data to achieve clinical transformation.

Clinical Leadership at the Top

A central feature of our proposed engagement model is a strong clinical chair who can serve as a liaison between the surgeon community, the hospital’s administrative leadership, and data analysts. In this role, the surgeon-champion should be willing to engage in frequent dialogue with data analysts to become conversant on issues associated with data availability so that (s)he can communicate strengths and weaknesses of different measurement approaches to his or her colleagues. Additionally, the surgeon-champion should be a model user of the data and reports as they become available, so that (s)he can demonstrate the desired change. In this process, surgeons are positioned to lead the process rather than be passive recipients, and they create a culture of mutual accountability and collaborative practice development.

Following the technology acceptance model, which is used to predict and explain how end-users (such as clinicians) react to and adopt technology [2], the surgeon-champion may engage a small number of like-minded colleagues to form a physician-led, multidisciplinary council to lead the adoption of analytic dashboards. While this council will serve multiple purposes, in the context of measure development, it offers a forum to select the outcome and process measures of greatest interest, and to approve measure definitions as the requirements are developed. Finally, this group can be used to discuss and prioritize modifications as needs arise. Upon report creation, this forum provides the vehicle whereby surgeons can compare their performance with those of their colleagues.

Although we encourage that surgeons’ names be shown alongside their results (unblinded), surgeon-specific results presented in a blinded fashion provide participants vital visibility to the relationship between process changes and outcomes. As with any clinical committee, all members (including administrators) must remember to strike a balance between enforcing standardized clinical practice and recognizing the need for each surgeon’s autonomy to make the best decisions on behalf of his or her patients.

Defining Meaningful and Actionable Measures

The second factor that we propose for successful measure development and dissemination involves measure selection and definition. It is critical that surgeons and their clinical teams have access to a breadth of clinical outcomes and process measures that reflect the status of their clinical practice and identify areas for improvement.

Most programs start by identifying key outcome measures, which reflect the status of the patient following treatment. Hospitals and surgeons together should focus on complication rates, readmission rates, patient-reported outcomes, and patient mortality. With progressive gainsharing and comanagement agreements on the horizon [1], surgeons are joining hospitals in taking an interest in improving the value of care they provide to their patients.

Process measures are developed to track compliance with evidence-based care pathways. When new protocols or clinical pathways are implemented within a hospital, process measures can inform the rate of adoption to these new standards. Process measures reflect changes in clinical practice, such as early mobility within 4 hours after a total joint arthroplasty procedure, compliance with a preoperative risk assessment, or appropriate use of venous thromboembolism prophylaxis. Reviewing clinical outcomes and process measures in tandem provides confidence that adoption of changes in clinical care pathways are not adversely affecting patient outcomes.

Measure selection starts with a high-level understanding of the desired outcomes and clinical process changes. Following that, a robust definition process should be undertaken to ensure the measures selected are meaningful and actionable. Each measure should reflect a specific population, including designated inclusion and exclusion criteria that will create a more homogenous patient group to evaluate over time and across facilities and/or providers. Populations are most easily defined using administratively coded data (such as diagnosis-related groups, procedure, or diagnosis codes), but clinical data from the electronic health records can also be used if there is confidence that it is routinely and consistently documented. Each measure should cover a timeframe that is long enough to generate enough data so as not to be skewed by outlier results. Additionally, the definition should include documentation of how the measures will be filtered or stratified by patient demographic information (such as age, gender, or payer) or procedural information (for example, trauma status, or type of procedure). If possible, the measure should address the differences among patient risk within a population. One simple risk stratification methodology is to measure the average Charlson Comorbidity Index to evaluate the overall patient complexity.

Through the surgeon-engagement structure proposed above, it is important that the surgeons and clinical team approve the final definition of the measure, which gives each member of the team buy-in into the final definition. The time required for measure definition can be accelerated with an engaged and communicative clinical team.

Cohesiveness within the Hospital

Successful measure definition is best achieved through collaboration among both clinical and administrative team members. As stewards for their patients’ care, surgeons will have the insight into outcomes that matter most for patients, but hospital and practice administrators understand the regulatory and financial benefits of improved performance and should also be included. It is also important that the broader clinical team participate in measure selection so that they are aligned with the need for improved documentation. Additionally, clinical staff, medical informaticists, and the quality team help identify the electronic health record fields needed for measure calculation. Finally, data analysts experienced with clinical effectiveness measures should be included so they can translate clinical pathways into process metrics and help the clinical team to understand the nuances between different methods of measure definition.

A strong surgeon-engagement model with a designated surgeon-champion is foundational to any effort to create measures that are meaningful for patients, hospitals, and surgeons. With a strong engagement model in place, it is in the interest of surgeons to devote time up front to collaborate with other clinical team members and administrators on measure definition to facilitate the generation of valuable measures that they can trust and use to incrementally and continuously improve the value of care we deliver to our patients.

References

1. Bushness BD. Co-management agreements in orthopedic surgery. Am J Orthop. 2015;44:E167–E172.
2. Holden RJ, Karsh B. The technology acceptance model: Its past and its future in health care. J Biomedical Informatics. 2010;43:159–172.
3. Krumholz HM. Big data and new knowledge in medicine. The thinking, training, and tools needed for a learning health system. Health Aff. 2014;33:1163–1170.
4. Lee VS, Kawamoto K, Hess R, Park C, Young J, Hunter C, Johnson S, Gulbransen S, Pelt CE, Horton DJ, Graves KK, Greene TH, Anzai Y, Pendleton RC. Implementation of a value-driven outcomes program to identify high variability in clinical costs and outcomes and association with reduced cost and improved quality. JAMA. 2016;316:1061–1072.
5. Leyton-Mange A, Andrawis J, Bozic KJ. Value-based healthcare: A surgeon value scorecard to improve value in total joint replacement. Clin Orthop Relat Res. 2018;476:934–936.
6. Porter ME, Lee TH. The strategy that will fix healthcare. Harvard Business Review. Available at: https://hbr.org/2013/10/the-strategy-that-will-fix-health-care. Accessed June 8, 2018.
7. Schilling PL, Bozic KJ. The big to do about “big data.” Clin Orthop Relat Res. 2014;472:3270–3272.
© 2018 by the Association of Bone and Joint Surgeons