Skip Navigation LinksHome > March 2011 - Volume 86 - Issue 3 > Building a Departmental Quality Program: A Patient-Based and...
Academic Medicine:
doi: 10.1097/ACM.0b013e318209346e
Quality Improvement

Building a Departmental Quality Program: A Patient-Based and Provider-Led Approach

Szent-Gyorgyi, Lara E. MPA; Coblyn, Jonathan MD; Turchin, Alexander MD, MS; Loscalzo, Joseph MD, PhD; Kachalia, Allen MD, JD

Free Access
Supplemental Author Material
Article Outline
Collapse Box

Author Information

Ms. Szent-Gyorgyi is quality program manager, Department of Medicine, Brigham and Women's Hospital, Boston, Massachusetts.

Dr. Coblyn is vice chair, Integrated Clinical Services, Department of Medicine, Brigham and Women's Hospital, and associate professor of medicine, Harvard Medical School, Boston, Massachusetts.

Dr. Turchin is senior medical informatician, Clinical Informatics Research and Development, Partners HealthCare, and assistant professor of medicine, Harvard Medical School, Boston, Massachusetts.

Dr. Loscalzo is chair and physician-in-chief, Department of Medicine, Brigham and Women's Hospital, and professor of medicine, Harvard Medical School, Boston, Massachusetts.

Dr. Kachalia is medical director, Quality and Safety, Brigham and Women's Hospital, and assistant professor of medicine, Harvard Medical School, Boston, Massachusetts.

Correspondence should be addressed to Dr. Kachalia, Brigham and Women's Hospital, 75 Francis Street, Boston, MA 02115; telephone: (617) 525-7277; fax: (617) 738-6732; e-mail: akachalia@partners.org.

First published online January 18, 2011

Supplemental digital content for this article is available at http://links.lww.com/ACADMED/A42.

Collapse Box

Abstract

Quality improvement in health care today requires a comprehensive approach. Improvement efforts led by patients, payers, regulators, or health care providers face many barriers. Obstacles include selecting measures with clinical value, building physician acceptance, establishing routine and efficient measurement, and resolving competing clinical demands and work flow impediments. To meet these challenges, the Brigham and Women's Hospital Department of Medicine created a grassroots quality program guided by four main principles: improvement is led by frontline clinicians who select measures important to their patients, performance measurement is automated and accurate, appropriate resources are provided, and interventions are system based and without financial incentives for individual providers.

The quality program has engaged the department's physicians from the start. Given the flexibility to define their own metrics according to their patients' needs, clinicians have selected measures related to prevention and wellness, which are often based on national standards. The central quality team facilitates measurement and reporting while providers focus on patient care. The subsequent production of meaningful, actionable data has been instrumental in building physician acceptance and in providing clinicians the opportunity to evaluate and monitor performance. The program's largest challenges have been in capturing meaningful data from electronic systems. The program's system-based focus encourages providers to develop solutions within the existing framework of clinic resources, primarily targeting work flows and processes, while minimizing large expenditures on additional staffing.

Quality efforts in health care today demand increasingly comprehensive and rigorous approaches to measurement and improvement.1 The drive to improve quality is shared by those within the profession, as well as patients and payers. In addition, the U.S. government seeks to spur further improvements in quality as part of its health care reform efforts.2,3 The focus of improvement efforts often resides in three aspects of care delivery: organizational infrastructure, process of care, and clinical outcomes.4

Despite the push for higher-quality care, many hurdles to effective quality efforts remain.5–9 Too many external mandates may dilute the importance of each area selected for improvement, and yet, paradoxically, only address a limited patient population or a small part of a patient's care. Physicians may be skeptical of the clinical value of externally selected measures, consistent with a belief that public reporting of poorly designed quality measures can have a negative impact on care.10–13 Furthermore, physicians and other health care professionals often regard collection and reporting of quality measures to be a burden on time and financial resources.

With these challenges in mind, the Department of Medicine (DOM) at Brigham and Women's Hospital (BWH) sought to build a grassroots quality program that would fit within an academic medical center's (AMC's) tripartite mission of excellence in clinical care, research, and education. We were mindful of AMC providers' multiple responsibilities, but we were also aware of their common goal: to deliver the highest-quality care. In developing the program within the existing departmental infrastructure, we were guided by four main principles: (1) improvement should be led by frontline clinicians selecting measures most important to their patients, (2) performance measurement should be automated and accurate, (3) appropriate resources are necessary to support quality efforts, and (4) interventions should be system-, and not provider-, focused. We also informed providers that financial incentives would not be tied to these improvement efforts unless they opted to do so.

In this article, we describe the structure of the BWH DOM quality program, our progress to date, and next steps. We share the challenges we encountered during initial implementation, how we have addressed them, and the insights we have gained. We hope that our program can offer others a departmental model that builds on clinicians' inherent interest in the quality of patient care by providing data collection and intervention assistance through a central team without the use of any external mandates or financial incentives.

Back to Top | Article Outline

Building a Quality Program

Organizational context

BWH is a teaching affiliate of Harvard Medical School and one of the two founding AMCs of Partners HealthCare System in Boston, Massachusetts. The DOM, which has about 1,200 physician faculty members, is the largest clinical department at BWH.

DOM faculty members are typically involved in all facets of the AMC mission of excellence in clinical care, research, and education. Accordingly, we developed the quality program to allow for their engagement in each aspect. Although the program's primary focus is on improving the quality of care, its structure facilitates study and evaluation of reasons for gaps in quality and well as the efficacy of interventions. The program's structure also lends itself to, and encourages, hands-on trainee education and involvement in quality projects.

Although specialty-based pay-for-performance (P4P) contracting is not an intrinsic element of the program, it is another factor in the AMC's environment. The quality program's structure is flexible so that each division's clinical leadership can determine if and when it is appropriate to use the program's resources to support P4P efforts.

Back to Top | Article Outline
Program development and design

We started the DOM quality program, which is funded solely by the department, in 2007. The quality program is structured to provide centralized coordination, monitoring, support, and reporting services for the department's clinical divisions. The central quality team staff, which initially included a part-time medical director (20% effort) and a full-time program manager, leads the design, evaluation, and analysis of the metrics and facilitates process improvements within each division. The DOM's vice chair for integrated clinical services serves as the program's executive sponsor.

Since its inception, the DOM quality program has been anchored by its four principles:

1. The program is driven by clinicians focused on patient care. To maximize clinical value (and to avoid potential reservations related to “mandated” metrics), each division's leadership selects metrics they believe are important to the care of their patients. Ideally, the metrics are evidence based and/or tied to national guidelines. The initially selected metrics are provided in Table 1.

Table 1
Table 1
Image Tools

2. Measurement is automated and accurate. To allow for quick and repeated measurement in support of rapid improvement cycles, all metric data elements are captured electronically. When necessary, work flow and/or system changes are made to enable electronic data capture.

3. Appropriate resources are provided. The central quality team was created to provide the divisions with resources and expertise. This enables the central team to build expertise in metric development and data collection and reporting while division leadership concentrates on clinical issues.

4. There is a system focus. Because the quality program's focus is on care provided to patients, the metrics are designed to measure a practice's, rather than an individual physician's, performance. There are no physician-level incentives or penalties. This system focus is consistent with our philosophy that providers are inherently interested in the quality of the care they provide and that better support systems will optimize improvements.

Back to Top | Article Outline
Key features

The central quality team works collaboratively with division representatives throughout the measurement and improvement process with the goal of minimizing the investment of clinical providers' time in administrative activities. The process contains several design elements that take metrics from conception to execution via a four-part implementation plan. Figure 1 illustrates the phases of metric development and deployment.

Figure 1
Figure 1
Image Tools
Back to Top | Article Outline
Electronic data capture

The quality program's requirement that all electronic data abstraction be automated eliminates the need for chart review with the goal of reducing the time and resources needed for data collection, improving the consistency of collected data, and increasing the ability to report data regularly. We anticipate that the resulting “push button” access to reports will have multiple downstream effects. Easy access to the data without chart reviews should increase division confidence in and use of data, as well as lead to deeper involvement and greater ownership of the metric.

Back to Top | Article Outline
Metric specifications

The reliance on electronic data capture necessitates the development of accurate, precise, and reliable specifications for each metric. Detailed documentation of the metrics helps ensure that the data are extracted to the expectations of the division, are consistent across reporting periods, and can be adopted by other divisions or institutions.

Using the metric specification, the central quality team generates preliminary data and then vets those data with clinical leadership for technical (i.e., that the data were abstracted as specified) and clinical accuracy. The metric specifications are then adjusted per the division's suggestions; ultimate approval rests with the division. Once the final specifications are approved, the central team obtains baseline performance data that the divisional leadership uses to set performance goals. (An example of a metric specification is provided in Supplemental Digital Appendix 1, http://links.lww.com/ACADMED/A42.)

Back to Top | Article Outline
Numerator/denominator definitions

To produce percentage-based performance data, the central quality team needed to generate a numerator and denominator for each metric. The numerator represents the number of patients receiving or achieving a desired treatment or outcome, and the denominator represents the population targeted for that treatment or goal. Most of the numerator data—whether for a clinical process (e.g., vaccine administration) or an outcome (e.g., patient blood pressure)—are readily available within the BWH electronic health record (EHR) data repository. For some metrics for which numerator data were not reliably recorded, the central team has worked with divisional leadership to gain commitment to adjust work flows to record data in a standardized and extractable location in the EHR.

Identifying the population for the denominator (i.e., patients eligible for the metric) proved more challenging because not all qualifying clinical diagnoses were listed in an electronically extractable location within the EHR. To address this problem, we developed a method to combine billing data with available clinical data. With assurance from clinical leadership that physicians bill very accurately for their specialty-specific diseases, the central team identified the standard CPT and ICD-9 billing codes that could be used to identify the correct patient populations and then confirmed with division representatives that the codes were correct. To maintain accuracy, use of the billing data is restricted to the billing coding within the relevant division (e.g., to identify the correct population of patients with chronic kidney disease, we use only the renal clinic's billing).

Back to Top | Article Outline
Collaboration with Information Systems

After identifying the required data elements, the quality team worked with Information Systems (IS) to ensure the feasibility of electronic data capture. A fundamental first step was to facilitate an agreement between the Brigham and Women's Physicians Organization (BWPO), which houses the physician billing data, and the Quality Data Management group (QDM), which oversees the Partners HealthCare data warehouse that houses the EHR clinical data, to combine billing and clinical data for our improvement purposes. The BWPO now supplies necessary billing data to QDM, which combines the billing data with clinical data to generate the requested data reports. The transfer of billing data from the BWPO to QDM has been automated to support routine measurement.

Back to Top | Article Outline
Performance improvement

After divisions finalize their metric specifications, they develop and implement quality improvement strategies. Improvement opportunities exist on two levels, documentation and clinical performance, each of which is critical to the delivery of high-quality care. Improved documentation can affect performance rates by simply demonstrating (and allowing for electronic measurement) that the necessary care was indeed provided. Clinical performance improvement efforts rely on administrative and clinical staff at all levels. The central quality team meets regularly with each division to review data and to assist with the development and monitoring of improvement strategies.

Back to Top | Article Outline

Challenges and Lessons

In the DOM quality program's first two years of operation (2007–2009), we were able to address many of the challenges related to quality improvement that emerged; many of these were related to electronic data capture. Table 2 highlights the program's milestones and describes strategies taken to overcome obstacles and lessons learned. Here, we describe our findings with regard to our four core principles.

Table 2
Table 2
Image Tools
Back to Top | Article Outline
Driven by clinical leadership

Engaging physicians is critical for the success of quality efforts. Developing measures that physicians value may encourage their engagement.14–16 We addressed this potential barrier by building a program that is led by local clinical leadership. Divisional leaders identified metrics they deemed clinically important, of value to patients, measurable, amenable to improvement, and (ideally) tied to evidence-based standards. Once possible metrics were identified, metric selection was generally finalized after discussion at a divisional faculty meeting.

Each participating clinical division quickly selected a metric and created an initial definition of the specifications. Given the flexibility to define their own metrics according to their patients' needs, most divisions selected measures related to prevention and wellness that had the potential to control costs and minimize inpatient hospital stays; selected metrics often were measures for which there were national standards (Table 1). Providers also have been very engaged with metric refinement and development of improvement strategies. With easy measurement now at their fingertips, several divisions are studying the effects of their intervention efforts.

Back to Top | Article Outline
Ease of measurement

Historically, a common source of quality data has been insurance claims.17 Much progress has been made in the use of claims data, but they are frequently limited in scope and application. The advent of EHRs has created expectations of greater data availability for quality efforts.18 Although research on extending the use of EHR data to specific patient populations is still in the early stages, adjusting either the data input processes or the clinic work flow to allow for more comprehensive data capture may have a positive impact on the ability to use an EHR for quality improvement.8,19

With assistance from IS, QDM, and the BWPO, our central quality team has successfully identified ways to capture data for almost all of the identified metrics, but this effort has not been without its challenges. Achieving quick and easy measurement has required flexibility in data capture. Furthermore, the data required by the various metrics span the entirety of the hospital's multileveled IS structure. The quality team worked through many obstacles, including QDM's limited access to certain data elements and difficulties in the systematic recording of data. For some metrics, these obstacles affected timely availability of initial data.

Our reliance on electronic data capture is challenged by how data are entered and extracted. Data entry challenges arise when no logical field exists in the EHR for a provider to record and store relevant information (e.g., asthma control test scores, externally collected lab results) or when an electronic field is available for use but has not been routinely used (e.g., vaccinations administered at another facility, blood pressure measurement). Solving the data entry problem requires collaboration with IS to create new fields or methods for data entry. In cases in which an electronic field exists but is not used, clinical work flow redesign is required. Data extraction challenges occur when data are not available for reporting because they are entered into the record in a nonstructured field or are not available to QDM within the current IS infrastructure. We have requested IS enhancements to address the data extraction problem. Through this experience, we have witnessed the critical importance of specifying how data will be captured and accessed for quality reporting during (and not after) the design of any new clinical process or electronic data system, and now advocate accordingly.

Back to Top | Article Outline
Appropriate resources

Despite increased recognition of the need for quality improvement initiatives, such efforts can be underresourced, particularly at the staff and data management levels. Some institutions regard appointing physician leadership as sufficient support for a quality program, but that can result in programs without appropriate structures or the dedicated resources necessary to develop, implement, and evaluate interventions.20

Our program's structure allows the centralized program manager to leverage the existing expertise and resources available throughout the DOM to the division level. One marker of success has been divisions reporting they have selected metrics they felt were administratively too challenging to pursue prior to the program's creation. To date, one of the program manager's main functions has been to obtain the necessary data, capitalizing on the department-level focus on quality to effect results. The effort invested in obtaining required data for one division can pay off in economies of scale when that division's metric or improvement solution can also be applied to other divisions.

Time and patience have also been essential to the program's growth. The central team's initial investment over the first two years of the program has been critical to building the foundation. That time has allowed the divisions to establish meaningful metrics with actionable data and has given the program manager the freedom to pursue all avenues to overcome obstacles, especially with regard to data acquisition.

Back to Top | Article Outline
System focus

Our system focus encourages divisions to develop quality improvement efforts within the existing framework of their clinical resources. We encourage divisions to primarily target work flows and processes to minimize large expenditures on additional staffing. We have found that this approach has created initiatives that do not place sole responsibility for improvement on any single provider. For example, in divisions seeking to improve vaccination rates, the clinics have set up a process using standing orders so that nurses and medical assistants can now automatically provide vaccinations to patients who come for a visit.

Even though data can be generated at the individual provider level, the DOM quality program focuses on solutions at the clinic level. In addition, no individual incentives or penalties are tied to the metrics. We recognize that P4P has emerged throughout the health care industry as a popular path to achieving quality improvement and cost control.21–23 Although some evidence has shown improved outcomes, P4P may not be without its pitfalls.24–27 Our quality program garners the desired attention from our clinicians without the need to provide additional financial incentives, consistent with our belief that operational support can go a long way toward improving quality.

Back to Top | Article Outline
Education

Educating the next generation of providers to lead future quality improvement efforts is crucial.28–32 A natural extension of our program has been to invite residents and fellows interested in quality to participate in defined projects. They are able to take a clinical or administrative leadership role in helping to execute many of the program's activities and work closely with the divisions throughout the cycle of a metric. One resident took charge of data analysis and distribution as well as presentations to the division faculty. He also successfully applied for an internal grant to support the quality efforts. These collaborations have been very successful, and we look forward to working with trainees on a regular basis and providing them with practical experience in quality improvement.

Back to Top | Article Outline

Next Steps

We continue to pursue IS enhancements. Although we have created some temporary solutions to support our quality efforts, sustainable long-term solutions to improve work flow and data collection require IS adjustments. Unsurprisingly, IS-related challenges are likely to be perennially present as divisions continue to select new metrics and new, previously unanticipated measurement needs arise.

In the longer term, we anticipate tracking the results of implemented improvement strategies and sharing them across the DOM. Once performance goals for individual metrics have been met, we will work with divisions to identify iterations of current or new metrics for the next cycle of improvement.

We also look forward to taking proven solutions and implementing them across the DOM where applicable (e.g., blood pressure or vaccination goals). In addition, one division's successful approach may affect another division's selection of future metrics, providing further opportunity to demonstrate the effectiveness of an improvement strategy.

Our ultimate goal is to expand the program across the Partners system, to the DOM's counterparts at other Partners hospitals. This collaboration would strengthen the program as a whole, improve patient care, and provide more data for setting benchmarks in selected metrics.

Back to Top | Article Outline

Conclusion

Producing meaningful, actionable data remains an essential goal as AMCs prepare for a future in which greater emphasis is placed on quality measurement and improvement. The BWH DOM quality program has been designed to put quality management principles into action, specifically by selecting metrics that are clinically relevant to providers, facilitating electronic data collection, involving all clinic staff in improvement activities while providers focus on patient care, and setting performance goals without financial incentives. Now that we have established a program that enjoys provider engagement and employs a measurement infrastructure, the divisions will continue to design and implement improvement efforts aimed at improving the system in which our physicians practice.

Back to Top | Article Outline

Acknowledgments:

The authors thank David McCready and Christine Imperato for their valuable support of the DOM Quality Program.

Back to Top | Article Outline

Funding/Support:

None.

Back to Top | Article Outline

Other disclosures:

None.

Back to Top | Article Outline

Ethical approval:

Not applicable.

Back to Top | Article Outline

References

1Institute of Medicine. Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, DC: National Academy Press; 2001.

2Mahon M, Greep AJ, Campbell K. New analysis: Congressional health care reform proposals would offer coverage to many without insurance; plan to cover the uninsured through Medicare reduces health care spending by $58 billion in 2010. http://www.commonwealthfund.org/Content/News/News-Releases/2009/Jan/New-Analysis–Congressional-Health-Care-Reform-Proposals-Would-Offer-Coverage-to-Many-Without-Insura.aspx. Accessed November 22, 2010.

3Evans DB, Tan-Torres ET, Lauer J, Frenk J, Murray CJL. Measuring quality: From the system to the provider. Int J Qual Health Care. 2001;13:439–446.

4Glickman SW, Baggett KA, Krubert CG, Peterson ED, Schulman KA. Promoting quality: The health-care organization from a management perspective. Int J Qual Health Care. 2007;19:341–348.

5McNeil BJ. Hidden barriers to improvement in the quality of care. N Engl J Med. 2001;345:1612–1620.

6Chassin MR, Galvin RW; National Roundtable on Health Care Quality. The urgent need to improve health care quality: Institute of Medicine national roundtable on health care quality. JAMA. 1998;280:1000–1005.

7Amalberti R, Auroy Y, Berwick D, Barach P. Five system barriers to achieving ultrasafe health care. Ann Intern Med. 2005;142:756–764.

8Pronovost PJ, Miller MR, Wachter RM, Meyer GS. Perspective: Physician leadership in quality. Acad Med. 2009;84:1651–1656. http://journals.lww.com/academicmedicine/Fulltext/2009/12000/Perspective__Physician_Leadership_in_Quality.9.aspx. Accessed November 11, 2010.

9Auerbach AD, Landefeld CS, Shojania KG. The tension between needing to improve care and knowing how to do it. N Engl J Med. 2007;357:608–613.

10Blumental D, Epstein AM. Quality of health care: Part six of six. N Engl J Med. 1996;335:1328–1331.

11Casalino LP, Alexander GC, Jin L, Konetzka RT. General internists' views on pay-for-performance and public reporting of quality scores: A national survey. Health Aff (Millwood). 2007;26:492–499.

12Casalino LP, Elster A, Eisenberg A, Lewis E, Montgomery J, Ramos D. Will pay-for-performance and quality reporting affect health care disparities? Health Aff (Millwood). 2007;26:405–414.

13Snyder L, Neubauer RL. Pay-for-performance principles that promote patient-centered care: An ethics manifesto. Ann Intern Med. 2007;147:792–794.

14Caverzagie KJ, Bernabeo EC, Reddy SG, Homboe ES. The role of physician engagement of the impact of the hospital-based practice improvement module (PIM). J Hosp Med. 2009;14:466–470.

15Gosfield AG, Reinertsen JL. Finding common cause in quality: Confronting the physician engagement challenge. Physician Exec. March–April 2008;34:26–28, 30–31.

16Audet A-M, Doty MM, Shamasdin J, Shoenbaum SC. Measure, learn, and improve: Physicians' involvement in quality improvement. Health Aff (Millwood). 2005;24:843–853.

17Mintz M, Narvarte HJ, O'Brien KE, Papp KK, Thomas M, Durning SJ. Use of electronic medical records by physicians and students in academic internal medicine settings. Acad Med. 2009;84:1698–1704. http://journals.lww.com/academicmedicine/Abstract/2009/12000/Use_of_Electronic_Medical_Records_by_Physicians.17.aspx. Accessed November 11, 2010.

18Blumenthal D, Glaser JP. Information technology comes to medicine. N Engl J Med. 2007;356:2527–2534.

19Miller RH, Sim I. Physicians' use of electronic medical records: Barriers and solutions. Health Aff (Millwood). 2004;23:116–126.

20Bohmer RMJ, Bloom JD, Mort EA, Demehin AM, Meyer GS. Restructuring within an academic health center to support quality and safety: The development of the center for quality and safety at the Massachusetts General Hospital. Acad Med. 2009;84:1663–1671. http://journals.lww.com/academicmedicine/Fulltext/2009/12000/Restructuring_Within_an_Academic_Health_Center_to.12.aspx. Accessed November 11, 2010.

21Rosenthal MB, Frank RG, Ahonghe L, Epstein AM. Early experience with pay-for-performance. JAMA. 2005;294:1788–1793.

22Campbell SM, Reeves D, Kontopantelis E, Sibbald B, Roland M. Effects of pay for performance on the quality of primary care in England. N Engl J Med. 2009;361:368–378.

23Epstein AM, Lee TH, Hamel MB. Paying physicians for high-quality care. N Engl J Med. 2004;350:406–410.

24Petersen LA, Woodard LD, Urech T, Daw C, Sookanan S. Does pay-for-performance improve the quality of health care? Ann Intern Med. 2006;145:265–272.

25Glickman SW, Ou F-S, DeLong ER, et al. Pay for performance, quality of care, and outcomes in acute myocardial infarction. JAMA. 2007;297:2373–2380.

26Lindenauer PK, Remus D, Roman S, et al. Public reporting and pay for performance in hospital quality improvement. N Engl J Med. 2007;356:486–496.

27Rosenthal MB. What works in market-oriented health policy? N Engl J Med. 2009;360:2157–2160.

28Accreditation Council for Graduate Medical Education. Outcome Project: Common Program Requirements. http://www.acgme.org/outcome/comp/compCPRL.asp. Accessed November 22, 2010.

29Daniel DM, Casey DE, Levine JL, et al. Taking a unified approach to teaching and implementing quality improvements across multiple residency programs: The Atlantic Health experience. Acad Med. 2009;84:1788–1795. http://journals.lww.com/academicmedicine/Abstract/2009/12000/Taking_a_Unified_Approach_to_Teaching_and.31.aspx. Accessed November 11, 2010.

30Splaine ME, Ogrinc G, Gilman SC, et al. The Department of Veterans Affairs National Quality Scholars Fellowship Program: Experience from 10 years of training quality scholars. Acad Med. 2009;84:1741–1748. http://journals.lww.com/academicmedicine/Abstract/2009/12000/The_Department_of_Veterans_Affairs_National.25.aspx. Accessed November 11, 2010.

31Patow C. Making residents visible in quality improvement. Acad Med. 2009;84:1642. http://journals.lww.com/academicmedicine/Fulltext/2009/12000/Making_Residents_Visible_in_Quality_Improvement.2.aspx. Accessed November 11, 2010.

32Patow CA, Karpovich K, Riesenberg LA, et al. Residents' engagement in quality improvement: A systematic review of the literature. Acad Med. 2009;84:1757–1764. http://journals.lww.com/academicmedicine/Abstract/2009/12000/Residents__Engagement_in_Quality_Improvement__A.28.aspx. Accessed November 11, 2010.

Back to Top | Article Outline
References Cited in Tables Only
33U.S. Department of Health and Human Services. National Asthma Education and Prevention Program Expert Panel Report 3: Guidelines for the Diagnosis and Management of Asthma. http://www.nhlbi.nih.gov/guidelines/asthma/asthsumm.pdf. Accessed March 15, 2010.

34Krumholz HM, Anderson JL, Bachelder BL, et al. ACC/AHA 2008 performance measures for adults with ST-elevation and non-ST-elevation myocardial infarction: A report of the American College of Cardiology/American Heart Association Task Force on Performance Measures. Circulation. 2008;118:2596–2648.

35American Diabetes Association. Standards of medical care in diabetes—2010. Diabetes Care. 2010;33(suppl 1):S11–S61. http://care.diabetesjournals.org/content/33/Supplement_1/S11.full. Accessed March 15, 2010.

36Ghany MG, Strader DB, Thomas DL, Seeff LB; American Association for the Study of Liver Diseases. Diagnosis, management, and treatment of hepatitis C: An update. Hepatology. 2009;49:1335–1374.

37Massachusetts General Laws, Part 1, Title 16, Chapter 111, Section 70g. Genetic information and reports protected as private information; prior written consent for genetic testing. http://www.malegislature.gov/Laws/GeneralLaws/PartI/TitleXVI/Chapter111/Section70g. Accessed November 11, 2010.

38Geerts WH, Bergqvist D, Pineo GF, et al. Prevention of venous thromboembolism. Chest. 2008;133:381S–453S.

39Advisory Committee on Immunization Practices. Recommended adult immunization schedule: United States, 2010. Ann Intern Med. 2010;152:36–39.

40Smith RA, Saslow D, Sawyer KA, et al. American Cancer Society guidelines for breast cancer screening: Update 2003. CA Cancer J Clin. 2003;53:141–169.

41American College of Obstetricians and Gynecologists (ACOG). Breast cancer screening. ACOG practice bulletin no. 42. http://www.guideline.gov/content.aspx?id=3990. Accessed November 22, 2010.

42US Preventive Services Task Force. Screening for breast cancer: U.S. Preventive Services Task Force recommendation statement. Ann Intern Med. 2009;151:716–726.

43National Kidney Foundation. K/DOQI clinical practice guidelines on hypertension and antihypertensive agents in chronic kidney disease. http://www.kidney.org/professionals/KDOQI/guidelines_bp/guide_1.htm. Accessed March 15, 2010.

44Saag K, Teng G, Patkar N, et al. American College of Rheumatology 2008 recommendations for the use of nonbiologic and biologic disease-modifying antirheumatic drugs in rheumatoid arthritis. Arthritis Rheum. 2008;59:762–768.

Cited By:

This article has been cited 1 time(s).

Journal of the American Medical Informatics Association
Meaningful measurement: developing a measurement system to improve blood pressure control in patients with chronic kidney disease
Greenberg, JO; Vakharia, N; Szent-Gyorgyi, LE; Desai, SP; Turchin, A; Forman, J; Bonventre, JV; Kachalia, A
Journal of the American Medical Informatics Association, 20(): E97-E101.
10.1136/amiajnl-2012-001308
CrossRef
Back to Top | Article Outline

Supplemental Digital Content

Back to Top | Article Outline

© 2011 Association of American Medical Colleges

Login

Article Tools

Images

Share