Baby boomers are nearing the age when they can expect to face the health consequences of aging, including multiple chronic illnesses and a decline in general physical functioning. This presents a societal challenge of providing for an increased number of older adults with numerous illnesses. A recent report from the Institute of Medicine highlights the inadequacy of current health care systems in the United States to meet the health care needs of today's elderly—let alone the increased numbers who will soon need care.1 Current trends in career choices of medical students and internal medicine (IM) residents with respect to addressing the needs of the elderly are grim: Only 2% of students in a recent study report that they plan to pursue general IM,2 and only 2% of IM residents are planning to pursue fellowship training in geriatrics.3 While only a few physicians become geriatricians, most will spend considerable portions of their professional time caring for older adults. All physicians (as well as other health care professionals) need to know how to meet the specific health care needs of older adults, with the goals of reducing morbidity and associated health care costs and enhancing the quality of the latter years of life.
Previous work has shown that considerable variability and significant gaps exist in the quality of care for patients seen in residency program clinics.4–6 Little is known about the quality of care for older adults provided by residents in IM and family medicine (FM) training programs even though IM and FM constitute the two disciplines that not only are prerequisites to training in geriatrics but also produce the majority of physicians who, as primary care doctors or subspecialists, care for older adults.
The purpose of this study is to report (1) the baseline level of care provided to older patients in IM and FM residency clinics, (2) the characteristics of the practice systems that produce these results, and (3) the relationship between specific practice system elements and the quality of care for older adults. In addition, we compare performance of residents with that of practicing physicians (board-certified geriatricians and general internists, a group that is likely to represent the high end of the spectrum of performance in care of older adults) on the Care of the Vulnerable Elderly (CoVE) practice improvement module (PIM) (described below) for selected processes of care provided to older patients. We compare underlying systems that support the delivery of these processes in the residency clinic and practicing physician groups. We use the practicing physicians' performance as a standard against which to benchmark the quality of care provided in the residency clinics. (An additional purpose of the study was to examine the effects of completing the CoVE PIM in IM and FM residency programs; these results will be published separately.)
The American Board of Internal Medicine (ABIM) developed PIMs as a tool for physicians who are completing their maintenance of certification to self-assess the quality of care they provide for specific conditions or patient populations.7 PIMs have also been successfully used in residency settings to provide experiential learning in a particular subject area (such as diabetes or hypertension), as well as to teach practice-based learning and improvement and systems-based practice.8,9 In 2005, ABIM released the CoVE PIM to assess care provided to older adults. Like other PIMs, the CoVE PIM is a Web-based self-evaluation tool based on nationally recognized guidelines10 that uses chart abstraction, patient surveys, and a practice system survey known as Examine Systems (all described briefly below) to generate a performance report focused on a key aspect of care—in this case, care provided for patients age 65 and older. The performance report is designed to help physicians first understand their current actual performance as it compares with guidelines for ideal care, and then to identify areas for improvement.
Settings and participants
Residency clinic sample.
In March 2006, the ABIM invited IM and FM residency programs to apply for inclusion in a study to describe the current state of care for older patients seen in IM and FM training programs. We sent one-time invitations by e-mail to 824 IM and FM program directors and 100 geriatric fellowship directors, and we also posted a request for applications to the listservs of the Association of Program Directors in Internal Medicine, the American Academy of Family Physicians, and the American Board of Family Medicine. ABIM offered programs an incentive of $250 per participating resident for successful completion of the study in order to offset the anticipated costs of participation, plus $1,000 per program per audit period to audit 50 to 75 medical records at baseline and follow-up. We assessed 110 training programs for eligibility (affiliation with an Accreditation Council for Graduate Medical Education-accredited training program, sufficient numbers of older patients, and support from the division chief or department chair), and we selected 46 (34 IM and 12 FM programs) to provide a sample that was geographically diverse and representative of both academic and community-based programs. We used baseline data collected via the CoVE PIM to provide an assessment of the current state of care for the elderly. A local medical record abstractor entered chart audit data for 50 to 75 charts per residency program (a minimum of 25 charts per clinic site). Participating programs selected the medical record abstractors, and ABIM provided an abstractor's manual as well as technical and study-related support as needed. Designated faculty leaders at each clinic site completed the Examine Systems survey. All programs received IRB approval before data collection began.
Practicing physician sample.
The comparison sample of practicing physicians consisted of 144 physicians nationally who completed the CoVE PIM by July 2008 as an elective component of their maintenance of certification program (selecting this PIM indicates that these physicians have a special interest in the care of older adults). Practicing physicians generally complete the chart audit section of PIMs on their own. They are encouraged (but not required) to include staff and other physicians in the practice when completing the Examine Systems section. ABIM must receive a minimum of 25 charts and 25 patient surveys to provide a performance report to physicians. The physicians use this report to generate an improvement plan, and the PIM is complete when the ABIM receives a report of the results of making a change in the practice. Patient survey results will be reported in subsequent work.
The chart audit for the CoVE PIM reviews basic medical history information that a chart should capture, including patient demographics, occurrence of chronic conditions important in older adults, and health-related habits (e.g., smoking, alcohol use, physical activity). The chart audit also assesses whether various screens (i.e., for depression, cognitive impairment, fall risk, hearing loss, and urinary incontinence) and examinations (i.e., vision loss, postural hypotension, and problems with gait or balance) were completed. For patients with increased fall risk or urinary incontinence, the audit includes an assessment of what additional evaluations were completed. The chart audit also assesses whether health care providers had documented functional status, ability to provide self-care, and end-of-life preferences. Some of the chart audit measures are specific largely to the care of the elderly, such as evaluating gait and balance and screening for fall risk and cognitive impairment. Some measures assume greater importance in a population of older patients because they are linked with conditions (e.g., vision loss and hearing impairment) far more common in the elderly, or because of the increased likelihood of death and incapacitation (thus the importance of identifying a medical surrogate). We identified these types of process measures as “geriatric-specific.” Other process measures included in the chart audit, while important in care of the elderly, also frequently apply to other populations, such as assessing functional status, documenting use of over-the-counter medications, and providing influenza vaccination; we considered these process measures to be “nongeriatric-specific.” We calculated the performance rate for each measure as the percentage of patients who received the process of care. Thus, the chart audit serves both to describe a sample of older adults in the practice (the demographics section) and to evaluate how often key processes of care are provided (audit of screens and evaluations). An example of a complete chart audit is available at http://www.abim.org/online/pim/demo.aspx.
Examine Systems survey.
The Examine Systems survey is a key component of all PIMs. The ABIM developed the version used in this study, which draws heavily on the theoretical underpinnings of the Wagner Chronic Care Model11 and the Clinical Microsystem.12 The Examine Systems survey is a series of 127 questions designed to help physicians understand the existing systems in their practices for purposes of information management, patient education and activation, access by and communication with patients, safety and efficiency, consultation and referral, team roles and responsibilities, and quality improvement. The Examine Systems survey also helps physicians understand how well their systems are working and what could be improved. Some elements in a practice system directly relate to recommended processes of care, such as whether the medical record system contains a template or reminder to screen for falls or fall risk and whether charts show that a screen for falls or fall risk was completed. Physicians completing the Examine Systems survey report (1) whether a practice element is working well in the microsystem, (2) if it is available but could use improvement, or (3) if it is not available or not operational. For example, in response to a question asking whether the medical record system includes a template or reminder to screen for falls or fall risk, physicians can answer that the medical record has a template that is working well, that it has a template but the template could use improvement, or that no template is available. Appendix 1 provides sample questions from the Examine Systems survey.
We used descriptive statistics to summarize characteristics of the physician groups (residents and practicing physicians) and to summarize performance on chart audit measures. We used contrast tests to determine whether there were statistically significant differences between the residency clinic and practicing physician groups with regard to selected patient characteristics.
We averaged performance rates for each process measure across all residency clinic sites and across all physicians' practices to determine their respective mean performance rates. We used multivariate analysis of covariance (MANCOVA) to determine whether there were differences between the groups (residency clinics and practicing physicians) for the set of nongeriatric and geriatric-specific process measures, controlling for patient age and the number of chronic conditions (both of which could affect the likelihood that process measures are performed).13 We then conducted individual univariate tests to determine which specific process measures were performed at significantly different rates between the two groups. We used structure coefficients obtained from descriptive discriminant analysis to assess the impact of each process measure on overall group separation. Structure coefficients are the correlations between the variables in the model and the linear discriminant functions that are computed.14 These are similar to “loadings” from factor analysis, and in this study they indicate how much each process measure contributes to the separation of the two groups.
We conducted chi-square significance tests to determine whether there were significant differences in the proportion of system elements that were reported to be working well in practice between the resident clinic and the practitioner groups, and we computed r 2 effect sizes to assess the size of the difference between the two groups.
We computed Pearson correlations to determine the relationship between the degree to which both nongeriatric-specific and geriatric-specific system elements were reported to be working well in practice and the actual performance rate for the corresponding process measure. Because practicing physicians working in solo practice compared with those working in larger microsystems may be less likely to have system elements or to perform specific processes, we computed partial correlations controlling for (partialed out) differences in practice setting and number of physicians in the practice.
Because in some cases multiple significance tests were performed, we assessed statistical significance using an alpha of .01.
We performed all data analyses using SPSS version 12.0 (SPSS, Inc., Chicago, Illinois).
We selected 46 residency programs (34 IM and 12 FM) to participate in the study; 9 were from the West and Hawaii, 8 from the Midwest, 7 from the South, and 22 from the East. Four programs dropped out after stratification, but before they collected any baseline data. The remaining 42 included 31 IM and 11 FM programs. The programs comprised 23 community programs, 15 university programs, 2 university-operated but community-based programs, one military program, and one multispecialty program. Seventeen of the 42 programs offered a geriatric fellowship. Trainees comprised 897 postgraduate year (PGY)1 residents, 746 PGY2 residents, 723 PGY3 residents, and 23 fellows. Twenty-three programs included more than 25 trainees. There were 54 clinic sites among the 42 programs; 2 training programs had 1 clinic site (among others) exclusively for geriatrics fellows, and we dropped these sites from the analysis.
Of the 144 practicing physicians included in the study, 78 (54%) were geriatricians, 59 (41%) were general internists, and 7 (5%) were certified in another medical subspecialty. Forty-three (30%) were in solo practice, 27 (19%) practiced in groups of two to five physicians, 22 (15%) in groups of six to ten, and 52 (36%) in larger groups. Sixty-nine (48%) were in private practice, 29 (20%) in an academic faculty practice, another 29 (20%) in a hospital-owned practice, 14 (10%) in military/government practices, and the remainder in other types of practice. Thirty-five (24%) also practiced in long-term care facilities, but we instructed them to exclude patients who were wheelchair-bound or had severe cognitive impairment.
We include data from 2,216 patient records from 52 clinic sites at the 42 residency programs, and from 3,693 patient records submitted by 144 practicing physicians in this report. The patients seen in the residency clinics were somewhat younger than those cared for by practicing physicians, and overall, they seem to have had fewer chronic medical conditions (see Table 1). Patients seen in the residency clinics were more likely, however, to be diagnosed with diabetes and hypertension. There was no difference in the number of medications prescribed; patients seen at the residency clinics and those cared for by practicing physicians both took a mean of eight medications per patient.
Performance on key processes of care
MANCOVA indicated a statistically significant difference between the residency clinics and practicing physicians for the set of nongeriatric-specific process measures (Wilks Λ = 0.52, F(7, 186) = 24.62, P < .001, partial η2 = 0.48) and for the set of geriatric-specific processes (Wilks Λ = 0.66, F(9, 183) = 10.58, P < .001, partial η2 = 0.34). Univariate tests indicated that, after controlling for patient age and number of chronic conditions, patients cared for in residency clinics were less likely to receive important processes of care than patients cared for by practicing physicians, as shown in Table 2. For nongeriatric-specific process measures, the largest differences were in documenting mobility/functional status and documenting current level of exercise. For geriatric-specific process measures, the largest differences were in performing gait and balance evaluation and in screening for falls or fall risk, cognitive impairment, and depression. We note that the practicing physicians also failed to perform many important processes of care at a high rate; for example, only 52% of their patients had been screened for cognitive impairment. Our analysis is not inclusive of all measures in the PIM. For two processes, measuring weight and blood pressure, performance in residency clinics equaled that of practicing physicians (results not shown); these measures did not significantly contribute to group separation based on initial MANCOVA analysis. For the processes of care shown in Table 2 and for some others not shown, patients were more likely to receive recommended services from practicing physicians than from residency clinics.
Practice systems elements
Certain practice system elements directly relate to specific processes of care measured by the chart audit (e.g., templates or reminders in the medical record system to document exercise, and documentation of current exercise in the chart). Table 3 shows how often residency clinics and practicing physicians reported these practice system elements to be functioning well in the practice setting. In all cases, practice system elements were reported to be “working well” at a higher proportion in practicing physician offices. Chi-square significance tests indicated that, for nongeriatric-specific elements, the largest differences between the residency clinics and practicing physicians were in reminders to prompt documentation of symptoms/functional status and use of over-the-counter medications. For geriatric-specific elements, the largest differences were in reminders to screen for fall risk, urinary incontinence, memory problems, and depression. We note that the degree to which the geriatric-specific elements are working well in practicing physicians' offices is still low: only 32% of practicing physicians reported that their offices have a medical record system that includes effective reminders to screen patients for falls or fall risk.
Relationship between practice system elements and process of care
Table 4 presents the correlation between the degree to which system elements (e.g., reminders for appropriate vaccinations and receipt of influenza vaccine) are reported to be working well in practice and the actual performance rate for the corresponding process measure. Although some of the practice system elements from the residency clinic sample were modestly correlated with the process measure of interest, none of the correlations were statistically significant. For the practicing physician sample, correlations between system elements and process measures were stronger for both geriatric-specific and nongeriatric-specific measures, and the correlations reached statistical significance.
Congress and the Department of Health and Human Services are considering how to change Medicare payments to adjust the physician workforce to meet the needs of older citizens.15 In addition to attempting to increase the number of primary care physicians and geriatricians, all physicians need to better understand the specific needs of older adults. Others have reported problems with the quality of care for this population16; indeed, as described by Asch and colleagues,17 patients receive a smaller proportion of recommended services as they age. Our results show that in a sample of IM and FM training programs, and in a sample of practicing physicians (who we can assume are particularly motivated to address the needs of older adults), there are significant gaps in the quality of care as measured by reported performance in delivering processes of care important for older adults, including screening for fall risk, cognitive impairment, and depression, evaluating vision and hearing, and identifying end-of-life preferences.
Others have shown an association between practice systems and quality of care.18–20 Recently, Solberg and colleagues21 explored the relationship between performance on the Physician Practice Connections-Readiness Survey (PPC-RS) and the quality of diabetes care among 40 medical groups in Minnesota; they found that the overall score on the PPC-RS for practice systems showed moderate correlation (Pearson correlation of 0.35 to 0.49) with both an overall composite measure for diabetes care outcomes and with some specific process measures (measurement of hemoglobin A1C, LDL cholesterol, and blood pressure). In their study, decision support (guidelines, reminders, and clinical alerts) and the use of performance measurement, data feedback, and formal quality improvement activities seemed to be the most important elements within the practice system. In our study, we looked at the presence of practice system elements designed to support both geriatric-specific and nongeriatric-specific needs of older adults in two settings: residency clinic practices and in the practices of experienced physicians engaged in the maintenance-of-certification process. In both settings, the practice system is more likely to include elements that support nongeriatric-specific processes (e.g., use of problem lists, reminders to consider vaccinations) than geriatric-specific processes (e.g., reminders to screen for falls or urinary incontinence). The residency clinics were significantly less likely to have well-functioning system elements present for nongeriatric-specific processes relating to documenting symptoms/functional status and use of over-the-counter medications and for all geriatric-specific processes than were the practicing physicians.
Even when the practice system includes elements designed to support the delivery of key processes of care, there is low correlation between the practice system element and performance on the corresponding process measure within the residency clinics (this may be partly due to low power, as this sample comprised only 52 residency clinic sites). This study does not allow us to draw conclusions as to why, but we propose several possibilities that can be studied in future work. First, although a practice system element may be in place and may even be reported to be functioning well, function depends on a human user. Residency clinic practices are particularly complex clinical microsystems in which inexperienced physicians (residents) practice for only a few hours a week, for one to three years. Inexperienced residents may not understand how the practice system can support their work. Faculty supervisors may also only be in the clinic setting for a few hours a week and may be ill equipped to coach the residents in better use of the practice system. Second, a human user (the physician) can easily ignore or override a practice system element such as a reminder to screen a patient for falls and fall risk. This may occur because the physician is overwhelmed by other needs of that patient—or by patients elsewhere in the hospital. Physicians may also disregard a reminder because of a knowledge deficit, such as not knowing the evidence regarding screening for falls or that steps can be taken to reduce the risk of falls.22 Third, few residents plan careers in geriatrics or primary care, and acquiring competence in the care of older adults may not seem relevant to their future plans. Fourth, supervising faculty may lack expertise in geriatrics and therefore may, through intention or oversight, focus on other aspects of patient care. Fifth, the delegation of tasks to other health care providers may not occur optimally in residency clinics; for many of the processes of care addressed in this study, the physician need not be the one to deliver the care. It is possible that nonphysician staff in a geriatrician's office are more knowledgeable about the health care needs of the elderly and are, therefore, more involved in the delivery of these processes of care.
Our study has several important limitations. First, research assistants abstracted chart audit data from the residency clinics, and there may be some underreporting because of poor documentation. The practicing physicians themselves generally abstracted their chart audit data; thus, the data are subject to self-report bias. However, a small study of ABIM's diabetes PIM indicates excellent agreement between chart audits performed by physicians and by specially trained abstractors.23 We have no reason to believe that the CoVE PIM would be different. Second, the residency clinic sample includes only 52 clinic sites, and we may have not had adequate power to detect some relevant correlations with the clinic as the unit of analysis (as mentioned above). Third, data from the residency clinic sample are aggregated at the level of the clinic site. Variations in performance between residents cannot be detected. Fourth, some residency clinics are performing at higher levels than others, and we have not studied in detail what these differences are, or why they exist. Important factors may include faculty with special interest or expertise in geriatrics and the presence of programmatic efforts to strengthen geriatrics training. We are undertaking further work to better understand the factors that helped high-performing residency clinics to achieve their results. Fifth, the patients in the residency clinic sample were, on average, younger than those seen by practicing physicians, and they had fewer of the chronic conditions assessed in the chart audit (this difference may be due in part to poor documentation or to underdiagnosis of conditions, especially those seen primarily in the elderly such as arthritis and Parkinson disease). They were, however, more likely to have diabetes and hypertension, and it may be that the residents' efforts were focused on health care needs associated with these problems, or that their geriatric-specific needs were perceived as less important than they would have been in an older patient group. Finally, our practicing physician sample is likely not representative of the average primary care provider with regard to interest in care of the elderly, and we do not know whether the differences we see in correlations between the presence of practice system elements and corresponding process measures hold for other practicing physicians.
The U.S. health care system is currently not meeting the needs of older adults, and it will be ill prepared to meet the needs of their rapidly increasing numbers in the coming decades unless steps are taken on several levels. Although increasing the numbers of physicians entering geriatrics and primary care will help, physicians with other primary professional interests will always care for older adults. It is thus essential that the training of all physicians include an emphasis on the special needs of the elderly.
We report here that even in the practices of motivated and experienced physicians—those who have elected to complete the CoVE PIM for their maintenance-of-certification program—significant gaps in the quality of care exist. These gaps are much more pronounced when care is delivered in the residency clinic setting. It will be difficult for residents to be prepared to care for the elderly when they are practicing in settings that neither support nor deliver good geriatric care.
Physician knowledge, although essential, is not sufficient, even when coupled with the highest professional aspirations to do right by one's patients. The clinical microsystems in which physicians work must also provide the support that make safe and timely care for patients, including older adults, possible.24 Indeed, there is currently great interest and even a sense of urgency in redesigning practice systems to lower the costs and improve the quality of health care,25,26 but researchers have much to learn about the interactions between human users of practice systems and the elements of those systems, and a one-size-fits-all model is unlikely to work for either patients or health care providers. We have shown here that practice system elements to support both geriatric-specific and nongeriatric-specific processes of care for older adults perform differently when used in residency clinic settings or in practicing physicians' offices. The next step is to begin to understand why.
The study of the Care of the Vulnerable Elderly practice improvement module was made possible by generous support from the Josiah Macy, Jr. Foundation and from the American Board of Internal Medicine Foundation. The authors wish to thank Halyna Didura for her skilled assistance with data analysis, and Siddharta Reddy for his keen eye in preparing the tables.
1 Committee on the Future Health Care Workforce for Older Americans, Institute of Medicine. Retooling for an Aging America. Washington, DC: The National Academies Press; 2008.
2 Hauer KE, Durning SJ, Kernan WN, et al. Factors associated with medical students' career choices regarding internal medicine. JAMA. 2008;300:1154–1164.
3 Garibaldi RA, Popkave C, Bylsma W. Career plans for trainees in internal medicine residency programs. Acad Med. 2005;80:507–512.
4 Mladenovic J, Shea JA, Duffy FD, Lynn LA, Holmboe ES, Lipner RS. Variation in internal medicine residency clinic practices: Assessing practice environments and quality of care. J Gen Intern Med. 2008;23:914–920.
5 Veloski J, Boex JR, Grasberger MJ, Evans A, Wolfson DB. Systematic review of the literature on assessment, feedback and physicians' clinical performance: BEME Guide No. 7. Med Teach. 2006;28:117–128.
6 Holmboe ES, Prince L, Green M. Teaching and improving quality of care in a primary care internal medicine residency clinic. Acad Med. 2005;80:571–577.
7 Holmboe ES, Lynn L, Duffy FD. Improving the quality of care via maintenance of certification and the Web: An early status report. Perspect Biol Med. 2008;51:71–83.
8 Oyler J, Vinci L, Arora V, Johnson J. Teaching internal medicine residents quality improvement techniques using the ABIM's practice improvement modules. J Gen Intern Med. 2008;23:927–930.
9 Bernabeo EC, Conforti LN, Holmboe ES. The impact of a preventive cardiology quality improvement intervention on residents and clinics: A qualitative exploration. Am J Med Qual. 2009;24:99–107.
10 Shekelle PG, MacLean CH, Morton SC, Wenger NS. ACOVE quality indicators. Ann Intern Med. 2001;135:653–667.
11 Wagner EH. Chronic disease management: What will it take to improve care for chronic illness? Eff Clin Pract. 1998;1:2–4.
12 Nelson EC, Batalden PB, Huber TP, et al. Microsystems in health care: Part 1. Learning from high-performing front-line clinical units. Jt Comm Qual Improv. 2002;28:472–493.
13 O'Leary J, Keeler EB, Damberg C, Kerr EA. An overview of risk adjustment. In: McGlynn EA, Damberg CL, Kerr EA, Brook RH. Health Information Systems: Design Issues and Analytic Applications. Santa Monica, Calif: RAND; 1998. Available at: http://www.rand.org/pubs/monograph_reports/2007/MR967.pdf
. Accessed August 9, 2009.
14 Huberty CJ, Olejnik S. Applied MANOVA and Discriminant Analysis. 2nd ed. Hoboken, NJ: Wiley-Interscience; 2006.
15 Iglehart JK. Medicare, graduate medical education, and new policy directions. N Engl J Med. 2008;359:643–650.
16 Wenger NS, Solomon DH, Roth CP, et al. The quality of medical care provided to vulnerable community-dwelling older patients. Ann Intern Med. 2003;139:740–747.
17 Asch SM, Kerr EA, Keesey J, et al. Who is at greatest risk for receiving poor-quality health care? N Engl J Med. 2006;354:1147–1156.
18 Feifer C, Ornstein SM, Nietert PJ, Jenkins RG. System supports for chronic illness care and their relationship to clinical outcomes. Top Health Inf Manage. 2001;22:65–72.
19 Fleming B, Silver A, Ocepek-Welikson K, Keller D. The relationship between organizational systems and clinical quality in diabetes care. Am J Manag Care. 2004;10:934–944.
20 Bodenheimer T, Wagner EH, Grumbach K. Improving primary care for patients with chronic illness. JAMA. 2002;288:1775–1779.
21 Solberg LI, Asche SE, Pawlson LG, Scholle SH, Shih SC. Practice systems are associated with high-quality care for diabetes. Am J Manag Care. 2008;14:85–92.
22 Tinetti ME, Baker DI, King M, et al. Effect of dissemination of evidence in reducing injuries from falls. N Engl J Med. 2008;359:252–261.
23 Holmboe ES, Meehan TP, Lynn L, Doyle P, Sherwin T, Duffy FD. Promoting physicians' self-assessment and quality improvement: The ABIM diabetes practice improvement module. J Contin Educ Health Prof. 2006;26:109–119.
24 Committee on Quality of Healthcare in America, Institute of Medicine. Crossing the Quality Chasm. Washington, DC: The National Academies Press; 2001.
25 Barr MS. The need to test the patient-centered medical home. JAMA. 2008;300:834–835.
26 Iglehart JK. No place like home—Testing a new model of care delivery. N Engl J Med. 2008;359:1200–1202.