The Patient-centered Medical Home (PCMH) model of care has been widely adopted as a means to improve primary care quality and reduce costs. More than 90 commercial plans, almost all state Medicaid programs, and a variety other practices and organizations have adopted the model,1 placing a renewed focus on primary care and its role in the health system at large. Early evidence shows that PCMH care may deliver on its promise of better care delivery, improved patient and staff experience, and lower utilization of emergency services2; however, questions remain as to the role of PCMH recognition in defining practice changes and how best to support practices in implementing the vast and wide-ranging set of changes that comprise the medical home.3 Policymakers, academics, and primary care practitioners are acutely aware of the need to better understand and disseminate the PCMH model if we are to achieve its full cost-saving and quality improvement potential.4
Despite this awareness and recent policy interest in building quality improvement infrastructure through primary care extension centers, empirical studies evaluating or even documenting technical assistance modalities for large-scale, multimodal primary care transformation efforts are relatively few. Those studies that do examine technical assistance approaches generally show moderate but significant improvements.5–8
In this paper, we will describe the multimodal technical assistance approach used in the Safety-Net Medical Home Initiative (SNMHI or Initiative), a 5-year effort sponsored by The Commonwealth Fund to (1) develop and test a replicable model for supporting PCMH transformation; (2) help 65 safety-net sites in 5 states redesign their clinical and administrative systems to improve patient experience and health; and (3) enhance regional capacity to support and sustain practice improvements; and (4) influence health policy by involving Medicaid and other stakeholders in action toward appropriate reimbursement levels to sustain practice efforts.
In addition to describing the technical assistance approach, we will present participating practices’ perceptions of each component’s helpfulness as context for interpreting PCMH implementation data presented elsewhere in this supplement and to contribute to the growing discourse on supporting primary care practice transformation. Although the full range of summative outcome data are not yet available, we do have evidence that practices were actively engaged and substantial transformation activities took place during the SNMHI.9,10
The 5-part technical assistance framework was developed from its inception as a replicable model of technical assistance.11 Technical assistance was deployed to the 65 participating safety-net practices as a means to help them implement the 8 Change Concepts for Practice Transformation12—Engaged Leadership, Quality Improvement Strategy, Empanelment, Continuous and Team-based Healing Relationships, Organized, Evidence-based Care, Patient-centered Interactions, Enhanced Access, and Care Coordination—that together defined our medical home intervention. Components of the technical assistance framework included: (1) individual site-level coaching provided by local medical home facilitators (MHFs) and supplemented by others providing expert consultation on specific change concepts; (2) regional and national learning communities of participating practices that included in-person meetings and field trips; (3) data monitoring and feedback including longitudinal feedback on Change Concept implementation as measured by the PCMH-A; (4) written implementation guides, tools, and webinars relating to each of the 8 Change Concepts; and (5) small grant funds to support infrastructure and staff development.
The SNMHI served half a million low-income, uninsured and underinsured patients in 5 states: Colorado, Idaho, Massachusetts, Oregon, and Pennsylvania (Pittsburgh area). Applicants to the SNMHI were regional organizations, in many cases Primary Care Associations (PCAs), clinic membership organizations skilled in quality improvement, and stakeholder engagement who partnered with 10–15 primary care practices in their region with whom they often had long-standing relationships. Applicant organizations were selected as Regional Coordinating Centers for the SNMHI based on their history with large-scale quality improvement efforts, readiness data from the partner practices, the experience and credentials of the proposed MHF(s), and with geographic diversity in mind. More details on the selection process, participating Regional Coordinating Centers and practices, and the medical home changes made are presented elsewhere.10
In addition to describing the 5 components technical assistance framework in depth, this paper will share data from a practice survey about the perceived value of technical assistance overall and by component. The “Practice Site Survey of Technical Assistance Value” was developed collaboratively by the national program team (Qualis Health, the MacColl Center for Health Care Innovation at Group Health Research Institute) and The Commonwealth Fund project officer between January and March of 2013. The survey consisted of 58 total questions and was administered via the online survey tool “SurveyMonkey.” The primary SNMHI contact at each site received an email invitation to participate, a $100 honorarium, and a certificate of recognition for participating in the SNMHI. Up to 3 phone call reminders were made to nonrespondents. After 10 weeks, 81% of participating sites initiated the survey and 78% completed it. The percentage of sites completing the survey varied across the 5 regions from 100% of the 13 Colorado sites to 61.5% of the 13 Massachusetts sites. Most surveys (63%) were completed by an individual. The remaining surveys were completed by a multidisciplinary team who collaborated to submit a single response. Front-line staff were represented in 90% of the survey responses. In 4 instances sites submitted multiple responses. When that occurred, the most complete or most representative survey was used.
Additional data presented in this paper were gathered from a variety of sources. Training, field trip, and mini-grant data were collected from financial records of payments to the sites and cross-compared with postevent summaries. The range of money given to sites was $200–$9763, with a median amount of $1336. SNMHI sites participated in a total of 167 trips and projects.
Data on MHF training, experience, credentials, and full-time equivalence (FTE) was abstracted from Regional Coordinating Center’s staffing plans provided in the initial and continuing grant applications (submitted once per year) and staffing change notices provided in Regional Coordinating Center interim and final reports. FTE allocations were cross-checked against budget documents included in the annual continuing grant applications.
This work was deemed not-research by the Group Health Institutional Review Board.
Individual Site-Level Coaching
A key component of the SNMHI implementation approach was direct, site-level coaching provided by local MHFs. These facilitators worked for and were supported by the Regional Coordinating Centers. In many cases, the Regional Coordinating Center role was fulfilled by a PCA, a membership organization with whom the majority (67.7%) of practices had a prior relationship, either individually or organizationally. The role of the MHF was to leverage their local knowledge and expertise to help practices “make meaningful changes designed to improve patients’ outcomes…[and] help physicians and improvement teams develop the skills they need to adapt clinical evidence to the specific circumstance of their practice environment.”13 In the SNMHI, MHFs were generalists who provided some PCMH content expertise; offered guidance on data collection and use; and connected practices with each other, with resources, and with local and national experts, as needed, for additional PCMH content.
The MHFs varied in terms of credentials, expertise, and time spent per practice. Over the course of the project, 27 MHFs in total were part of the SNMHI. This ranged from a low of 8 in year 1 to a high of 18 in year 2. Similarly, total MHF FTE varied over the course of the project and averaged 1.52 per region over the SNMHI, ranging from a low of 0.7 FTE to 2.18 per year. Each region had between 10 and 15 practices participating. The majority of the MHFs had either a background in nursing, a master’s degree in a health-related field (eg, MPH, MSW, MHA, MBA), or another advanced degree (eg, MD, PhD). The majority were also experienced in quality improvement strategies and had a background working in primary care. Table 1 provides additional information on the characteristics of MHFs.
Most practices received coaching from their MHFs monthly (61.2%) or at least quarterly (22.4%). Coaching took place both in-person and by phone, with geography and distance between the coach and the practice being key determinants of the mode. Although practices regularly received support from MHFs, they also needed content and technical assistance beyond what their individual facilitators could provide. For example, more than half of practices took advantage of consulting support from the national program team for assistance with National Committee for Quality Assurance PCMH Recognition (Table 2).
Two thirds of practices said the coaching they received from their MHF(s) was helpful or very helpful, especially in communicating the changes necessary for becoming a PCMH.
In addition to coaching practices directly, MHFs worked with one another and the national program team to share coaching knowledge and implementation strategies. MHFs met monthly through phone and at least every 6 months in-person. As a group, they systematically tracked and reflected on the implementation progress at their sites using a shared set of SNMHI tools (see Appendix A, Supplemental; Digital Content 1, http://links.lww.com/MLR/A791); they developed their own skills and understanding of the PCMH model by availing themselves of training funds and webinars; and they participated in each other’s quality improvement programming. This MHF network became invaluable as turnover among MHFs exceeded 50% over the course of the initiative and the training and onboarding of new staff became a major focus.
The second component of the SNMHI technical assistance approach was the creation and ongoing maintenance of a learning community for practices both within and across the regions. The purpose of the learning community was to create opportunities for in-person and remote interaction among health center peers, including sharing content around effective models of care; offering best practices; brainstorming solutions for common pitfalls for measurement and implementation of medical home components; and sparking and sustaining momentum to continue change efforts. Formal breakthrough series collaborative methods were included as part of the learning community in some, but not all, regions.
In-person regional learning sessions were held on a regular basis, ranging from quarterly to annually. The majority of sites (79.6%) interacted with their peers through regional or national meetings or site visit at least 4 times a year. In collaboration with the MHFs, the national program team experimented with a variety of ways to encourage meaningful interaction among sites around medical home content during these regional learning sessions. Particularly, well-received methods included sessions that allowed for open dialog on a topic of choice (open space), rapid fire sharing of changes by Change Concept to maximize the number of teams sharing their work, small group sessions, and peer-to-peer or clinic-to-clinic mentoring sessions. These methods were designed to both facilitate content sharing and provide opportunities for relationship development among peers.
In addition to regular regional meetings, a national learning session was held at the mid-point of the Initiative. All sites participated by leading a presentation, creating a storyboard, or facilitating a small group discussion; all practice facilitators attended as well. Presentations led by the participating practices, with support from their MHFs and the national program team, were encouraged to be highly interactive. More than 90% of practices found this meeting to be helpful or very helpful—one of the highest rated components of technical assistance.
Another highly rated component of technical assistance was field trips with 91.2% of sites rating them as helpful or very helpful for practice transformation. Both host and visiting sites appeared to value the chance to work together in the clinical setting. Field trips ranged from 1 to 3 days in length and sites sent on average 2–3 staff members per trip. Sites selected different individuals (eg, administrators, Medical Directors, providers, quality improvement leads, and others) to make field trips depending on the topic and experience desired.
More than half (39/65) of SNMHI sites participated in a field trip during the course of the SNMHI, 35 sites visited another site (either another SNMHI site or an external site), and 14 sites hosted another site.
On the basis of the quotes from our technical assistance survey, we know that field trips offered at least 3 pathways of support. For practices having difficulty getting started, field trips to high-performing practices provided a way to “see the PCMH in action.” These visits seemed to provide a tangible vision of how PCMH could work in practice, and were described as a “turning point” by several sites. For practices in the middle of implementation, field trips provided a different kind of support. These practices most often wanted to visit sites very much like their own that were just a little further ahead. These field trips focused around sharing very specific tools and techniques. Finally, field trips appeared to provide an “energy boost,” as 1 practice described, by providing exposure to other systems also in the change process—normalizing the challenges of PCMH transformation and providing momentum to continue.
Data Monitoring and Feedback
The third component of our technical assistance approach focused on the selection of clinically meaningful metrics and coaching on how to use data to improve care. Recognizing the multiple aims of measurement, the national program team worked to codevelop a measurement strategy with the participating practices to assure usefulness, feasibility, and ownership of data at the practice level. This resulted in an approach to clinical quality measurement that was not standardized across practices (Fig. 1).
Practices selected their own measures within the following domains: clinical quality, transformation (eg, continuity, access), patient experience, provider/staff satisfaction, and utilization. These measures were reported quarterly to the national program team; some MHFs asked sites to submit them more regularly. In addition, all practices completed the PCMH-A and Key Activities Checklist regularly. Practices received feedback quarterly or more often in the form of coaching calls and 1-page dashboard reports that showed a practice’s suite of measures overtime and compared with other sites when applicable.
Despite selecting practices with stated experience participating in quality improvement initiatives, not all sites had the capacity to collect and report reliable measures. Metrics reported for external audiences, like the Uniform Data System that many participating Federally Qualified Health Centers must submit to the Bureau of Primary Health Care, were not sufficient to support site-level quality improvement because they were submitted infrequently and were not available at the clinic, team, or provider level. Many practices had constrained capacity to field or analyze surveys of patient or provider experience. Electronic health records (EHRs) were often unable to provide population-level data that could be used for quality improvement or teams were not yet skilled in abstracting available data for this purpose.
Even with these challenges, 83.7% of practices acknowledged that collecting, submitting, and receiving feedback on data helped to implement process improvements necessary for becoming a medical home.
To define the changes necessary for becoming a medical home and to support MHF coaching and practice implementation, the national program team—in collaboration with practice facilitators, practices, and outside experts—created a library of written and audio resources. These included implementation guides, tools and resources, webinars, and office hours based on The Change Concepts for Practice Transformation. Implementation guides provided an evidence-based overview and rationale for a given Change Concept, described implementation steps and activities, and provided tangible tools and case studies about implementation. Implementation guides and webinars were viewed as helpful by both the practice facilitators for whom the guides served as a kind of medical home curriculum and to the practices, the majority of whom made use of these materials regularly. Additional information on the type and number of resources included in the SNMHI Resource Library is published elsewhere10 (Table 3).
Webinars and implementation guides were created during the Initiative in part based on SNMHI practice feedback. When MHFs or national program team members heard repeated requests for technical assistance on a similar topic—such as how to use data for quality improvement—the national program team identified or created new resources.
The resulting SNMHI Resource Library was revised based on practice learning, experience, and feedback and republished in the public domain in 2013.
Small Grant Funds
Forty-nine sites applied for and received small grant funds to work on a variety of special projects. Mini-grants focused on improvement in specific areas to address barriers to PCMH transformation. Most focused on improving team-based care or data collection and reporting capacity, although topics varied. Examples of special projects included developing a staff training video on huddles, developing materials to educate patients on team-based care, hosting a retreat to form care teams, and attending a training to help extract population-level data from an EHR. In addition, all sites received a small one-time grant to support data collection and reporting.
In 2008, the SNMHI was launched with the goal of developing a replicable and sustainable model of technical assistance for PCMH transformation.11 Since that time, new regulations have been passed as part of section 5405 of the Affordable Care Act and organizations and policymakers have become increasingly interested in developing lasting infrastructure to support primary care practice improvement. The 5-component approach outlined above capitalized on both national and local resources. Reflecting on 4 years of technical assistance support and peer partnership, nearly 90% of practices said the multimodal approach used by the SNMHI was helpful or very helpful in supporting their transformation efforts. Although summative evaluation data have not yet been published, our experience highlights several learnings that may be of use to others developing broad scale practice facilitation programs.
First, the practice survey responses indicate that peer-learning opportunities, including in-person meetings and field trips, were particularly well received by practices and contributed to vitally important components of practice transformation-envisioning change, overcoming implementation obstacles, and renewing momentum and motivation. The regional and national meetings as well as the field trips provided opportunities for staff in multiple roles to catch a vision for a new way of delivering care. They also provided a forum for sharing concrete ideas and commiserating about challenges. For a change as comprehensive and wide ranging as the medical home, we suggest that peer interactions, especially in-person visits to the clinic site, may play an outsize role in helping practices envision a new way of delivering care. In fact, practices participating in a field trip scored higher on our medical home implementation assessment tool, the PCMH-A,9 at the end of the initiative than their nonsite visiting peers (9.56 vs. 9.25, P=0.06).
Second, gathering, reporting, and using clinical data for quality improvement continues to be a challenge for many practices, despite robust technical assistance by both the national program team and regional MHFs including onsite expert coaching at regional meetings, several webinars describing how to build quality improvement capacity, and an implementation guide on using data to support quality improvement. The presence of an EHR did not necessarily facilitate change. Even with large-scale investments in health information technology, practices continue to need support to effectively use EHRs for measurement and to redesign workflows so those data can lead to real practice change.
External stakeholders such as local hospitals, quality improvement organizations, and grant and policymakers are increasingly interested in practice-level data across a variety of other domains, including cost and utilization, and patient and staff experience, yet practices have little to no experience or capacity to measure. By acknowledging a practice’s limited capacity to engage in measurement activities, we recognize the tension between measuring for quality improvement and generating comparable, valid data for program monitoring. Much more work is needed to improve commonly collected, validated measures-such as the Uniform Data System and Consumer Assessment of Healthcare Providers and Systems to ensure they are actionable at the site level and helpful for clinical quality improvement. This may entail adjusting the level and the frequency with which data are collected.
Finally, there is an important role for both local and national organizations to provide nonduplicative, mutually reinforcing support across practices and regions. Local facilitators with an established relationship with practices brought important assets to practice transformation including trust, contextual knowledge, and a longitudinal perspective that would have been difficult if not impossible to recreate nationally. The high level of receptivity to the coaching may be due in part to the fact the MHFs worked for local organizations that shared a history of working on quality improvement efforts with the practices. In fact, most MHFs worked for PCAs, membership organizations to which the majority of SNMHI practices belonged. MHFs working for nonmembership organizations, such as payers, and PCA coaches supporting nonmembers had a more difficult time reaching and connecting with sites. The PCMH-A scores of sites that were and were not PCA members were comparable at baseline, but PCA member sites had a higher increase in PCMH-A scores over time compared with non-PCA member sites. This finding suggests that organizational trust between a sponsoring organization and primary care practice may be a factor in supporting practice transformation. More research is needed (Table 4).
In addition to local support, the national program team provided important cross-regional learning opportunities and spearheaded the development of materials and tools for both sites and facilitators. The national program team supported a robust learning community for the MHFs to receive ongoing support, networking, and training. With turnover rates among practice facilitators at 50% over a 4-year span, the importance of a learning community for MHFs became increasingly clear. Just as teams cited that peer-learning was important in their transformational journey, so too did the practice facilitators need a space to exchange ideas and improve their skills.
Only 15% of primary care practices have been recognized as PCMHs by the National Committee for Quality Assurance14; others involved in PCMH pilots and demonstrations have made great strides as well. The tools and lessons learned from the SNMHI’s technical assistance approach may be useful as we help the remaining majority of practices increase their team-based, proactive, and patient-focused care. Technical assistance efforts must be equipped to support the complexity, breadth, and duration of change required to be a medical home—change that touches everything and everyone in a primary care practice. Indeed, the “epic whole-practice reimagination” of primary care requires epic support for both practices and practice facilitators.15