In 2010, with the introduction of the Affordable Care Act, patient engagement became a major focus of health care improvement efforts nationally.1 Patient engagement falls under the umbrella of patient-centered care and definitions of patient engagement focus on the idea of promoting active patient involvement in health care by supporting patient participation in decisions related to their health in an educational and supportive environment.2 Growing evidence suggests that patient engagement can aid in accomplishing the Institute for Healthcare Improvement’s Triple Aim3—improving the patient experience of care, improving population health, and reducing health care costs. However, patient engagement has not been consistently defined, operationalized, or translated into practice.1
Even with inconsistencies in definition and operationalization, literature suggests minority populations tend to be less engaged with health care4,5 and physician communication with minority patients is less patient-centered.6 Attempts to implement technology-based interventions such as patient portals and personal health records to enhance engagement have been less successful with minority patients.7,8 In addition, quality improvement (QI) efforts intended to target health disparities using patient-centered approaches have not been effective at decreasing disparities in health care quality9 or outcomes.10 There are a number of potential benefits to improving patient engagement in clinical settings. Patients who are more engaged tend to make better use of resources, and have a better awareness and understanding of their conditions and health outcomes.11 Patient engagement has also been shown to improve health care quality and safety.12
There are many obstacles to successful implementation of patient engagement practices.2 Providers face challenges that include limited time and resources, and in some cases inadequate skills. Compared with whites, minority patients often have disproportionately lower health literacy and higher distrust of the health care system.13 In addition, the uneven power differential between patients and providers is wider for minority patients compared with the general population.14 These various challenges highlight the need for more robust tools to support patients, providers, and organizations in improving patient engagement, particularly among minority populations.15,16 Kilbourne and colleagues’ model of health disparities identifies key determinants for understanding inequities at the individual, provider, and health care system levels. In this model, organizational culture, attitudes, and communication are important to successful implementation for reducing disparities.17
A core tenet of the Patient Centered Medical Home (PCMH) model is engaging patients through the delivery of patient-centered care. The Veterans Health Administration (VHA) began implementing its PCMH model, known as Patient Aligned Care Teams (PACT), in 2010.18 Patient engagement is central to PACT, which aims to “meet patients where they are,” by creating partnerships with patients, improving access to care, and utilizing a team-based approach to care.19 The medical home model focuses on improving care delivery for the sickest patients most in need of care coordination (a population with significant numbers of low income and racial and ethnic minority patients) as well as addressing social determinants of health. However, to date there is little evidence that the medical home model has been successful in either engaging patients or in reducing disparities in patient engagement.20
Our objective was to develop a toolkit to help VHA providers increase patient engagement and reduce disparities. We sought to understand the universe of practices that primary care clinicians and leadership use for patient engagement and, using a Delphi process, narrow the universe by examining how important and feasible these practices were to implement. To achieve this objective we conducted observations and interviews in facilities that serve large minority patient populations. Here we describe the process of creating this toolkit and the practices identified.
We began with a series of qualitative interviews and site visits at VHA primary care sites serving large minority populations to identify patient engagement practices and resources currently in use. We interviewed clinic staff and leadership and shadowed patients to identify facility culture, policies, and activities related to patient engagement. Interviews were coded and analyzed with the goal of compiling a list of patient engagement practices. We then used a modified Delphi process, conducting a series of conference calls and surveys in which key stakeholders helped to narrow down practices to develop a toolkit. This evaluation project was reviewed by the Michael J. Crescenz VA Medical Center Institutional Review Board and deemed to be a QI effort. Evaluators did not interview patients directly regarding their experiences; however, 2 patient representatives participated in the Delphi meetings.
We used a positive deviance approach, stratifying site selection by the concentration of minority patients served and facility performance on a measure of patient engagement. The positive deviance approach, in which high performers are contrasted with low performers, seeks to identify unique practices of positive deviants.21
We categorized sites as high–minority-serving based on the proportion of nonwhite patients served at the site. Sites with at least a 15% nonwhite patient population (the national median) were considered high–minority-serving. To measure patient engagement at each site, we used a modified version of the PACT Implementation Index, a measure developed to assess the implementation of PACT across the VHA.22 This modified measure consists of 4 patient engagement components from a VHA-specific PCMH Survey of Healthcare Experiences of Patients23,24: care comprehensiveness, self-management support, patient-centered communication, and shared decision making. Sites were assigned a 1 or −1 for each domain for which they are in the top or bottom quartile nationally and 0 otherwise. Scores were summed across items resulting in a 9-point scale from −4 to 4.25 Sites with scores above 0 were considered high performing and those 0 and below were low performing.
We included a national sample of sites. In preparation for implementation of a regional patient engagement intervention, we oversampled sites located within the regional VHA service network which includes parts of Pennsylvania, Delaware, New Jersey, Ohio, and West Virginia. We selected 32 sites, evenly split in terms of performance, for telephone interviews (20 medical centers with outpatient clinics and 12 community-based outpatient clinics). Half of all sites selected were high–minority-serving. At each site the following people were targeted for interviews: the physician director of primary care services, the nurse director of primary care services, and the patient customer service representatives. In addition, from the 32 sites, 6 regionally diverse sites from across the United States were selected for site visits. Five of these sites were high–minority-serving. Half of the 6 sites visited were high performing on the performance measure while the other half were low performing. At each site visit, we shadowed patients through their appointments and interviewed a wide range of frontline staff including providers on primary care teams (teams are made up of a primary care provider, nurse care manager, clinical associate, and administrative clerk) and ancillary staff.
The interview guide was tailored to the role of the participant and included a definition of patient engagement,26 defined as any practices, policies, or procedures that: (1) involve and support patients (and their families and representatives) as active members of the health care team; and (2) encourage collaborative partnerships between patients, health care providers, and the organization as a whole.27 Respondents were asked to describe the patient engagement efforts occurring at their facilities as well as any barriers and facilitators to implementation. These open-ended, semistructured interviews were confidential, and each lasted approximately 30 minutes. During site visits, additional data were collected through the direct observations of patients and staff interactions. Observations included patient shadowing, patient group visits, staff meetings, public space observations, and guided tours. Observations were used to obtain a more complete picture of activities happening on the ground. Interviewees may not have been aware that they were using a patient engagement practice but it was noted as such if observed by the evaluator.
Data were audio recorded, transcribed, and analyzed using NVivo 10.28 A total of 204 data points were imported into the qualitative database including: 155 interviews, 22 field notes of patient observations, 8 field notes of class observations, and 19 field notes of facility tours and site observations. Four qualitative evaluators used an applied thematic analysis approach29 to formulate a codebook, capturing the activities, policies, and procedures that support and hinder patient engagement. The codebook was used by each coder independently on no more than 3 transcripts at a time before testing for inter-rater reliability and resolving discrepancies with the group. A total of 20% of the transcripts was double coded by the 4 evaluators. Estimates of inter-rater reliability produced an average κ statistic of 0.86 with a range of 0.61–1.0.30 Coding discrepancies were resolved through discussion and consensus. The evaluation team initially organized the data into programs or resources and when and how patient interactions occurred. The data were further scrutinized to select items that demonstrated actionable patient engagement practices. Once an initial list of practices emerged from the subcodes, they were separated into 2 lists: patient engagement practices or patient engagement resources. Patient engagement practices were activities that engaged patients directly in their care; patient engagement resources were activities or assets that helped to facilitate patient engagement practices at a facility.
The Modified Delphi Method
To narrow the universe of patient engagement practices and resources identified through qualitative analyses, a modified Delphi method was used, a technique used to build consensus through surveys and group communication.31 The Delphi method utilizes experts on a particular topic to guide researchers on how to address a problem. As the modified Delphi method does not require group members to be physically present at one location,32 participants called in to an online meeting to view shared content and provided input via confidential online surveys accessed from a survey link.
Participants for the modified Delphi meetings were selected through a purposeful sampling strategy.33 Ten participants were providers of patient-centered health care from our national sample of facilities described above. These included 2 physician directors of primary care services, 3 nurse directors of primary care services, 3 patient customer service representatives, a health behavior coordinator and a health promotion disease prevention coordinator. Two participants were patients who received their primary care at a VHA facility in an urban location. The group participated in three, 90-minute modified Delphi calls to pare down the practices and resources to final lists based on importance and feasibility.
For the first modified Delphi meeting, participants completed confidential online surveys rating each practice and resource. Respondents were first asked “How important is this practice or resource to help patients be more engaged in their health care?” and then “How feasible is this practice or resource to implement?” Items were rated on a scale that ranged from 1 (extremely important/extremely feasible) to 5 (not important at all/not feasible at all). Importance and feasibility groupings were separated into 4 groups: high importance and feasibility, medium importance and feasibility, low importance and feasibility, and polarized. Items in the high group were retained and not discussed further. Items in the low group were removed and not re-considered. Items that were rated medium or were polarized were kept for further discussion and ratings in the second round.
For the second modified Delphi meeting, participants were presented with the first round survey results and asked to review and discuss the items that were polarized. After the discussion, participants were asked to rerate the polarized items and then asked to either “keep” or “remove” the items that had previously fallen into the medium group. Polarized items reclassified as high in importance/feasibility remained on the list as well as any medium rated items that were selected as “keep” by at least 50% of the respondents.
For the third modified Delphi meeting, participants were asked to select their preferred items from all of the “high” rated practices and resources from calls 1 and 2 and the “keep” practices and resources selected in call 2. These remaining practices and resources were randomly sorted into groups of approximately 10 for consideration by the participants. Participants were asked to select their “top 3” items from each group. A team of evaluators reviewed the outcomes of the final survey and removed items on the practices list that received a median score of <5 “keep” votes or items on the resources list with <4 “keep” votes.
Qualitative Interviews and Site Visit Results
Patient Engagement Practices
Coding resulted in 5 categories of practices: engagement with a patient that occurred right before the visit (previsit), during the visit (visit), right after the visit (postvisit), in between appointments (between visits), and in group settings (classes & clinics). Exemplary quotes discussing practices are included in Table 1.
Before the visit (or “previsit”) respondents described communicating with patients in a way that would make the actual visit more productive. One such practice was to call the patient before the visit to discuss the upcoming appointment and elicit the patient’s visit priorities. To improve the patient wait-time experience, one facility monitored the clinic flow and communicated any delays in real-time. Other practices required the staff to spend some time preparing for the visit by reviewing patient records before the visit, and asking the patient about the primary goal for their visit. At some facilities, patients were provided with summary sheets of information from their last visit to be reviewed while they waited and check-in sheets that asked about questions they may have for their provider.
During the visit, staff asked patients if all of their needs had been met and gathered information that helped with understanding the patients’ health contexts—for example, by asking open-ended questions about self-care, making small talk about family and home life, and asking about non–health-related concerns such as food and housing security. To promote patient agency, staff would assist patients by setting SMART (specific, measureable, achievable, realistic, time-related)34 goals with them and asking about their opinions on the available treatment options. To help make patients feel more at ease, staff described paying attention to patient body language, avoiding jargon, actively listening, providing opportunities to ask questions, and personally introducing patients to other providers for warm hand-offs.
At the end of the visit (or “postvisit”) respondents discussed ensuring the patient was clear on the decisions that were made during the visit and any required next steps. This included summarizing everything that happened during the visit, discussing medication changes, providing information on when medications needed to be refilled, and confirming follow-up appointment dates. These practices were either performed verbally or by providing an appointment and medication information sheet during check-out. Anticipating the patient’s needs going forward and educating them about the available programs to meet those needs were also postvisit practices.
In between appointments (or “between visits”) respondents discussed strategies for remaining in contact with patients to keep them engaged. Collaborating and scheduling follow-ups with the extended health team members (such as pharmacists, behavioral health, and social workers) helped patients resolve issues with self-care and medication adherence and also provided opportunities for health education.
To remain in communication with the patient after the visit, staff talked about conducting postdischarge follow-up calls, promoting the use of secure online messaging to reach staff with questions, and utilizing telehealth services where available. Providing patient support materials, such as educational resources on a patient’s chronic conditions, home logs to track health care progress, and contact information of the health care team, were other ways to keep the patient engaged in their care.
Group activities for patients under “classes & clinics” revolved around establishing support groups, group clinics, and group classes for specific target areas such as mental health, homelessness, chronic pain, hypertension, or diabetes. Clinics and classes were held by social workers, nutritionists, mental health workers, and other staff. The group sessions included shared medical appointments which would be logged in patients’ charts and provide an opportunity to share experiences, talk to specialists, review lab results, and promote success stories.
Patient Engagement Resources
The initial coding cycle yielded a list of resources that was divided into 3 categories: resources for patients, resources for staff, and resources for both patients and staff. Exemplary descriptions of resources are included in Table 2.
Resources for patients included educational, outreach, and promotional materials. Respondents described a variety of educational materials, in both paper and electronic versions, which could be disseminated to patients. Electronic materials included customized waiting room televisions and electronic bulletin board, tutorials about the patient portal, and online patient health libraries. Some clinics also promoted their facility’s social media pages to encourage patients to get more involved and provide access to educational materials. Paper materials placed in visible locations and common areas and given directly to patients to make them more aware of their health were also discussed. Some facilities had an in-house “learning center” staffed with employees or volunteers to help patients find educational materials, sign up for the electronic patient portal, and participate in classes. Other venues for providing patients with information included new patient orientations and health education outreach fairs.
In addition to education, some facilities provided patients with opportunities for giving feedback. These “talk-back” sessions and “town hall” meetings were designed to allow patients to ask questions, share their experiences, and discuss complaints. At one facility, patients were invited to participate in quarterly departmental meetings, and at another the center director would field patient complaints through phone meetings.
Resources for staff consisted primarily of training and supports for training. At many facilities, customer service training was made mandatory for all incoming staff. At other sites, customer service training was targeted at departments with a high volume of complaints. Motivational interviewing (MI)35 and TEACH36,37 trainings were implemented widely across most sites and were mentioned by most respondents. MI is designed to motivate patients to make health behavior changes that are congruent with their lifestyle.35 TEACH trains clinicians to coach and educate patients to improve their health outcomes by exploring their preferences and needs and honoring them as equal partners.36,37 Various sites reported different types of patient-centered care training, typically lasting several days.
In addition to training, facilities also addressed staffing, culture, space, and time constraints. Some sites found new, creative ways to utilize staff. For example, clinics would assign a “float staff” whose job it was to manage the unexpected or unscheduled needs of patients. The main goal for most sites was to create a more collaborative team-based approach including extended team members (eg, social workers, pharmacists, psychologists.) Many sites discussed creating protected time for training, meetings, huddles, and other administrative tasks. Respondents also discussed the issue of addressing space constraints through innovative design, renovation, and organization to support the PACT model.
Many of the resources that would impact both patients and staff were efforts aimed at QI. In some cases, these QI initiatives involved larger, nationwide surveys, and programs to respond to feedback from those surveys. In other cases, facilities generated their own QI programs. For example, one site started a “mystery shopper” initiative using “unannounced standardized patients” who arrive clandestinely and role-played various scenarios to look for potential gaps in providers’ practices. At various facilities, respondents also described “steering committees” aimed at monitoring patient satisfaction and discussing improvement.
Modified Delphi Results
Figure 1 provides an overview of the project from the qualitative data collection, to qualitative results, to the pre-Delphi lists, through the modified Delphi process, to the final toolkit. The 128 item practices and 94 item resources (222 items total) lists were shortened during 3 modified Delphi meetings. After the first round, 74 highly rated items were put aside to remain on the list and 32 low rated items were eliminated. There were 60 items where respondents were polarized on importance and/or feasibility and an additional 56 items that were rated as medium in importance and/or feasibility. In the second meeting, of the 60 polarized and 56 medium rated items, 57 items were kept and 59 items were removed. In total, 131 items remained on the list (74 from round 1 and 57 from round 2); 90 were patient engagement practices and 41 were resources.
At the third meeting, participants rated and selected what they viewed were the top practices and resources from the 131 remaining items. During this meeting each participant selected their 3 preferred practices or resources from 13 randomly sorted sequential groups of 10–11 items. A total of 76 items were eliminated after this step. Items that received a score of 5 or above in practices and 4 or above in resources remained on the list. The final patient engagement lists for the toolkit contained 36 practices and 19 resources.
Post Delphi Toolkit
Table 3 depicts the postmodified Delphi practices toolkit. The 3 practices in the previsit category were activating patients for a visit by providing summary information and check-in sheets and preparing providers for a visit by reviewing patient’s records before the visit. The visit section was the largest with 17 items including such activities as building rapport with patients through clear and transparent communication. The 2 postvisit items were related to summarizing the visit and establishing some next steps. The 8 items in the between-visits section included such activities as scheduling follow-ups with extended care team members. The 6 items remaining in the classes and clinics section of the practices list included offering a new patient orientation and other group activities.
Table 4 depicts the postmodified Delphi resources toolkit. With a reevaluation of the remaining items, the structure of the toolkit was altered to the following categories: environment, training, communication, and feedback. The environment section included 9 items, such as improving phone access and responsiveness. The 5 items in the training section included requiring staff trainings on such topics as interpersonal communication and MI. The 2 items in the communication section included providing customized programming for patients on televisions or electronic bulletin boards. Finally, the 3 items in the feedback section included audio or video recording clinical encounters as a feedback opportunity for staff, establishing a primary care advisory committee, and appointing a patient advocate to manage complaints. This final list of 36 practices and 19 resources was assembled into an online tool (www.visn4.va.gov/VISN4/CEPACT/PE_practices/PE_tools.asp) for dissemination.
We found no differences in patient engagement practices between high-performing and low-performing sites. However, high-performing sites tended to describe more training opportunities and staff feedback mechanisms. Specifically, high-performing sites were more likely to describe requirements for training and refresher training in motivational interviewing. High performers were also more likely to describe feedback mechanisms and QI initiatives at their facilities. In addition, when examining barriers to patient engagement, all sites faced the same barriers. These included communication barriers, care coordination issues, organizational constraints, patient care barriers, space constraints, staffing constraints, and time constraints. However, low-performing and high–minority-serving sites more often reported barriers to implementation of patient engagement practices. No identified practice or resource specifically targeted patient engagement of minorities or addressed disparities.
This paper demonstrates the use of qualitative methods and a modified Delphi approach for the development of a toolkit aimed at increasing patient engagement and reducing disparities in engagement. Use of qualitative interviewing and observations provided a rich data source from clinical staff and observations at VHA facilities and allowed the evaluation team to assemble comprehensive lists of practices and resources that were successfully used at high–minority-serving institutions with high patient engagement ratings. These exhaustive lists were then narrowed and prioritized using a modified Delphi method, an effective means for building consensus to create a toolkit that is concise and digestible for a target audience.
The elements in the toolkit comprised of patient engagement practices and resources are not unexpected. Many of the items on the list are fundamental, routine activities related to patient care, most taking place during the visit. Patient engagement activities occurred at various levels within all the organizations we evaluated regardless of performance; however, those with high performance scores were more likely to describe training opportunities, feedback mechanisms, and a focus on motivational interviewing. We also found that specific resources, such as training and QI initiatives, support and may be necessary for successful implementation of patient engagement practices. Barello et al’s review of over 1000 articles on patient engagement revealed that definitions of patient engagement tend to be narrow, missing components that may hinder or facilitate patient engagement.38 We captured a broader range of activities related to patient engagement, creating a toolkit of both practices and resources, that is translatable to any health care setting.
Many of the identified patient-engagement practices centered on providing patients with agency over their own care. These types of activities can be crucial to promoting patient activation, an important step that has been shown to improve health outcomes.39 Patient activation has been found to be particularly important for minority patients.40–42 Activating patients requires providers to fundamentally change the way in which they communicate with patients,43,44 using methods such as motivational interviewing to foster a “collaborative relationship” with patients.45 Collaborative relationships can reduce distrust, which among minority patients can stem from prior experience of racial discrimination in a clinical setting.46 Building trust has also been shown to be a key factor in health care utilization.47
We identified a number of system and local barriers to patient engagement that came up more often at low-performing sites. Similar barriers have been identified in other studies including a concern among staff about disruption to daily routines,48 time constraints,14,49 and staffing shortages.50 Key facilitators that have been identified for promoting patient engagement include clearer communication, maintaining staff satisfaction, building staff capacity, and a flexible culture dedicated to learning.51 Given the barriers expressed by our respondents, it remains to be seen whether this toolkit will lead to noticeable improvement on performance measures. Kilbourne et al17 emphasizes the importance of taking multiple steps to apply, evaluate, and further refine interventions to translate them into practice and reduce disparities. To fill the gaps that currently exist in patient engagement, we plan to translate this toolkit into practice by evaluating a small-scale implementation to learn about the best way to disseminate it.
The methods used in the development of this patient engagement toolbox have inherent strengths and weaknesses. While the modified Delphi approach allows for consensus building among diverse stakeholders, it is also possible that pressure from group members may stifle some unique perspectives. It was not possible to anonymize the participants during the online meetings, but complete anonymity cannot be guaranteed in most modified Delphi processes.52 It is also possible that a number of items may have fallen off the list because of limited opportunity for discussion. However, on the whole, the modified Delphi approach is a rigorous and systematic method for contributing to the limited body of knowledge about patient engagement.
We are now disseminating the toolkit created in this process across a local network of facilities. Implementation of these practices is being tracked on an ongoing basis along with facility performance on measures of patient-centered communication, self-management support, mental health support, and shared decision making. As sites implement the toolkit, our evaluation team will connect with them individually and in group settings to provide coaching and monitor progress. This will enable the development of a final set of “best” practices that will have been listed, prioritized, and vetted by clinical staff on the ground.
The authors thank the staff of the VISN 4 Center for Evaluation of Patient Aligned Care Teams (CEPACT).
1. Frosch D, Dardess P, Baade S, et al. Moving from mandate to reality: a roadmap for patient and family engagement. 2015. Available at: http://Healthaffairs.Org/Blog/2015/01/20/Moving-from-Mandate-to-Reality-a-Roadmap-for-Patient-and-Family-Engagement
. Accessed October 1, 2016.
2. Carman KL, Dardess P, Maurer M, et al. Patient and family engagement: a framework for understanding the elements and developing interventions and policies. Health Aff (Millwood). 2013;32:223–231.
3. Berwick DM, Nolan TW, Whittington J. The triple aim: care, health, and cost. Health Aff (Millwood). 2008;27:759–769.
4. Hibbard JH, Greene J, Becker ER, et al. Racial/ethnic disparities and consumer activation in health. Health Aff (Millwood). 2008;27:1442–1453.
5. Nelson LA, Mulvaney SA, Gebretsadik T, et al. Disparities in the use of a mHealth medication adherence promotion intervention for low-income adults with type 2 diabetes. J Am Med Inform Assoc. 2016;23:12–18.
6. Johnson RL, Roter D, Powe NR, et al. Patient race/ethnicity and quality of patient-physician communication during medical visits. Am J Public Health. 2004;94:2084–2090.
7. Yamin CK, Emani S, Williams DH, et al. The digital divide in adoption and use of a personal health record. Arch Intern Med. 2011;171:568–574.
8. Jhamb M, Cavanaugh KL, Bian A, et al. Disparities in electronic health record patient portal use in nephrology clinics. Clin J Am Soc Nephrol. 2015;10:2013–2022.
9. Hicks LS, O’Malley AJ, Lieu TA, et al. Impact of health disparities collaboratives on racial/ethnic and insurance disparities in US community health centers. Arch Intern Med. 2010;170:279–286.
10. Trivedi AN, Grebla RC, Wright SM, et al. Despite improved quality of care in the veterans affairs
health system, racial disparity persists for important clinical outcomes. Health Aff (Millwood). 2011;30:707–715.
11. O’Connor AM, Bennett CL, Stacey D, et al. Decision aids for people facing health treatment or screening decisions. Cochrane Database Syst Rev. 2009.
12. Hinkin J. Hand decontamination: what interventions improve compliance? EDTNA-ERCA J. 2002;28:134–137.
13. Shea JA, Micco E, Dean LT, et al. Development of a revised health care system distrust scale. J Gen Intern Med. 2008;23:727–732.
14. Grande SW, Faber MJ, Durand M, et al. A classification model of patient engagement methods and assessment of their feasibility in real-world settings. Patient Educ Couns. 2014;95:281–287.
15. Say RE, Thomson R. The importance of patient preferences in treatment decisions--challenges for doctors. BMJ. 2003;327:542–545.
16. Burns KK, Bellows M, Eigenseher C, et al. “Practical” resources to support patient and family engagement in healthcare decisions: a scoping review. BMC Health Serv Res. 2014;14:175–189.
17. Kilbourne AM, Switzer G, Hyman K, et al. Advancing health disparities research within the health care system: a conceptual framework. Am J Public Health. 2006;96:2113–2121.
18. Rosland AM, Nelson K, Sun H, et al. The patient-centered medical home in the veterans health administration. Am J Manag Care. 2013;19:e263–e272.
19. Chokshi DA, Schectman G, Agarwal M. Patient-centered innovation: the VA approach. In: Healthcare. Vol. 1, Elsevier; 2013:72–75.
20. Belue R, Degboe A, Miranda P, et al. Do medical homes reduce disparities in receipt of preventive services between children living in immigrant and non-immigrant families? J Immigr Minor Health. 2012;14:617–625.
21. Marsh DR, Schroeder DG, Dearden KA, et al. The power of positive deviance. BMJ. 2004;329:1177–1179.
22. Nelson KM, Helfrich C, Sun H, et al. Implementation of the patient-centered medical home in the veterans health administration: associations with patient satisfaction, quality of care, staff burnout, and hospital and emergency department use. JAMA Intern Med. 2014;174:1350–1358.
23. Dyer N, Sorra JS, Smith SA, et al. Psychometric properties of the consumer assessment of healthcare providers and systems (CAHPS(R)) clinician and group adult visit survey. Med Care. 2012;50(suppl):S28–S34.
24. Scholle SH, Vuong O, Ding L, et al. Development of and field test results for the CAHPS PCMH survey. Med Care. 2012;50(suppl):S2–10.
25. Hausmann LR, Canamucio A, Gao S, et al. Racial and ethnic minority concentration in veterans affairs
facilities and delivery of patient-centered primary care. Popul Health Manag. 2016.
26. Rockville MD, Maurer M, Dardess P, et al. Guide to patient and family engagement: environmental scan report. 2012. Available at: http://www.ahrq.gov/research/findings/final-reports/ptfamilyscan/ptfamilyscan.pdf
. Accessed January 17, 2015. [WebCite Cache].
27. Maurer M, Dardess P, Carman KL, et al. Guide to Patient and Family Engagement: Environmental Scan Report. Rockville, MD: Agency for Healthcare Research and Quality; 2012.
28. Ltd QIP. NVivo qualitative data analysis software; QSR International Pty Ltd. Version 11. 2014.
29. Guest G, MacQueen KM, Namey EE. Applied Thematic Analysis. California: Sage; 2011.
30. Landis JR, Koch GG. The measurement of observer agreement for categorical data. Biometrics. 1977;33:159–174.
31. Hsu C, Sandford BA. The Delphi technique
: making sense of consensus. Pract Assess Res Eval. 2007;12:1–8.
32. Mitchell TR. People in organizations: an introduction to organizational behavior. New York: McGraw-Hill; 1982.
33. Patton MQ. Qualitative Research
and Evaluation Methods, III ed. Thousand Oaks, CA: Sage; 2002.
34. Doran GT. There’s a SMART way to write management’s goals and objectives. Manage Rev. 1981;70:35–36.
35. Rollnick S, Miller WR, Butler CC, et al. Motivational interviewing in health care: helping patients change behavior. 2008:203.
36. Zapatka SA, Conelius J, Edwards J, et al. Pioneering a primary care adult nurse practitioner interprofessional fellowship. J Nurse Pract. 2014;10:378–386.
37. VA Centers of Excellence in Primary Care Education (CoEPCE). Available at: www.va.gov/OAA/coepce/
. Accessed March 29, 2017.
38. Barello S, Graffigna G, Vegni E, et al. The challenges of conceptualizing patient engagement in healthcare: a lexicographic literature review. J Particip Med. 2014;6:e9.
39. Hibbard JH, Greene J. What the evidence shows about patient activation: Better health outcomes and care experiences; fewer data on costs. Health Aff (Millwood). 2013;32:207–214.
40. Alegria M, Polo A, Gao S, et al. Evaluation of a patient activation and empowerment intervention in mental health care. Med Care. 2008;46:247–256.
41. Alegría M, Sribney W, Perez D, et al. The role of patient activation on patient–provider communication and quality of care for US and foreign born latino patients. J Gen Intern Med. 2009;24:534–541.
42. Katz ML, Fisher JL, Fleming K, et al. Patient activation increases colorectal cancer screening rates: a randomized trial among low-income minority patients. Cancer Epidemiol Biomarkers Prev. 2012;21:45–52.
43. Frosch DL, May SG, Rendle KA, et al. Authoritarian physicians and patients’ fear of being labeled “difficult” among key obstacles to shared decision making. Health Aff (Millwood). 2012;31:1030–1038.
44. Entwistle VA, Watt IS. Patient involvement in treatment decision-making: the case for a broader conceptual framework. Patient Educ Couns. 2006;63:268–278.
45. Graves E, Watkins RW. Motivational interviewing: patient engagement as the key to healthy practices, patients, and practitioners. N C Med J. 2015;76:175–176.
46. Armstrong K, Putt M, Halbert CH, et al. Prior experiences of racial discrimination and racial differences in health care system distrust. Med Care. 2013;51:144–150.
47. LaVeist TA, Isaac LA, Williams KP. Mistrust of health care organizations is associated with underutilization of health services. Health Serv Res. 2009;44:2093–2105.
48. Elwyn G, Scholl I, Tietbohl C, et al. “Many miles to go…”: a systematic review of the implementation of patient decision support interventions into routine clinical practice. BMC Med Inform Decis Mak. 2013;13:S14.
49. Gravel K, Légaré F, Graham ID. Barriers and facilitators to implementing shared decision-making in clinical practice: a systematic review of health professionals’ perceptions. Implement Sci. 2006;1:526–535.
50. Weiss ME, Yakusheva O, Bobay KL. Quality and cost analysis of nurse staffing, discharge preparation, and postdischarge utilization. Health Serv Res. 2011;46:1473–1494.
51. Luxford K, Safran DG, Delbanco T. Promoting patient-centered care
: a qualitative study of facilitators and barriers in healthcare organizations with a reputation for improving the patient experience. Int J Qual Health Care. 2011;23:510–515.
52. Keeney S, Hasson F, McKenna HP. A critical review of the Delphi technique
as a research methodology for nursing. Int J Nurs Stud. 2001;38:195–200.