SOCIAL DETERMINANTS OF HEALTH (SDH) are “the economic and social conditions that influence ... health” (Commission on Social Determinants of Health, 2008). SDH interplay with biological factors, disease status, and behavior to impact myriad health outcomes (Berkowitz et al., 2016; Garg et al., 2013), with particularly negative health impacts in socioeconomically vulnerable populations. Yet, in the United States, the appropriate role of primary care in addressing SDH (aka “social needs”) remains poorly defined (Gottlieb et al., 2017), and methods for effectively assessing and addressing these needs in clinical settings are not well understood. This is unfortunate because SDH screening in primary care settings could (i) improve health care teams' ability to understand the “upstream” factors impacting their patients' health and ability to act on care recommendations; (ii) inform clinical care decisions; and (iii) identify patients in need of referral to community resources to address identified needs (Garg et al., 2015; Gottlieb et al., 2013). It could also inform the provision and funding of community resources by providing data showing the need for such services.
Several national efforts now underway support standardizing SDH data collection in electronic health records (EHRs) so as to make SDH data available to care teams. An Institute of Medicine (IOM) committee was convened in 2014 to identify social and behavioral domains that most strongly determine health and to identify the measures of those domains that could be used in EHRs. The committee's 2014 phase 2 report recommended 11 candidate SDH data domains, selected on the basis of (1) association with health; (2) “actionability” when treating patients and developing interventions; (3) availability and standardization of reliable, valid measures; (4) the feasibility of collecting and general accessibility of data; and (5) sensitivity, such as patient comfort with disclosing information (Committee on the Recommended Social and Behavioral Domains and Measures for Electronic Health Records, Board on Population Health and Public Health Practice, Institute of Medicine, 2014, 2015). The Medicare Access & CHIP Reauthorization Act of 2015 (Centers for Medicare & Medicaid Services, 2016), Centers for Medicaid Services 2016 Quality Strategy (Centers for Disease Control and Prevention, 2016), and the Office of the National Coordinator on Health Information Technology (Health and Human Services Department, 2016) also emphasize the need to standardize recording SDH data in EHRs. Most recently, the Centers for Medicare & Medicaid Services has emphasized screening on 5 SDH domains that can be addressed through community services—housing instability, food insecurity, transportation difficulties, utility assistance needs, and interpersonal safety (Billioux et al., 2017).
Despite this growing national attention, few ambulatory care settings have developed or reported on systematic SDH screening approaches (Chung et al., 2016); thus, lacking standardized workflows/screening tools, existing efforts to assess patients' SDH have typically been ad hoc (Adler & Stead, 2015). Past efforts to bring diverse patient-reported measures (PRMs) into primary care settings (Bryan et al., 2014; Spertus, 2014) faced multiple challenges: the logistical burden of collecting and using these data; inability to bill for time used to interpret PRM data; the need to tailor PRMs to meet clinic priorities; the difficulty of taking action on PRM data with available resources; and lack of clarity as to which PRMs matter most to primary care teams and/or patients (Boyce et al., 2014; Hostetter, 2012; Ivanova et al., 2011; Nelson et al., 2015; Ridgeway et al., 2013). It is very likely that efforts to develop SDH screening tools will encounter similar barriers. This article describes the processes used by 6 early developers of SDH screening tools, and the barriers that these diverse ambulatory care organizations faced, to guide others conducting similar efforts.
- Domain: a specific category of SDH (eg, food or housing)
- Item: an SDH question within an SDH domain
- Measure: a collection of items used to address a single SDH
- Screening tool: a collection of items and/or measures used as a group
On the basis of team members' knowledge of the field, we identified 6 organizations that had developed tools for ambulatory care–based SDH screening (Table 1). In summer 2016, representatives of these organizations were asked to take part in semistructured, 45- to 60-minute, audio-recorded interviews via phone or in person. This format ensured comparable data across sites while allowing for emergent ideas. The interview guide focused on the following: the purpose for SDH screening in ambulatory primary care; their development processes; and how their tool/strategies were used. One organization (case study 5) chose to draft its own case study based on themes within the interview guide, which was then edited through an iterative process between the lead authors and the author of case study 5. We reviewed each organization's screening tool and supplementary materials. Kaiser Permanente's (KP's) Center for Health Research institution review board approved this study.
Data management and analysis
The lead author (K.L.) repeatedly listened to all recordings and transcribed passages related to SDH tool development. Because of financial constraints, only 2 of 5 transcripts were fully professionally transcribed. K.L. then reviewed the transcripts and wrote detailed chronological accounts of interviewees' processes of developing and refining SDH screening tools. The team communicated with the interviewees to refine and confirm understanding and then used these texts to identify interviewees' common experiences, processes, and barriers. When a particular element was found in an interview (ie, workflow issues), we examined all other cases for instances of that element. This was first done by K.L. and then discussed and refined with other authors. This synthesis was descriptive and illustrated commonalities of experience; see Table 2–4. Each interviewee organization approved the final manuscript.
Case study 1: HealthBegins
HealthBegins developed an SDH screening tool in 2011, as colleagues sought guidance on how to collect SDH data.
HealthBegins reviewed the SDH screening literature, compiled an inventory of existing SDH items, and then from this inventory chose domains/items to include in its screening tool, based on collection feasibility and salience to ambulatory primary care. To avoid redundancy, it excluded domains already frequently collected in primary care, such as race/ethnicity (Table 3). It also drew on team members' clinical experience to identify potentially important SDH domains for which no items existed.
HealthBegins initially envisioned creating a Web-based “bank” of validated SDH items. However, as its members felt this interface would be complex to implement and maintain, HealthBegins decided to create an SDH screening tool as a static PDF. This could be used just for SDH screening, but HealthBegins hoped it would also spark discussions among clinicians about incorporating SDH data into patient care. The PDF was developed before the IOM's recommendations were released and then updated in January 2015 to incorporate them; it was the first available SDH risk screening tool to incorporate these recommendations.
HealthBegins' SDH screening tool (Manchanda & Gottlieb, 2015) is now available online as a free PDF. It includes suggestions on screening frequency, scoring instructions, and a “Referral Plan Complete?” checkbox for each domain. HeathBegins' “Get Ready, Get Set, Go Upstream” operational framework provides technical assistance (eg, readiness assessment and other evaluative tools) for clinics seeking to use the tool.
By early 2017, more than 1500 health care professionals representing more than 1000 health systems had downloaded the HealthBegins tool.
Case study 2: WellRx Pilot at University of New Mexico
The Office for Community Health (OCH) at the University of New Mexico (UNM) in Albuquerque had a long-standing interest in integrating SDH into ambulatory care. In 2012, the OCH received a Blue Cross/Blue Shield Community Grant to work with stakeholders (community health workers [CHWs], nurses, clinic directors, physician assistants, and doctors) from 3 family medicine primary care clinics to develop an SDH screening tool.
The OCH team reviewed existing SDH screening tools, prioritizing ease of use and appropriate literacy level. No single tool met these criteria and also captured the SDH domains it sought—existing tools were too lengthy, narrowly focused, or population-specific. Thus, to create its own screening tool, the team developed a preliminary SDH domain list based on stakeholder input and their experience as physicians/researchers. Through an iterative process, it selected 11 SDH domains to include and then wrote 1 item per domain to create the “WellRx Questionnaire.” The team emphasized straightforward items and sought to make its tool at a third-grade reading level, although some domains were too complex for this. All items had 2 possible responses, “Yes” or “No,” aligned to visually enable easy identification of unmet needs. The screening tool was translated into Spanish. It was originally available in the paper format to UNM clinics. Resource lists for each domain were developed and kept in each clinic in the paper form. In December 2016, the tool was made available in the UNM Hospital system's EHR.
The tool was piloted in 3 Albuquerque clinics (Page-Reeves et al., 2016); each chose how it was implemented. For some, a receptionist handed the patient the paper screening tool to complete and placed it into the visit folder that was given to the provider just before the visit or directly to a CHW. If the latter, the CHW entered the information into an EHR free text box to document SDH needs identified and resources provided. In other clinics, the survey was administered verbally by the medical assistant (MA) during rooming. If the patient had an SDH need, the MA entered a diagnosis, “Inadequate Community Resources,” into the EHR. The provider saw this, reviewed the paper survey results, and then asked the patient whether assistance was desired. If yes, the provider sent the CHW an EHR message; in clinics with no CHW, the provider handed the patient an information sheet with relevant resources.
In its 90-day pilot, the WellRx tool was completed by 3048 patients; 2038 were collected by MAs and 1110 by self-administration; 46% of screened patients identified 1 or more unmet SDH needs (Page-Reeves et al., 2016). Since then, UNM mandated using the tool to screen all patients at its 9 primary care clinics. Resource lists are now produced and updated by SHARE New Mexico, based on its statewide Resource Directory. SHARE built a portal through which clinic staff can access this directory and share resource information with patients/other clinic staff.
Case study 3: Mosaic Medical
Mosaic had a “warm handoff” in place whereby providers referred patients with nonmedical needs to CHWs at a clinic visit. However, Mosaic felt that SDH screening prior to visits would yield better assessments. It wanted a standardized tool for screening, tracking progress in remediating patients' SDH, identifying gaps in needed services, and enhancing outreach.
Mosaic's tool was built by its “Population Specialist” who surveyed existing tools and concluded that they were all too long and/or at an inappropriately high literacy level. Thus, the specialist drafted an SDH screening tool including domains selected on the basis of the specialist's knowledge of his or her patient population. The draft tool included 19 items covering 16 domains. Mosaic's nurse care coordinators, behavioral health specialists, and CHWs were then asked whether important domains were missing. Per their feedback, the final version included 43 items covering 14 domains. It was made available in a paper format and as a data entry “flow sheet” within the EHR. Mosaic's Population Specialist also created a workflow for how CHWs should introduce the screening tool to patients and enter data into the EHR.
The tool was piloted by screening all new patients in 1 Mosaic clinic. Before any clinic visit, these patients met with a “Patient Navigator” who provided information about the Mosaic model of care and services per standard intake processes. The SDH tool was then administered verbally by a CHW or completed by the patient on paper, after which the CHW entered the data into the EHR, along with the “Smart Code” for a positive SDH screening and other relevant information, including free text notes. Mosaic used this process to identify SDH needs but not to track referrals.
Mosaic discontinued use of its tool after 2 years in favor of OCHIN's SDH screening tool, described in case study 6. As an OCHIN member, Mosaic provided input toward that tool's development.
Case study 4: Care Management Institute at Kaiser Permanente
KP developed its SDH screening tool because SDH information was considered important for assessing patient/population health; Meaningful Use phase 3 attestation was expected to involve SDH reporting; and several KP programs hoped to use SDH data. The effort was led by KP's Care Management Institute's (CMI's) Center for Population Health. An inter-regional Advisory Group was established to guide these efforts.
The CMI started by reviewing the IOM-recommended SDH domains/items and other available tools. It then sought content advice from stakeholders within KP, public health researchers, and community-based organizations. Its first screening tool included most of the IOM-recommended domains, with slightly altered items. On the basis of stakeholder feedback, the CMI developed a more comprehensive, 32-item tool, “Your Current Life Situation” (YCLS). This was made available to all KP regions for pilot testing, administered by a paper form, scripted telephone interview, or online in the patient portal. Responses entered via the patient portal went directly into the EHR (HealthConnect). A parallel “ambulatory” screening tool was created to let clinic staff enter responses collected during phone interviews or via paper. The entered responses highlighted “positive” screening results requiring follow-up and were immediately available in the EHR.
KP leaders and staff were generally enthusiastic about the EHR-integrated YCLS tool but concerned about how long it would take to administer. Thus, the CMI developed a short YCLS (Tables 3 and 4) and a resource of additional items with further information. This “item bank” includes curated items organized by domain; to create it, the CMI surveyed KP leaders and program staff about prioritized domains. The shorter YCLS can be completed using the same mechanisms as the longer version, described earlier. The tool and the supplemental item bank were translated into several languages, although the online version is only currently available in English. The CMI also assists KP programs seeking to adopt the YCLS.
KP's SDH screening is intended to be part of integrated total health assessment, customized to ask patients only what is relevant to their circumstances. As more programs use the YCLS, the CMI continues to seek feedback and to help regional programs develop and share best practices for SDH follow-up/referrals, coding/other tracking, and integrating SDH data into care.
Case study 5: National Association of Community Health Centers
In September 2013, the National Association of Community Health Centers, the Association of Asian Pacific Community Health Organizations, the Oregon Primary Care Association, and the Institute for Alternative Futures launched a national effort to develop a standardized SDH screening tool for community health centers (CHCs). This coalition created, piloted, and disseminated the “Protocol for Responding to and Assessing Patient Assets, Risks, and Experiences” (PRAPARE) (National Association of Community Health Centers, 2016) with support from the Kresge Foundation, Kaiser Permanente East Bay Community Benefit Foundation, and Blue Shield of California Foundation. PRAPARE was developed to help CHCs document and address patient SDH-related risks; enhance team members' knowledge of patients' circumstances and use that knowledge to better integrate social service interventions and resources into care; identify gaps in the availability of community “upstream” resources; and generate data that could help CHCs demonstrate the value they bring to patients, communities, and payers.
The PRAPARE coalition first reviewed 50 existing SDH screening tools, few of which had been incorporated into EHRs or validated. SDH domains included in the PRAPARE tool were selected on the basis of evidence that they predicted health outcomes and costs; national SDH initiatives (eg, Healthy People 2020) (Robert Wood Johnson Foundation, 2016); IOM recommendations; and stakeholder input. Specific items for each domain were selected on the basis of actionability, data collection burden, and other criteria.
To facilitate SDH data capture and reporting, project leaders then worked with 4 pilot teams representing 7 CHCs to build and test the PRAPARE tool in commonly used EHRs: NextGen, eClinical Works, GE Centricity, and Epic. An OCHIN (case study 6) CHC in Oregon tested the Epic version. SDH data already collected by CHCs can be auto-populated into these tools. Response options can be made more granular or used in combination with other risk screening tools. Where available, International Classification of Diseases, Tenth Revision, social risk codes are included.
A PRAPARE Implementation and Action Toolkit was released in September 2016 to guide implementation; it includes the 4 EHR tools, technical resources, best practices, multiple tested workflows, and other resources. A paper-based version of the PRAPARE screening tool is also available.
By January 2017, more than 260 organizations/multiorganization coalitions had downloaded a PRAPARE tool. Other health centers have built PRAPARE into IT-enabled solutions beyond these 4 EHRs. PRAPARE tools are now being built for other EHRs, and tablets and patient portals for collecting PRAPARE data are being tested.
Case study 6: OCHIN
The Clinical Operations Review Committee (CORC), OCHIN's member-led clinician advisory group, wanted to expand on PRAPARE by integrating the IOM-recommended domains and creating a suite of EHR-based tools. Its tool development was conducted in conjunction with a National Institutes of Health–funded pilot study assessing how to integrate SDH screening into primary care workflows using EHRs (R18DK105463).
The development team (researchers working with CORC clinicians) first reviewed the PRAPARE tool and IOM-recommended SDH domains. They chose to include all IOM-recommended domains plus others targeted to CHC patients (either from PRAPARE or recommended by the CORC). OCHIN's tool is considered the official Epic version of PRAPARE, referred to as “PRAPARE-plus” because of the inclusion of additional domains beyond the original PRAPARE questionnaire.
Building on PRAPARE, the team sought to identify optimal strategies for using EHR tools for collecting and presenting SDH data, using an iterative process (described elsewhere) (Gold et al., 2017). To enable SDH screening in diverse workflows, the tools included a data collection “flow sheet” accessible at check-in, rooming, or postvisit; a screening tool in the patient portal that could be sent to patients before visits; and a paper version of the screening tool that could be completed by patients at the clinic and then entered into the flow sheet by clinic staff. An SDH data summary tool was built to bring in data from these routes/other places in the EHR where relevant data are collected (eg, race/ethnicity is pulled from Vitals).
For 3 pilot CHCs, another tool was developed to facilitate referrals to local social services for several SDH domains. CHC clinicians, patient advocates, and social workers were asked to identify high-priority SDH domains, as well as community agencies they had worked with before. This information was used to populate “preference lists” of local resources in the EHR. When a referral is made from these lists, the resource contact information and instructions are added to the after-visit summary.
In June 2016, the SDH data collection/summary tools were made available in the EHR shared by all OCHIN member clinics (97 sites in 18 states). Preliminary data show variation in screening adoption and workflows. The SDH preference lists are now being pilot-tested; next steps are to assess uptake of the tools and identify barriers to their use, needed adaptations, and how to optimally integrate SDH information into patient care.
Common experiences and challenges
The organizations started by reviewing existing tools found that none met their needs (eg, inappropriate for a given organization's structure, preferences, and patients) and sought to develop their own by writing their own items or picking specific items/domains to include. When development processes occurred after the IOM report became available, many of its recommended domains were included. However, customization was the norm. For example, where the IOM included “Financial Resource Strain” as a single domain, the interviewees often included more precise domains (Table 4). Most interviewees wanted SDH tools that fit into existing workflows and avoided redundant data collection, so they usually excluded IOM-recommended domains on alcohol use, stress, depression, and physical activity (which are often already captured) from the screening tools.
Another common concern was how to administer SDH screening in ambulatory primary care workflows. As shown in Table 2, the organizations we interviewed produced tools utilizing differing degrees of technology: all were accessible via paper, most in an EHR, and half in patient-facing portals. This demonstrates the variability of primary health care settings' approaches to integrating SDH data tools into their health information technology structures. Various approaches have different pros and cons; developing and updating EHR-based tools require resources and infrastructure support, but SDH data collected on paper must be manually entered into the EHR, consuming staff time. Most interviewees used the combination of modalities that worked best for their setting. This need for flexibility was universal.
Although concerns were common, interviewees encountered little patient discomfort with SDH screening. KP was concerned about the impact of collecting potentially sensitive data via phone. Mosaic was concerned about the sensitivity of intimate partner violence and substance use items and omitted these. OCHIN debated which intimate partner violence item to include and finally chose a more general item, rather than the IOM-recommended 4-item measure.
All interviewees were concerned about care teams being unable to address positive SDH screenings because of limited staff time, lack of local resources, etc. Mosaic and PRAPARE developers believed that even if they could not address every SDH need, SDH data could help identify unmet community needs, thus supporting advocacy. Although the WellRx team had concerns that SDH screening would be burdensome, its physicians reported that patients received more holistic care, lessening workloads, and improving care quality. Another concern was how best to communicate with local agencies, track outcomes of past referrals, and—if resource lists were created—how to keep them updated.
The processes used by these interviewees to develop SDH tools varied depending on organizational perspectives, needs, and goals. Their efforts highlight considerations that may help other organizations develop their own SDH screening plans.
The IOM-recommended SDH domains were often included in the screening tools. These recommendations, along with those recently released by the Centers for Medicare & Medicaid Services, could facilitate standardization of SDH screening, providing data on how SDH impact health and how clinic-based interventions mitigate these impacts. However, there is a clear need to customize SDH screening tools to specific settings/populations. Thus, development of SDH tools must consider how data might be comparable across populations, yet customized to local needs. Local resource availability should also be considered. Some organizations were reluctant to ask patients about needs that they/their community could not address. Clinics hoping to refer patients to community resources to address SDH needs will require accurate lists of available resources, and maintenance of these lists likely requires designated staff time and standardized processes for integrating community resource information.
Despite concerns about patient willingness to share SDH information, our interviewees' actual experience aligns with prior research showing low refusal rates (Giuse et al., 2017; Page-Reeves et al., 2016, Pinto et al., 2016). Interviewees highlighted the importance of “messaging” SDH screening to patients in a way that builds trust. As SDH screening becomes more widespread, it will be important to maintain awareness of how different patient populations respond.
Diverse methods were used to integrate SDH screening into clinic workflows while minimizing burden to care teams. There is a need to collect SDH data expeditiously, without harming the patient's trust. Research is needed to determine which screening modalities best address this (Garg et al., 2005, 2007, 2015) and the feasibility of leveraging technology (eg, tablets, smartphones) to collect SDH data (Tung & Peek, 2015).
EHR-based approaches to SDH screening require organizational capacity to build and maintain EHR tools. Such approaches are likely to increase, as EHRs are now in common use (The Office of the National Coordinator for Health Information Technology, 2015), and national initiatives emphasize EHR-based SDH data. Institutional support for integration of SDH into primary care and EHRs, in addition to recommended domains and items, could help standardize and focus SDH data collection. However, as indicated in Table 3, IOM recommendations were followed loosely and all organizations interviewed added additional items to suit their patient populations. This suggests an unresolved tension between tailoring and standardizing data collection in diverse settings.
Our interviewees are not necessarily representative of all current SDH screening development processes. In addition, all interviewed organizations were based in the United States. The outlined processes are reflective of both researchers and patient populations within the United States. However, much can be learned from these organizations' pioneering efforts. We encourage others conducting similar activities to publish on their efforts. Second, we primarily interviewed individuals who developed and refined screening tools, rather than those who used the tools (with one exception). We sought to provide insight into successes and barriers in tool development; tool implementation is beyond the scope of this article but deserves similar description.
Despite the importance of SDH data, and recent policy emphasis on health care providers collecting and acting on SDH data, the adoption of EHR-based SDH data collection has been gradual. Adoption of SDH data collection in primary care may be impacted by barriers similar to those that have slowed the uptake of other types of PRMs, as described earlier. This article summarizes how 6 diverse health care organizations sought to address similar challenges as they developed SDH screening tools to guide others hoping to design and implement such tools. More research is needed to assess how to implement these approaches in diverse care settings and how to use SDH data once collected.
Adler N. E., Stead W. W. (2015). Patients in context—EHR capture of social and behavioral determinants of health. The New England Journal of Medicine, 372(8), 698–701. doi:10.1056/NEJMp1413945
Berkowitz S. A., Hulberg A. C., Hong C., Stowell B. J., Tirozzi K. J., Traore C. Y., Atlas S. J. (2016). Addressing basic resource needs to improve primary care
quality: A community collaboration programme. BMJ Quality & Safety, 25(3), 164–172. doi:10.1136/bmjqs-2015-004521
Billioux A. V., Verlander K., Anothony S., Alley D. (2017). Standardized screening
for health-related social needs in clinical settings. The Accountable Health Communities Screening
Tool. Retrieved from https://nam.edu/wp-content/uploads/2017/05/Standardized-Screening
Boyce M. B., Browne J. P., Greenhalgh J. (2014). The experiences of professionals with using information from patient-reported outcome measures
to improve the quality of healthcare: A systematic review of qualitative research. BMJ Quality & Safety, 23(6), 508–518. doi:10.1136/bmjqs-2013-002524
Bryan S., Davis J., Broesch J., Doyle-Waters M. M., Lewis S., McGrail K., Sawatzky R. (2014). Choosing your partner for the PROM: A review of evidence on patient-reported outcome measures
for use in primary and community care. Healthcare Policy, 10(2), 38–51.
Centers for Disease Control and Prevention. (2016). CMS timeline of important MU dates. Meaningful Use. Retrieved February 28, 2017, from https://http://www.cdc.gov
Centers for Medicare & Medicaid Services. (2016). MACRA: Delivery system reform, Medicare payment reform. Retrieved September 9, 2016, from https://http://www.cms.gov
Chung E. K., Siegel B. S., Garg A., Conroy K., Gross R. S., Long D. A., Fierman A. H. (2016). Screening
for social determinants of health
among children and families living in poverty: A guide for clinicians. Current Problems in Pediatric and Adolescent Health Care, 46(5), 135–153. doi:10.1016/j.cppeds.2016.02.004
Commission on Social Determinants of Health
. (2008). Closing the gap in a generation: Health equity through action on the social determinants of health
: Commission on Social Determinants of Health
final report. Geneva, Switzerland: World Health Organization.
Committee on the Recommended Social and Behavioral Domains and Measures for Electronic Health Records
, Board on Population Health and Public Health Practice, Institute of Medicine. (2014). Capturing social and behavioral domains in electronic health records
: Phase 1. Washington, DC: National Academies Press.
Committee on the Recommended Social Behavioral Domains Measures for Electronic Health Records
, Board on Population Health Public Health Practice, Institute of Medicine. (2015). Capturing social and behavioral domains and measures in electronic health records
: Phase 2. Washington, DC: National Academies Press.
Garg A., Butz A. M., Dworkin P. H., Lewis R. A., Thompson R. E., Serwint J. R. (2007). Improving the management of family psychosocial problems at low-income children's well-child care visits: The WE CARE Project. Pediatrics, 120(3), 547–558. doi:10.1542/peds.2007-0398
Garg A., Jack B., Zuckerman B. (2013). Addressing the social determinants of health
within the patient-centered medical home: Lessons from pediatrics. JAMA, 309(19), 2001–2002.
Garg A., Toy S., Tripodis Y., Silverstein M., Freeman E. (2015). Addressing social determinants of health
at well child care visits: A cluster RCT. Pediatrics, 135(2), e296–e304.
Garg A. X., Adhikari N. K., McDonald H., Rosas-Arellano M. P., Devereaux P. J., Beyene J., Haynes R. B. (2005). Effects of computerized clinical decision support systems on practitioner performance and patient outcomes: A systematic review. JAMA, 293(10), 1223–1238.
Giuse N. B., Koonce T. Y., Kusnoor S. V., Prather A. A., Gottlieb L. M., Huang L. C., Stead W. W. (2017). Institute of Medicine measures of social and behavioral determinants of health: A feasibility study. American Journal of Preventive Medicine, 52(2), 199–206. doi:10.1016/j.amepre.2016.07.033
Gold R., Cottrell E., Bunce A., Middendorf M., Hollombe C., Cowburn S., Melgar G. (2017). Developing electronic health record (EHR) strategies related to health center patients' social determinants of health
. Journal of the American Board of Family Medicine, 30(4), 428–447. doi:10.3122/jabfm.2017.04.170046
Gottlieb L., Sandel M., Adler N. E. (2013). Collecting and applying data on social determinants of health
in health care settings. JAMA Internal Medicine, 173(11), 1017–1020.
Gottlieb L. M., Wing H., Adler N. E. (2017). A systematic review of interventions on patients' social and economic needs. American Journal of Preventive Medicine. Advance online publication. doi:10.1016/j.amepre.2017.05.011
Health and Human Services Department. (2016). 2015 Edition Health Information Technology (Health IT) certification criteria, 2015 edition base electronic health record (EHR) definition, and ONC Health IT certification program modifications. Retrieved November 17, 2016, from https://http://www.federalregister.gov
Hostetter M. K. S. (2012). Using patient-reported outcomes to improve health care quality. New York, NY: The Commonwealth Fund.
Institute of Medicine. (2014). Recommended social and behavioral domains and measures for electronic health records
. Retrieved from http://http://www.iom.edu
Ivanova J. I., Birnbaum H. G., Schiller M., Kantor E., Johnstone B. M., Swindle R. W. (2011). Real-world practice patterns, health-care utilization, and costs in patients with low back pain: The long road to guideline-concordant care. Spine Journal, 11(7), 622–632.
Manchanda R., Gottlieb L. (2015). Upstream risks screening
tool & guide V2.6. Retrieved February 27, 2017, from https://http://www.aamc.org
National Association of Community Health Centers
. (2016). PRAPARE. Retrieved February 23, 2017, from http://http://www.nachc.org
Nelson E. C., Eftimovska E., Lind C., Hager A., Wasson J. H., Lindblad S. (2015). Patient reported outcome measures in practice. BMJ, 350, g7818.
Page-Reeves J., Kaufman W., Bleecker M., Norris J., McCalmont K., Ianakieva V., Kaufman A. (2016). Addressing social determinants of health
in a clinic setting: The WellRx Pilot in Albuquerque, New Mexico. Journal of the American Board of Family Medicine, 29(3), 414–418. doi:10.3122/jabfm.2016.03.150272
Pinto A. D., Glattstein-Young G., Mohamed A., Bloch G., Leung F. H., Glazier R. H. (2016). Building a foundation to reduce health inequities: Routine collection of sociodemographic data in primary care
. Journal of the American Board of Family Medicine, 29(3), 348–355. doi:10.3122/jabfm.2016.03.150280
Ridgeway J. L., Beebe T. J., Chute C. G., Eton D. T., Hart L. A., Frost M. H., Sloan J. A. (2013). A brief Patient-Reported Outcomes Quality of Life (PROQOL) instrument to improve patient care. PLoS Medicine, 10(11), e1001548. doi:10.1371/journal.pmed.1001548
Robert Wood Johnson Foundation. (2016). County Health Rankings & Roadmaps. Retrieved February 24, 2017, from http://http://www.countyhealthrankings.org
Spertus J. (2014). Barriers to the use of patient-reported outcomes in clinical care. Circulation: Cardiovascular Quality and Outcomes, 7(1), 2–4. doi:10.1161/circoutcomes.113.000829
The Office of the National Coordinator for Health Information Technology. (2015). Office-based physician electronic health record adoption. Retrieved February 25, 2017, from https://dashboard.healthit.gov/quickstats/pages/physician-ehr-adoption-trends.php
Tung E. L., Peek M. E. (2015). Linking community resources in diabetes care: A role for technology? Current Diabetes Reports, 15(7), 45. doi:10.1007/s11892-015-0614-5