Skip Navigation LinksHome > September 2012 - Volume 50 - Issue > Development and Evaluation of CAHPS® Survey Items Assessing...
Medical Care:
doi: 10.1097/MLR.0b013e3182652482
Health Literacy

Development and Evaluation of CAHPS® Survey Items Assessing How Well Healthcare Providers Address Health Literacy

Weidmer, Beverly A. MA*; Brach, Cindy MA; Hays, Ron D. PhD

Free Access
Article Outline
Collapse Box

Author Information

*RAND Corporation, Santa Monica, CA

Center for Delivery, Organization, and Markets, Agency for Healthcare Research and Quality, Rockville, MD

UCLA Department of Medicine, Division of General Internal Medicine & Health Services Research, Los Angeles, CA

Supported by a contract from the Agency for Healthcare Research and Quality (HHSP233200600332P). R.D.H. was also supported in part by grants from AHRQ (U18 HS016980), NIA (P30AG021684), and the NIMHD (2P20MD000182).

The authors declare no conflict of interest.

Reprints: Beverly A. Weidmer, MA, RAND Corporation, 1776 Main Street, P.O. Box 2138, Santa Monica, CA 90407-2138. E-mail: beverly_weidmer@rand.org.

Collapse Box

Abstract

Background: The complexity of health information often exceeds patients’ skills to understand and use it.

Objective: To develop survey items assessing how well healthcare providers communicate health information.

Methods: Domains and items for the Consumer Assessment of Healthcare Providers and Systems (CAHPS)® Item Set for Addressing Health Literacy were identified through an environmental scan and input from stakeholders. The draft item set was translated into Spanish and pretested in both English and Spanish. The revised item set was field tested with a randomly selected sample of adult patients from 2 sites using mail and telephonic data collection. Item-scale correlations, confirmatory factor analysis, and internal consistency reliability estimates were estimated to assess how well the survey items performed and identify composite measures. Finally, we regressed the CAHPS global rating of the provider item on the CAHPS core communication composite and the new health literacy composites.

Results: A total of 601 completed surveys were obtained (52% response rate). Two composite measures were identified: (1) Communication to Improve Health Literacy (16 items); and (2) How Well Providers Communicate About Medicines (6 items). These 2 composites were significantly uniquely associated with the global rating of the provider (communication to improve health literacy: P<0.001, b=0.28; and communication about medicines composite: P=0.02, b=0.04). The 2 composites and the CAHPS core communication composite accounted for 51% of the variance in the global rating of the provider. A 5-item subset of the Communication to Improve Health Literacy composite accounted for 90% of the variance of the original 16-item composite.

Conclusions: This study provides support for reliability and validity of the CAHPS Item Set for Addressing Health Literacy. These items can serve to assess whether healthcare providers have communicated effectively with their patients and as a tool for quality improvement.

Health literacy is the capacity to obtain, communicate, process, and understand basic health information and services to make health decisions.1 A nationally representative assessment of English literacy among American adults aged 16 and older estimated that only 12% of US adults have proficient health literacy.2 Over a third of US adults (77 million) have difficulty with common health tasks, such as following directions on a prescription drug label or adhering to a childhood immunization schedule using a standard chart.3

The complexity of health information and the communication skills of healthcare providers affect patient understanding.4,5 Reducing health literacy demands has recently emerged as a national health priority6,7 and provider-patient communication objectives are included in the national health promotion and disease prevention program Healthy People 2020.8 Although health literacy has long been recognized as an important healthcare issue,9–12 attention has now turned to health care providers’ role in improving patient understanding.

The Consumer Assessment of Healthcare Providers and Systems (CAHPS)® Clinician and Group 12-month survey includes a question that asks, “How often does your provider explain things in a way that was easy to understand?” Responses to this item, however, do not indicate which aspects of communication are problematic or how providers and their practices can improve the quality of their communications. The CAHPS Item Set for Addressing Health Literacy was developed as both a measure of whether healthcare providers have succeeded in reducing the health literacy demands they place on patients, and as a tool for quality improvement.

Back to Top | Article Outline

METHODS

Item Development

We followed the standard CAHPS approach for developing surveys (Fig. 1). We conducted an environmental scan to identify domains of interest and relevant survey items. A Call for Measures was issued in the Federal Register to obtain additional measures, but few responses were submitted and no additional measures were obtained. We interviewed health literacy experts and held stakeholder meetings—including representatives from government agencies, healthcare providers, health literacy experts and advocates, and consumers—to prioritize the domains the item set should cover and learn about other potential sources of survey items. We mapped the survey items that we collected to the domains, and modified or adapted measures in the public domain to make them “CAHPS-like.” We drafted new survey items for the domains for which we were unable to identify appropriate existing items.

Figure 1
Figure 1
Image Tools
Back to Top | Article Outline
Translation

We translated the health literacy items into Spanish using the CAHPS guidelines for translation13 to produce a Spanish version that was conceptually equivalent to the English, easy to understand, and understood by Spanish speakers from different countries. The CAHPS approach involves 2 forward translations and then a bilingual committee review to resolve translation issues by consensus. The translators and bilingual reviewers that participated in producing the Spanish version were selected using the CAHPS guidelines for selecting translators and reviewers.14 In recent years, the translation by committee approach used has been shown to yield superior translations and has come to be seen as the recommended approach.15–17

Back to Top | Article Outline
Cognitive Interviews

Cognitive interviewing is a technique used to evaluate how survey respondents understand, mentally process, and respond to survey items and to use this information to modify and refine survey measures.18,19 This approach has been used routinely to evaluate CAHPS surveys in both English and Spanish.20–22 We conducted 2 rounds of cognitive interviews in both English and Spanish. We recruited interviewees that represented a mix of participants in terms of sex, race/ethnicity, insurance status, country of origin, and educational level, with the majority of respondents having <12 years of schooling.

Back to Top | Article Outline
Field Testing

We integrated the health literacy items, into the CAHPS Clinician and Group 12-month survey, interspersing the health literacy items among the core survey items. Table 1 shows the 30 health literacy items that were field-tested.

Table 1
Table 1
Image Tools

Two sites participated in the field test: a health plan located in New York City and an outpatient clinic based in an academic medical center in the southern United States. In recruiting field test partners, we targeted sites that would provide racial and ethnic diversity, sufficient numbers of Spanish speakers, and patients with limited health literacy skills. Both the health plan and the clinic provide free or low-cost health coverage to patients on Medicaid or who have no insurance at all and serve a largely low-income, minority population.

The sample frame for the field test included 1200 randomly selected adult patients who had at least 1 outpatient visit in the prior 12 months (600 per site). We used a combination of mail and phone survey administration, which involved a notification letter, 2 survey mailings, a reminder letter, and multiple follow-up phone calls to nonrespondents. To maximize response rates among Spanish speakers, Hispanic respondents were mailed materials in both English and Spanish. Respondents who completed a survey were mailed a thank you letter with a check for $10.

Back to Top | Article Outline
Composite Development

One of the goals of the field test was to assess whether reliable and valid composites could be constructed. A composite is composed of ≥2 survey items that are closely related conceptually and statistically. Composites are useful for both internal and public reporting of survey results because they summarize a large amount of data in a concise manner.23 Analyses were conducted to estimate the reliability and validity of the composites.24,25

As shown in Table 1, we placed health literacy items that were conceptually related into 6 groupings (multi-item composites): (1) patient/provider communication (10 items); (2) communication about health problems or concerns (2 items); (3) disease self-management (6 items); (4) communication about medicines (6 items); (5) communication about test results (2 items); and (6) communication about forms (3 items in English and 4 items in Spanish). We then ran tests of these 6 hypothesized composites to see whether the items in each “hung” together and could therefore be scored as a composite. The cutoff for inclusion of an item in a composite measure was an item-rest correlation of ≥0.30. We also examined the correlation of each item to the composite it was not hypothesized to represent to assess whether the composites represent statistically unique aspects of communication to improve health literacy. Through an iterative process, we revised the placement of items into composites, taking into account correlations between the items and item content, and reran the correlation analysis to assess the fit of each individual item into the composite it was hypothesized to represent. We also conducted categorical confirmatory factor analysis in Mplus26 to assess the fit of the composite structure. In addition, we identified a short subset of items from a final 16-item composite (Communication to Improve Health Literacy) by regressing the composite score on the items in it to identify the subset of items that accounted for most of the variance in this composite. We used a maximum R 2 forward selection procedure (MaxR option in SAS PROC REG) that tests the effects of switching different combinations of items on the total amount of variance explained. Our goal was to identify a subset of items that accounted for 90% of the variance in the 16-item composite score.

We also wanted to check that the health literacy composites had an effect on global ratings of the provider distinct from the effect of an existing communication composite comprised of 5 items from the CAHPS Clinician and Group survey. We therefore regressed the global rating of provider item (0–10 rating where 0=worse possible provider and 10=best possible provider) on the new composites and the CAHPS core communication composite.

Back to Top | Article Outline

RESULTS

Field Test Response

We obtained 601 completed surveys for a response rate of 52%. Overall, more surveys were completed in English (79%) than Spanish (21%) and more surveys were completed by mail (65%) than by phone (35%). Similar response rates were obtained from each field test site (51% and 53%).

Table 2 shows the demographic characteristics of survey respondents. Blacks were the largest racial group (45% as compared with 21% each for whites and other). Hispanics constituted 39% of respondents. Most respondents were female (80%) and 61% of the sample was 45 years or older. A significant proportion of respondents (36%) had less than a high school education, 24% graduated high school or obtained a general educational development, 26% had some college, and 12% had more than a college degree. Forty-two percent reported that their health was fair or poor. Demographic characteristics were not available for those that did not respond to the survey; therefore nonresponse bias could not be estimated.

Table 2
Table 2
Image Tools
Back to Top | Article Outline
Item Distributions

We conducted analyses to evaluate the distribution of survey items. The percentage of ceiling effects for the items ranged from 13% (q27) to 98% (q14) with a median of 70%. The percentage of floor effects ranged from 0.43% (q31) to 49% (q27) with a median of 3%. Item 14 was an outlier in terms of its ceiling effect (almost 98% of respondents said that their provider did not use a condescending, sarcastic, or rude tone or manner). However, this item was retained in the item set because endorsement of the item, although rare, can send a strong message to clinicians about how patients perceive them.

Back to Top | Article Outline
Correlations Between Items and Composites

We analyzed the extent to which items correlated together into multi-item composites and the internal consistency reliability of the composites. Appendix A shows correlations between items and the 6 hypothesized composites. Items that were highly correlated with their composite, whether or not this correlation was significantly higher, were deemed acceptable. These correlations indicate that the data are not consistent with the hypothesized composites. Some items did not correlate with the composite they were hypothesized to represent, whereas others related to multiple composites. For example, item 13 correlated only 0.19 with the patient/provider communication composite. Examination of the correlations suggested 2 composites: one on communication to improve health literacy and a smaller composite focused on communication about medicines.

A 2-factor categorical confirmatory factor analysis model representing the revised item configuration fit the data well (Comparative Fit Index=0.958; Tucker-Lewis Index=0.953; Root Mean Square Error of Approximation=0.068). Factor loadings were all statistically significant. Standardized loadings ranged from 0.436 to 0.913 for the communication to improve health literacy factor and from 0.655 to 0.965 for communication about medicines. The estimated correlation between the 2 factors was 0.784. Thus, the confirmatory factor analysis provided support for the 2 new CAHPS composites.

Table 3 provides the item-scale correlations for the 2 composites. We included item 21 in the Communication to Improve Health Literacy composite, because it gets at a key aspect of patient/provider communication. We ended up excluding it from the final version of the new item set because this item is part of the CAHPS core survey. Item-composite correlations for the 16 remaining items ranged from 0.30 (item 10) to 0.79 (item 18). The composite on communication about medicines—called How Well Providers Communicate About Medicines—included 5 items (plus item 29, a screener from the CAHPS Clinician and Group core items). Only 3 of the 5 items (item 31, 33, and 34), however, were scored because items 30 and 32 are screeners for the items that follow them. Although item 29 (How often did this provider give you easy to understand instructions about how to take your medicines?) was more highly correlated with the Communication to Improve Health Literacy composite, it remained in the Communication About Medicines composite based on its strong conceptual relationship to the other items in that composite. The item-composite correlations for the 3 items that are not screeners was 0.60 for item 34, 0.52 for item 31, and 0.49 for item 33.

Table 3
Table 3
Image Tools
Back to Top | Article Outline
Other Items

Composite measures need to be closely related, both statistically and conceptually. Items 13, 14, 27, 36, 41, 42, and 43 did not correlate distinctly with either of the final composites. Although some of these items (27, 36, and 41) have substantial correlations (r≥0.40) with one or both of the composites and while the patterns of correlations for these items were similar to other items that were included in the composites, they did not fit into either of the composites conceptually and were therefore not included.

Back to Top | Article Outline
Five-item Composite on Communication to Improve Health Literacy

The large number of items (16) included in the Communication to Improve Health Literacy composite makes it unlikely that all users would choose to use it. Hence, we conducted additional analyses to identify a shorter version of this composite. We found that 5 items (18, 19, 24, 29, and 37) accounted for 90% of the variance in the 16-item Communication to Improve Health Literacy composite and could constitute a shorter version of this composite.

Back to Top | Article Outline
Means, SDs, and Internal Consistency Reliability Estimates

Composite scores were calculated in a 2-step process: linearly transforming the items to a 0–100 possible range and then averaging the items within each composite. The mean for the 16-item Communication to Improve Health Literacy composite (0–100 possible range) was 86 (SD=16) The mean for the 5-item How Well Providers Communicate About Medicines composite was 60 (SD=35). The mean for the 5-item (items 18, 19, 24, 29, 37) Communication to Improve Health Literacy composite was 84 (SD=21).

Internal consistency reliability estimates for the 3 composites were 0.89 (16-item Communication to Improve Health Literacy), 0.71 (How Well Providers Communicate About Medicines), and 0.79 (5-item Communication to Improve Health Literacy). In comparison, the 5 CAHPS core communication items that were administered in the field test had an internal consistency reliability of 0.88. The correlation between the 16-item Communication to Improve Health Literacy composite and the How Well Providers Communicate About Medicines composite was 0.39.

Back to Top | Article Outline
Associations With Global Rating of Provider

The correlations between items and composites with the global rating of the provider indicate the extent to which a composite “drives” the rating. The higher the correlation between a composite and the global rating of the provider, the more an increased score for that composite is associated with a better rating for the provider.

The correlations of the items not included in any composite with the global rating of the provider ranged from −0.08 (P>0.05; item 43) to 0.44 (P <0.001; item 36). Correlations of the individual 5 items in the short Communication to Improve Health Literacy composite with the global rating of the provider ranged from 0.42 (item 37: test results easy to understand) to 0.61 (item 18: provider gave all information wanted about health).

We regressed the global rating of the provider item on the 5-item Communication to Improve Health Literacy composite and the How Well Providers Communicate About Medicines composite. Both composites, 5-item Communication to Improve Health Literacy (b=0.45; P<0.0001) and 5-item How Well Providers Communicate About Medicines, (b=0.04; P<0.01) were significantly associated with of the global rating of the provider and accounted for 42% of the variance of the rating.

A regression predicting the global rating of the provider revealed significant unique effects of all the 3 composites (the 16-item version of the Communication to Improve Health Literacy composite: P<0.001, b=0.28; the 5-item Communication About Medicines composite: P=0.0207, b=0.036); and the CAHPS 5-core item communication composite: P<0.0001, b=0.35). The adjusted R 2 indicated that 51% of the variance in the global rating was accounted for by these 3 communication composites. Hence, the composites account for a majority of the variance in the overall perceptions of the provider and each composite contributes important unique information about patients’ health care experiences.

Back to Top | Article Outline
Final Version of the CAHPS Item Set for Addressing Health Literacy and Composites

The final version of the item set includes 30 supplemental items designed for use with the CAHPS Clinician and Group survey. The items address 6 main topic areas: (1) communication to improve health literacy; (2) communication about health problems and concerns; (3) communication about medicines; (4) communication about tests; (5) communication about forms; and (6) disease self-management. We identified 2 composites that can be calculated and reported, one that provides a composite score on communication to improve health literacy and one that provides a composite score on communication about medicines. We also identified a short version of the former composite that has similar explanatory power with a third the number of items. The final version of the health literacy item set was cognitively tested again as part of other CAHPS survey development efforts. Revisions to the item set were made based on the results of additional testing and as part of an effort to harmonize various CAHPS supplemental item sets. The final version of the item set can be found on the CAHPS Web site at https://www.cahps.ahrq.gov/Surveys-Guidance/CG.aspx.

Back to Top | Article Outline

DISCUSSION

Recent years have seen an increased awareness of the mismatch between patients’ health literacy skills and the demands that are placed on them. There is a growing recognition that healthcare providers have a responsibility to improve patients’ understanding of health information. The CAHPS Item Set for Addressing Health Literacy is a tool to help healthcare providers identify areas for quality improvement in how they communicate health information to patients. It also serves as a measure of whether healthcare providers’ have succeeded in reducing the health literacy demands they place on patients.

The results from the analysis of the field test data provide support for the reliability and validity of the CAHPS Item Set for Addressing Health Literacy. They also show that higher ratings on many of the health literacy items and on the composites go hand in hand with more favorable global ratings of providers.

Depending on their focus, survey users have the flexibility to pick and choose items and are not required to field the entire item set or to field all the items within 1 topic area. In addition to the individual items in the item set, we identified 2 composite measures that can be used for both internal and public reporting: How Well Providers Communicate About Medicines consisting of 5 items, and Communication to Improve Health Literacy consisting of a long 16-item version and a short 5-item version. The short version of this composite allows users to score how well healthcare providers are addressing their patients’ health literacy needs without having to use all 16 items in the long version of the composite. Additional information on calculating composite scores can be found in the CAHPS Clinician & Group Surveys and Instructions (https://www.cahps.ahrq.gov/Surveys-Guidance/CG/Get-CG-Surveys-and-Instructions.aspx).

This study has several limitations. First, the health literacy item set was tested exclusively with a population insured by Medicaid managed care plans or Medicare. Further research is needed to test the health literacy item set with other insured populations (eg, those with commercial insurance). Second, the study was limited to English-speaking and Spanish-speaking populations. Further, we were unable to obtain sufficient Spanish language surveys to adequately compare the psychometric measurement properties of the health literacy items and composite measures by language. Additional research is needed to fully assess the Spanish version of the item, and to test the item set with other non–English-speaking populations.

Despite these limitations, the CAHPS Item Set for Addressing Health Literacy can serve as a tool to measure, from the patient’s perspective, how well healthcare providers’ are meeting their patients’ health literacy needs and to use this information for quality improvement purposes. To further aid quality-improvement efforts, each of the items in the item set has been mapped to recommendations made in the American Medical Association’s Health Literacy Educational Toolkit, Second edition.27 This Health Literacy Quality Crosswalk can be found in the document About the CAHPS Item Set for Addressing Health Literacy (https://www.cahps.ahrq.gov/Surveys-Guidance/Item-Sets/Health-Literacy.aspx).

Back to Top | Article Outline
APPENDIX A: Item-scale Correlations for Hypothesized Composites Measures (n=492) Cited Here...
Back to Top | Article Outline

REFERENCES

Table. No title avai...
Table. No title avai...
Image Tools
1. H.R. 3590—111th Congress: Patient Protection and Affordable Care Act. 2009

2. Kutner M, Greenberg E, Jin Y, et al. The Health Literacy of America’s Adults: Results From the 2003 National Assessment of Adult Literacy. 2006 Washington, DC National Center for Educational Statistics

3. America’s Health Literacy: Why We Need Accessible Health Information. 2009 Rockville, MD U. S. Department of Health and Human Services, Office of Disease Prevention and Health Promotion

4. Communicating Health: Priorities and Strategies for Progress. 2003 Washington, DC US Department of Health and Human Services, Office of Disease Prevention and Health Promotion

5. Baker DW. The meaning and the measure of health literacy. J Gen Intern Med. 2006;21:878–883

6. Parker R, Ratzan SC. Health literacy: a second decade of distinction for Americans. J Health Commun. 2010;15(Suppl 2):20–33

7. Koh H, Berwick D, Clancy C, et al. New federal policy initiatives to boost health literacy can help the nation move beyond the cycle of costly “crisis care.” Health Aff. 2012;31:434–443

8. Healthy Pople 2010: Understanding and Improving Health. 20002nd ed Washington, DC U.S. Government Printing Office

9. Adams K, Corrigan JM Priority areas for national action: transforming healthcare quality. 2003 Washington, DC The National Academies Press

10. Health Literacy: A Prescription to End Confusion. 2004 Washington, DC The National Academies Press

11. DeWalt DA, Callahan LF, Hawk Victoria H., et al. Health literacy universal precautions toolkit. 2010 Rockville, MD Agency for Healthcare Research and Quality

12. Berkman ND, DeWalt DA, Pignone MP, et al. Literacy and Health Outcomes. 2004 Rockville, MD Agency for Healthcare Research and Quality

13. Weidmer, et al. Guidelines for Translating CAHPS Surveys, 2006. Available at: https://www.cahps.ahrq.gov/Surveys-Guidance/Helpful-Resources/Translating-Surveys.aspx Accessed May 14, 2012

14. Solano-Flores, et al. The Assessment and Selection of Translators and Translation Reviewers, 2006. Available at: https://www.cahps.ahrq.gov/Surveys-Guidance/Helpful-Resources/Translating-Surveys.aspx Accessed May 14, 2012

15. Harkness JA, Van de Vijver FJR, Molher PPh Cross-Cultural Survey Methods. 2003 Hoboken, NJ John Wiley & Sons Inc.

16. Behling O, Law KS Translating Questionnaires and Other Research Instruments: Problems and Solutions. Sage University Papers Series on Quantitative Applications in the Social Sciences. 2000 Thousand Oaks, CA Sage:07–131

17. U.S. Census Bureau Methodology and Standards Council. The Translation of Surveys: An Overview of Methods and Practices and the Current State of Knowledge. Census Bureau Guideline: Language Translation of Data Collection Instruments and Supporting Materials; 2000. Working Paper. Washington, DC: US Census Bureau

18. Hughes KA. Comparing Pretesting Methods: Cognitive Interviews, Respondent Debriefing, and Behavior Coding. Annual Meeting of the Federal Committee on Statistical Methodology. 2003 Arlington, VA US Census Bureau

19. Willis G Cognitive Interviewing: A Tool for Improving Questionnaire Design. 2004 Thousand Oaks, CA Sage Publications

20. Harris-Kojetin LD, Fowler FJ, Brown JA, et al. The use of cognitive testing to develop and evaluate CAHPS 1.0 core survey items. Med Care. 1999;37:MS10–MS21

21. Levine RE, Fowler FJ Jr., Brown JA. Role of cognitive testing in the development of the CAHPS Hospital Survey. Health Ser Res. 2005;40(6 pt 2):2037–2056

22. Weech-Maldonado R, Weidmer B, Morales L, et al.Cynamon ML, Kulka RA Cross-cultural adaptation of survey instruments: the CAHPS experience. Seventh Conference on Health Survey Research Methods. (DHSS Publications No. [PHS] 01-1013.). Hyattsville, MD US Dept of Health and Human Services; 2001:75-82. Publication PHS 01-1013

23. McGee J, Kanouse DE, Sofaer S, et al. Making survey results easy to report to consumers: How reporting needs guided survey design in CAHPS. Med Care. 1999;37(suppl 3):MS32–MS40

24. Solomon LS, Hays RD, Zaslavsky AM, et al. Psychometric properties of a group-level Consumer Assessment of Health Plans Study (CAHPS) instrument. Med Care. 2005;43:53–60

25. Hays RD, Chong K, Brown J, et al. Patient reports and ratings of individual physicians: an evaluation of the DoctorGuide and Consumer Assessment of Health Plans Study provider-level surveys. Am J Med Qual. 2003;18:190–196

26. Muthén LK, Muthén BO Mplus User’s Guide. 2009 Los Angeles, CA Muthén & Muthén

27. Agency for Healthcare Research and Quality. About the CAHPS Item Set for Addressing Health Literacy Document No. 1311. 2011

Cited By:

This article has been cited 2 time(s).

Medical Care
Development of Items to Assess Patients’ Health Literacy Experiences at Hospitals for the Consumer Assessment of Healthcare Providers and Systems (CAHPS) Hospital Survey
Weidmer, BA; Brach, C; Slaughter, ME; Hays, RD
Medical Care, 50(): S12-S21.
10.1097/MLR.0b013e31826524a0
PDF (143) | CrossRef
Medical Care
The Consumer Assessment of Healthcare Providers and Systems (CAHPS) Cultural Competence (CC) Item Set
Weech-Maldonado, R; Carle, A; Weidmer, B; Hurtado, M; Ngo-Metzger, Q; Hays, RD
Medical Care, 50(): S22-S31.
10.1097/MLR.0b013e318263134b
PDF (195) | CrossRef
Back to Top | Article Outline
Keywords:

health literacy; CAHPS; patient survey; communication

© 2012 Lippincott Williams & Wilkins, Inc.

Login

Article Tools

Images

Share

Search for Similar Articles
You may search for similar articles that contain these same keywords or you may modify the keyword list to augment your search.