Secondary Logo

Journal Logo

Research Reports

A Comparative Analysis of Telephone and In-Person Survey Administration for Public Health Surveillance in Rural American Indian Communities

English, Kevin C. DrPH; Espinoza, Judith MPH; Pete, Dornell MPH; Tjemsland, Amanda MPH

Author Information
Journal of Public Health Management and Practice: September/October 2019 - Volume 25 - Issue - p S70-S76
doi: 10.1097/PHH.0000000000001007
  • Free


The availability of high-quality public health surveillance data is essential for American Indian and Alaska Native (AI/AN) populations to identify strengths and assets, elucidate needs and priorities, monitor trends, measure impact, and drive programmatic investments and policy decisions for community health improvement. However, national public health surveillance systems typically suffer from insufficient inclusion and/or inaccurate identification of AI/AN populations. Instead of providing useful data, surveillance reports often characterize the AI/AN population with an asterisk, or aggregate it into a meaningless “other” category, due to small sample sizes, large margins of errors, racial and ethnic misclassification, or other issues related to the validity and statistical significance of data for AI/AN populations.1 This is especially problematic when trying to access reliable public health data at the regional or tribal level. Tribe-specific data are a valid request, however, given the heterogeneity of the 573 federally recognized tribes in size, culture, language, history, geographic location, resource base, and community/societal structure, all of which impact the health and well-being of tribal members. The current dearth of data has significant public health implications for AI/AN communities. Without AI/AN-specific data, adequate funding and infrastructure may not be appropriately allocated at the federal, state, county, and/or tribal level to address critical and emerging health concerns in AI/AN communities. Likewise, tribal leaders lack relevant data to drive timely action and policy development that address highest priority needs in their respective communities.

Nationally, public health surveillance systems have largely utilized telephone survey administration in response to the rising cost of traditional face-to-face interview methods and the convenience of computer-assisted telephone interviewing.2 Current studies have demonstrated, however, that relative to face-to-face samples, telephone samples commonly underrepresent people with low incomes, less educational attainment, and minorities.3 This is due, in part, to the fact that these subgroups may be less likely to own telephones or have reliable telephone coverage. The proliferation of cell phone–only households has further complicated public health surveillance methodologies and noncoverage concerns.4 These trends may be particularly problematic for rural AI/AN populations who often have poorer access to landline telephones and sporadic mobile telephone coverage than non-Hispanic whites.3 In addition, minorities and people with low incomes are more likely to refuse to participate in telephone surveys than face-to-face surveys.5

Consequently, there is a need for increased methodological research on the effect of public health surveillance system mode upon response of specific groups, especially underrepresented populations such as AI/ANs. To date, no studies have explored the impact of survey administration mode upon AI/AN participation, despite the heavy burden of health disparities witnessed among AI/ANs and the lack of AI/AN specific data necessary to stimulate responsive public health action. Understanding factors that influence participation among AI/ANs in public health surveillance is critical toward the attainment of representative and quality data that can assist tribes in harnessing local resources to areas of greatest need, developing relevant policy and program supports, and holding state and federal governments accountable to meet the needs of AI/AN people.

The present study aimed to fill this knowledge gap and inform best practices for public health surveillance among rural AI/AN populations by comparing the differential impact of telephone (cell phone and landline) versus face-to-face administration of a health risk behavior survey in 3 rural tribes in New Mexico. Analyses examined the influence of survey administration mode upon response rates, participant demographics, and costs. The study was led by the Albuquerque Area Southwest Tribal Epidemiology Center (AASTEC), 1 of 12 tribal epidemiology centers (TECs) serving the nation's 12 Indian Health Service Areas and Urban Indian health programs. The TECs were established in 1996, under the reauthorization of the Indian Health Care Improvement Act, amid growing concern about the lack of adequate public health surveillance and data for disease control for AI/AN populations.6 The TECs are defined by the Indian Health Care Improvement Act as public health authorities (25 USCA § 1621m(e)(1)). A cornerstone of TECs is to provide a vital link to AI/AN health data and technical assistance to support tribal communities in gaining greater knowledge, empowerment, and influence over their own health care and wellness. The AASTEC was therefore well-situated to conduct the present study to inform best practices for public health surveillance within rural AI/AN populations.


Research setting

This study took place in 3 neighboring AI/AN tribes in rural New Mexico. These tribes were selected on the basis of the criteria that they shared a telephone prefix and were interested in conducting a comprehensive community health assessment. Before initiating the study, the research team presented the study protocol to tribal leadership in all 3 tribes and obtained formal approval via tribal resolution. The study protocol was also approved by a tribal institutional review board. Each participating tribe also appointed a local task force composed of tribal health directors and staff to refine the survey instrument, discuss sampling strategies, and establish procedures for survey administration. This high level of community involvement was essential to promote community ownership of the project and ensure that final products aligned with the explicit needs of the participating tribes. All 3 tribes also agreed to allow AASTEC to aggregate final survey data and conduct comparative analyses to assess the impact of survey administration mode upon participation.

Survey instrument

The survey instrument utilized for this study was a modified Behavioral Risk Factor Surveillance System (BRFSS) instrument. The BRFSS is a national health survey that collects data about US residents regarding their health-related risk behaviors, chronic health conditions, and use of preventive services.7 The tribal BRFSS instrument encompassed previously validated BRFSS modules selected by the community task forces based upon tribal health priorities, rather than unilateral adherence to the core and state-added BRFSS modules selected for the New Mexico BRFSS. Key modules included health status, health care access, nutrition, physical activity, substance use, violence, mental health, environmental health, chronic disease management, and preventive screening. Questions within each module were largely unchanged to maximize comparability. Participating communities also requested the inclusion of an additional scale measuring community capacity, which has been previously validated in other tribal communities in New Mexico.8

Two separate instruments were developed for the study—an abbreviated instrument and a full instrument. The abbreviated instrument was utilized in the telephone administration phase of the study upon recommendation by our subcontractor with prior experience conducting statewide BRFSS surveillance via telephone. It was suggested that a truncated instrument was necessary to thwart participant fatigue and attrition during telephone administration. The full instrument was utilized during the in-person, household administration phase of the study. This ensured that at least some data were collected for all modules that were prioritized by the tribes.


Our combined target sample size was n = 900 (approximately n = 300 participants per tribe), with half of the participants expected to complete the survey via telephone and the other half via in-person administration. This sample size was based upon a power analysis conducted to determine the number of surveys that needed to be completed within each tribe to conduct tribe-specific analyses at the 95% confidence level. The sample size was therefore more than sufficient to conduct the analyses with the aggregate data set presented here.

The 2 administration modes necessitated distinct sampling strategies. For the telephone sample, a random-digit-dialing sample of landline telephone numbers was generated based upon the telephone prefix shared by all three participating Tribes. The cell phone random-digit-dialing sample was drawn using a systematic random sample stratified by county and service provider from a frame of blocks built from activated wireless phone numbers.

Sampling for the in-person survey followed completion of the telephone administration to ensure that the study was adequately powered to not only conduct the comparative analyses with the aggregate data set but also ensure attainment of a representative sample of participants from each tribe to engender tribe-specific prevalence estimates. Samples for the in-person administration study phase were drawn at random from enrollment/census rosters in each tribe, inclusive of tribal members aged 18 years and older currently living on tribal lands. All sampling (telephone and in-person household) was conducted without replacement.

Survey administration

Telephone interviews were all conducted in English by trained interviewers working for the state BRFSS survey unit using a computer-assisted telephone interviewing system. Per state and Centers for Disease Control and Prevention BRFSS protocol, interviewers made up to 15 calls per participant at various time points during weekdays, evenings, and weekends. Calls to a selected number were ceased if any of the following conditions occurred: (1) 15 attempts were made; (2) the selected phone number was disconnected; (3) the number was associated with a business; or (4) participant refusal. Participation was anonymous and consent was given verbally. Incentives were not provided to avoid requiring participants to release identifiers (ie, name and mailing address). The telephone administration phase occurred over a 2-month period until all randomly selected telephone numbers were exhausted (n = 1667 landlines and n = 1080 cell phones). On average, telephone surveys were completed in 25 minutes.

For the in-person survey, AASTEC staff trained local interviewers in each tribe to conduct the surveys. All interviewers participated in a 3-day training, which included a review of the study protocol, survey instrument and skip patterns, and several mock interview sessions. Additional training content included informed consent, participant confidentiality, and strategies to reduce interviewer bias. A full-time survey coordinator at AASTEC conducted weekly meetings with community interviewers to provide technical assistance, collect completed surveys and consent forms, monitor adherence to the survey protocol, troubleshoot challenges, and offer tangible and social support. The survey was advertised in the community through local newsletters, and interviewers carried a letter of survey endorsement signed by tribal leadership.

Selected study participants were assigned to the interviewers at random. Each question was read aloud to the participant in English or the native language (<5% of all surveys). On average, the in-person surveys were completed in 45 minutes. All participants signed a consent form and received a small incentive to honor their participation. Each interviewer was instructed to make up to 4 home visits to complete the survey. Interviewers ceased attempts to survey a selected participant if any of the following criteria were met: (1) participant refusal, (2) completed 4 visits without reaching the participant, (3) participant was determined to no longer live on tribal lands during the survey period (ie, moved, hospitalized, or incarcerated), or (4) the participant was determined to be deceased.

Data analysis

Survey response rates were calculated separately by administration mode. For the in-person survey, a simple response rate calculation was performed (number of completed interviews/total sample) × 100. For the telephone survey, each telephone number in the sample was assigned a disposition code to indicate a particular result of calling the number, that is, (1) a completed interview; or (2) a determination that (a) a household was eligible to be included but an interview was not completed; or (b) a telephone number was ineligible or could not have its eligibility determined. The final disposition codes were then used to calculate response rates, in accordance with the Centers for Disease Control and Prevention BRFSS standards set by the American Association for Public Opinion Research, which accounted for key assumptions of eligibility among potential respondents or households that were not interviewed.9

Next, the 2 data sets (telephone and in-person) were merged and a new variable was created to assign the applicable survey administration mode to each participant. SAS 9.4 was utilized to perform the analyses. Descriptive statistics were first utilized to characterize the population completing the survey via each administration mode. Bivariate analyses and significance testing (ie, χ2) were then performed to determine whether any significant differences were observed across participant demographics (ie, age, sex, income, educational attainment, and employment) by survey administration mode. The data were also analyzed separately by tribe to determine prevalence estimates for each survey indicator, which were then incorporated into tribe-specific reports and disseminated directly to each participating community.

Finally, a basic cost analysis was performed to determine the average cost per completed survey by survey administration mode. This was accomplished by calculating the total costs of survey administration per mode, divided by the total number of completed surveys.


Response rate by survey administration mode

For the telephone sample, 2747 unique telephone numbers (1667 landlines and 1080 cell phone) were selected at random for participation. A combined total of 9081 call attempts were made to these numbers. These attempts resulted in 896 answered calls (434 landlines and 462 cell phone). Of the answered calls, 302 persons were members of one of the participating tribes and 183 began the survey. At the conclusion of the survey administration, a total of 162 surveys were completed by telephone. It is important to note that only 2 calls were completed by mobile phone. Using Centers for Disease Control and Prevention's BRFSS response rate calculation, which accounts for different eligibility factors (interviewed, eligible/noninterview, unknown eligibility, not eligible), the telephone sample response rate was calculated at 35.7% (combined landline and mobile telephones).

Because a separate aim of the study was to generate tribe-specific BRFSS data for each tribe, and the needed sample size was calculated at approximately n = 300 per tribe, it was necessary to oversample participants for in-person administration. A total of 781 households across all 3 tribes were therefore randomly selected to participate via this administration mode. At the conclusion of the 6-month in-person survey administration period, the total number of completed surveys was 538, which reflected a 68.9% response rate.

As demonstrated in Table 1, the in-person survey administration yielded a much higher response rate than telephone survey administration. In fact, the in-person administration rate (68.9%) was nearly double the telephone administration rate (35.7%). It is also important to note that the in-person response rates were relatively similar across tribes, 78.4%, 60.0%, and 66.4%, respectively. Response rates could not be calculated by tribe for the telephone sample, because it was not feasible to determine the tribal affiliation of nonrespondents.

TABLE 1 - Response Rate
Administration Mode Unique Participants Call Attempts Completed Surveys Response Rate
Landline phone 1667 5767 162
Mobile phone 1080 3314 2
Telephone (combined) 2747 9081 164 35.7%
In-person 781 N/A 538 68.9%
Abbreviation: N/A, not applicable.

Demographic characteristics of sample

The demographic profile of both samples by age, sex, income, employment, and educational attainment is outlined in Table 2. Statistically significant differences were observed by age, income, and educational attainment.

TABLE 2 - Demographic Characteristics of Participants
In-Person Survey Telephone Survey P
N % 95% CI N % 95% CI
Sex .66
Male 247 46.5 41.9-51.1 67 50.3 42.0-58.6
Female 285 53.5 48.9-58.1 96 49.7 41.4-58.0
Age, y <.01
Average age 532 47.1 45.6-48.6 163 51.3 48.8-53.8
Age groups, y <.01
18-29 105 19.7 16.3-23.1 19 11.7 6.7-16.6
30-49 195 36.6 32.5-40.7 51 31.3 24.1-38.5
50+ 232 43.7 39.5-47.9 93 57.1 49.4-64.7
Educational attainment <.01
Non–high school graduate 93 17.6 14.3-20.8 15 9.0 4.6-13.4
High school graduate 394 74.5 70.8-78.2 129 77.7 71.3-84.1
College graduate 42 7.9 5.6-10.3 22 13.3 8.0-18.5
Employment status .12
Employed 280 55.3 50.7-59.9 73 46.5 38.1-55.0
Unemployed 67 15.4 11.8-19.0 27 18.4 11.7-25.1
Other 167 29.3 25.2-33.3 58 35.1 27.0-43.2
Income <.05
<$10 000 149 33.3 28.9-37.7 28 20.4 13.6-27.3
$10 000 to $19 999 115 25.7 21.7-29.8 36 26.3 18.8-33.7
$20 000 to $34 999 105 23.5 19.5-27.4 47 34.3 26.3-42.4
$35 000 to $49 999 40 8.9 6.3-11.6 12 8.8 4.0-13.6
≥$50 000 38 8.5 5.9-11.1 14 10.2 5.1-15.4
Abbreviation: CI, confidence interval.

Compared with those completing a telephone-administered survey, the sample of participants completing the survey via in-person administration was younger (mean age = 47.1 years, 95% confidence interval [CI] [45.6-48.6] vs 51.3 years, 95% CI [48.8-53.8]). In-person survey participants also encompassed a greater percentage of AI/AN adults who (a) had not graduated from high school (17.6%, 95% CI [14.3-20.8] vs 9.0%, 95% [4.6-13.4]); and (b) had household incomes lower than $10 000 (33.3%, 95% CI [28.9-37.7] vs 20.4%, 95% CI [13.6-27.3]). A greater percentage of females completed the survey via in-person administration (53.5%) compared with telephone administration (49.7%); however, this difference was not statistically significant. Likewise, significant differences were not observed by employment status.

Cost analysis

Total costs for the telephone survey administration were $35 000, equivalent to the value of the contract with the state BRFSS survey unit. Costs for the in-person survey administration included AASTEC labor (epidemiologists, community interviewers, and data entry technicians), participant incentives, and interviewer mileage reimbursement. These costs totaled $103 268 (Table 3). Although the in-person survey administration incurred higher costs, the low response rate observed in the telephone administration sample resulted in a lower cost per completed survey by in-person administration ($192), compared with telephone administration ($211).

TABLE 3 - Cost Analysis
In-Person Survey Telephone Survey
Total cost of survey $103 268 $35 000
Total number of surveys completed 538 166
Cost per survey $192 $211


This study is the first to demonstrate significant differences in survey response rates among rural AI/AN adults according to administration mode. In fact, in-person administration of a health risk behavior survey to AI/AN adults yielded a response rate that was nearly double the response rate observed during telephone administration (landline and cell phones combined). Differences were also evident by telephone type where only 2 participants out of a sample of 1080 unique cell phone numbers completed a survey. These findings have significant public health implications, given the gravitation of national public health surveillance systems toward telephone survey administration only and increasing cellular telephones.

One argument commonly made in favor of this transition to telephone administration is that it reduces cost. However, for the rural AI/AN population, this may not be the case. As evidenced in this study, when comparing costs of administration by completed survey, the in-person survey proved to be less expensive to administer although both survey administration types incurred high costs to attain the target sample size.

At the same time, this study demonstrated significant differences according to demographic traits by survey administration mode. The AI/AN adults who completed the survey in-person were, on average, younger and had lower household incomes and educational attainment than those who completed the survey via telephone. Thus, AI/ANs who do participate in public health surveillance via telephone may be systematically different than the AI/AN population completing the survey in-person. Moreover, based on comparisons to US Census data, the subgroup of AI/AN adults who completed the survey in-person was more demographically similar to the rural AI/AN population that the survey was intended to reach.10

There are a few limitations that must be considered when interpreting the findings of this study. First, a nominal incentive was offered to participants completing the in-person survey only. Because the telephone survey did not collect identifying information (ie, name and mailing address), it was not feasible to disseminate incentives to participants. It is also important to note that the in-person survey instrument was significantly longer (45 minutes vs 25 minutes); thus, the investment of time for completing the survey in-person was greater. Interviewers were also trained to not mention the incentive until the completion of the in-person survey to reduce any bias that might be incurred from incentivizing participation. Nevertheless, the inclusion of an incentive in only 1 survey administration mode must be considered when interpreting the findings of this study. Second, there was potential for crossover of the 2 samples. To address this concern, interviewers were instructed to note whether the participant indicated at any time during the survey that he or she had previously participated in the survey by telephone. To our knowledge, crossover did not occur between the samples in this study. Finally, the study population included 3 rural tribes in New Mexico and therefore cannot be generalized to all federally recognized tribes throughout the country, or the urban American Indian population. Nevertheless, it is important to note that a similarly high response rate was observed for in-person survey administration across all 3 participating tribes.

Implications for Policy & Practice

The findings from this study have important implications for public health surveillance within rural American Indian/Alaska Native (AI/AN) populations, including the following:

  • Telephone survey administration is unlikely to yield sufficient coverage of the rural AI/AN population. This discovery is particularly disconcerting, given the fact that face-to-face interviewing has largely been replaced by telephone interviewing (and increasingly mobile phones) as the dominant mode of health survey data collection in the United States.5
  • The AI/ANs who do currently participate in telephone surveys may be systematically different, on average, than those AI/ANs who actually live on rural tribal lands as evidenced in this study.
  • If there is a genuine interest in ensuring that useful data area vailable for the rural AI/AN population, additional resources and innovation are needed to fully engage the AI/AN population within national public health surveillance systems. Specifically, there is a need to use mixed-mode strategies for public health surveillance that weave together in-person household administration with telephone and/or other administration modes. Recent studies have demonstrated the success of such blended approaches in the general population.11
  • There is a need for additional research on the potential benefits and/or limitations of other survey administration modes (ie, online or mailed surveys) in rural AI/AN populations, as well as exploration of new and alternative survey administration modes that may better fit the unique context of rural AI/AN populations.
  • With sufficient resources, tribal epidemiology centers are well situated to fill the current gaps in public health surveillance in partnership with AI/AN populations. As evidenced in this study, all 3 participating communities successfully obtained tribe-specific health data following the administration of a mixed-mode survey led by the Albuquerque Area Southwest Tribal Epidemiology Center.
  • Greater attention and resources to integrate tribes and tribal epidemiology centers into the national public health surveillance system are needed to ensure the highest quality of data for the AI/AN population at not only the national and state levels but also the tribal level.

The World Health Organization defines public health surveillance as the continuous, systematic collection, analysis, and interpretation of health-related data needed for the planning, implementation, and evaluation of public health practice.12 The pervasive absence of such data for AI/AN populations is a clear and present threat to the self-determination of tribal nations to advance the health and wellness of their own people. Without change and innovation, the AI/AN population will continue to be underrepresented in national public health surveillance systems, further challenging capacity to document and address persistent disparities and inequities witnessed among AI/ANs nationwide.


1. National Congress of American Indians (NCAI) Policy Research Center. Data disaggregation. Accessed November 5, 2018.
2. Greenfield TK, Midanik LT, Rogers JD. Effects of telephone versus face-to-face interview modes on reports of alcohol consumption. Addiction. 2000;95(2):277–284.
3. Ellis CH, Krosnick JA. Comparing telephone and face-to-face surveys in terms of sample representativeness: a meta-analysis of demographic characteristics. NES Technical Report Series, No. nes010871; 1999.
4. Blumberg SJ, Luke JV. Wireless substitution: Early release of estimates from the National Health Interview Survey, July-December 2016. National Center for Health Statistics. Published 2017. Accessed October 1, 2018.
5. Holbrook AL, Krosnick JA, Pfent AM. The causes and consequences of response rates in surveys by the news media and government contractor survey research firms. In: Lepkowski J, Harris-Kojetin B, Lavrakas PJ, et al, eds. Advances in Telephone Survey Methodology. New York, NY: Wiley; 2007.
6. 25 U.S.C.A § 1621m(e)(1). Published March 23, 2010. Accessed October 5, 2018.
7. Centers for Disease Control and Prevention. Behavioral Risk Factor Surveillance System (BRFSS). Accessed November 5, 2018.
8. Oetzel J, Wallerstein N, Solimon A, et al. Creating an instrument to measure people's perception of community capacity in American Indian communities. Health Educ Behav. 2011;38(3):301–310.
9. Centers for Disease Control and Prevention. Behavioral Risk Factor Surveillance System 2016 summary data quality report. Published June 29, 2017. Accessed September 12, 2018.
10. US Census Bureau. Published 2010. Accessed October 1, 2018.
11. Mauz E, Lippe EV, Allen J, et al. Mixing modes in a population-based interview survey: comparison of a sequential and a concurrent mixed-mode design for public health research. Arch Public Health. 2018;76(1). doi:10.1186/s13690-017-0237-1.
12. World Health Organization. Public health surveillance. Accessed November 1, 2018.

American Indians; data; epidemiology; health surveys; public health surveillance; tribal epidemiology centers

© 2019 Wolters Kluwer Health, Inc. All rights reserved.