From Westat, Bethesda, MD.
Address correspondence to: Jack Cahill, Westat, 1650 Research Blvd. TB 330, Bethesda, MD 20850. E-mail: email@example.com.
The commentary by Patricia Hartge1 in this issue makes an important contribution by summarizing recent issues regarding study participation. Her report underscores the need to evaluate the design approaches to the study and to find solutions that allow for analysis with sufficient confidence in the results.
One approach to increasing the response rates is to examine the costs and benefits of adopting new procedures. Hartge describes a variety of approaches to increase the response rates, including incentives, use of advance letters, and multiple contacts. What also needs to be considered is the real-world budget constraints. Do the benefits of marginal increases as discussed by Hartge outweigh the costs?
In terms of study design, the best approach may be to find a balance between study costs and the survey measures used to reduce nonparticipation. This balancing act requires serious examination of how a particular survey procedure will potentially improve data quality and whether such improvement is worth the associated costs. There are also other ways to limit costs and at the same time improve the response rate such as refusal subsampling (ie, recontacting a subsample of people who refused). A mail questionnaire with telephone follow up may be a more cost-effective approach than a single-mode design. A dual-frame approach for sampling may improve response rates and also be cheaper. Other approaches might include methodologic experiments and the conduct of a nonresponse bias survey. Epidemiologists, in general, should be encouraged to do a critical analysis of prior studies to see what really works and integrate that information with budgetary considerations to make wise decisions.
Hartge correctly points out that the use of random digit dialing as a sampling frame is quickly becoming a tool of the past. One type of sampling frame recently available in the United States but not discussed in her commentary is the national listing of addresses available from the U.S. Postal Service (USPS). The file contains all delivery-point addresses serviced by the USPS, which allows contact of households without phones or with only cell phones. Link and colleagues2,3 found that using this sampling frame with appropriate follow up achieved a higher response rate than random digit dialing. This sampling frame has the added advantages of yielding a population more reflective of the general population at lower cost than random digit dialing and being adaptable to a mixed-mode approach. Clearly, tradeoffs in length and complexity of the questionnaire must be reflected against the needs of the survey. For a mail survey, it is also difficult to tightly control respondent selection within the targeted household. One approach would be to send a mailed questionnaire with core questions for everyone in the household, with telephone follow up for interview of nonresponders, or to ask additional questions of certain individuals. Whatever the design approach, an evaluation of this sampling frame and mode of interview would be of value to the epidemiologic community.
Additionally, because there is an ongoing reluctance among the U.S. population to participate in studies, epidemiologists are faced with increased uncertainty about the performance of their study at any particular point in time. This problem requires careful monitoring and the ability to change the approach during the course of the data collection. Because mixed-mode approaches are the likely future of epidemiologic studies, the skills needed by epidemiologists are changing, and simply doing what worked in the past may not be good enough. Epidemiologists need to understand different methodologies and the tools required for combining multiple modes into a single design. Working in close collaboration with the survey methodologist offers some advantages for the epidemiologist. The survey methodologist can provide the epidemiologist with new methods for dealing with nonresponse, which the epidemiologist can balance against the scientific and practical needs of the study.
ABOUT THE AUTHOR
JACK CAHILL is a Vice President and Study Area Director at Westat. He has 32 years of experience in planning, implementing, and evaluating a variety of clinical and epidemiologic studies in the areas of cancer, nutrition, and health intervention at both the domestic and international level.
1. Hartge P. Participation in population studies [Commentary]. Epidemiology. 2006;17:252–254.
2. Link M, Battaglia MP, Frankel MR, et al. Effectiveness of an Address-Based Sampling Frame Alternative to RDD: 2005 BRFSS Mail Survey Pilot Results. Paper presented at the Annual Meeting of the American Statistical Association, 2005.
3. Link M, RTI International, Mokdad A. Are Web and Mail Modes Feasible Options for the Behavioral Risk Factor Surveillance System? Paper presented at the Health Survey Research Methods Conference, 2004.
© 2006 Lippincott Williams & Wilkins, Inc.