Share this article on:

Our Policy on Policy



The long path from data to policy usually begins with the publication of study results in a peer-reviewed journal. While publication is no guarantee of quality, it suggests the results are worth further consideration. Agencies that make policy (for example, the U.S. Environmental Protection Agency) often require that an article be accepted in a peer-reviewed journal before findings can be considered in the regulatory process.

Should epidemiologists discuss the policy implications of their data within the scientific report itself? Many do. 1 Under the editorship of Kenneth Rothman, Epidemiology has discouraged this practice. 2 Rothman’s position was that “. . .it is simply too facile to toss off a policy recommendation in the closing paragraph of a scientific paper without giving the implicit decision analysis the due consideration it deserves. Making good health policy is complicated. . .. Our editorial policy is intended to avoid trivializing a complex process and to increase the likelihood that policy discussions are treated with the seriousness and depth of understanding that they deserve.”2 Despite some criticism, 3 this policy has stood through the journal’s first decade.

As the new editors of Epidemiology, we find the question worth revisiting. We have invited three respected observers to comment on this question, and their reflections are presented in this issue. 4–6 Daniel Greenbaum is a former policy maker and current head of a research organization. Stephen Teret is a researcher on the volatile public health issues of firearms and violence. (It was Teret who questioned Rothman’s policy in 1993. 3) Noel Weiss has made substantial contributions to research on a range of chronic diseases.

One point on which our commentators agree is that few epidemiologists really understand the process of policy-making. Greenbaum sets out key differences between the worlds of research and policy, and describes the tensions at their interface. 6 Weiss offers the practical observation that a single study seldom provides the full range of evidence needed to determine policy, 4 while Teret argues that epidemiologists abrogate their duty if they limit themselves only to the narrow discussion of their data. 5

After reflection on all these points, we have decided to maintain Epidemiology’s editorial policy that discourages public health policy recommendations in research reports. At the same time, we see opportunities for epidemiologists to take more deliberate account of the needs of policy makers in the presentation and discussion of data. 7 Epidemiologic research does not have to be relevant to policy in order to be sound, but attention to policy relevance can improve epidemiologic research.

Sometimes the policy relevance of epidemiologic data can be improved through changes in the way results are presented. 8 For example, data from a given study may be too sparse to determine dose-response relationships or to identify susceptible sub-groups, and in a published paper the presentation of such data might be criticized. However, questions of dose-response or susceptible subgroups are highly relevant to policy, and can eventually be addressed through the accumulation of data from many studies. With the advent of electronic publication, it is now possible for authors to provide supplemental analyses and tables as potential resources for future meta-analyses and evidence-based policy making. We encourage authors to use Epidemiology’s electronic space liberally for such purposes.

All epidemiologists know the importance of placing their findings in the context of existing information, although their thoroughness in doing so varies. Context is especially important for policy applications. Do our data nudge the collective weight of evidence in one direction or another? If so, how far? In Bayesian terms, what do the current findings add to “prior odds”? 9 This assessment extends beyond merely statistical considerations. Our confidence in our findings rests not so much on statistical precision as on the level of trust in our tools and assumptions. Sensitivity analysis allows this confidence to be quantified. 10 A thorough assessment of uncertainty – best done by the investigators themselves – enhances the value of data for development of policy.

In sum, Epidemiology’s past policy on policy will continue unchanged. Implications for public policy belong in commentaries and not in the last paragraphs of research reports. We welcome commentaries that discuss policy, and we ask our authors and reviewers to advise us when they think commentary is warranted. More broadly, we encourage our authors to keep in mind the utility of their data for policy purposes – not only for today’s policy-makers, but also for tomorrow’s. This may affect how data are presented and how they are discussed. Meanwhile (and with the help of our hardworking cadre of reviewers), we will do our best to select for Epidemiology those papers most likely to advance science, and influence policy.

Back to Top | Article Outline


1. Jackson LW, Lee NL, Samet JM. Frequency of policy recommendations in epidemiologic publications. Am J Public Health 1999; 89:1206–1211.
2. Rothman KJ. Policy recommendations in Epidemiology research papers. Epidemiology 1993; 4:94–95.
3. Teret S. So what? Epidemiology 1993; 4:93–94.
4. Weiss NS. Policy emanating from epidemiologic data: what is the proper forum? Epidemiology 2001; 12:373–374.
5. Teret S. Policy and science: should epidemiologists comment on the policy implications of their research? Epidemiology 2001; 12:374–375.
6. Greenbaum DS. Epidemiology at the edge. Epidemiology 2001; 12:376–377.
7. Samet JM, Schnatter R, Gibb H. Epidemiology and risk assessment. Am J Epidemiol 1998; 148:929–936.
8. Nurminen M, Nurminen T, Corvalán CF. Methodologic issues in epidemiologic risk assessment. Epidemiology 1999; 10:585–593.
9. Goodman S. Of P-values and Bayes: a modest proposal. Epidemiology 2001; 12:295–297.
10. Greenland S. Basic methods of sensitivity analysis of biases. Int J Epidemiol 1996; 25:1107–1116.
© 2001 Lippincott Williams & Wilkins, Inc.