Tuma, Rabiya S. PHD
The rapid decline in hormone use by postmenopausal women following the Women's Health Initiative (WHI) study accounts for less than half of the decline in breast cancer incidence that occurred around the same time, according to new research presented at the American Association for Cancer Research Frontiers in Cancer Prevention meeting. Among other noteworthy findings, investigators also reported on studies examining the use of mammography in women under 40, neighborhood-level health disparities, and dietary factors that alter cancer risk.
No Caption Available...Image Tools
Since 2002, there has been a dramatic decline in breast cancer incidence in the United States. Many researchers have attributed this decline to the drop in hormone therapy use that followed the WHI study data in 2002 that showed a 1.5-fold increased breast cancer risk associated with hormone use. “Because the changes happened at the same time, people just assumed that they were related and that the drop in hormone use could explain the whole drop in incidence, but no one had put numbers to it, to see if it made sense that it accounted for the entire drop in incidence,” said Brian Sprague, PhD, a postdoctoral fellow at the University of Wisconsin, who led the new study.
When Dr. Sprague and colleagues used data from the literature to estimate the drop in hormone use, however, they found that it likely accounted for just 42% of the decline. “Based on that data we would expect a 3.5% decline in breast cancer, which is substantial and equates to about 6,000 fewer cases per year, but there was actually a 7% decline in breast cancer in just the year following the Women's Health Initiative. So there seemed to be a substantial proportion of the decline that could not be attributed to changes in hormone use,” Dr. Sprague said.
When asked what might account for the remainder of the decline, Dr. Sprague said it was unclear. Other risk factors have not changed as rapidly and some risk factors, like obesity, have moved in the wrong direction such that they would more likely cause an increase in incidence. One possibility, he said, is that screening mammography detected cancers earlier than they otherwise would have been found, and that change in timing led to the apparent decline in new cases.
Mammography in Women under Age 40
In another study, Julie M. Kapp, PhD, MPH, Assistant Professor of Family and Community Medicine at the University of Missouri-Columbia, presented data showing that the majority of women under 40 who had mammograms did so for screening purposes. Using data from the NCI Breast Cancer Surveillance Consortium, Dr. Kapp and colleagues examined the first mammograms of women aged 18 to 39 that were performed between 1996 and 2005. Overall, the team identified 73,353 screening mammograms and 26,262 diagnostic mammograms.
In this first portion of their work, the investigators focused on the ethnic and racial differences, and found that the false-positive rate in screening mammography varied modestly between ethnicities, ranging from 10.4% to 14.1%. In the African American population 363 women had to undergo a screening mammogram to reveal one cancer, while the number of women needed to screen in the white population was 623.
Looking at the diagnostic mammograms, the researchers also found a high false-positive rate, ranging from 8.7% for Caucasian women to 18.2% for Asian women. The risk of a true positive in the diagnostic mammograms was less than 1% for all racial and ethnic groups.
“It is important to note that the risk of false positives for a mammogram in these women is much greater than finding a cancer,” Dr. Kapp said. “We are hoping to shed more light on mammography use in younger women in general and, in particular, sensitivity toward racial and ethnic differences. These women should be aware that false positive rates are very high in the younger group.”
In ongoing studies, Dr. Kapp and colleagues plan to evaluate the overall sensitivity and specificity of screening mammograms in this young population. She declined to comment on those data as they are currently under review.
People living in more deprived neighborhoods are at increased risk of death and have worse health risks, according to a study presented by Chyke Doubeni, MD, MPH, Assistant Professor of Family and Community Health and Assistant Vice Provost for Diversity at the University of Massachusetts Medical School in Worcester, which found that the excess risk persisted even after accounting for dietary and lifestyle factors.
“The study points out that there is more to neighborhood effects than just the diet, physical activity, and smoking risk factors that we all think of,” Dr. Doubeni said.
To evaluate the impact of neighborhood environment on health and mortality, the team used data from the NIH-AARP Diet and Health Study, which enrolled more than half a million subjects between 1995 and 1996 and collected detailed data on diet, lifestyle, and medical history via questionnaires. In the current analysis, Dr. Doubeni and colleagues found that a larger proportion of individuals who lived in the most deprived neighborhoods reported worse general health, higher body mass index, and poorer diet, with a lower Mediterranean diet intake. However, even after controlling for these known risk factors, the quintile of people who lived in the neighborhood with highest deprivation had a 23% (female) or 22% (male) increased risk of death compared with those in the quintile who lived in the least deprived neighborhood.
“The common wisdom is that if you account for the risk factors like smoking, diet, and exercise, you shouldn't see any differences in neighborhood risks, so that is surprising,” Dr. Doubeni said.
Pistachios, Coffee, Omega-3 Fatty Acids
Numerous studies at the meeting reported changes in cancer risk associated with a variety of dietary factors. In a small randomized controlled study, Ladia M. Hernandez, MS, RD, LD, Senior Research Dietitian in the Department of Epidemiology at the University of Texas M. D. Anderson Cancer Center, found that daily consumption of 68 grams of pistachios increased serum levels of gamma-tocopherol, a form of vitamin E that has been associated with a reduced risk of some cancers. Following a four-week intervention, the participants randomized to eat pistachios had a 26% increase in serum gamma-tocopherol compared with baseline levels. The researchers hypothesize that incorporating pistachios into the diet of individuals at high risk of lung cancer may reduce their risk.
In a second diet study, Kathryn M. Wilson, Ph.D., a postdoctoral fellow at the Channing Laboratory of Harvard Medical School and the School of Public Health, used data from the Health Professionals Follow-Up Study to assess the impact of coffee consumption on the risk of advanced prostate cancer. Coffee is known to affect insulin and glucose metabolism as well as sex hormone levels, supporting the idea that there could be an association.
Men who drank six or more cups of coffee a day had a 59% reduction in risk of lethal or advanced cancer. There was a weaker trend for reduction in risk of any kind of prostate cancer. Interestingly, the risk reduction for advanced cancer was similar for decaffeinated and caffeinated coffee, suggesting that caffeine itself was not the most important risk component in the coffee.
Finally, researchers at the National Institute of Environmental Health Sciences found in a case-control study that individuals who consumed more long-chain omega-3 fatty acids had a reduced risk of large bowel cancer. Specifically, they recruited 1,509 Caucasians (716 cases and 787 controls) and 369 African Americans (213 cases and 156 controls) using the State Cancer Registry and the Department of Motor Vehicles records. Using dietary intake questionnaires, Sangmi Kim, PhD, a postdoctoral fellow, and colleagues assessed participants’ intake of 19 different polyunsaturated fatty acids over the previous 12 months.
Individuals in the quartile with the highest dietary intake of omega-3 fatty acids had a 39% reduction in risk of bowel cancer compared with those in the quintile with the lowest intake. The association was seen only in white study participants but not in African Americans. Dr. Kim said they were surprised that the association was not similar in the Caucasian and African American populations and that they could not identify any reason for this difference.
© 2010 Lippincott Williams & Wilkins, Inc.