- Meta-analyses do fairly well at correctly stating their conclusions. Meta-analyses are quantitative literature reviews that combine results from across studies, essentially increasing the sample size and breadth.
- For example, Schoenfeld and Ioannadis looked at 9 meta-analyses showing an association between consuming particular foods (I think of foods such as fruits and vegetables) and reduced risk of cancer. For all 9, the results were statistically significant at conventional levels (p<.05).
- Schoenfeld and Ioannadis looked at 4 meta-analyses showing an association between consuming particular foods (I think of foods such as processed meats) and increased risk of cancer. For 3 of the 4, the results were statistically significant at conventional levels (p<.05).
- While the meta-analyses did well at reporting results, individual studies sometimes report results in their abstracts even when they were not statistically significant. Some of these reported results may be attributed to random happenstance rather than real cancer effects. This is why it is unwise to change eating habits with every new study. It is wiser to rely on the balance of scientific evidence connecting particular foods to cancer risks and benefits.
But, journalists will report the conclusions of the new study to say something completely different from the above. In the endless search for novelty, journalists write each day that everything you previously believed about diet and health is mistaken.
For example, Sarah Kliff at the Washington Post's Wonkblog writes today under the headline: "Pretty much everything you eat is associated with cancer. Don’t worry about it."
Food industry public relations folks will love this message.
To make things worse, Kliff garbles the statistical material (see the comments to the Wonkblog post):
Don’t panic yet, though: The vast majority of those studies, Schoenfeld and Ioannidis found, showed really weak associations between the ingredient at hand and cancer risk. A full 80 percent of the studies had shown statistical relationships that were “weak or nominally significant,” as measured by the study’s P-values.This description seems to ignore the meta-analyses and also it seems to describe perfectly fine statistically significant results (p<.05) as if they were "weak or nominally significant."
Perhaps the authors share some of the responsibility. Kliff quotes the author:
“I was constantly amazed at how often claims about associations of specific foods with cancer were made, so I wanted to examine systematically the phenomenon,” e-mails study author John Ioannidis ”I suspected that much of this literature must be wrong. What we see is that almost everything is claimed to be associated with cancer, and a large portion of these claims seem to be wrong indeed.”As with climate change skepticism, people twist genuine heterogeneity in scientific results to cast doubt on both marginal claims and widely accepted claims alike. I wish that neither journalists nor authors would cavalierly say that most of the literature is wrong, when even the new study shows quite trustworthy results for the authoritative meta-analyses that actually merit attention from the public.
Just for example, the balance of scientific evidence from WCRF and AICR suggests that fruits and vegetables reduce risk and that red meats and processed meats increase risks of certain cancers. Such conclusions based on meta-analyses fare well in the new AJCN study.