
“There does not appear to be evidence that chocolate should be avoided in terms of impact on cardiovascular risk.” So conclude the authors of a report published Monday in the august medical journal Heart. Their takeaway is decidedly unsexy (takeaways from well-conducted health studies, which the aforementioned investigation appears to be, often are), but this has not prevented the Telegraph (“Two bars of chocolate a day ‘lowers risk of stroke and heart disease’”); Independent (“Two chocolate bars a day ‘reduce risk of heart attack and stroke’”); Mirror (“Two chocolate bars a day can SLASH the risk of heart disease and stroke”); and others from overselling the study’s finding with their headlines. Don’t believe them.
Media watchdog Science Media Centre gives a good, measured summary of the study’s findings. Note the thoughtful hedging, the careful weighing of evidence, the absence of superlatives:
The paper shows that people in Norfolk who admitted to consuming more chocolate squares, bars or hot chocolate in a questionnaire administered once in the 1990s were, according to their answers to that questionnaire, younger with lower BMI and blood pressure and less likely to have diabetes or to be physically active and more likely to smoke. During follow-up until 2008, they were also less likely to die from cardiovascular disease (although not less likely to suffer cardiovascular disease).
It is hard to know if the lower risk comes from chocolate or those other factors. The authors have tried to account for these as far as possible, but the nature of the study means that it is not possible to do that perfectly. Therefore, it is possible that the protective effect might be because of something else – not chocolate.
The study is well conducted observational research, but the limitations of the study design mean that the study can only generate hypotheses for evaluation in further research.
Which is to say the study has made incremental progress towards a greater understanding of how chocolate’s ingredients affect the complex biochemical system that is the human body. Behold, the slow, frustrating, and occasionally ambiguous march of science! (Now there’s a headline.)
So here’s a question: How did something so modest become “Two chocolate bars a day can SLASH the risk of heart disease and stroke”? Who or what is to blame for this hype?
When John Bohannon lifted the lid on his “Chocolate Sting” here on io9 last month, the list of guilty parties included faulty experimental design, gimmicky statistics, predatory open-access publishers, unreliable peer review, a hyped press release, and the uncritical parroting of that press release by media outlets. These all rank among the biggest problems plaguing the research-media complex, and Bohannon’s hoax hinged on the exploitation of every single one of them. But in the case of the present study, the list of offenders is much shorter.
The present chocolate study was, unlike Bohannon’s, well-designed. Being an observational study, it was subject to a number of caveats—and, to their credit, lead researcher Phyo Myint and his colleagues go to great lengths to highlight those limitations. The paper was published in a journal that seems likely to have performed rigorous peer review (it will, at the very least, receive a lot more scrutiny, by virtue of Heart’s sizable readership, alone).
Science Media Centre notes one major misstep on the part of the press release issued by the University of Aberdeen, where Myint is a professor. The first line reads: “Eating up to 100g of chocolate every day is linked to lowered heart disease and stroke risk, according to research carried out by scientists at the University of Aberdeen.” SMC says this introduction “could lead people to infer that chocolate has a protective effect against [cardiovascular disease],” but it is arguably a health/science journalist’s job to know that “link” ≠ “causation.”
What’s more, the rest of the press release does an uncharacteristically good job of highlighting the study’s limitations. It includes explicit warnings about causation versus correlation, the unreliability of food questionnaires, and even reverse causation! A recent survey of observational studies found that these sorts of limitations are rarely addressed in press releases and associated news stories. How rarely? Here’s the relevant excerpt from last week’s issue of JAMA Internal Medicine:
Any study limitation was mentioned in 70 of 81 (86%) source article Discussion sections, 26 of 48 (54%) accompanying editorials, 13 of 54 (24%) journal press releases, 16 of 81 (20%) source article abstracts (of which 9 were published in the Annals of Internal Medicine), and 61 of 319 (19%) associated news stories. An explicit statement that causality could not be inferred was infrequently present: 8 of 81 (10%) source article Discussion sections, 7 of 48 (15%) editorials, 2 of 54 (4%) press releases, 3 of 81 (4%) source article abstracts, and 31 of319 (10%) news stories contained such statements.
The University of Aberdeen press release calls attention to not one limitation but several, including an explicit statement that causality cannot be inferred from this, an observational study. Here’s the direct quote:
This is an observational study so no definitive conclusions about cause and effect can be drawn. And the researchers point out that food frequency questionnaires do involve a certain amount of recall bias and underestimation of items eaten.
Reverse causation—whereby those with a higher cardiovascular disease risk profile eat less chocolate and foods containing it than those who are healthier—may also help to explain the results, they say.
The upshot of all this is that the study, the researchers, and the press release all earn relatively high marks for quality and forthrightness. Which leaves the media outlets.
Many have failed by mischaracterizing the first line of the press release in favor of a sensational headline and lead (though even the worst offenders go on to list caveats further down in their pieces, thanks, presumably, to the inclusion of these caveats in the press packet). Others have split the difference; the Washington Post tacked what may be the most ubiquitous two-word caveat in history to the end of its headline: “Good news for chocolate lovers: The more you eat, the lower your risk of heart disease, study suggests.”
The best headline/lead combo I’ve seen comes from Agence France Presse. “More evidence that chocolate may be good for the heart, say researchers,” the headline reads. The opening sentence: “New research has added to tentative evidence that eating chocolate in modest quantities may be good for the heart.”
But nobody tops LiveScience for sheer pithiness. The opening line on its coverage of Myint’s team’s research could serve as the intro to countless health studies, with only minor edits. “Chocolate is good for your heart,” it reads, “sort of, maybe.”
As Virginia Hughes wrote last year, on the heels of a recently published resveratrol study (like chocolate, the health media often reports breathlessly on resveratrol, a compound with ambiguous longevity-boosting/disease-fighting properties): “The science of health is so, so confusing, I almost wonder if it wouldn’t be better for journalists to stop writing about health altogether. Or at least to dramatically change the way we do it.”
Contact the author at rtgonzalez@io9.com and @rtg0nzalez. Art by Jim Cooke.