To give you a brief example of the kind of misreporting that occurs frequently in mainstream science journalism, particularly with nutrition, I’m going to pick apart a 2013 article from the BBC news website. Before I start, I’d like to say that I’m one of the BBC’s biggest fans. However, in the pursuit of true, unblemished objectivity, I can’t exempt them from my unbiased criticism.
Titled “Processed meat ‘early death’ link” the article discusses the results of a study relating processed meat consumption to early death from population data of half a million people. Innocent enough. However, the article itself is symptomatic of the substandard way in which science journalism is executed to the public. At the time of writing, it is the most recent in a long line of similar articles, relaying the latest in the world of nutritional epidemiology: large studies looking at levels of different parameters in significant populations, in an attempt elucidate trends for further research. Here, they looked at processed meat consumption and prevalence of cardiovascular disease, cancer, and ultimately, early death. Early death is an essential outcome to consider, as it has the final say on the usefulness of any specific research. For example, a food might increase your chances of getting cancer, but be protective against heart disease, thus your lifespan eating it might be roughly the same as if you didn’t indulge. Therefore, what we’re interested in from a study like this, is if consuming a food will shorten your life as opposed to not consuming it. The absolutely critical thing to remember with any epidemiological studies like this, which the article fails to point out at any stage, is their inherent inability to establish a causal relationship. That is to say, from these studies, you can’t say that eating this causes that, because the study shows an association not a direct cause. “Correlation does not imply causation” so the famous saying goes.
The article begins with precious little discussion about the actual study, before motoring forward with recommendations based on the study’s supposed implications. “Sausages, ham, bacon and other processed meats appear to increase the risk of dying young, a study of half a million people across Europe suggests.” [emphasis mine] What is critical to the credibility of the author here is the deliberate use of the words I have italicised. Blink and you’ll miss them, but they save the article from factual misdemeanor. Here’s my own, more candid version of the above quote: “According to one study, sausages, ham, bacon and other processed meats were related to a risk of dying young, but the results do not constitute evidence and should not be taken without a suitable level of skepticism.” How’s that for a tagline? Not very appealing, basically. That’s why it’s so rare. It will come as no surprise that the media prefer to sensationalise. It makes the story more juicy. But sensationalism is not acceptable when it compromises the message. Especially when the message is as important and wide reaching as nutrition and health research. Few people know or are able to analyse the wording of these articles with such pedanticism, and this is the crux of the issue. The article goes on; “[The study] concluded diets high in processed meats were linked to cardiovascular disease, cancer and early deaths.” Again, my emphasis. “The researchers, writing in the journal BMC Medicine, said salt and chemicals used to preserve the meat may damage health.” Need I go on? Add to the article’s semantic subterfuge the bolstering of the tagline with the weighty phrase “half a million people”. Wow, this study must be right if so many people were involved. Yes, bigger numbers are more reliable, but that doesn’t make spurious and unrelated conclusions on the data they throw up any more valid. After no more than 3 sentences, the articles enters directly into how to act on this information, which is the one thing that you don’t need to do, given the lack of any reason to do so. “The British Heart Foundation suggested opting for leaner cuts of meat.” But hang on, where’s the evidence that this will help prevent an early death?
Scrolling down we stumble upon a redeeming feature of the article, which should serve to fuel the discerning reader’s skepticism: “[The study] showed people who ate a lot of processed meat were also more likely to smoke, be obese and have other behaviours known to damage health.” So we don’t know if the early deaths they’re talking about are actually caused by the smoking habits of those eating said processed meat. Everyone knows about the severely health damaging effects of smoking, so you can begin to see the problem with this study. You could just as validly write an article reporting the “Processed meat ‘smoking’ link” where “Sausages, ham, bacon and other processed meats appear to increase the risk of smoking, a study of half a million people across Europe suggests.”
If you’re just about ready to debunk the study, as you should be, the researchers have a rebuttle. The article continues, “However, the researchers said even after those risk factors were accounted for, processed meat still damaged health.” Oh, ok so it definitely was the processed meat. Well no, not exactly. What about every other factor, lifestyle and nutrition related, that can impact the health of people who tend to eat more processed meat, that the researchers did not measure? Perhaps these processed meat eaters are more likely to wash their food down with a sugary drink? (there was still a weak association after adjusting for alcohol). Or how about that white bread bun around the processed meat, or the fries that accompany it? What if processed meat eaters are more partial to jumping out of a plane than your average Joe? We’re looking at all cause mortality aren’t we? We already know they’re more likely to pop out for a ciggy after breakfast. Perhaps the kinds of people that eat more processed and red meat, despite the nutritional zeitgeist of the last 40 years being to reduce meat consumption, are the kinds of people who don’t really care about their health as much as others. This conjecture wouldn’t constitute a great leap of faith. To this end, the study points out that there were no pernicious associations between poultry and health, which ties in nicely with the prevailing health-conscious mantra of eating low-fat, white meat. This is the healthy user bias, which researchers tried to account for by adjusting for several variables (smoking, alcohol etc.), but you can’t adjust for every factor, known and unknown. Here, you’ll also notice the patronising way the author introduces the reader to the scientists’ interpretation of the research. “the researchers said” is a phrase reinforcing the idea that scientists are some infallible authority, and what they discuss is shrouded in a mystic coat of scientific esotericism, beyond the intellectual analysis of the layperson. Placing “scientists” on an authoritative pedestal is part of the problem with this kind of reporting, as it quietly leaves the reader to assume that they are under-equipped to interpret the data for themselves. Often, they’re not.
The article goes on: “Dr Rachel Thompson, from the World Cancer Research Fund, said: “This research adds to the body of scientific evidence highlighting the health risks of eating processed meat”.” As a doctor, she should well know that the one thing this study can’t constitute is evidence. I might assume that the “body of scientific evidence” she’s referring to is as faulty as this one, which would really question the case against processed meat. After all, as the article points out, the Italians are one population that eats a huge amount of processed meat yet suffers few of the ill effects this article suggests. You might take this as “evidence” that the study is a load of [insert expletive]. Again, the point is, as with any prospective study like the one here, that we don’t know for sure, so we can’t make any conclusions, or any legitimate recommendations.
I urge you, as the news reporter should, to read the actual study yourself. This is certainly not beyond the means of the average citizen (it’s written in English, after all). You’ll find out that the increase in deaths found was relatively minor, and (God forbid) maybe even make your own informed decision on the dangers of processed meat. Here’s the article again.
So why can’t this type of study draw any firm conclusions? Because you haven’t isolated any variables, you essentially can’t be sure what is causing what, as you have too many things influencing the situation. Theoretically, to work out whether eating processed meat truly lead to early death, you would have to study two identical populations, both consuming identical diets apart from one population which consumed processed meat and the other consuming an unprocessed variety, as close in composition as possible. Ideally, these people wouldn’t be educated about the difference in the diets, or what the experiment was for, and both populations would be consuming isocaloric and isometabolic diets (same number of calories, and same proportion of macronutrients). The problem here is that this kind of study never happens, for obvious reasons. It is totally impractical. This is the major roadblock in nutrition research- the lack of proper, randomised controlled trials. All we have to go on are weak associations drawn from vast swathes of population data. Voilà, epidemiology. And look where that got us- nutrition science is a mess.
I feel obligest to say that of course, processed meat consumption may well lead to an early death. The point is that this study cannot by its very design show such a link. That is the very underwhelming truth behind these studies. So the next time you see an uncritical piece on epidemiology, remember the weakness of these studies, and remember it well.
Have your say, comments are always welcome.