When it comes to our health, the media wields enormous influence over what we think. They tell us what’s good, what’s bad, what’s right and wrong, what we should and shouldn’t eat. When you think about it, that’s quite some responsibility. But do you really think that a sense of philanthropic duty is the driving force behind most of the health ‘news’ stories that you read? Who are we kidding? It’s all about sales, of course, and all too often that means the science plays second fiddle. Who wants boring old science getting in the way of a sensation-making headline?
When it comes to research – especially the parts we’re interested in, namely food, diet and nutrients – there’s a snag. The thing is, these matters are rarely, if ever, clear-cut. Let’s say there are findings from some new research that suggest a component of our diet is good for our health. Now academics and scientists are generally a pretty cautious bunch – they respect the limitations of their work and don’t stretch their conclusions beyond their actual findings. Not that you’ll think this when you hear about it in the media. News headlines are in your face and hard hitting. Fluffy uncertainties just won’t cut it. An attention-grabbing headline is mandatory; relevance to the research is optional. Throw in a few random quotes from experts – as the author Peter McWilliams stated, the problem with ‘experts’ is you can always find one ‘who will say something hopelessly hopeless about anything’ – and boom! You’ve got the formula for some seriously media-friendly scientific sex appeal, or as we prefer to call it, ‘textual garbage’. The reality is that a lot of the very good research into diet and health ends up lost in translation. Somewhere between its publication in a respected scientific journal and the moment it enters our brains via the media, the message gets a tweak here, a twist there and a dash of sensationalism thrown in for good measure, which leaves us floundering in a sea of half-truths and misinformation. Most of it should come with the warning: ‘does nothing like it says in the print’. Don’t get us wrong: we’re not just talking about newspapers and magazines here, the problem runs much deeper. Even the so-called nutrition ‘experts’, the health gurus who sell books by the millions, are implicated. We’re saturated in health misinformation.
Quite frankly, many of us are sick of this contagion of nutritional nonsense. So, before launching headlong into the rest of the book, take a step back and see how research is actually conducted, what it all means and what to watch out for when the media deliver their less-than-perfect messages. Get your head around these and you’ll probably be able to make more sense of nutritional research than most of our cherished health ‘gurus’.
Rule #1: Humans are different from cells in a test tube
At the very basic level, researchers use in-vitro testing, in which they isolate cells or tissues of interest and study them outside a living organism in a kind of ‘chemical soup’. This allows substances of interest (for example, a vitamin or a component of food) to be added to the soup to see what happens. So they might, for example, add vitamin C to some cancer cells and observe its effect. We’re stating the obvious now when we say that what happens here is NOT the same as what happens inside human beings. First, the substance is added directly to the cells, so they are often exposed to concentrations far higher than would normally be seen in the body. Second, humans are highly complex organisms, with intricately interwoven systems of almost infinite processes and reactions. What goes on within a few cells in a test tube or Petri dish is a far cry from what would happen in the body. This type of research is an important part of science, but scientists know its place in the pecking order – as an indispensable starting point of scientific research. It can give us valuable clues about how stuff works deep inside us, what we might call the mechanisms, before going on to be more rigorously tested in animals, and ultimately, humans. But that’s all it is, a starting point.
Rule #2: Humans are different from animals
The next logical step usually involves animal testing. Studying the effects of a dietary component in a living organism, not just a bunch of cells, is a big step closer to what might happen in humans. Mice are often used, due to convenience, consistency, a short lifespan, fast reproduction rates and a closely shared genome and biology to humans. In fact, some pretty amazing stuff has been shown in mice. We can manipulate a hormone and extend life by as much as 30%1. We can increase muscle mass by 60% in two weeks. And we have shown that certain mice can even regrow damaged tissues and organs.
So, can we achieve all of that in humans? The answer is a big ‘no’ (unless you happen to believe the X-Men are real). Animal testing might be a move up from test tubes in the credibility ratings, but it’s still a long stretch from what happens in humans. You’d be pretty foolish to make a lot of wild claims based on animal studies alone.
To prove that, all we need to do is take a look at pharmaceutical drugs. Vast sums of money (we’re talking hundreds of millions) are spent trying to get a single drug to market. But the success rate is low. Of all the drugs that pass in-vitro and animal testing to make it into human testing, only 11% will prove to be safe and effective enough to hit the shelves5. For cancer drugs the rate of success is only 5%5. In 2003, the President of Research and Development at pharmaceutical giant Pfizer, John La Mattina, stated that ‘only one in 25 early candidates survives to become a prescribed medicine’. You don’t need to be a betting person to see these are seriously slim odds.
Strip it down and we can say that this sort of pre-clinical testing never, ever, constitutes evidence that a substance is safe and effective. These are research tools to try and find the best candidates to improve our health, which can then be rigorously tested for efficacy in humans. Alas, the media and our nutrition gurus don’t appear to care too much for this. Taking research carried out in labs and extrapolating the results to humans sounds like a lot more fun. In fact, it’s the very stuff of many a hard-hitting newspaper headline and bestselling health book. To put all of this into context, let’s take just one example of a classic media misinterpretation, and you’ll see what we mean.
Rule #3: Treat headlines with scepticism
Haven’t you heard? The humble curry is right up there in the oncology arsenal – a culinary delight capable of curing the big ‘C’. At least that’s what the papers have been telling us. ‘The Spice Of Life! Curry Fights Cancer’ decreed the New York Daily News. ‘How curry can help keep cancer at bay’ and ‘Curry is a “cure for cancer”’ reported the Daily Mail and The Sun in the UK. Could we be witnessing the medical breakthrough of the decade? Best we take a closer look at the actual science behind the headlines.
The spice turmeric, which gives some Indian dishes a distinctive yellow colour, contains relatively large quantities of curcumin, which has purported benefit in Alzheimer’s disease, infections, liver disease, inflammatory conditions and cancer. Impressive stuff. But there’s a hitch when it comes to curcumin. It has what is known as ‘poor bioavailability’. What that means is, even if you take large doses of curcumin, only tiny amounts of it get into your body, and what does get in is got rid of quickly. From a curry, the amount absorbed is so miniscule that it is not even detectable in the body.
So what were those sensational headlines all about? If you had the time to track down the academic papers being referred to, you would see it was all early stage research. Two of the articles were actually referring to in-vitro studies (basically, tipping some curcumin onto cancer cells in a dish and seeing what effect it had).
Suffice to say, this is hardly the same as what happens when you eat a curry. The other article referred to an animal study, where mice with breast cancer were given a diet containing curcumin. Even allowing for the obvious differences between mice and humans, surely that was better evidence? The mice ate curcumin-containing food and absorbed enough for it to have a beneficial effect on their cancer. Sounds promising, until we see the mice had a diet that was 2% curcumin by weight. With the average person eating just over 2kg of food a day, 2% is a hefty 40g of curcumin. Then there’s the issue that the curcumin content of the average curry/turmeric powder used in curry is a mere 2%. Now, whoever’s out there conjuring up a curry containing 2kg of curry powder, please don’t invite us over for dinner anytime soon.
This isn’t a criticism of the science. Curcumin is a highly bio-active plant compound that could possibly be formulated into an effective medical treatment one day. This is exactly why these initial stages of research are being conducted. But take this basic stage science and start translating it into public health advice and you can easily come up with some far-fetched conclusions. Let us proffer our own equally absurd headline: ‘Curry is a Cause of Cancer’. Abiding by the same rules of reporting used by the media, we’ve taken the same type of in-vitro and animal-testing evidence and conjured up a completely different headline. We can do this because some studies of curcumin have found that it actually causes damage to our DNA, and in so doing could potentially induce cancer.
As well as this, concerns about diarrhoea, anaemia and interactions with drug-metabolizing enzymes have also been raised. You see how easy it is to pick the bits you want in order to make your headline? Unfortunately, the problem is much bigger than just curcumin. It could just as easily be resveratrol from red wine, omega-3 from flaxseeds, or any number of other components of foods you care to mention that make headline news.
It’s rare to pick up a newspaper or nutrition book without seeing some new ‘superfood’ or nutritional supplement being promoted on the basis of less than rigorous evidence. The net result of this shambles is that the real science gets sucked into the media vortex and spat out in a mishmash of dumbed-down soundbites, while the nutritional messages we really should be taking more seriously get lost in a kaleidoscope of pseudoscientific claptrap, peddled by a media with about as much authority to advise on health as the owner of the local pâtisserie.
Rule #4: Know the difference between association and causation
If nothing else, we hope we have shown that jumping to conclusions based on laboratory experiments is unscientific, and probably won’t benefit your long-term health. To acquire proof, we need to carry out research that involves actual humans, and this is where one of the greatest crimes against scientific research is committed in the name of a good story, or to sell a product.
A lot of nutritional research comes in the form of epidemiological studies. These involve looking at populations of people and observing how much disease they get and seeing if it can be linked to a risk factor (for example, smoking) or some protective factor (for example, eating fruit and veggies). And one of the most spectacular ways to manipulate the scientific literature is to blur the boundary between ‘association’ and ‘causation’. This might all sound very academic, but it’s actually pretty simple.
Confusing association with causation means you can easily arrive at the wrong conclusion. For example, a far higher percentage of visually impaired people have Labradors compared to the rest of the population, so you might jump to the conclusion that Labradors cause sight problems. Of course we know better, that if you are visually impaired then you will probably have a Labrador as a guide dog. To think otherwise is ridiculous.
But apply the same scenario to the complex human body and it is not always so transparent. Consequently, much of the debate about diet and nutrition is of the ‘chicken versus egg’ variety. Is a low or high amount of a nutrient a cause of a disease, a consequence of the disease, or simply irrelevant?
To try and limit this confusion, researchers often use what’s known as a cohort study. Say you’re interested in studying the effects of diet on cancer risk. You’d begin by taking a large population that are free of the disease at the outset and collect detailed data on their diet. You’d then follow this population over time, let’s say ten years, and see how many people were diagnosed with cancer during this period. You could then start to analyse the relationship between people’s diet and their risk of cancer, and ask a whole lot of interesting questions. Did people who ate a lot of fruit and veggies have less cancer? Did eating a lot of red meat increase cancer? What effect did drinking alcohol have on cancer risk? And so on.
The European Prospective Investigation into Cancer and Nutrition (EPIC), which we refer to often in this book, is an example of a powerfully designed cohort study, involving more than half a million people in ten countries. These studies are a gold mine of useful information because they help us piece together dietary factors that could influence our risk of disease.
But, however big and impressive these studies are, they’re still observational. As such they can only show us associations, they cannot prove causality. So if we’re not careful about the way we interpret this kind of research, we run the risk of drawing some whacky conclusions, just like we did with the Labradors. Let’s get back to some more news headlines, like this one we spotted: ‘Every hour per day watching TV increases risk of heart disease death by a fifth’.
When it comes to observational studies, you have to ask whether the association makes sense. Does it have ‘biological plausibility’? Are there harmful rays coming from the TV that damage our arteries or is it that the more time we spend on the couch watching TV, the less time we spend being active and improving our heart health. The latter is true, of course, and there’s an ‘association’ between TV watching and heart disease, not ‘causation’.
So even with cohorts, the champions of the epidemiological studies, we can’t prove causation, and that’s all down to what’s called ‘confounding’. This means there could be another variable at play that causes the disease being studied, at the same time as being associated with the risk factor being investigated. In our example, it’s the lack of physical activity that increases heart disease and is also linked to watching more TV.
This issue of confounding variables is just about the biggest banana skin of the lot. Time and time again you’ll find nutritional advice promoted on the basis of the findings of observational studies, as though this type of research gives us stone cold facts. It doesn’t. Any scientist will tell you that. This type of research is extremely useful for generating hypotheses, but it can’t prove them.
Rule #5: Be on the lookout for RCTs (randomized controlled trials)
An epidemiological study can only form a hypothesis, and when it offers up some encouraging findings, these then need to be tested in what’s known as an intervention, or clinical, trial before we can talk about causality. Intervention trials aim to test the hypothesis by taking a population that are as similar to each other as possible, testing an intervention on a proportion of them over a period of time and observing how it influences your measured outcome.