Set aside time and space for your own mental health

Work places huge demands on modern living. It goes without saying that over generations work demands have increased. For example, generations ago the concept of a traditional job for most people was a five-day working working week. The song “9 to 5” by Dolly Parton more or less captured the essence of work at the time. (Unfortunately, it is still fairly often played, to the point that people in non-Western societies assume we still only work eight hour days, five times a week, and spend our free time sunning ourselves on the beach.) Nowadays people have to work longer hours, and travel further for work. The total time spent each day traveling and working each day could easily amount to twelve hours, and it is not like the commute is down time – we still have to catch up on emails, admin, and type away busily on the laptop. We could easily spend sixty hours doing work-related things.

And the weekends? Forget the weekends. These days there is no distinction between a weekday and a weekend. Work has steadily grown its talons and where an hourly-rated individual used to get 1.5 or two times the normal rate for working on a weekend, these days it is the same. Employers realise that in an economy with job shortages, they can get away with offering less rates but will not be short of takers.

The problem with all this is that we don’t really have much of a choice when it comes to establishing our work boundaries or exercise or rights when we realise we are being pushed beyond our work boundaries. We’re made to feel that in these times, we are lucky to hold down a job, and if we complain about the increasing demands of it, and how higher managers try to force more work on us without increasing our pay, we might get told to take a hike and end up in a more difficult situation of having no job, commitments to uphold and having to start out again. There are lots of people trapped in jobs where they have to take on more and more as the years go by, and have every ounce of work and free hour extracted from them for little pay. This places increasing mental demands on the individual not just in having to cope with work demands, but the possibility of being made redundant if he or she shows weakness by having to admit an inability to cope any more. It is a no win situation.

Is it a surprising statistic that mental health illness is on the rise? Hardly.

Nowadays people are working more to live and living to work more.

What can you do to preserve some semblance of mental health?

The first thing you can do for yourself is to establish boundaries within the home. Establish a space where work does not intrude. A good idea is often the bedroom, or even have a rule that you will not work on the bed. If you end up working on your laptop in the bed, it will not do you any good – keep at least a certain physical space for yourself.

Also try to set aside a time each day for yourself if possible. It is possibly unrealistic to say an hour each day in the modern life climate, but something like twenty minutes to half an hour would be a good idea. Use this time to wind down in your personal space doing something you enjoy, that is different from work. You may think you cannot really afford that time, but it is important to disassociate yourself from work for the sake of your long-term longevity. Think of it as enforced rest. If it works better for you, take your enforced in the middle part of the working day. You don’t necessarily have to be doing something, use it to rest or catch a power nap.

Every now again, such as on a weekend, do something different from work. Do a yoga class, learn an instrument like the piano, or play a game of tennis. The possibilities for leisure are endless. But don’t bring your work approach to your leisure. Don’t start charting your tennis serve percentage, or do anything that makes your leisure activity appear like work in a different form. The only thing you must do with a businesslike approach is to meet this leisure appointment so that your life does not revolve around a continuous stretch of work.

We can moan about it but the nature of work will never revert back to how it was in the past. Those of us who long for the good old days will only make our own lives miserable with wishful thinking. Those of us who insist on working five-day weeks will find it is insufficient to maintain modern living in the twenty-first century. We will all end up working longer and harder in the current economic climate, and even if times improve, employers will be unlikely to go back to pre-existing forms of remuneration if workers have already been accustomed and conditioned to work at a certain level, because it is more cost effective to hire fewer employees who do more work than have the same work done by more employees. Employees have to recognise that adapting to increasing work loads are a working life skill, and that taking steps to negate increasing pressures will also be an essential part to maintaining our own mental health and well-being.

Why mental health problems will never go away

Many people will experience mental health difficulties at some point in their lives. As people go through life the demands on them increase, and over a prolonged period these can cause difficulty and ill health. These problems can manifest themselves both in mental and physical ways.

What kind of demands do people experience? One of these can be work-related. People may experience  stresses of looking for work, having to work in jobs which do not test their skills, or be involved in occupations  which require skills that are seemingly difficult to develop. Another common theme with adults that causes stress is having to work in a job which increasingly demands more of them, but does not remunerate them accordingly. In other words, they have to work more for less, and have to accept the gradual lowering of work conditions, but are unable to change jobs because they have already invested so much in it in terms of working years, but cannot leave and start afresh because the demands of a mortgage to pay off and a young family to provide for means they cannot start on a lower rung in a new occupation. Over a prolonged period, this can cause severe unhappiness.

Is it surprising that suicide affects men in their thirties and forties? This is a period for a man where work demands more, the mortgage needs paying, and the family demands more of his time and energy. It is unsurprising that having spent long periods in this sort of daily struggle, that men develop mental health problems which lead some to attempt suicide. But mental health does not just affect men. Among some of the this some women have to deal with are the struggles of bringing up children, the work life balance, the unfulfilled feel of not utilising their skills, and feeling isolated.

One of the ways ill health develops mentally is when people spend too long being pushed too hard for too long. Put under these kind of demands, the body shuts down as a self preservation measure. But the demands on the person don’t just go away. You may want a break from work. But this may not be possible or practical. In fact, the lack of an escape when you are aware you need one may be a greater trigger of mental illness, because it increases the feeling of being trapped.

It is little wonder that when people go through periods of mental ill health, an enforced period of short-term rest will allow them to reset their bearings to be able to continue at work, or return to work with some level of appropriate support. But this is only temporary.

With mental ill health problems, lifestyle adjustments need to be made for sufficient recovery.

Under the Equality Act (2010), your employer has a legal duty to make “reasonable adjustments” to your work.

Mental ill health sufferers could ask about working flexibly, job sharing, or a quiet room, a government report suggests.

The practicality of this however means more cost to the employer in having to make adjustments to accommodate the employee, and unless the employee is a valued one, whom the employer would like to keep, often the case is that they will be gradually phased out of the organisation.

In fact, when an employee attains a certain level of experience within an organisation, employers often ask more of them because they know these employees are locked in to their jobs, and have to accept these grudgingly, or risk losing their jobs, which they cannot do if they have dependents and financial commitments.
And you know the irony of it? The mental ill health sufferer already knows that. Which is why they don’t speak out for help in the first place.

If these employees complain, employers simply replace them with younger employees, who cost less, and who are willing to take on more responsibilities just to have a job. Any responsibilities the redundant employee had simply get divided up between his leftover colleagues, who are in turn asked to take on more responsibilities. They are next in line in the mental health illness queue.

And what if you are self employed? And have to work to support yourself and your dependents? The demands of the day to day are huge and don’t seem to go away.

You can see why mental health is  perceived a ticking time bomb. Organisations are not going to change to accommodate their employees because of cost, but keep pressing them to increase productivity without pay, knowing that they cannot say no, and when all the life and juice has been squeezed out of them, they can be chucked away and replaced with the next dispensable employee.

A ticking time bomb.

The financial considerations of investing in medicine and medical research

BBC News reports that a drug that would reduce the risk of HIV infection would result in cost savings of over £1bn over 80 years. Pre-exposure prophylaxis, or Prep, would reduce infection and hence lower the treatment costs for patients in the long term.

The catch? There is one. It’s the long term.

The cost of the treatment and prevention is such that its provision for the first twenty years – bundling together the cost of medical research and production of medicine – would result in a financial loss, and parity would only be achieved after a period of about thirty to forty years; this period is hard to define because it is dependent on what the drug would cost in the future.

Prep combines two anti-HIV drugs, emtricitabine and tenofovir. The medical trials behind it have concluded it has an effective rate of over one in five when it comes to protecting men who have unprotected sex with men from HIV infection. The exact figure is close to 86%.

Prep can be used either on a daily basis, or on what has been termed a sexual event basis – using it for two days before, during and after periods of unprotected sex.

The research model analysed the potential impact of Prep and found that it could reduce infection rates by over a quarter. The cost of the treatment itself, comparative to the cost of treating infection, would result in a saving over one billion pounds over eight years.

However, it does raise a few ethical questions. If the National Health Service is aiming to be a sustainable one – and one of the aims of sustainability is to empower citizens to take responsibility for their own health –  shouldn’t it be considering less about how it will balance the books, but spend more on education for prevention in the first place? The cost of producing Prep on the NHS would be £19.6 billion over 80 years; while the estimated savings from treatment would be £20.6 billion over the same period. Educating people not to have unprotected sex with those at the risk of HIV arguably would result in a higher saving over a lower time period. Perhaps the NHS should consider ways of reducing cost more significantly, rather than latching on to a cheaper prevention drug immediately. If consumer behaviour is not going to change, symptoms are still going to surface, and the provision of Prep on the NHS may only encourage less self-regulation and awareness.

Pressures faced by the NHS

The good news is that every day, the vast majority of the 63.7 million people eligible to use the NHS, don’t need to. And every day you don’t need to use the NHS, someone else benefits. Most people are very capable of looking after themselves most of the time, self-treating for minor ailments and knowing when and where to seek help for serious illness. 90 per cent of symptoms are self-treated but an estimated fifty-two million visits to general practice each year are still for conditions that would get better with time. Self-care is likely to improve further when those who want it are given access to and control over their medical records and data, and technology is better used to direct you to the right information when you need it. In the meantime, a friendly pharmacist can often save you a trip to the GP.

The bad news is that demand in many areas outstrips both the supply of services and the funding for them. Patients who need care are having to wait longer for it, and too many referrals are rejected as not urgent enough, when the NHS should be doing its utmost to prevent emergencies.

There is a very, very big mental illness iceberg out there and it’s showing no signs of melting.

Life is tough enough for NHS staff, but imagine what it’s like for these children and their carers who can’t get any care at all? The pattern of services struggling – or simply not being able to cope safely – with the demands placed on staff is common across the NHS. Waiting times are creeping up, emergency departments are overflowing, people struggle to get a GP appointment, services are being restricted and rationed and lots of people are having to fend for themselves.The technology and choices patients now face can be very complex, but the strength of the NHS lies in its humanity and the ethos that as a society we should pool our resources to care for and protect the most vulnerable.

The NHS is nearly always buckling under the demands placed on it, partly because it’s a victim of its own success. Improvements in public health, wealth and healthcare since the NHS was founded sixty-seven years ago have been stunning. In 1948, half the population died before the age of sixty-five. Now, the average life expectancy is over eighty. One in three children born today will live to one hundred, but one in four boys born in Glasgow still won’t make it to sixty-five. The UK is still a very unequal society, and the rich not only live fifteen years longer than the poor, but they have up to twenty years more healthy living. Life is very, very unfair, which is why we need to fight poverty and build the confidence, courage and resilience in our children to live well, as well as improve and fund the NHS. Those who pay for it most often use it least. It’s the badge of honour for living in a humane society.

And we nearly all need it eventually if we want help or treatment. One in two people in the UK will get cancer, one in three will get diabetes and nearly everyone will get heart disease. Many of these diseases will be contained rather than cured. Obesity appears unstoppable. Liver disease, kidney disease, lung disease, musculoskeletal disease, depression and anxiety are all on the increase. Mental illnesses cost the UK over £70 billion a year, one in three of us experiences mental health problems every year and one in three people over sixty-five will die with dementia. Many people with dementia live for many years, even if they haven’t been diagnosed and treated. Dementia alone already costs the economy more than cancer and heart disease put together.

These chronic diseases account for 70 per cent of the NHS budget, although many can be delayed if not prevented by a healthier lifestyle. Those with three or more incurable diseases are usually on multiple medications, the combined effects of which are unknown. Many older patients on multiple drugs struggle to take them properly, and there’s a delicate balance between benefit and harm. Loneliness is often a far bigger problem.

The NHS and social care system is crucially dependent on millions of unpaid volunteers and carers, and many very dedicated but poorly paid care workers. The round-the-clock pressures and responsibilities they face are huge. If carers went on strike, the NHS and social care service would collapse overnight. Keeping it all afloat is a massive, collaborative effort and we are far too reliant on institutionalized care, rather than supporting people in their homes.

More women give birth in hospital than need or want to be there, so those who really need to have hospital births don’t always get safe care. Far too many frail elderly patients, many with dementia, end up in acute hospitals, often the most frightening and disorientating place they can be. Far too many people with mental illness end up in police custody and far too many people die in hospital when they want to die at home. We can change this, if services join up, and patients and carers receive the right training and support. Having chemotherapy or dialysis at home can transform not just your healthcare but your whole life. It doesn’t happen nearly enough.

Fixing the NHS and social care system will not be quick or easy, even if we put more money in. In many instances, it would often be kinder to have less high-tech, expensive intervention than more. If all we ever did in the NHS was capture the ideas and feedback from frontline staff, patients and carers, and use it to continuously improve a stable system that everyone understood, the NHS would be out of sight as the world’s best. We have to spend every available penny supporting and joining up the frontline – the NHS is not about the bricks and mortar, it’s about mobilizing and motivating a brilliant workforce to serve patients and give you as much control as you want over your care. And to do that, you need to find your voice and we need to listen to you.

Research done by the Health Foundation, When Doctors and Patients Talk, found that NHS staff are often as anxious and fearful as you are during consultations. They are anxious and frightened of missing an important diagnosis, not being able to give patients what they are entitled to, not being able to practise the standards of care they’d like to, having to deal with justifiable anger, missing a target they have been ordered to hit, being asked to do something they do not feel competent to do, or having to look after so many patients in such a short space of time they just do not feel safe. The ever-present fear is that they simply cannot cope safely with the demand. Just as we shouldn’t blame people for being ill or old or overweight, we shouldn’t blame NHS staff for not being able to always provide the highest standards of care. Praise, kindness and understanding are much better motivators.

And there’s plenty to be thankful for. The Commonwealth Fund in America compares the health systems in eleven countries and ranks them according to eleven measures: quality of care, effective care, safe care, coordinated care, patient-centred care, access, cost-related problems, timeliness of care, efficiency, equity, and healthy lives. You might expect Austria, Canada, France, Germany, the Netherlands, New Zealand, Norway, Sweden, Switzerland or America to thrash us. In fact, the 2014 ranking (based on 2011 data) puts the UK top of the healthcare table overall, and first in eight of the eleven categories. It came second in equity to Sweden, and third behind Switzerland and the Netherlands for timeliness of care. The NHS is far from perfect, but we should celebrate and publicize the amazing care it often gives, try to improve the good care it usually gives and quickly address the poor care it occasionally gives, so further harm is prevented.

To improve, the NHS needs to be simplified so that anyone can understand it. We pretend to distinguish between healthcare and social care, but it’s all ‘care’ and it should be joined into one care system, with those with the greatest need treated by one team with one named person responsible for coordinating your care. And we must all do everything we can to live well. In the NHS, the staff spend too much time diving into the river of illness, pulling people out and trying to put them back together that no-one has time to wander upstream and look at who’s pushing you in.

Media’s Marvellous Medicine

When it comes to our health, the media wields enormous influence over what we think. They tell us what’s good, what’s bad, what’s right and wrong, what we should and shouldn’t eat. When you think about it, that’s quite some responsibility. But do you really think that a sense of philanthropic duty is the driving force behind most of the health ‘news’ stories that you read? Who are we kidding? It’s all about sales, of course, and all too often that means the science plays second fiddle. Who wants boring old science getting in the way of a sensation-making headline?

When it comes to research – especially the parts we’re interested in, namely food, diet and nutrients – there’s a snag. The thing is, these matters are rarely, if ever, clear-cut. Let’s say there are findings from some new research that suggest a component of our diet is good for our health. Now academics and scientists are generally a pretty cautious bunch – they respect the limitations of their work and don’t stretch their conclusions beyond their actual findings. Not that you’ll think this when you hear about it in the media. News headlines are in your face and hard hitting. Fluffy uncertainties just won’t cut it. An attention-grabbing headline is mandatory; relevance to the research is optional. Throw in a few random quotes from experts – as the author Peter McWilliams stated, the problem with ‘experts’ is you can always find one ‘who will say something hopelessly hopeless about anything’ – and boom! You’ve got the formula for some seriously media-friendly scientific sex appeal, or as we prefer to call it, ‘textual garbage’. The reality is that a lot of the very good research into diet and health ends up lost in translation. Somewhere between its publication in a respected scientific journal and the moment it enters our brains via the media, the message gets a tweak here, a twist there and a dash of sensationalism thrown in for good measure, which leaves us floundering in a sea of half-truths and misinformation. Most of it should come with the warning: ‘does nothing like it says in the print’. Don’t get us wrong: we’re not just talking about newspapers and magazines here, the problem runs much deeper. Even the so-called nutrition ‘experts’, the health gurus who sell books by the millions, are implicated. We’re saturated in health misinformation.

Quite frankly, many of us are sick of this contagion of nutritional nonsense. So, before launching headlong into the rest of the book, take a step back and see how research is actually conducted, what it all means and what to watch out for when the media deliver their less-than-perfect messages. Get your head around these and you’ll probably be able to make more sense of nutritional research than most of our cherished health ‘gurus’.

Rule #1: Humans are different from cells in a test tube
At the very basic level, researchers use in-vitro testing, in which they isolate cells or tissues of interest and study them outside a living organism in a kind of ‘chemical soup’. This allows substances of interest (for example, a vitamin or a component of food) to be added to the soup to see what happens. So they might, for example, add vitamin C to some cancer cells and observe its effect. We’re stating the obvious now when we say that what happens here is NOT the same as what happens inside human beings. First, the substance is added directly to the cells, so they are often exposed to concentrations far higher than would normally be seen in the body. Second, humans are highly complex organisms, with intricately interwoven systems of almost infinite processes and reactions. What goes on within a few cells in a test tube or Petri dish is a far cry from what would happen in the body. This type of research is an important part of science, but scientists know its place in the pecking order – as an indispensable starting point of scientific research. It can give us valuable clues about how stuff works deep inside us, what we might call the mechanisms, before going on to be more rigorously tested in animals, and ultimately, humans. But that’s all it is, a starting point.

Rule #2: Humans are different from animals
The next logical step usually involves animal testing. Studying the effects of a dietary component in a living organism, not just a bunch of cells, is a big step closer to what might happen in humans. Mice are often used, due to convenience, consistency, a short lifespan, fast reproduction rates and a closely shared genome and biology to humans. In fact, some pretty amazing stuff has been shown in mice. We can manipulate a hormone and extend life by as much as 30%1. We can increase muscle mass by 60% in two weeks. And we have shown that certain mice can even regrow damaged tissues and organs.

So, can we achieve all of that in humans? The answer is a big ‘no’ (unless you happen to believe the X-Men are real). Animal testing might be a move up from test tubes in the credibility ratings, but it’s still a long stretch from what happens in humans. You’d be pretty foolish to make a lot of wild claims based on animal studies alone.

To prove that, all we need to do is take a look at pharmaceutical drugs. Vast sums of money (we’re talking hundreds of millions) are spent trying to get a single drug to market. But the success rate is low. Of all the drugs that pass in-vitro and animal testing to make it into human testing, only 11% will prove to be safe and effective enough to hit the shelves5. For cancer drugs the rate of success is only 5%5. In 2003, the President of Research and Development at pharmaceutical giant Pfizer, John La Mattina, stated that ‘only one in 25 early candidates survives to become a prescribed medicine’. You don’t need to be a betting person to see these are seriously slim odds.

Strip it down and we can say that this sort of pre-clinical testing never, ever, constitutes evidence that a substance is safe and effective. These are research tools to try and find the best candidates to improve our health, which can then be rigorously tested for efficacy in humans. Alas, the media and our nutrition gurus don’t appear to care too much for this. Taking research carried out in labs and extrapolating the results to humans sounds like a lot more fun. In fact, it’s the very stuff of many a hard-hitting newspaper headline and bestselling health book. To put all of this into context, let’s take just one example of a classic media misinterpretation, and you’ll see what we mean.

Rule #3: Treat headlines with scepticism
Haven’t you heard? The humble curry is right up there in the oncology arsenal – a culinary delight capable of curing the big ‘C’. At least that’s what the papers have been telling us. ‘The Spice Of Life! Curry Fights Cancer’ decreed the New York Daily News. ‘How curry can help keep cancer at bay’ and ‘Curry is a “cure for cancer”’ reported the Daily Mail and The Sun in the UK. Could we be witnessing the medical breakthrough of the decade? Best we take a closer look at the actual science behind the headlines.

The spice turmeric, which gives some Indian dishes a distinctive yellow colour, contains relatively large quantities of curcumin, which has purported benefit in Alzheimer’s disease, infections, liver disease, inflammatory conditions and cancer. Impressive stuff. But there’s a hitch when it comes to curcumin. It has what is known as ‘poor bioavailability’. What that means is, even if you take large doses of curcumin, only tiny amounts of it get into your body, and what does get in is got rid of quickly. From a curry, the amount absorbed is so miniscule that it is not even detectable in the body.

So what were those sensational headlines all about? If you had the time to track down the academic papers being referred to, you would see it was all early stage research. Two of the articles were actually referring to in-vitro studies (basically, tipping some curcumin onto cancer cells in a dish and seeing what effect it had).

Suffice to say, this is hardly the same as what happens when you eat a curry. The other article referred to an animal study, where mice with breast cancer were given a diet containing curcumin. Even allowing for the obvious differences between mice and humans, surely that was better evidence? The mice ate curcumin-containing food and absorbed enough for it to have a beneficial effect on their cancer. Sounds promising, until we see the mice had a diet that was 2% curcumin by weight. With the average person eating just over 2kg of food a day, 2% is a hefty 40g of curcumin. Then there’s the issue that the curcumin content of the average curry/turmeric powder used in curry is a mere 2%. Now, whoever’s out there conjuring up a curry containing 2kg of curry powder, please don’t invite us over for dinner anytime soon.

This isn’t a criticism of the science. Curcumin is a highly bio-active plant compound that could possibly be formulated into an effective medical treatment one day. This is exactly why these initial stages of research are being conducted. But take this basic stage science and start translating it into public health advice and you can easily come up with some far-fetched conclusions. Let us proffer our own equally absurd headline: ‘Curry is a Cause of Cancer’. Abiding by the same rules of reporting used by the media, we’ve taken the same type of in-vitro and animal-testing evidence and conjured up a completely different headline. We can do this because some studies of curcumin have found that it actually causes damage to our DNA, and in so doing could potentially induce cancer.

As well as this, concerns about diarrhoea, anaemia and interactions with drug-metabolizing enzymes have also been raised. You see how easy it is to pick the bits you want in order to make your headline? Unfortunately, the problem is much bigger than just curcumin. It could just as easily be resveratrol from red wine, omega-3 from flaxseeds, or any number of other components of foods you care to mention that make headline news.

It’s rare to pick up a newspaper or nutrition book without seeing some new ‘superfood’ or nutritional supplement being promoted on the basis of less than rigorous evidence. The net result of this shambles is that the real science gets sucked into the media vortex and spat out in a mishmash of dumbed-down soundbites, while the nutritional messages we really should be taking more seriously get lost in a kaleidoscope of pseudoscientific claptrap, peddled by a media with about as much authority to advise on health as the owner of the local pâtisserie.

Rule #4: Know the difference between association and causation
If nothing else, we hope we have shown that jumping to conclusions based on laboratory experiments is unscientific, and probably won’t benefit your long-term health. To acquire proof, we need to carry out research that involves actual humans, and this is where one of the greatest crimes against scientific research is committed in the name of a good story, or to sell a product.

A lot of nutritional research comes in the form of epidemiological studies. These involve looking at populations of people and observing how much disease they get and seeing if it can be linked to a risk factor (for example, smoking) or some protective factor (for example, eating fruit and veggies). And one of the most spectacular ways to manipulate the scientific literature is to blur the boundary between ‘association’ and ‘causation’. This might all sound very academic, but it’s actually pretty simple.

Confusing association with causation means you can easily arrive at the wrong conclusion. For example, a far higher percentage of visually impaired people have Labradors compared to the rest of the population, so you might jump to the conclusion that Labradors cause sight problems. Of course we know better, that if you are visually impaired then you will probably have a Labrador as a guide dog. To think otherwise is ridiculous.

But apply the same scenario to the complex human body and it is not always so transparent. Consequently, much of the debate about diet and nutrition is of the ‘chicken versus egg’ variety. Is a low or high amount of a nutrient a cause of a disease, a consequence of the disease, or simply irrelevant?

To try and limit this confusion, researchers often use what’s known as a cohort study. Say you’re interested in studying the effects of diet on cancer risk. You’d begin by taking a large population that are free of the disease at the outset and collect detailed data on their diet. You’d then follow this population over time, let’s say ten years, and see how many people were diagnosed with cancer during this period. You could then start to analyse the relationship between people’s diet and their risk of cancer, and ask a whole lot of interesting questions. Did people who ate a lot of fruit and veggies have less cancer? Did eating a lot of red meat increase cancer? What effect did drinking alcohol have on cancer risk? And so on.

The European Prospective Investigation into Cancer and Nutrition (EPIC), which we refer to often in this book, is an example of a powerfully designed cohort study, involving more than half a million people in ten countries. These studies are a gold mine of useful information because they help us piece together dietary factors that could influence our risk of disease.

But, however big and impressive these studies are, they’re still observational. As such they can only show us associations, they cannot prove causality. So if we’re not careful about the way we interpret this kind of research, we run the risk of drawing some whacky conclusions, just like we did with the Labradors. Let’s get back to some more news headlines, like this one we spotted: ‘Every hour per day watching TV increases risk of heart disease death by a fifth’.

When it comes to observational studies, you have to ask whether the association makes sense. Does it have ‘biological plausibility’? Are there harmful rays coming from the TV that damage our arteries or is it that the more time we spend on the couch watching TV, the less time we spend being active and improving our heart health. The latter is true, of course, and there’s an ‘association’ between TV watching and heart disease, not ‘causation’.

So even with cohorts, the champions of the epidemiological studies, we can’t prove causation, and that’s all down to what’s called ‘confounding’. This means there could be another variable at play that causes the disease being studied, at the same time as being associated with the risk factor being investigated. In our example, it’s the lack of physical activity that increases heart disease and is also linked to watching more TV.

This issue of confounding variables is just about the biggest banana skin of the lot. Time and time again you’ll find nutritional advice promoted on the basis of the findings of observational studies, as though this type of research gives us stone cold facts. It doesn’t. Any scientist will tell you that. This type of research is extremely useful for generating hypotheses, but it can’t prove them.

Rule #5: Be on the lookout for RCTs (randomized controlled trials)
An epidemiological study can only form a hypothesis, and when it offers up some encouraging findings, these then need to be tested in what’s known as an intervention, or clinical, trial before we can talk about causality. Intervention trials aim to test the hypothesis by taking a population that are as similar to each other as possible, testing an intervention on a proportion of them over a period of time and observing how it influences your measured outcome.