Why mental health problems will never go away

Many people will experience mental health difficulties at some point in their lives. As people go through life the demands on them increase, and over a prolonged period these can cause difficulty and ill health. These problems can manifest themselves both in mental and physical ways.

What kind of demands do people experience? One of these can be work-related. People may experience  stresses of looking for work, having to work in jobs which do not test their skills, or be involved in occupations  which require skills that are seemingly difficult to develop. Another common theme with adults that causes stress is having to work in a job which increasingly demands more of them, but does not remunerate them accordingly. In other words, they have to work more for less, and have to accept the gradual lowering of work conditions, but are unable to change jobs because they have already invested so much in it in terms of working years, but cannot leave and start afresh because the demands of a mortgage to pay off and a young family to provide for means they cannot start on a lower rung in a new occupation. Over a prolonged period, this can cause severe unhappiness.

Is it surprising that suicide affects men in their thirties and forties? This is a period for a man where work demands more, the mortgage needs paying, and the family demands more of his time and energy. It is unsurprising that having spent long periods in this sort of daily struggle, that men develop mental health problems which lead some to attempt suicide. But mental health does not just affect men. Among some of the this some women have to deal with are the struggles of bringing up children, the work life balance, the unfulfilled feel of not utilising their skills, and feeling isolated.

One of the ways ill health develops mentally is when people spend too long being pushed too hard for too long. Put under these kind of demands, the body shuts down as a self preservation measure. But the demands on the person don’t just go away. You may want a break from work. But this may not be possible or practical. In fact, the lack of an escape when you are aware you need one may be a greater trigger of mental illness, because it increases the feeling of being trapped.

It is little wonder that when people go through periods of mental ill health, an enforced period of short-term rest will allow them to reset their bearings to be able to continue at work, or return to work with some level of appropriate support. But this is only temporary.

With mental ill health problems, lifestyle adjustments need to be made for sufficient recovery.

Under the Equality Act (2010), your employer has a legal duty to make “reasonable adjustments” to your work.

Mental ill health sufferers could ask about working flexibly, job sharing, or a quiet room, a government report suggests.

The practicality of this however means more cost to the employer in having to make adjustments to accommodate the employee, and unless the employee is a valued one, whom the employer would like to keep, often the case is that they will be gradually phased out of the organisation.

In fact, when an employee attains a certain level of experience within an organisation, employers often ask more of them because they know these employees are locked in to their jobs, and have to accept these grudgingly, or risk losing their jobs, which they cannot do if they have dependents and financial commitments.
And you know the irony of it? The mental ill health sufferer already knows that. Which is why they don’t speak out for help in the first place.

If these employees complain, employers simply replace them with younger employees, who cost less, and who are willing to take on more responsibilities just to have a job. Any responsibilities the redundant employee had simply get divided up between his leftover colleagues, who are in turn asked to take on more responsibilities. They are next in line in the mental health illness queue.

And what if you are self employed? And have to work to support yourself and your dependents? The demands of the day to day are huge and don’t seem to go away.

You can see why mental health is  perceived a ticking time bomb. Organisations are not going to change to accommodate their employees because of cost, but keep pressing them to increase productivity without pay, knowing that they cannot say no, and when all the life and juice has been squeezed out of them, they can be chucked away and replaced with the next dispensable employee.

A ticking time bomb.

The role of pharmacy in healthcare

Pharmacists are experts on the actions and uses of drugs, including their chemistry, their formulation into medicines and the ways in which they are used to manage diseases. The principal aim of the pharmacist is to use this expertise to improve patient care. Pharmacists are in close contact with patients and so have an important role both in assisting patients to make the best use of their prescribed medicines and in advising patients on the appropriate self-management of self-limiting and minor conditions. Increasingly this latter aspect includes OTC prescribing of effective and potent treatments. Pharmacists are also in close working relationships with other members of the healthcare team –doctors, nurses, dentists and others –where they are able to give advice on a wide range of issues surrounding the use of medicines.

Pharmacists are employed in many different areas of practice. These include the traditional ones of hospital and community practice as well as more recently introduced advisory roles at health authority/ health board level and working directly with general practitioners as part of the core, practice-based primary healthcare team. Additionally, pharmacists are employed in the pharmaceutical industry and in academia.

Members of the general public are most likely to meet pharmacists in high street pharmacies or on a hospital ward. However, pharmacists also visit residential homes (see Ch. 49), make visits to patients’own homes and are now involved in running chronic disease clinics in primary and secondary care. In addition, pharmacists will also be contributing to the care of patients through their dealings with other members of the healthcare team in the hospital and community setting.

Historically, pharmacists and general practitioners have a common ancestry as apothecaries. Apothecaries both dispensed medicines prescribed by physicians and recommended medicines for those members of the public unable to afford physicians’fees. As the two professions of pharmacy and general practice emerged this remit split so that pharmacists became primarily responsible for the technical, dispensing aspects of this role. With the advent of the NHS in the UK in 1948, and the philosophy of free medical care at the point of delivery, the advisory function of the pharmacist further decreased. As a result, pharmacists spent more of their time in the dispensing of medicines –and derived an increased proportion of their income from it. At the same time, radical changes in the nature of dispensing itself, as described in the following paragraphs, occurred.

In the early years, many prescriptions were for extemporaneously prepared medicines, either following standard ‘recipes’from formularies such as the British Pharmacopoeia (BP) or British Pharmaceutical Codex (BPC), or following individual recipes written by the prescriber (see Ch. 30). The situation was similar in hospital pharmacy, where most prescriptions were prepared on an individual basis. There was some small-scale manufacture of a range of commonly used items. In both situations, pharmacists required manipulative and time-consuming skills to produce the medicines. Thus a wide range of preparations was made, including liquids for internal and external use, ointments, creams, poultices, plasters, eye drops and ointments, injections and solid dosage forms such as pills, capsules and moulded tablets (see Chs 32–39). Scientific advances have greatly increased the effectiveness of drugs but have also rendered them more complex, potentially more toxic and requiring more sophisticated use than their predecessors. The pharmaceutical industry developed in tandem with these drug developments, contributing to further scientific advances and producing manufactured medical products. This had a number of advantages. For one thing, there was an increased reliability in the product, which could be subjected to suitable quality assessment and assurance. This led to improved formulations, modifications to drug availability and increased use of tablets which have a greater convenience for the patient. Some doctors did not agree with the loss of flexibility in prescribing which resulted from having to use predetermined doses and combinations of materials. From the pharmacist’s point of view there was a reduction in the time spent in the routine extemporaneous production of medicines, which many saw as an advantage. Others saw it as a reduction in the mystique associated with the professional role of the pharmacist. There was also an erosion of the technical skill base of the pharmacist. A look through copies of the BPC in the 1950s, 1960s and 1970s will show the reduction in the number and diversity of formulations included in the Formulary section. That section has been omitted from the most recent editions. However, some extemporaneous dispensing is still required and pharmacists remain the only professionals trained in these skills.

The changing patterns of work of the pharmacist, in community pharmacy in particular, led to an uncertainty about the future role of the pharmacist and a general consensus that pharmacists were no longer being utilized to their full potential. If the pharmacist was not required to compound medicines or to give general advice on diseases, what was the pharmacist to do?

The need to review the future for pharmacy was first formally recognized in 1979 in a report on the NHS which had the remit to consider the best use and management of its financial and manpower resources. This was followed by a succession of key reports and papers, which repeatedly identified the need to exploit the pharmacist’s expertise and knowledge to better effect. Key among these reports was the Nuffield Report of 1986. This report, which included nearly 100 recommendations, led the way to many new initiatives, both by the profession and by the government, and laid the foundation for the recent developments in the practice of pharmacy, which are reflected in this book.

Radical change, as recommended in the Nuffield Report, does not necessarily happen quickly, particularly when regulations and statute are involved. In the 28 years since Nuffield was published, there have been several different agendas which have come together and between them facilitated the paradigm shift for pharmacy envisaged in the Nuffield Report. These agendas will be briefly described below. They have finally resulted in extensive professional change, articulated in the definitive statements about the role of pharmacy in the NHS plans for pharmacy in England (2000), Scotland (2001) and Wales (2002) and the subsequent new contractual frameworks for community pharmacy. In addition, other regulatory changes have occurred as part of government policy to increase convenient public access to a wider range of medicines on the NHS (see Ch. 4). These changes reflect general societal trends to deregulate the professions while having in place a framework to ensure safe practice and a recognition that the public are increasingly well informed through widespread access to the internet. For pharmacy, therefore, two routes for the supply of prescription only medicines (POM) have opened up. Until recently, POM medicines were only available on the prescription of a doctor or dentist, but as a result of the Crown Review in 1999, two significant changes emerged.

First, patient group directions (PGDs) were introduced in 2000. A PGD is a written direction for the supply, or supply and administration, of a POM to persons generally by named groups of professionals. So, for example, under a PGD, community pharmacists could supply a specific POM antibiotic to people with a confirmed diagnostic infection, e.g. azithromycin for Chlamydia.

Second, prescribing rights for pharmacists, alongside nurses and some other healthcare professionals, have been introduced, initially as supplementary prescribers and more recently, as independent prescribers.

The council of the Royal Pharmaceutical Society of Great Britain (RPSGB) decided that it was necessary to allow all members to contribute to a radical appraisal of the profession, what it should be doing and how to achieve it. The ‘Pharmacy in a New Age’consultation was launched in October 1995, with an invitation to all members to contribute their views to the council. These were combined into a subsequent document produced by the council in September 1996 called Pharmacy in a New Age: The New Horizon. This indicated that there was overwhelming agreement from pharmacists that the profession could not stand still.

The main output of this professional review was a commitment to take forward a more proactive, patient-centred clinical role for pharmacy using pharmacists’ skills and knowledge to best effect.

Why Asians are more prone to Type 2 diabetes than Westerners

Thirty-four year-old Alan Phua is what you might describe as a typical male Chinese man. He exercises for three to five times a week in a country that places a high emphasis on healthy lifestyles. He also carefully observes what he eats and is strict about his diet.

Alan lives in Singapore. In addition to military service for the duration of two and a half years when they turn eighteen, citizens have annual reservist training for two weeks until they turn forty. Failing to meet targets for physical exercises such as chin ups, standing broad jumps, sit ups, shuttle runs and a 1.5 mile run means remedial physical training every few months until these standards are meet. But not all is negative though. Meeting or exceeding these targets is rewarded by financial incentives. In other words, living in Singapore as a male means there is a strong push to keep fit and maintain it.

The reasons for this are very clear. Singapore is a small country surrounded by two large neighbours in Malaysia and Indonesia. Its population of five million citizens means that like Israel, it has to rely on a citizen reservist force should the threat of war ever loom. While most of the citizens there seem of the mindset that military war would never break out, as the country is so small that any military action would damage the infrastructure and paralyse it; furthermore, the military is only a deterrent force, the readiness to military action gives leverage in negotiations between nation. For example, if the countries disagree over the supply of water that Malaysia gives Singapore to refine, and the discussions escalate towards a military standoff, having a reservist army puts the country in a better negotiating position. But while many may claim that a war is hypothetical, there is a simpler reason for maintaining fitness. A fitter population means less stress on the healthcare system. Singapore is the sustainable healthcare system that many countries are seeking to adopt.

Like many others in Singapore, Alan’s body does not produce enough insulin. This, as a result, causes the accumulation of sugar in the bloodstream. The lack of insulin leads to other health issues, such as general fatigue, infections, or other effects such as the failure of wounds to heal. However, all is not lost. Eating properly and having a good level of exercise can prevent the blood glucose level from rising and developing into diabetes.

Local researchers from the country’s National University Hospital (NUH), working together with Janssen Pharmaceuticals, have discovered that the reason why Asians are moresusceptible than Westerners to developing Type 2 diabetes is the inability of their bodies to produce high enough levels of insulin.

Even though the finding was based only on a small sample size of 140 mostly Chinese participants, the data, if expanded and refined, will point the way and help patients with diabetes to manage it better; not just for local patients but also within the region. Doctors believe that better dietary advice and a better selection of drugs would help patients to treat diabetes. The preliminary findings are part of the country’s largest diabetes study launched last year. The five-year ongoing study has recruited around 1,300 participants, and aims to eventually nearly double that.

The researchers did however notice the ethnicity of the results was fairly restricted and more participants from a wider racial profile will be needed for the results to be applied to the general population.

Currently, the statistics show that one in three Singaporeans has a risk of developing diabetes. Currently, one out of every fourteen Singaporeans are diabetic. Type 2 diabetes comes about because insufficient insulin is produced by the pancreas, or because the body has insulin resistance.

A previous study that 8 per cent of Chinese people with a Body Mass Index (BMI) of 23 have diabetes. A BMI of 23 is within the normal weight range for Caucasians, and the rate of diabetes development within Chinese people is four times more than their European counterparts. The researchers claimed that it highlighted the importance of avoiding too much high-glucose food such as those rich in simple carbohydrates which include white rice and sugar.

The findings could also lay the foundation for efforts to test whether therapies that target insulin secretion and the ability to make more insulin could be more effective in the local population, and lead to customised diabetes treatment.

What bearing does this have on us, and what action can we take? A good start would be to avoid eating high glucose food such as rice too often and managing our diet. Also try adopting a more active lifestyle!

Revising Traditional Antibiotic Advice

What do you do when you have a cold and feel under the weather? Perhaps you decide to tough it out, and head to work as usual. You grin and bear it, because as far as you are concerned, it’s just a common cold and you can’t do anything about it.

But suppose you don’t get any better after a week, when you expected that the cold would have already run its course. You decide to stay at home to rest, and after a further two days when no improvement is seen, you go to visit the doctor.

The doctor’s advice? A course of antibiotics. Two tablets three times a day after meals, and by the way, keep finishing the course even when you feel better.

This is the advice that has been dispensed through decades to patients. Finish the whole prescription of antibiotics. And as patients, we put our trust in doctors so whatever they said went. Who were we to argue with seven years of medical training?

But what would you say if this medical advice turned out to be incorrect? I know what I’d think – firstly the sceptic in me would say medical advice is fickle and flows with what is fashionable at the time. At times, medicine seems also subservient to politics and economy. Remember the case with red wine? When the economy was flagging, a glass of red wine was said to be good for you. Yet when the NHS was under strain this so-called health benefit was reversed.

In this day and age it is also fashionable for everyone to carve a niche for themselves, and for many the way to do so is to turn traditional advice upside down on its head and revise or reformat existing information. And so, with these in mind, it is unsurprising that we learn of yet another study that claims the rule that patients must finish antibiotics course is wrong.

The new slant on the old problem is that patients should stop taking the prescribed medication when they feel better rather than as what doctors previously used to recommend.

The new panel of experts suggest that  the long embedded rule is incorrect, because continually taking medication after we have felt better only lowers the body’s resistance in the long run. They argue that if the body already feels better, giving it medication it does not need has counter-productive effects.

This differs with the advice that doctors have traditionally recommended, which is based on the idea that bacteria remains in our bodies even though we feel better and these bacteria may develop adaptation to antibiotics if they are not fully killed off. In other words, if you have not fully killed off the bacteria, it develops tolerance and immunity to the drug which partially fended it off, and ultimately the antibiotics’ effectiveness is negated.

Imagine two medieval armies: Trojans and Greeks. One day the Trojans manage to get inside the Greek city walls and wreak havoc (according to the Greeks anyway) with their torches, spears and swords. But the Greeks have a special weapon, say for arguments’ sake, an M16 with a laser sight. If the Greeks completely defeat the Trojans, the effectiveness of their weapon is guaranteed against successive waves of Trojan attacks. But if the Greek army stops to celebrate the moment the city battle swings in their favour, retreating Trojans may bring back information about the weapon, and how it works, and plan successive attacks that limit the effectiveness of the weapon or destroy it completely.

Martin Llewelyn, professor in infectious diseases at Brighton and Sussex medical school have called for a re-examination of the traditional advice. In an analysis in the British Medical Journal, they say “the idea that stopping antibiotic treatment early encourages antibiotic resistance is not supported by evidence, while taking antibiotics for longer than necessary increases the risk of resistance”.

In other words, stop taking the medicine the moment you feel better.
In the past, the theory supporting the completion of a course of antibiotics has been that too short a course would allow the bacteria causing  disease to mutate and become resistant to the drug.

For certain diseases, bacteria can clearly become resistant if the drugs are not taken for long enough to completely eradicate them. One such example of this is tuberculosis.

But a large majority of the bacteria that cause illnesses are found in the environment around us and have no impact until the bacteria gets into the bloodstream or the gut. The case putting forward a cessation in medication once the patient’s health improves is that the longer the bacterial exposure to antibiotics within the body, the higher the chance of developed resistance.

The hypothesis put forth by Professor Llewelyn has not been without its backers.

Peter Openshaw, president of the British Society for Immunology, said he had always considered the notion  that stopping antibiotic treatment early would make organisms more drug-resistant rather “illogical”.

He supported the idea of a more sparing use of antibiotics because the evidence of a link between long-term complete use and benefit was tenuous.

He dismissed claims that not finishing a course of antibiotics would lead to bacteria gaining antibiotic resistance but thought the reverse would be more true. “Far from being irresponsible, shortening the duration of a course of antibiotics might make antibiotic resistance less likely.”

A great British authority, Prof Harold Lambert had made the suggestion as far back as in 1999 in a Lancet article entitled “Don’t keep taking the tablets”. Even though the idea had been broached then, it had not been taken seriously and with hindsight it is surprising that nearly two decades later the medical world has not investigated the alternatives fully and that the optimum duration of antibiotics courses or doses in many conditions remains an investigated fast.

Jodi Lindsay, a professor of microbial pathogenesis at St George’s, University of London, stated that the new research by Professor Llewellyn was good in principle, and that the previous advice to complete a course of antibiotics may have been based on a fear of under-treatment. But nevertheless she cautioned against an over-reaction towards the results of the findings. “The evidence for shorter courses of antibiotics being equal to longer courses, in terms of cure or outcome, is generally good, although more studies would help and there are a few exceptions when longer courses are better – for example, TB.”

To complicate matters, the ideal length of a course of antibiotics varies in individuals depending on what antibiotics they have taken in the past. Hospitalised patients can be tested to find out when the drugs can be stopped. Outside of a hospital setting, this testing is not feasible.

The World Health Organisation advice is still based on the pre-existing guidelines and has not changed.

The Royal College of GPs, however, expressed caution over the findings. “Recommended courses of antibiotics are not random,” said its chair, Prof Helen Stokes-Lampard. She further elaborated that antibiotic treatment courses were already being customised according to individual conditions and if patients took it upon themselves to adjust the prescribed periods, stopping when they felt better, it would be dangerous because a slight turn in outlook did not necessarily demonstrate the complete eradication of the disease. Professor Stokes-Lampard also stressed that it was important for patients to have clear guidelines to adhere to and any adjustment using feel as an indicator might be confusing.

The National Institute for Health and Care Excellence is currently developing guidance for managing common infections, which will look at all available evidence on appropriate prescribing of antibiotics.

The cynics among us might ask, has such a review on current guidelines been made with the objective to cut the cost of medical care? It is well known the health budget is ever dwindling, and one cannot help but feel that the review on existing guidelines of antibiotics has been made with an objective to save on the cost of medicine rather than put patient health first.

The health service is currently riding the trend of developing sustainability in infrastructure and treatment, and this revision of traditional guidelines may seem to be a reframing of the evidence to suit a pre-determined outlook.

Let us return to the example of Greeks and Trojans. If the battle is raging within the Greek city walls and the tide turns against the Trojans, should the Greeks fire their ammunition at the retreating Trojans until they all fall to the ground? Ammunition in the form of gunpowder and metal casings cost money and if the ammunition could be used sparingly, then there is more money to funnel towards other  daily activities like farming and livestock. The question we are being asked to address is the equivalent of this hypothetical situation: Should the Greeks keep firing their weapons, until all the Trojans fall before they manage to retreat and leave the Greek city walls, or should the Greeks try to save the cost of a few rounds of ammunition if they are certain the Trojans are so heavily wounded they would never survive the escape and make it to their own city walls to compromise the information they know about the secret weapon?

You may decide, as I did, that the cost of a few extra rounds of ammunition outweighs all the mental confusion of wondering “what if …?” for the next few months. “What if I didn’t take the medication long enough? What if the bacteria has mutated?”

You can see why it is easier that when it comes to health, be cautious, don’t customise. Don’t experiment on the one life you’ve got!

Sustainable healthcare is not as clear-cut as it seems

Sustainable healthcare is thought of as the provision of future approaches to health, care and wellbeing in an increasingly environmentally and financially sustainable manner, and one which also makes intelligent use of our abundant social and human resources.

Proponents of the sustainable healthcare approach point out that in our current place in time, and for the future – our future, the future of the generations after us, and the for earth’s future itself – we need to ensure that the healthcare institutions need to make minimal impact on the environment, so that there is an environment for future generations, and not one that has been plundered of its resources.

The thinking behind the sustainable healthcare approach is that in operating with a minimal-impact focus, there is an added benefit in doing so, because operating costs can supposedly be minimised as well. There is a perceived amount of overlap in these two areas, and avoiding unnecessary environmental harm while reducing waste can also save money. One quoted instance of this is how the use of ultra-low energy lighting in hospitals requires less power demands, which means an efficient distribution of electricity at the power plants, and ultimately this all can be traced all the way back to making less environmental impact whilst saving on electrical costs.

But how much of this interest in sustainability is genuine, and how much of this is merely a governmental facade to mask or sidestep the issue of reduced health budgets?

Yes, ultra-low energy lighting can minimise running and environmental costs, but is the manufacturing process of the lighting itself sustainable? And will the ultra-low energy lighting pay for itself over its lifespan?

Let’s consider this example. Suppose it costs £20,000 to replace the outdated lighting systems in a hospital with the new flash lighting. Throw in an added £5,000 for labour costs. Assume the new lighting lasts 20 years before it needs to be replaced. Will the £25,000 costs be significantly made up over the twenty years to warrant its installation in the first place?

Whether or not it is possible to do so, it is arguable that even if it isn’t, it is more PR-friendly for organisations to be viewed in the public eye as sustainable, and they will rush to choose measures which may be seen to be sustainable, rather than carefully evaluate their options. This only opens things up to abuse as suppliers will merely jump on the bandwagon and market products under the guise of being sustainable.

Take for example, the case with Salford City Council. Years ago, when the trend for recycling was at its peak, the council unceremoniously dumped five wheelie bins in every household without consultation. The move was supposedly to encourage recycling. But what resulted was a stealthy method to cut the frequency of bin collections from once every week to fortnightly, as well as cut the number of waste collection workers by forcing residents to sort out the recycling. But where did the money for four extra bins per household come from, apart from through higher council tax charges, and did the extra financial cost, as well as the added time of a rubbish truck emptying five bins instead of one, thereby causing road congestion, save costs in the long run? Probably not. In the rush to be seen as being eco-friendly, it is ironic that it cost more to be viewed as so, rather than to not be eco-friendly at all. The extra cost went into buying a perception. We must not make the same mistake with sustainability.

Proponents of sustainability in healthcare suggest that health is also won or lost outside formal health and social care settings. The hospitals and other care organisations do not govern the lives of individuals. Individuals themselves can empower themselves to live better lives, by making better choices so as to require less medical care and ultimately make less demands on the healthcare system and the environment. Sustainable healthcare, they say, means a thorough examination of how we are living – how we eat better, how we move our own bodies more, how we develop new ways of protecting and improving health. It is about empowering the individual to lead a pro-active lifestyle.

Another idea put forth is that the quality of resilience needs to be imbibed in people, families and communities, especially when you consider the increasingly frequent extreme weather. Is this a subtle way of saying “in times of harsh weather conditions – like extreme cold or heat, instead of getting in touch with your GP, just tough it out?”

It is suggested that what is needed is a collective effort in supporting and growing effective networks within communities so that the health system works to provide support and services alongside people rather than just to people. How does this help the system to be sustainable? The claim is that pre-existing logistical setups are already in place and aid can be delivered swiftly to recipients – the cost of setting up a delivery system is negated by using one that is already in existence.

The proposed sustainable health strategy is based on three principles, and launched jointly by leaders from the NHS, the social care system, local government, and Public Health England.

Firstly, a healthy society depends on a healthy environment: clean air to breath, green spaces for children to play in, safe places to walk and cycle, and a radical reduction in our greenhouse gas emissions.

Secondly, the health and care system is increasingly aware of the benefit of helping to develop resilient communities: resilience that is fundamental to health and wellbeing, both in times of relative stability, and in times of crisis.

Thirdly, the health and care system can take every opportunity to work with people to prevent the preventable and manage the manageable. This means helping us all improve our understanding and control over our own health, illnesses and opportunities and to take pro-active steps over our health, within our homes and communities. The traditional model that exists at this time is of citizens being well, then falling ill, before being treated and getting better is increasingly outdated. This, in essence, is passing on the responsibility for our own health to the state. We need to make increasingly wiser choices over our health and manage ourselves with the support and guidance of the health and care system using improved information, integration, collaboration and technology.

However, this third principle requires a cultural shift for public, patients and particularly professionals. We have business models that are built on the foundations of people getting ill. We pay insurance premiums, private medical treatments, and existing business models rely on poor health to function rather than improved health. What would happen to hospitals and staff if people lived better? We would have a surplus and would have to close them down. Cynics could be forgiven in thinking that sustainability is the guise under which the NHS meets budgets through cuts. The existing models we have depend on certain numbers of people being ill in order to function. But in the light of diminished financial resources, this rethink may come sooner and seen more positively. We may need more diverse business models for providers of care. We could reward care providers for the amount they reduce death rates or health inequalities or survival times or for simply improving the experiences of patients. But will this only complicate matters by only shifting traditional costs elsewhere?

Those who hold responsibility in commissioning healthcare are increasingly choosing to focus on outcomes as the marker for remuneration. There are many examples already in existence in our society already, where care is less focussed on a hospital setting and more in the community. This can be through more programmes which are community based ones. Or it is suggested we could rely on interventions in partnership with the voluntary sector, or – through the use of technological opportunities – even care in the home. And the societal focus of rewarding providers of care for outcomes rather than just activity might uncover methods that are more creative, cost effective, and appropriate ways of keeping people informed, independent, and healthy. It is a way of allocating resources to encourage positive needs. Cynics may suggest that this is a subtle way of shifting government responsibilities lower down to the community, in light of budget cuts, but it remains to be seen whether individuals in need get help more sooner within their community, and the redistribution of care resources down the chain to the community, which is already in place, rather than wait for government initiatives to filter through after legislation and debate, means that a community-based system is more responsive and more agile.

Devolving responsibility and dividing care provision requires overall coordination to ensure facets run smoothly. If what was amalgated under a traditional NHS is now devolved to various third parties, who will be in charge of overall co-ordination? Will the cost of overall co-ordination actually prove to be more expensive in the process? In the latter case, probably not – but why? Because the responsibility for co-ordination will probably fall to a computer system, and any failings of this system will be explained away by traditional excuses we ascribe to technology; and without having a person to blame anyway, we will have to grin and bear it. Be resilient.

This move towards reaching a sustainable health service is partly technological, but other factors such as the economic, social and cultural also come into play. It will only be achievable through various shared values – honesty, partnership, social involvement. Business models and technology widely used outside in other facets of society have to be used creatively in order to deliver a safer, fairer future. One of the challenges is that sustainability and commerce do not easily mix, and the transition must be carefully managed. Sustainable models are efficient and usually low-cost, and as a form of revenue and unemployment do not offer as much as commerce-based models. And as organisations such as the health service move towards a lean, agile and efficient mode of service, unfortunately it is a likelihood that such sustainable models mean that individuals employed in traditional sectors may require re-training to find continual employment.

This strategy for the future has been shaped and supported by partners across the system, not just by a single organisation. This is vital because, although we know much about what needs to be done, we really are not yet certain how to do it together, ensuring our collective efforts add more value than the sum of our individual approaches. Society as a whole needs to move forward together, inching our way collectively so that we can all adapt to the various stages of the transition. Binding all of these approaches into a sustainable approach focusses us on the truth that a liveable community for our children is more important than a growing economy.

The health and care system should not entertain the thought that it is separate from the challenges (and opportunities) that are presented by a rapidly changing world. A future-focused health and care system is the most obvious representation of a collective effort for the common good. But the scale and pace of this move is needs to be realistic and functional, and it extends beyond specific health areas alone. Moving towards a sustainable health model is not just about making the NHS more efficient and environmentally-friendly. It is about creating a sustainable world for future generations through its constituents, and fostering shared capabilities so not only is the sense of community and a shared world an attainable vision, but working demands are shared to minimise wastage and tax on the environment. The NHS is one such constituent to institute this change in thinking and move it forward.

Healthcare is a fine example of a sector to set clear examples of our collective responsibility to the future. In the current financial climate of constraint there is opportunity for this evolution of healthcare to go forward. What is being proposed is good for the purse, good for our families, and good for the future. Ultimately this reduces the cost of state healthcare, empowers individuals to take charge of their own, and whatever resources are available are delivered to those who need it in an efficient way.

What does all this mean for the NHS? Among other things, it means the health service must:

1) Operate sustainably, minimising its impact of the environment. This means the physical infrastructure of the NHS has to be sourced from means that are as environmentally-friendly as possible.

2) Continue to source its needs from sustainably-beneficial agencies. An example of this is that if the NHS has two pharmaceutical companies offering the same drug at similar prices, it engages the one with more environmentally-friendly policies. This is to promote the use of sustainable measures to its partners.

3) Impart knowledge through the use of community-based programmes to encourage personal responsibility for health. The NHS can support local programmes, for example by setting up stalls in school fairs to promote health advice or encourage people to do health testing, among other things, so that tests that are routinely conducted at hospitals or GPs are done elsewhere, resulting in time savings, and also encouraging individuals to maintain responsibly from their health by having constant local reminders within the community framework.

4) Conduct constant research in order to find better medicines – “better” not necessarily in the sense that the cost is lower, but better in the sense that they can be constructed from more natural sources, and hence require less demands on the environment during production. In this, the NHS may have to turn to various alternative therapies such as homeopathy, Chinese medicine, or preventative therapies such as the Alexander Technique.

5) The NHS also has to collaborate with partner agencies that can provide long-term non-medication solutions. For example, research on mental health has suggested that while medication for serious health issues has benefit, milder forms of mental health are equally well-addressed with cognitive therapy rather than medication, and cognitive therapy also has more lasting impact and a lower chance of relapse than medication. In the long run it is not conceivable that a sustainable health service would cultivate a society that is less dependent on medication as a quick-fix remedy, but one that encourages its citizens to closely reflect on how they are living their lives in order to live free of medication as far as possible.

The challenge in implementing all this, as we have previously mentioned, is that as organisations and processes become more streamlined, cost-effective and environmentally-friendly, economies which depend on growth – essentially, ALL of them – and government which raise finance from taxation – again, ALL of them – start to suffer. Sustainable organisations employ less people. And the more sustainable an organisation becomes, the less people it employs. Even the large organisations that focus on saving the environment have a majority of their workforce consisting of volunteers. This means a huge chunk of the government budget attained from working tax is lost, and a equally worrying time-bomb is large numbers of unemployed with no financial means of securing a living property.

Taxes on working profits will fall, because the silent partner is sustainable schemes is low cost. Would you pay more for the same product or service, if the product carries a promise to invest sustainably? Would you pay more for Fairtrade products? Most, apart from those with more disposable income, would probably go cheaper, given that society will have to live with financial constraints for years to come. The tax on a lower-cost product is less than that on a higher-cost product, which means the government has less revenue.

We arrive at a situation where the government has less funds to distribute, organisations that have less funding but have to find ever more ingenious ways to use or grow that forever-dwindling income. At the same time we have a situation where the numbers of jobless will continue to grow, many will remain on low-wage jobs, while property continues to spiral out of reach. It is not an economically-sustaining situation. It begs the question – are we only pushing the demands on the environment that we made, while we were focussed on growth, back to the economy again?

A sustainable health service, and living sustainably, is essentially a dismantling of the economy that we have come to build.

The future is uncertain. It is scary and will require careful negotiation. We’re sure we need to live sustainably and make less demands on the environment, but we haven’t quite worked out fully how to transition there, nor what we will do when we really get there.

What antibiotics in agriculture are really about

There is widespread concern over the use of antibiotics in the agricultural world and what is wider bearings are. The general consensus is that the use of antibiotics in agriculture needs to be minimised dramatically by farmers, as there are fears that drug-resistant bacteria could pass up the food chain through consumption and environmental contamination.

The concerns take on many forms. Firstly, just as humans can develop resistance to medicines after prolonged use, there is the concern that long-term antibiotic use in agricultural settings may create antibiotic resistance in the animals and crops which receive these antibiotics. Secondly, even if these crops and animals themselves do not develop resistance to antibodies themselves, the prolonged consumption of the vegetables or meat from these farm animals could breed resistance in humans who consume them. There may also be other side effects we are as yet unaware of.

Antimicrobial drugs, which include antibiotics, antifungal and antiparasitical drugs, are commonly used in farming. They are used to prevent damage to crops, kill parasites, as well as keep livestock healthy. The long term aim of antimicrobial drugs in the context of farming is to maximise crop production and livestock farming. A field of crops lost to infestation is months of work for nothing. A farmer with a field of cows suffering from disease has lost not just capital but production possibilities as well. As with the case of mad-cow disease in the 1990s, farmers who had their cows put down not only lost the money they had invested in buying and breeding these cows, but also on the sale of milk and beef.

And in many cases, the losses from a brief period of crop infestation or animal disease could significantly affect a farmer’s income, or make such a dent in their livelihood that it either forces them to take on additional debt to cover the losses, or be so insurmountable that it forces them out of business.

There might be those that argue against the use of antibiotics but the truth is that they are necessary. They are one form of insurance for a sector that has to combat various problems, including the uncertainties of weather. When, for example, your crops – your livelihood – are subject to the whims of weather, infestation, and perhaps human vandalism and theft, you have to take steps to minimise risks on all fronts. You cannot simply just leave things to chance and hope for divine favour or faith – that would merely be masking a lack of responsibility.

Pests and viruses do not restrict their infestation to selected fields. Left unchecked, they would merely spread from unprotected fields and livestock, and then infect further unprotected areas. Antibiotics are medical city walls that keep away marauding invaders, and prevent them from invading territories and conscripting the local population into their armies to do further damage.

Resistance to the antibiotics, antifungal and antiparasitical drugs used in agriculture is collectively known as antimicrobial resistance (AMR).

An independent body chaired by the British economist Jim O’Neill looked specifically at antibiotic use in the environment and agriculture. Among other things, this body examined the ways in which regulation and financial measures such as taxation and subsidies could play in reducing the risks associated with the agricultural use of antimicrobials and environmental contamination.

The data from the report suggests the amount of antimicrobials used in food production internationally is at least the same as that in humans, and in some places is higher. For example, in the US more than 70% of antibiotics that are medically important for humans are used in animals.

What does that all mean? It means that drugs normally for humans are already used in animals. If human beings consume the meat of the animals over prolonged periods, their bodies can develop tolerance to the antibiotics because they were used in the animals. If human beings later have a need for these antibodies, in the medicines for humans, these forms of medication will have little or no effect. And as we have seen before, ineffective long term medication may only create addiction to drugs and pain relief medication.

The report included peer-reviewed research articles in which 72% of the 139 articles found evidence of a link between antibiotic consumption in animals and resistance in humans. There is enough impetus for policy makers to argue for a global reduction of antibiotics in food production to a more appropriate level.

But while the evidence suggests that we should reduce the usage of these antibiotics, antimicrobial usage is unfortunately likely to rise because of the economic growth and for increasing wealth and food consumption in the emerging world.

A considerable amount of antibiotics are used in healthy animals to prevent infection or speed up their growth. This is particularly the case in intensive farming, where animals are kept in confined conditions. An infection in these confined spaces could easily spread between organisms. Further to this, some animals receive antibiotics so that natural limiters to size are killed off in order that their growth is accelerated. If you sell meat by weight, it makes sense that you try to produce as big as animal as you can so that you can maximise your profits.

The report mainly highlighted three main risks that had connections with the high levels of antimicrobial use in food production. There was the concern that drug-resistant strains could be transmitted through direct contact between humans, particularly in the case of farmers, and animals on their farm. Secondly, the transmission of the drug-resistant strains could also result due to the contact during the preparation of the meat, or the consumption of it. Thirdly, the excrement of the animals might contain the drug-resistant strains and the antimicrobials and therefore pass into the environment.

There was also concern raised about the possibility of contaminating the natural environment. For example, if factories that manufacture these antimicrobials do not dispose of by-products properly, these may pollute the natural environment such as water sources. Already we have seen that fish near waste-treatment plants, which treated urine tinged with chemicals from birth control pills, developed abnormal characteristics and behaviour.

The review made three key recommendations for global action to reduce the risks described. The first was that there should be a global target for the minimisation of antibiotic use in food production to a recognised and acceptable level in livestock and fish. There were also recommendations that restrictions be placed on the use of antibiotics in the animals that are heavily consumed by humans.

Currently there are no guidelines surrounding the disposal of antimicrobial manufacturing waste into the environment and the report urged the quick establishment of these in order that pollution of the environment could be minimised and the disposal of by-products and active ingredients be regulated.

The report also urged for more monitoring on these problematic areas in concordance with agreed global targets, because legislation without means of enforcement is useless.

Is it possible that the production of antimicrobials can be limited? One cannot help but be cynical. As long as we inhabit a world where sales drive rewards, it is inconceivable that farmers would slow down their production on their own initiative. We would definitely need legislation and some form of method to ensure compliance.

But what form of legislation should we have? Should we focus on imposing penalties for non-compliance or incentives to encourage the reduced use of antimicrobials?

Some may argue that the latter is more effective in this case. If farmers are offered financial subsidies so that they receive more money for the price of meat, for example, they would be more inclined to reduce the usage of antimicrobials. But how would these be monitored? Could the meat for sale could be tested to ensure the density of antimicrobials falls under established guidelines, for example, so that if the farrmer has been relying on the use of antibiotics to increase the size of livestock, he is latterly being recompensed for the reduction in size arising from the reduction of the antibiotics?

Unfortunately the difficulty is in reconciling both the need as well as the established economic system for growth in one hand, with the sustainability factor in the other. How is farm produce sold? When you buy a bag of salad, a cut of meat, or a bottle of milk, all this is sold by weight or volume. You may buy eggs in carton of six, but they are also graded by size and weight. For the direct manufacturer – the farmer – size, volume and growth are what bring about greater profits – although these profits may barely be just above the threshold for subsistence. And after making allowances for damage due to weather, theft, low market demand and all other variables that threaten an already low-profit industry, asking a farmer to reduce the use of antimicrobials is akin to asking him not to take measures to protect his livelihood. If the use of antimicrobials bothers you, then you have to compensate the farmer not to use them, by being willing to pay higher prices for farm products.

Why do organic or free range eggs cost twice the price for half the size? Aha!

While antimicrobials are also used on free range produce, and the case of organic farming is not entirely relevant here, the same issue is being highlighted here. You are paying more for the process than the product, and in doing so the extra payment that you make is towards the farmers for farming practices you are seeking to promote.

A farmer can get more produce by rearing battery hens, but if you are concerned over animal welfare, you pay extra per animal for the farmer to rear it with more space and hence more welfare for the animal. Your free range chicken costs more not because it is bigger, or necessarily healthier, but because it has been afforded more space, which you consider to be ethical. Farmers may switch to organic farming if there is enough demand for this, and for some this may even be more favourable, because having to produce fewer hens, but fetching the same price as battery hens, may, in the grand scheme of things, be seen by the farmer as a more favourable solution.

In trying to promote less use of antimicrobials, we have to make up the farmer’s perceived loss of earnings. So it is not incorrect to say that if we are concerned about the use of antimicrobials in agriculture, we have to pay more for our farm produce. Are you prepared to do that? For families with high disposable income, the increase may only represent a small additional fraction. But for families on smaller incomes, the increase may be too steep to be feasible. In other words, while the need for a reduction in agricultural antibiotics is recognised, in practical terms it may only remain an aspirational ideal except to those who can afford it.

Can be people be convinced – even if the cost is high – that in the long term it is better for human health? If the continued use of antimicrobials means that human medication in the future may become less effective as our resistance is tempered, should we, despite our reservations about the cost – make the leap towards maintaining a sustainable future? And if low-income families cannot afford to pay more in the cost of their weekly shop to get less, ridiculous as it might sound – should higher income earners step in to fill the shortfall?

It is strange how the wider discussion about the use of antimicrobials in society leads to a discussion about income distribution and political sensitivities.

What has arisen in the course of that evaluation, however, is the fact that expecting citizens alone to fully contribute towards the production shortfall arising from a reduced use of antimicrobials by paying more for their farm produce is not going to work. While some can afford to, many cannot, and those that can may not necessarily want to pay for those that cannot. There are also other measures to reduce the use of anti-microbials.

Governments could also introduce legislation to prevent environmental contamination through antimicrobial products and by-products, and harsh penalties for doing so. At the moment there are no rules in place, it is of increasing concern that such legislation is developed quickly.

Governments could also offer tax subsidies and support for farmers who continue to reduce antimicrobials usage. These could be introduced at the end of the first year, when farmers need most support at the initial stages of conversion, then at thirty months, and at further longer-spaced periods. Subsidies or incentives could an arithmetic progression at the end of one year, two-and-a-half years, four-and-a-half years, seven years and so on, so there is continued incentive to maintain reduced antimicrobial usage.

The only problem is, where would the money for these subsidies come from? If the government receives less tax from farm produce transactions because less has been sold, and it has also received less from antimicrobial companies in the form of tax, because it has made them limit their production, where will it make up the shortfall? Through an environment tax on its citizens?

Therein lies the problem.

The conundrum is this: the threat of antibiotic resistance in the future means we have to lower the level of antimicrobials we currently use. Yet if we do so, we are looking at reduced economic output. And as long as we have an economic system that is reliant on growth and increased production, asking to slow down production is economic suicide.

You may ask: “What about if we have a re-evaluation of an economic system, and create one that is based on sustainability?”

I am sorry to say it but that is wishful, idealistic thinking.

The problem with switching to a sustainable-based economy can be described as such.

Imagine there is a children’s party. At this party there is a table with a gigantic bowl of sweets. The children who are first to arrive eagerly stuff their faces and pockets with sweets, and as the party progresses, the bowl gradually looks emptier and emptier. The parents present chastise their kids if they continue to head for the sweet bowl, remonstrating with them to leave some for the kids who have not yet arrived from the party. Some of these children, perhaps the older ones, might reduce their trips to the bowl and the number of sweets they take. But some children will continue to plunder the bowl of its sweets before it all runs out and stuff their faces, recognising the sweets are a dwindling resource and if they want to eat them they’d best take as many as they can. And a third group, while recognising the sweets will soon run out, are equally keen to get hold of as many as they can, not to eat the sweets, but because they realise that when one of the latecomers arrives and find there are no sweets left, their parents may offer them incentives to trade to appease the desperate child. “Charlie didn’t get many sweets because he was late. If you let Charlie have two of the sweets you already have, I’ll buy you an ice-cream later.” This third group recognises not just the impending scarcity, but contribute to it by stockpiling their own resources to use for later leverage. And they may even make the loudest noises about how everyone should stop taking sweets, only so that they can make the biggest grabs when no one is looking.

Who are the losers in this situation? The obvious ones are the one who arrived late at the party. But the not so obvious losers are the ones from the first group, who amended their behaviour to ensure that there were still sweets left for the later groups to come. In being principled, holding on to ideals, they became lesser off materially, and the only consolation was the knowledge they had made the effort to leave some sweets for the late group – whether or not the latecomers actually got any or not is another question. The sweets ran out eventually.

The problem with thinking about sustainable economic measures is that the first to make an attempt to switch on ethical or aspirational grounds will be among the ones to lose out, because subsequent groups will still make a grab for whatever is left. Some will make a grab to get as much of the remaining resource, while others will make a grab so that when there is scarcity – and scarcity drives up prices – they have plenty of the resource to benefit. So while everyone is making the right noises about economic sustainability, everyone is just holding back for someone to make the first move.

So this is what antibiotics in agriculture really tells you: Too much can create problems later due to antibiotic resistance and improper disposal. We need to cut down on the use of antimicrobials. But reduced antimicrobials means reduced output, and we must be prepared to pay higher prices for less produce to compensate the farmer for that to work, in order that they may earn a living. The government can introduce penalties to govern the disposal of antimicrobial-related products to limit the damage on the environment alongside incentives to limit the use of antimicrobials. But it will have problems funding the incentives. Because what it is proposing is economic slowdown, in order to have an economy at all in later generations – but the current generations are too concerned with their own interests and survival, and stealthily making a grab for the remnants after the first few leave the economic arena.

How long-term medication harms – but why nothing may be done about it

In looking at mental health, we have previously examined the idea that while medication offers short-term relief, long-term change is brought about through lasting measures such as cognitive therapy. We have also seen that medication is more effective in individuals with more severe forms of mental health, while milder forms can also be dealt with through non-medicative measures. We can summarise by saying that the role of medication is to offer immediate relief, but over a long term, to stabilise the individual to a state where pressures or stressors can be managed to a point where they do not cause stress, but give the individual opportunity to live with them, while examining the root cause of their problems.

The underlying causes are usually non-medically related; they can be extrinsic factors such as the working enviroment or lifestyle. Medication is hence insufficient to deal with these because they cannot impact on them. The focus on the root of the problem is one that patients on medication need to ultimately address. Unfortunately patients taking prescription medicines often make the assumption that if a certain pharmaceutical drug has been prescribed to address a particular problem, then more of it, even within limits, can eventually help resolve it. That is only a mistaken assumption. Overdosing on medication does not address the root of the problem. It only lulls the body into a relaxed state, blinding us to the immediate surroundings, so while we feel calm, relaxed or “high”, this feeling is only temporal.

Medications and the prescription of medication are reactive, not proactive. They treat symptoms that have manifested, but do not treat the cause of the symptoms.

These views of medicine are not just limited to mental health problems; they can extend into physical realms. Take eczema for example. A doctor may prescribe creams containing hydrocortisone and paraffin for you to manage the itchy, red flaring skin conditions that usually see in eczema sufferers. However, these creams may only offer you temporary relief. As soon as you stop taking them, your eczema may return. Advocates of TCM, or traditional Chinese Medicine, suggest that eczema results from an overactive liver, and the trapped “heat” in the body, when it is seeking release, manifests itself as flared red patches over the skin. Creams such as paraffin or other barrier creams may be viewed actually as being counterproductive, because they only prevent the internal heat from escaping and make the eczema worse. Have you ever encountered anyone who, upon applying the cream for ezcema, reported it only worsened the itch? If you visit a TCM practicioner, you will probably be prescribed a cream with some menthol formulation for external use, oral medicine for your eczema, and the advice that in order to deal with the root cause of your eczema, you have to make changes in your diet – specifically, not to over-consume food such as fried food or chocolate, and to avoid alcohol and coffee.

It would be great if the immediate and short-term relief brought about by medication could be extended for long periods. If you were suffering from serious illness such as severe depression, the difference you feel would be very noticeable at the onset of medication. However, medication is only a short-term stress suppressant, buying time in order for longer-term (usually non-medical) measures to take effect. It is not the intention of any prescriber – be it a GP or pharmacist – that any patient be on medication for a prolonged period of time. While it might be good financially to have such patients, it is unethical to keep patients unwell to have a constant income stream and a source of revenue. In this situation the health of the patient has become secondary to the financial benefit he or she can bring, and it is against the ethics of the medical profession.

It is unwise to be on medication for long periods. First and foremost, the body adapts to the doseage and in time the effects that the medicine initially brought are diminished, to the point that either a higher doseage of the medicine is required, or the patient is switched to another new type of medicine which is more potent. In both cases, if medication is seen to be the cure, rather than just to buy immediate relief, then the patient will merely keep taking the medicine in the hope that one day it will completely cure his or her problems, and the potential for addiction to a higher doseage results. This is how all addiction begins, and it is unfortunate if patients who take medication find that it has not only dealt with their initial symptoms, but layered it with a secondary problem of addiction to painkillers.

Addiction is only one of the problems brought about by use of long-term medication. There is the possibility, too, that the body also adapts to new chemicals and is slowly malformed. But the negative impact of medication remains unnoticed until it reaches the tipping point and consequences are made apparent with a catastrophic event. With smoking, for example, constant exposure to the chemicals damages the lungs and malforms them, but often people only sit up and try to take corrective action when irreparable damage has set in and lung cancer has developed. Medication is on the opposite end to the scale as smoking and is taken at the onset to cure rather than harm, but it has the potential to change the human body when taken over prolonged periods.

But the changes are not necessarily just experienced by patients on medication alone. Research scientists from the University of Exeter found that, for example, certain species of male fish were becoming transgender and displaying female characteristics and behaviours, such as having female organs, being less aggressive, and even laying eggs. The fish had come into contact with chemicals in water near waste-treatment plants. Chemicals contained in birth-control pills, mixed with urine flushed down the toilet, were cited as a particular source of contamination.

Long term medication is also not a good idea for children. If hyperactive children are embarking on activities that require focus such as school, or piano lessons, it may not be a good idea for them to be on prolonged medication. It may be better to treat the underlying causes first, to teach the child management strategies, rather than to merely treat the outwardly present effects.

When it comes to mental health problems, the best approaches are a mixture of medication and therapy. Give that medication is meant to be short-term, it is hence, important that therapy be as effective as possible in order for patients to entrust it to fully healing them, rather than depending on medication. This is of course more appropriate in instances of mental illness rather than physical illness that involve pain-relief. Nevertheless, in the latter case, where medication is for physical pain relief, some have suggested therapies such as hypnosis and acupuncture as long-term substitutes for pain medication.

It is worth the NHS examining such therapies in order to study the scientific evidence behind them, to glean any insight that could either be applied elsewhere to other treatments, or to find more cost-effective, longer-lasting treatments that will contribute to the NHS being a sustainable health service. Already, at the present time, the current model of the state being a mere provider and source of medicines and advice to its citizens cannot carry on. The cost of patient care will rise and drain its resources, and it would be more cost-effective to spend resouces to encourage citizens to actively take responsibility for their own health, and hence lessen the burden on the health service, rather than merely look towards it as a provider of medication.

There are also other reasons why the NHS has to prime itself for a move towards being a sustainable health service. It has to limit its carbon footprint in order to minimise the impact it has on the environment.

The prescription of long-term medication can ultimately have its impact traced back to the environment. Constituents of medication are either obtained from natural ingredients from foods grown on land, or manufactured in factories, which again, commandeer land use. The process of turning them into medication requires power and electricity, which either use up fossil fuels and produces fumes and greenhouses gases that result in global warming and instances of extreme weather, or renewable energy in the form of wind farms that still use up land, or solar energy from solar cells whose manufacture might have been through unsustainable means. Waste from manufacturing processes, or from the manufacture and the disposal of the medical product enters landfill or pollutes natural resources.

Land is a limited resource. More specifically, land that can grow useful crop is a limited resource. And so even if the current level of pharmaceutical manufacturing remains the same – perhaps, by some freak balance where the number of people being newly prescribed medication is equatable to the number of deaths – the land, along with the space available for landfill can never be refreshed on that basis. It might not make an immediate difference to you, but every individual has a civic responsibility, as a global citizen, to preserve the earth to make it habitable for future generations, to avoid killing off the human race.

Essentially, we need to lower our dependency on medication to avoid this impact on the environment. So that future generations have a habitable environment.

The problem is in convincing pharmaceutical companies to embrace this thinking. These companies depend on sales and if sales were to fall, so would profits and the price of shares. Pharmaceutical companies are accountable to their shareholders, and need to raise their share prices and create growth. The moment they start thinking about sustainability, they are looking to reduce their growth, and their share price would stagnate. Would you invest in a company with stagnant growth? Thought not. And if a company reports less profit, the government would have raised less revenue through tax and has to make up the shortfall somehow.

Being on long-term medication harms the body, among other things by creates changes in the body and fostering dependency. Ultimately it has significant bearing on the environment. The challenge is for us to wean ourselves off long-term medication, only using it in the short term while we address the root causes of our problems through therapy. On a wider scale, we need to create new business models because current ones actually depend on a sizeable number being unwell, in order for the economy to function. Surely that last statement is not ethical in itself and must raise incredulity – that in this day and age we are not trying to heal people, but maintain a threshold of well and unwell people that is economically beneficial!