A short history of non-medical prescribing

It had long been recognised that nurses spent a significant amount of time visiting general practitioner (GP) surgeries and/ or waiting to see the doctor in order to get a prescription for their patients. Although this practice produced the desired result of a prescription being written, it was not an efficient use of either the nurses’or the GPs’time. Furthermore, it was an equally inefficient use of their skills, exacerbated by the fact that the nurse had usually themselves assessed and diagnosed the patient and decided on an appropriate treatment plan.

The situation was formally acknowledged in the Cumberlege Report (Department of Health and Social Security 1986), which initiated the call for nurse prescribing and recommended that community nurses should be able to prescribe from a limited list, or formulary. Progress was somewhat measured, but The Crown Report of 1989 (Department of Health (DH) 1989) considered the implications of nurse prescribing and recommended suitably qualified registered nurses (district nurses (DN) or health visitors (HV)) should be authorised to prescribe from a limited list, namely, the nurse prescribers’formulary (NPF). Although a case for nurse prescribing had been established, progress relied on legislative changes to permit nurses to prescribe.

Progress continued to be cautious with the decision made to pilot nurse prescribing in eight demonstration sites in eight NHS regions. In 1999, The Crown Report II (DH 1999) reviewed more widely the prescribing, supply and administration of medicines and, in recognition of the success of the nurse prescribing pilots, recommended that prescribing rights be extended to include other groups of nurses and health professionals. By 2001, DNs and HVs had completed education programmes through which they gained V100 prescribing status, enabling them to prescribe from the NPF. The progress being made in prescribing reflected the reforms highlighted in The NHS Plan (DH 2000), which called for changes in the delivery of healthcare throughout the NHS, with nurses, pharmacists and allied health professionals being among those professionals vital to its success.

The publication of Investment and Reform for NHS Staff –Taking Forward the NHS Plan (DH 2001) stated clearly that working in new ways was essential to the successful delivery of the changes. One of these new ways of working was to give specified health professionals the authority to prescribe, building on the original proposals of The Crown Report (DH 1999). Indeed, The NHS Plan (DH 2000) endorsed this recommendation and envisaged that, by 2004, most nurses should be able to prescribe medicines (either independently or supplementary) or supply medicines under patient group directions (PGDs) (DH 2004). After consultation in 2000, on the potential to extend nurse prescribing, changes were made to the Health and Social Care Act 2001.

The then Health Minister, Lord Philip Hunt, provided detail when he announced that nurse prescribing was to include further groups of nurses. He also detailed that the NPF was to be extended to enable independent nurse prescribers to prescribe all general sales list and pharmacy medicines prescribable by doctors under the NHS. This was together with a list of prescription-only medicines (POMs) for specified medical conditions within the areas of minor illness, minor injury, health promotion and palliative care. In November 2002, proposals were announced by Lord Hunt, concerning ‘supplementary’prescribing (DH 2002).

The proposals were to enable nurses and pharmacists to prescribe for chronic illness management using clinical management plans. The success of these developments prompted further regulation changes, enabling specified allied health professionals to train and qualify as supplementary prescribers (DH 2005). From May 2006, the nurse prescribers’extended formulary was discontinued, and qualified nurse independent prescribers (formerly known as extended formulary nurse prescribers) were able to prescribe any licensed medicine for any medical condition within their competence, including some controlled drugs.

Further legislative changes allowed pharmacists to train as independent prescribers (DH 2006) with optometrists gaining independent prescribing rights in 2007. The momentum of non-medical prescribing continued, with 2009 seeing a scoping project of allied health professional prescribing, recommending the extension of prescribing to other professional groups within the allied health professions and the introduction of independent prescribing for existing allied health professional supplementary prescribing groups, particularly physiotherapists and podiatrists (DH 2009).

In 2013, legislative changes enabled independent prescribing for physiotherapists and podiatrists. As the benefits of non-medical prescribing are demonstrated in the everyday practice of different professional groups, the potential to expand this continues, with consultation currently under way to consider the potential for enabling other disciplines to prescribe.

The bigger issues that come with preventing hearing loss

Is there cause for optimism when it comes to preventing hearing loss? Certainly the latest research into this suggests that if positive effects experienced by mice could be transferred to humans and maintained for the long term, then hereditary hearing loss could be a thing of the past.

It has always been assumed that hearing loss is always down to old age. The commonly held view is that as people grow older, their muscles and body functions deteriorate with time to the point that muscle function is impaired and eventually lost. But hearing loss is not necessarily down to age, although there are cases where constant exposure to loud noise, over time, causes reduced sensitivity to aural stimuli. Over half of hearing loss cases are actually due to inheriting faulty genetic mutations from parents.

How do we hear? The hair cells of the inner ear called the cochlea respond to vibrations and these signals are sent to the brain to interpret. The brain processes these signals in terms of frequency, duration and timbre in order to translate them into signals we know.

For example, if we hear a high frequency sound of short duration that is shrill, our brain interprets these characteristics and then runs through a database of audio sounds, an audio library in the brain, and may come up with the suggestion that it has come from a whistle and may signify a call for attention.

What happens when you have a genetic hearing loss gene? The hairs on the inner ear do not grow back and consequently sound vibration from external stimuli do not get passed on to the brain.

With progressive hearing loss too, the characteristics of sound also get distorted. We may hear sounds differently to how they are produced, thereby misinterpreting their meaning. Sounds of higher and lower frequency may be less audible too.

How does that cause a problem? Imagine an alarm. It is set on a high frequency so that it attracts attention. If your ability to hear high frequencies is gradually dulled then you may not be able to detect the sound of an alarm going off.

As hearing gradually deteriorates, the timbre of a sound changes. Sharper sounds become duller, and in the case of the alarm, you may hear it, but it may sound more muted and the brain may not be able to recognise that it is an alarm being heard.

Another problem with hearing loss is the loss of perception of volume. You may be crossing the road and a car might sound its horn if you suddenly encroach into its path. But if you cannot hear that the volume is loud, you may perceive it to be from a car far away and may not realise you are in danger.

The loss of the hairs in the inner ear is a cause of deafness in humans, particularly those for whom hearing loss is genetic. Humans suffering from hereditary hearing loss lose the hairs of the inner ear, which result in the difficulties mentioned above. But there is hope. In a research experiment, scientists successfully delayed the loss of the hairs in the inner ear for mice using a technique that edited away the genetic mutation that causes the loss of the hairs in the cochlea.

Mice were bred with the faulty gene that caused hearing loss. But using a technology known as Crispr, the faulty gene was replaced with a healthy normal one. After about eight weeks, the hairs in the inner ears of mice with genetic predisposition to hearing loss flourished, compared to similar mice which had not been treated. The genetic editing technique had removed the faulty gene which caused hearing loss. The treated mice were assessed for responsiveness to stimuli and showed positive gains.

We could be optimistic about the results but it is important to stress the need to be cautious.

Firstly, the research was conducted on mice and not humans. It is important to state that certain experiments that have been successful in animals have not necessarily had similar success when tried on humans.

Secondly, while the benefits in mice were seen in eight weeks, it may take longer in humans, if at all successful.

Thirdly, we should remember that the experiment worked for the mice which had the genetic mutation that would eventually cause deafness. In other words, they had their hearing at birth but were susceptible to losing it. The technique prevented degeneration in hearing in mice but would not help mice that were deaf at birth from gaining hearing they never had.

Every research carries ethical issues and this one was no different. Firstly, one ethical issue is the recurring one of whether animals should ever be used for research. Should mice be bred for the purposes of research? Are all the mice used? Are they accounted for? Is there someone from Health and Safety going around with a clipboard accounting for the mice? And what happens to the mice when the research has ceased? Are they put down, or released into the ecosystem? “Don’t be silly,” I hear you say, “it’s only mice.” That’s the problem. The devaluation of life, despite the fact that it belongs to another, is what eventually leads to a disregard for other life and human life in general. Would research scientists, in the quest for answers, eventually take to conducting research on beggars, those who sleep rough, or criminals? Would they experiment on orphans or unwanted babies?

The second, when it comes to genetics, is whether genetic experimentation furthers good or promotes misuse. The answer, I suppose, is that the knowledge empowers, but one cannot govern its control. The knowledge that genetic mutation can be edited is good news, perhaps, because it means we can genetically alter, perhaps, disabilities or life-threatening diseases from the onset by removing them. But this, on the other hand, may promote the rise of designer babies, where mothers genetically select features such as blue eyes for their unborn child to enhance their features from birth, and this would promote misuse in the medical community.

Would the use of what is probably best termed genetic surgery be more prominent in the future? One can only suppose so. Once procedures have become more widespread it is certain to conclude that more of such surgeons will become available, to cater for the rich and famous. It may be possible to delay the aging process by genetic surgery, perhaps by removing the gene that causes skin to age, instead of using botox and other external surgical procedures.

Would such genetic surgery ever be available on the NHS? For example, if the cancer gene were identified and could be genetically snipped off, would patients request this instead of medical tablets and other external surgical processes? One way of looking at it is that the NHS is so cash-strapped that under QALY rules, where the cost of a procedure is weighed against the number of quality life years it adds, the cost of genetic surgery would only be limited to more serious illnesses, and certainly not for those down the rung. But perhaps for younger individuals suffering from serious illnesses, such as depression, the cost of a surgical procedure may far outweigh a lifetime’s cost of medication of anti-depressant, anti-psychotics or antibiotics. If you could pinpoint a gene that causes a specific pain response, you might alter it to the point you may not need aspirin, too much of which causes bleeds. And if you could genetically locate what causes dementia in another person, would you not be considered unethical if you let the gene remain, thereby denying others the chance to live a quality life in their latter years?

Genetic editing may be a new technique for the moment but if there is sufficient investment into infrastructure and the corpus of genetic surgery information widens, don’t be surprised if we start seeing more of that in the next century. The cost of genetic editing may outweigh the cost of lifelong medication and side effects, and may prove to be not just more sustainable for the environment but more agreeable to the limited NHS budget.

Most of us won’t be around by then, of course. That is unless we’ve managed to remove the sickness and death genes.

Physical and Mental Contamination

Is there a need to start worrying about your kitchen? I don’t mean in the home improvement context, never mind that the island unit is looking a bit worse for wear, and that your swanky appliances need upgrading so you can have two ovens to cook for your little army; or maybe you are thinking you could expand beyond the microwave and gas cooker. Or perhaps you are considering the option of creating an open plan kitchen. Whatever the physical changes you are considering, they are beyond the scope of discussion. Danger lurks in your kitchen.

It doesn’t come in the form of masked strangers brandishing kitchen implements. Or ruthless critics in the form of master chefs or children. No, the hidden danger in your kitchen is more subtle, more soft, yet potentially more lethal.

The kitchen sponge.

Scientists estimate that the kitchen sponge contains the highest concentration of bacteria than anywhere in the house. On the face of things, this is not an unrealistic statement. The kitchen sponge is in contact with remnants of food as it passes over the crockery and cutlery, and while the latter are clean, microscopic elements of food have merely been transferred to the sponge. And even if you take the effort to rinse out the sponge, or go a step further by microwaving the sponge, trace elements of food bacteria will remain.

According to the Mail Online, one of the more sensationalist newspapers in the United Kingdom, there are 54 billion cells of bacteria residing on the humble sponge. But of course the Mail Online would say that – it is taking a simple fact and blowing it out of proportion in order to create a purchasing headline. (And what is a purchasing headline? It is one that intrigues you enough to make a financial physical purchase to discover more, or hook you in enough to commit your time to reading more, never mind that the headline was slightly manipulated in the first place.) The fact is, bacteria exist all around us. They are on the surfaces of things around us. But it is important to distinguish between good and bad bacteria. The majority of bacteria around us are harmless. The remaining bacteria can do us harm if they enter our bodies, which is why it is a good idea to wash hands before eating. This ensures the harmful bacteria on our hands, either from touching door knobs or taps or other contaminated surfaces, does not rest on food that we ingest. It is also a good idea to cover up exposed cuts so that bacteria does not enter the bloodstream.

Bacteria is all around us but we can’t live life in fear of it.

Can you imagine if the word bacteria was substituted with the word humans? It would give a better perspective.

The headline would read that something like “A higher concentration of criminals found in [name of city]”. But you can’t live like every human in that city will do you harm. You can only take necessary steps to avoid being negatively affected.

Just like bacteria.

The current guidelines around hand washing recommend that we  our hands with water and soap for at least 20 seconds, after instances such as using the toilet, handling raw food like meat and vegetables. It is advised that we wash our hands before eating or after contact with animals.

Does washing with specialist soaps make any difference? A study by Rutgers University and GOJO Industries in the US found that there was little difference, which suggests the science between Brand X and Brand Y is as manufactured as the products themselves.

The study involved twenty volunteers and examined variables of hand washing such as brand, volume and time elapsed. A non-harmful strain of the bacteria e-coli was placed on the hands of the volunteers and then examined after washing to see how much remained.

The study found that there was little to distinguish between normal soaps and anti-microbial formulations. In fact, as long as volunteers washed their hands with soap for thirty seconds, the difference in results after washing were negligible.

There were a couple of minor limitations to the study conducted by Rutgers and GOJO Industries.

Firstly, that sample size is too small. Secondly, volunteers could not ethically be asked to handle deadly bacteria so the results may have only be applicable to that particular strain of bacteria.

There was a major stumbling block to the research however. GOJO Industries manufactures hand soaps.

We have already examined in the past how it is not a good idea if pharmaceutical companies run their own tests because the authenticity cannot be guaranteed completely if there is a bias from the outset. If a pharmaceutical company or any other manufacturing company is going to invest time, money and effort into production, it is going to choose results which have a positive bias, rather than those with a negative one which either force further research, impacting on production time and costs, or one that cause the complete abandonment of results.

Is there anything we can trust anymore? The dilemmas we have are that the media distorts reporting, and research is funded with an agenda which produces an expected outcome. It is difficult to secure funding for research if there is no meaningful purpose behind it beyond sales.

Returning to the original issue of bacteria, as long as we take necessary precautions, that is the best we can do. These precautions include replacing the sponge regularly, and not leaving unwashed dishes in the kitchen, and washing our hands to avoid contamination.

And take in what you read and hear about health and news with a pinch of objectivity. Avoid contaminating your mind too!

An overview of mental health

Mental illness continues to be one of the most misunderstood, mythologised and controversial of issues. Described for as long as human beings have been able to record thoughts and behaviours, it is at once a medical, social and at times political issue. It can lead to detention against one’s will and has its very own Act of Parliament, and yet we really know very little about it.

Societies through the ages have responded to this mystery by the locking up of people whose sometimes bizarre behaviour was deemed dangerous, unsuitable or just plain scandalous. Only within the relatively recent past have the tall, thick walls of the asylum been dismantled and those who remained institutionalised and hidden allowed out into the community.

Little wonder then that mental health and mental disorder remain misunderstood to most, and frightening to many. Recent reports suggest that stigma is on the decline (Time to Change 2014) but progress has been slow. Despite the best efforts of soap scriptwriters, high-profile celebrities ‘coming clean’ about mental illness, and the work of mental health charities and support groups in demystifying diagnoses such as depression, we still see and hear many examples of discrimination and myth.

Given the sheer ubiquity of mental illness throughout the world, the stigma and mystery is surprising. The most recent national survey confirms the now well-known statistic that just under one in four English adults are experiencing a diagnosable mental disorder at any one time (McManus et al. 2009). Depression is identified by the World Health Organization as the world’s leading cause of years of life lost due to disability (WHO 2009).

Relatively few of those experiencing mental health problems will come to the attention of a GP, let alone a mental health professional. This is especially so in the developing world where initiatives to develop local mental health interventions are gaining considerable ground after generations of cultural stigma and ignorance (WHO 2009). But even in parts of the world where people have ready access to medical help, many suffer alone rather than face the apparent shame of experiencing mental health problems.

Perhaps part of our reluctance to accept mental illness lies with difficulties determining mental health. We are made aware of factors that determine positive mental health. Connecting with people, being active, learning new things, acts of altruism and being aware of oneself (NHS 2014) have been evidenced as ways of promoting our well-being, but mental order remains rather more loosely defined than mental disorder.

So what are the systems used to categorise and define mental illness? In the United Kingdom, mental health professionals often refer to an ICD-10 diagnosis to refer to a patient’s condition. This is the World Health Organization’s (WHO) diagnostic manual, which lists all recognised (by WHO at least) diseases and disorders, including the category ‘mental and behavioural disorders’ (WHO 1992). The Diagnostic and Statistical Manual of Mental Disorders (better known as DSM-5) is more often used in the United States and elsewhere in the world (American Psychiatric Association 2013). These two sets of standards are intended to provide global standards for the recognition of mental health problems for both day-to-day clinical practice and clinical researchers, although the tools used by the latter group to measure symptoms often vary from place to place and can interfere with the ‘validity’ of results, or in other words the ability of one set of results to be compared with those from a different research team.

ICD-10 ‘mental and behavioural disorders’ lists 99 different types of mental health problem, each of which is further sub-divided into a variety of more precise diagnoses, ranging from the relatively common and well known (such as depression or schizophrenia) to more obscure diagnoses such as ‘specific developmental disorders of scholastic skills’.

The idea of using classification systems and labels to describe the highly complex vagaries of the human mind often meets with fierce resistance in mental health circles. The ‘medical model’ of psychiatry – diagnosis, prognosis and treatment – is essentially a means of applying the same scientific principles to the study and treatment of the mind as physical medicine applies to diseases of the body. An X-ray of the mind is impossible, a blood test will reveal nothing about how a person feels, and fitting a collection of psychiatric symptoms into a precise diagnostic category does not always yield a consistent result.

In psychiatry, symptoms often overlap with one another. For example, a person with obsessive compulsive disorder may believe that if they do not switch the lights on and off a certain number of times and in a particular order then a disaster will befall them. To most, this would appear a bizarre belief, to the extent that the inexperienced practitioner may label that person as ‘delusional’ or ‘psychotic’. Similarly, a person in the early stages of Alzheimer’s disease may often experience many of the ‘textbook’ features of clinical depression, such as low mood, poor motivation and disturbed sleep. In fact, given the tragic and predictable consequences of dementia it is unsurprising that sufferers often require treatment for depression, particularly while they retain the awareness to know that they are suffering from a degenerative condition with little or no improvement likely.

Psychiatry may often be a less-than-precise science, but the various diagnostic terms are commonplace in health and social care and have at least some descriptive power, although it is also important to remember that patients or clients may experience a complex array of feelings, experiences or ‘symptoms’ that may vary widely with the individual over time and from situation to situation.

Defining what is (or what is not) a mental health problem is really a matter of degrees. Nobody could be described as having ‘good’ mental health every minute of every day. Any football supporter will report the highs and lows encountered on an average Saturday afternoon, and can easily remember the euphoria of an important win or the despondency felt when their team is thrashed six-nil on a cold, wet Tuesday evening. But this could hardly be described as a ‘mental health problem’, and for all but the most ardent supporters their mood will have lifted within a short space of time.

However, the same person faced with redundancy, illness or the loss of a close family member might encounter something more akin to a ‘problem’. They may experience, for example, anger, low mood, tearfulness, sleep difficulties and loss of appetite. This is a quite normal reaction to stressful life events, although the nature and degree of reaction is of course dependent on a number of factors, such as the individual’s personality, the circumstances of the loss and the support available from those around them at the time. In most circumstances the bereaved person will recover after a period of time and will return to a normal way of life without the need for medical intervention of any kind. On the other hand, many people will experience mental health problems serious enough to warrant a visit to their GP.

The majority of people with mental health problems are successfully assessed and treated by GPs and other primary care professionals, such as counsellors. The Improving Access to Psychological Therapies (IAPT) programme is a now well-established approach to treating mental health problems in the community. GPs can make an IAPT referral for depressed and/or anxious patients who have debilitating mental health issues but who don’t require more specialised input from a psychiatrist or community mental health nurse. Most people receiving help for psychological problems will normally be able to carry on a reasonably normal lifestyle either during treatment or following a period of recovery. A small proportion of more severe mental health issues will necessitate referral to a Community Mental Health Team (CMHT), with a smaller still group of patients needing in-patient admission or detention under the Mental Health Act.

Mental health is a continuum at the far end of which lies what professionals refer to as severe and enduring mental illness. This is a poorly defined category, but can be said to include those who suffer from severely debilitating disorders that drastically reduce their quality of life and that may necessitate long-term support from family, carers, community care providers, supported housing agencies and charities. The severe and enduring mentally ill will usually have diagnoses of severe depression or psychotic illness, and will in most cases have some degree of contact with mental health professionals.

Revising Traditional Antibiotic Advice

What do you do when you have a cold and feel under the weather? Perhaps you decide to tough it out, and head to work as usual. You grin and bear it, because as far as you are concerned, it’s just a common cold and you can’t do anything about it.

But suppose you don’t get any better after a week, when you expected that the cold would have already run its course. You decide to stay at home to rest, and after a further two days when no improvement is seen, you go to visit the doctor.

The doctor’s advice? A course of antibiotics. Two tablets three times a day after meals, and by the way, keep finishing the course even when you feel better.

This is the advice that has been dispensed through decades to patients. Finish the whole prescription of antibiotics. And as patients, we put our trust in doctors so whatever they said went. Who were we to argue with seven years of medical training?

But what would you say if this medical advice turned out to be incorrect? I know what I’d think – firstly the sceptic in me would say medical advice is fickle and flows with what is fashionable at the time. At times, medicine seems also subservient to politics and economy. Remember the case with red wine? When the economy was flagging, a glass of red wine was said to be good for you. Yet when the NHS was under strain this so-called health benefit was reversed.

In this day and age it is also fashionable for everyone to carve a niche for themselves, and for many the way to do so is to turn traditional advice upside down on its head and revise or reformat existing information. And so, with these in mind, it is unsurprising that we learn of yet another study that claims the rule that patients must finish antibiotics course is wrong.

The new slant on the old problem is that patients should stop taking the prescribed medication when they feel better rather than as what doctors previously used to recommend.

The new panel of experts suggest that  the long embedded rule is incorrect, because continually taking medication after we have felt better only lowers the body’s resistance in the long run. They argue that if the body already feels better, giving it medication it does not need has counter-productive effects.

This differs with the advice that doctors have traditionally recommended, which is based on the idea that bacteria remains in our bodies even though we feel better and these bacteria may develop adaptation to antibiotics if they are not fully killed off. In other words, if you have not fully killed off the bacteria, it develops tolerance and immunity to the drug which partially fended it off, and ultimately the antibiotics’ effectiveness is negated.

Imagine two medieval armies: Trojans and Greeks. One day the Trojans manage to get inside the Greek city walls and wreak havoc (according to the Greeks anyway) with their torches, spears and swords. But the Greeks have a special weapon, say for arguments’ sake, an M16 with a laser sight. If the Greeks completely defeat the Trojans, the effectiveness of their weapon is guaranteed against successive waves of Trojan attacks. But if the Greek army stops to celebrate the moment the city battle swings in their favour, retreating Trojans may bring back information about the weapon, and how it works, and plan successive attacks that limit the effectiveness of the weapon or destroy it completely.

Martin Llewelyn, professor in infectious diseases at Brighton and Sussex medical school have called for a re-examination of the traditional advice. In an analysis in the British Medical Journal, they say “the idea that stopping antibiotic treatment early encourages antibiotic resistance is not supported by evidence, while taking antibiotics for longer than necessary increases the risk of resistance”.

In other words, stop taking the medicine the moment you feel better.
In the past, the theory supporting the completion of a course of antibiotics has been that too short a course would allow the bacteria causing  disease to mutate and become resistant to the drug.

For certain diseases, bacteria can clearly become resistant if the drugs are not taken for long enough to completely eradicate them. One such example of this is tuberculosis.

But a large majority of the bacteria that cause illnesses are found in the environment around us and have no impact until the bacteria gets into the bloodstream or the gut. The case putting forward a cessation in medication once the patient’s health improves is that the longer the bacterial exposure to antibiotics within the body, the higher the chance of developed resistance.

The hypothesis put forth by Professor Llewelyn has not been without its backers.

Peter Openshaw, president of the British Society for Immunology, said he had always considered the notion  that stopping antibiotic treatment early would make organisms more drug-resistant rather “illogical”.

He supported the idea of a more sparing use of antibiotics because the evidence of a link between long-term complete use and benefit was tenuous.

He dismissed claims that not finishing a course of antibiotics would lead to bacteria gaining antibiotic resistance but thought the reverse would be more true. “Far from being irresponsible, shortening the duration of a course of antibiotics might make antibiotic resistance less likely.”

A great British authority, Prof Harold Lambert had made the suggestion as far back as in 1999 in a Lancet article entitled “Don’t keep taking the tablets”. Even though the idea had been broached then, it had not been taken seriously and with hindsight it is surprising that nearly two decades later the medical world has not investigated the alternatives fully and that the optimum duration of antibiotics courses or doses in many conditions remains an investigated fast.

Jodi Lindsay, a professor of microbial pathogenesis at St George’s, University of London, stated that the new research by Professor Llewellyn was good in principle, and that the previous advice to complete a course of antibiotics may have been based on a fear of under-treatment. But nevertheless she cautioned against an over-reaction towards the results of the findings. “The evidence for shorter courses of antibiotics being equal to longer courses, in terms of cure or outcome, is generally good, although more studies would help and there are a few exceptions when longer courses are better – for example, TB.”

To complicate matters, the ideal length of a course of antibiotics varies in individuals depending on what antibiotics they have taken in the past. Hospitalised patients can be tested to find out when the drugs can be stopped. Outside of a hospital setting, this testing is not feasible.

The World Health Organisation advice is still based on the pre-existing guidelines and has not changed.

The Royal College of GPs, however, expressed caution over the findings. “Recommended courses of antibiotics are not random,” said its chair, Prof Helen Stokes-Lampard. She further elaborated that antibiotic treatment courses were already being customised according to individual conditions and if patients took it upon themselves to adjust the prescribed periods, stopping when they felt better, it would be dangerous because a slight turn in outlook did not necessarily demonstrate the complete eradication of the disease. Professor Stokes-Lampard also stressed that it was important for patients to have clear guidelines to adhere to and any adjustment using feel as an indicator might be confusing.

The National Institute for Health and Care Excellence is currently developing guidance for managing common infections, which will look at all available evidence on appropriate prescribing of antibiotics.

The cynics among us might ask, has such a review on current guidelines been made with the objective to cut the cost of medical care? It is well known the health budget is ever dwindling, and one cannot help but feel that the review on existing guidelines of antibiotics has been made with an objective to save on the cost of medicine rather than put patient health first.

The health service is currently riding the trend of developing sustainability in infrastructure and treatment, and this revision of traditional guidelines may seem to be a reframing of the evidence to suit a pre-determined outlook.

Let us return to the example of Greeks and Trojans. If the battle is raging within the Greek city walls and the tide turns against the Trojans, should the Greeks fire their ammunition at the retreating Trojans until they all fall to the ground? Ammunition in the form of gunpowder and metal casings cost money and if the ammunition could be used sparingly, then there is more money to funnel towards other  daily activities like farming and livestock. The question we are being asked to address is the equivalent of this hypothetical situation: Should the Greeks keep firing their weapons, until all the Trojans fall before they manage to retreat and leave the Greek city walls, or should the Greeks try to save the cost of a few rounds of ammunition if they are certain the Trojans are so heavily wounded they would never survive the escape and make it to their own city walls to compromise the information they know about the secret weapon?

You may decide, as I did, that the cost of a few extra rounds of ammunition outweighs all the mental confusion of wondering “what if …?” for the next few months. “What if I didn’t take the medication long enough? What if the bacteria has mutated?”

You can see why it is easier that when it comes to health, be cautious, don’t customise. Don’t experiment on the one life you’ve got!

What antibiotics in agriculture are really about

There is widespread concern over the use of antibiotics in the agricultural world and what is wider bearings are. The general consensus is that the use of antibiotics in agriculture needs to be minimised dramatically by farmers, as there are fears that drug-resistant bacteria could pass up the food chain through consumption and environmental contamination.

The concerns take on many forms. Firstly, just as humans can develop resistance to medicines after prolonged use, there is the concern that long-term antibiotic use in agricultural settings may create antibiotic resistance in the animals and crops which receive these antibiotics. Secondly, even if these crops and animals themselves do not develop resistance to antibodies themselves, the prolonged consumption of the vegetables or meat from these farm animals could breed resistance in humans who consume them. There may also be other side effects we are as yet unaware of.

Antimicrobial drugs, which include antibiotics, antifungal and antiparasitical drugs, are commonly used in farming. They are used to prevent damage to crops, kill parasites, as well as keep livestock healthy. The long term aim of antimicrobial drugs in the context of farming is to maximise crop production and livestock farming. A field of crops lost to infestation is months of work for nothing. A farmer with a field of cows suffering from disease has lost not just capital but production possibilities as well. As with the case of mad-cow disease in the 1990s, farmers who had their cows put down not only lost the money they had invested in buying and breeding these cows, but also on the sale of milk and beef.

And in many cases, the losses from a brief period of crop infestation or animal disease could significantly affect a farmer’s income, or make such a dent in their livelihood that it either forces them to take on additional debt to cover the losses, or be so insurmountable that it forces them out of business.

There might be those that argue against the use of antibiotics but the truth is that they are necessary. They are one form of insurance for a sector that has to combat various problems, including the uncertainties of weather. When, for example, your crops – your livelihood – are subject to the whims of weather, infestation, and perhaps human vandalism and theft, you have to take steps to minimise risks on all fronts. You cannot simply just leave things to chance and hope for divine favour or faith – that would merely be masking a lack of responsibility.

Pests and viruses do not restrict their infestation to selected fields. Left unchecked, they would merely spread from unprotected fields and livestock, and then infect further unprotected areas. Antibiotics are medical city walls that keep away marauding invaders, and prevent them from invading territories and conscripting the local population into their armies to do further damage.

Resistance to the antibiotics, antifungal and antiparasitical drugs used in agriculture is collectively known as antimicrobial resistance (AMR).

An independent body chaired by the British economist Jim O’Neill looked specifically at antibiotic use in the environment and agriculture. Among other things, this body examined the ways in which regulation and financial measures such as taxation and subsidies could play in reducing the risks associated with the agricultural use of antimicrobials and environmental contamination.

The data from the report suggests the amount of antimicrobials used in food production internationally is at least the same as that in humans, and in some places is higher. For example, in the US more than 70% of antibiotics that are medically important for humans are used in animals.

What does that all mean? It means that drugs normally for humans are already used in animals. If human beings consume the meat of the animals over prolonged periods, their bodies can develop tolerance to the antibiotics because they were used in the animals. If human beings later have a need for these antibodies, in the medicines for humans, these forms of medication will have little or no effect. And as we have seen before, ineffective long term medication may only create addiction to drugs and pain relief medication.

The report included peer-reviewed research articles in which 72% of the 139 articles found evidence of a link between antibiotic consumption in animals and resistance in humans. There is enough impetus for policy makers to argue for a global reduction of antibiotics in food production to a more appropriate level.

But while the evidence suggests that we should reduce the usage of these antibiotics, antimicrobial usage is unfortunately likely to rise because of the economic growth and for increasing wealth and food consumption in the emerging world.

A considerable amount of antibiotics are used in healthy animals to prevent infection or speed up their growth. This is particularly the case in intensive farming, where animals are kept in confined conditions. An infection in these confined spaces could easily spread between organisms. Further to this, some animals receive antibiotics so that natural limiters to size are killed off in order that their growth is accelerated. If you sell meat by weight, it makes sense that you try to produce as big as animal as you can so that you can maximise your profits.

The report mainly highlighted three main risks that had connections with the high levels of antimicrobial use in food production. There was the concern that drug-resistant strains could be transmitted through direct contact between humans, particularly in the case of farmers, and animals on their farm. Secondly, the transmission of the drug-resistant strains could also result due to the contact during the preparation of the meat, or the consumption of it. Thirdly, the excrement of the animals might contain the drug-resistant strains and the antimicrobials and therefore pass into the environment.

There was also concern raised about the possibility of contaminating the natural environment. For example, if factories that manufacture these antimicrobials do not dispose of by-products properly, these may pollute the natural environment such as water sources. Already we have seen that fish near waste-treatment plants, which treated urine tinged with chemicals from birth control pills, developed abnormal characteristics and behaviour.

The review made three key recommendations for global action to reduce the risks described. The first was that there should be a global target for the minimisation of antibiotic use in food production to a recognised and acceptable level in livestock and fish. There were also recommendations that restrictions be placed on the use of antibiotics in the animals that are heavily consumed by humans.

Currently there are no guidelines surrounding the disposal of antimicrobial manufacturing waste into the environment and the report urged the quick establishment of these in order that pollution of the environment could be minimised and the disposal of by-products and active ingredients be regulated.

The report also urged for more monitoring on these problematic areas in concordance with agreed global targets, because legislation without means of enforcement is useless.

Is it possible that the production of antimicrobials can be limited? One cannot help but be cynical. As long as we inhabit a world where sales drive rewards, it is inconceivable that farmers would slow down their production on their own initiative. We would definitely need legislation and some form of method to ensure compliance.

But what form of legislation should we have? Should we focus on imposing penalties for non-compliance or incentives to encourage the reduced use of antimicrobials?

Some may argue that the latter is more effective in this case. If farmers are offered financial subsidies so that they receive more money for the price of meat, for example, they would be more inclined to reduce the usage of antimicrobials. But how would these be monitored? Could the meat for sale could be tested to ensure the density of antimicrobials falls under established guidelines, for example, so that if the farrmer has been relying on the use of antibiotics to increase the size of livestock, he is latterly being recompensed for the reduction in size arising from the reduction of the antibiotics?

Unfortunately the difficulty is in reconciling both the need as well as the established economic system for growth in one hand, with the sustainability factor in the other. How is farm produce sold? When you buy a bag of salad, a cut of meat, or a bottle of milk, all this is sold by weight or volume. You may buy eggs in carton of six, but they are also graded by size and weight. For the direct manufacturer – the farmer – size, volume and growth are what bring about greater profits – although these profits may barely be just above the threshold for subsistence. And after making allowances for damage due to weather, theft, low market demand and all other variables that threaten an already low-profit industry, asking a farmer to reduce the use of antimicrobials is akin to asking him not to take measures to protect his livelihood. If the use of antimicrobials bothers you, then you have to compensate the farmer not to use them, by being willing to pay higher prices for farm products.

Why do organic or free range eggs cost twice the price for half the size? Aha!

While antimicrobials are also used on free range produce, and the case of organic farming is not entirely relevant here, the same issue is being highlighted here. You are paying more for the process than the product, and in doing so the extra payment that you make is towards the farmers for farming practices you are seeking to promote.

A farmer can get more produce by rearing battery hens, but if you are concerned over animal welfare, you pay extra per animal for the farmer to rear it with more space and hence more welfare for the animal. Your free range chicken costs more not because it is bigger, or necessarily healthier, but because it has been afforded more space, which you consider to be ethical. Farmers may switch to organic farming if there is enough demand for this, and for some this may even be more favourable, because having to produce fewer hens, but fetching the same price as battery hens, may, in the grand scheme of things, be seen by the farmer as a more favourable solution.

In trying to promote less use of antimicrobials, we have to make up the farmer’s perceived loss of earnings. So it is not incorrect to say that if we are concerned about the use of antimicrobials in agriculture, we have to pay more for our farm produce. Are you prepared to do that? For families with high disposable income, the increase may only represent a small additional fraction. But for families on smaller incomes, the increase may be too steep to be feasible. In other words, while the need for a reduction in agricultural antibiotics is recognised, in practical terms it may only remain an aspirational ideal except to those who can afford it.

Can be people be convinced – even if the cost is high – that in the long term it is better for human health? If the continued use of antimicrobials means that human medication in the future may become less effective as our resistance is tempered, should we, despite our reservations about the cost – make the leap towards maintaining a sustainable future? And if low-income families cannot afford to pay more in the cost of their weekly shop to get less, ridiculous as it might sound – should higher income earners step in to fill the shortfall?

It is strange how the wider discussion about the use of antimicrobials in society leads to a discussion about income distribution and political sensitivities.

What has arisen in the course of that evaluation, however, is the fact that expecting citizens alone to fully contribute towards the production shortfall arising from a reduced use of antimicrobials by paying more for their farm produce is not going to work. While some can afford to, many cannot, and those that can may not necessarily want to pay for those that cannot. There are also other measures to reduce the use of anti-microbials.

Governments could also introduce legislation to prevent environmental contamination through antimicrobial products and by-products, and harsh penalties for doing so. At the moment there are no rules in place, it is of increasing concern that such legislation is developed quickly.

Governments could also offer tax subsidies and support for farmers who continue to reduce antimicrobials usage. These could be introduced at the end of the first year, when farmers need most support at the initial stages of conversion, then at thirty months, and at further longer-spaced periods. Subsidies or incentives could an arithmetic progression at the end of one year, two-and-a-half years, four-and-a-half years, seven years and so on, so there is continued incentive to maintain reduced antimicrobial usage.

The only problem is, where would the money for these subsidies come from? If the government receives less tax from farm produce transactions because less has been sold, and it has also received less from antimicrobial companies in the form of tax, because it has made them limit their production, where will it make up the shortfall? Through an environment tax on its citizens?

Therein lies the problem.

The conundrum is this: the threat of antibiotic resistance in the future means we have to lower the level of antimicrobials we currently use. Yet if we do so, we are looking at reduced economic output. And as long as we have an economic system that is reliant on growth and increased production, asking to slow down production is economic suicide.

You may ask: “What about if we have a re-evaluation of an economic system, and create one that is based on sustainability?”

I am sorry to say it but that is wishful, idealistic thinking.

The problem with switching to a sustainable-based economy can be described as such.

Imagine there is a children’s party. At this party there is a table with a gigantic bowl of sweets. The children who are first to arrive eagerly stuff their faces and pockets with sweets, and as the party progresses, the bowl gradually looks emptier and emptier. The parents present chastise their kids if they continue to head for the sweet bowl, remonstrating with them to leave some for the kids who have not yet arrived from the party. Some of these children, perhaps the older ones, might reduce their trips to the bowl and the number of sweets they take. But some children will continue to plunder the bowl of its sweets before it all runs out and stuff their faces, recognising the sweets are a dwindling resource and if they want to eat them they’d best take as many as they can. And a third group, while recognising the sweets will soon run out, are equally keen to get hold of as many as they can, not to eat the sweets, but because they realise that when one of the latecomers arrives and find there are no sweets left, their parents may offer them incentives to trade to appease the desperate child. “Charlie didn’t get many sweets because he was late. If you let Charlie have two of the sweets you already have, I’ll buy you an ice-cream later.” This third group recognises not just the impending scarcity, but contribute to it by stockpiling their own resources to use for later leverage. And they may even make the loudest noises about how everyone should stop taking sweets, only so that they can make the biggest grabs when no one is looking.

Who are the losers in this situation? The obvious ones are the one who arrived late at the party. But the not so obvious losers are the ones from the first group, who amended their behaviour to ensure that there were still sweets left for the later groups to come. In being principled, holding on to ideals, they became lesser off materially, and the only consolation was the knowledge they had made the effort to leave some sweets for the late group – whether or not the latecomers actually got any or not is another question. The sweets ran out eventually.

The problem with thinking about sustainable economic measures is that the first to make an attempt to switch on ethical or aspirational grounds will be among the ones to lose out, because subsequent groups will still make a grab for whatever is left. Some will make a grab to get as much of the remaining resource, while others will make a grab so that when there is scarcity – and scarcity drives up prices – they have plenty of the resource to benefit. So while everyone is making the right noises about economic sustainability, everyone is just holding back for someone to make the first move.

So this is what antibiotics in agriculture really tells you: Too much can create problems later due to antibiotic resistance and improper disposal. We need to cut down on the use of antimicrobials. But reduced antimicrobials means reduced output, and we must be prepared to pay higher prices for less produce to compensate the farmer for that to work, in order that they may earn a living. The government can introduce penalties to govern the disposal of antimicrobial-related products to limit the damage on the environment alongside incentives to limit the use of antimicrobials. But it will have problems funding the incentives. Because what it is proposing is economic slowdown, in order to have an economy at all in later generations – but the current generations are too concerned with their own interests and survival, and stealthily making a grab for the remnants after the first few leave the economic arena.

The need for cautious antibiotic usage

Antibiotics are medicines which are used to treat forms of bacterial infection or prevent their spread. As the name “antibiotics” suggest, they are anti-bodies and work by killing bacteria or preventing them from reproducing and spreading.

That all sounds impressive. But unfortunately antibodies don’t work for everything. For example, antibiotics don’t work for viral infections such as colds and flu, and most coughs and sore throats. Someone suffering from these infections usually get better without the use of antibiotics. The use of antibiotics to treat these is actually counter-productive, as taking antibiotics when you don’t need them encourages dangerous bacteria that live inside you to become resistant. Over time, this will mean that when you require the help of antibiotics most, they may not work for you as you may have actually been encouraging the tolerance of bacteria by suppressing your body’s ability to fight bacteria.

So don’t use antibiotics for common ailments that can get better on their own. In these situations, what you need is pain relief, and there are many options to choose from. However, antibiotics may be used to treat bacterial infections in cases such as when bacteria could infect others unless treated or infections are not likely to clear up without antibiotics. In other words, if there is further risk of infection to others, or complications which may arise from a lack of treatment, then a course of antibiotics is best followed.

The doses of antibiotics vary but if you are prescribed a course, then take the antibiotics as directed on the packet or the patient information leaflet that comes with the medication. If in doubt then seek advice from the pharmacist.

Antibiotics can be administered in various ways. The most common antibiotics are oral ones, in the form of capsules, tablets or liquid. These are commonly used to treat moderate infections or infections which are milder. There are also topical antibiotics, which are basically creams, lotions, sprays or drops, which are often administered for skin infections.

Topical and oral antibiotics are for less-serious infections. More serious infections, where the medicine has to be absorbed more quickly into the bloodstream, have to be treated by antibiotics administered through injection or drip infusion.

It is essential to finish taking a prescribed course of antibiotics, even if you feel better before the course has ended The prescribed doseage is the estimated time it will take to completely kill off the bacteria. Midway through a course, you may have killed off enough bacteria to not be under the effect of the infection, but stopping the course of antibiotics then can leave the remaining bacteria become resistant to the antibiotic.

But what if you missing a dose of antibiotics? If that is the case, then it is advisable to take that dose as soon as you remember and then continue to take your course of antibiotics as normal. However, if you have missed a dose and only remembered it when it is nearly time for the next dose, it is preferable to simply skip it and merely to continue your regular dosing schedule. Taking two doses only encourages the body to anticipate needing the double doseage in order to fight the infection, and messes up the body’s resistance levels.

Furthermore, there is a higher risk of side effects if you take two doses closer together than recommended. You may experience effects such as pain in your stomach, diarrhoea, and feeling or being sick. Most side effects are gastro-intestinal, and overdosing on anti-biotics may cause bloating, indigestion and diarrhoea.

Some people may have an allergic reaction to antibiotics, especially penicillin and a type called cephalosporins. In very rare cases, this can lead to a serious allergic reaction (anaphylaxis), which is a medical emergency. Sufferers carry an epi-pen and the drug is administered in the bloodstream through injection.

Antibiotics are not over the counter medicines and you should never use any remaining tablets arising from someonbe else’s incomplete course, as you may experience different reactions to the drug. Some antibiotics are also not suitable for people with certain medical conditions, or women who are pregnant or breastfeeding, as they may, for example, adversely affect the lining of the stomach. You should only ever take antibiotics prescribed for you and also never pass them on to someone else.

Antibiotics are only still chemicals and depending on the constituents, some can also react unpredictably with other medications, such as the oral contraceptive pill and alcohol. It’s important to read the information leaflet that comes with your medication carefully and discuss any concerns with your pharmacist or GP.

There are hundreds of different types of antibiotics, but most of them can be broadly classified into six groups. These are outlined below.

Penicillins (such as penicillin and amoxicillin) – widely used to treat a variety of infections, including skin infections, chest infections and urinary tract infections

Cephalosporins (such as cephalexin) – used to treat a wide range of infections, but some are also effective for treating more serious infections, such as septicaemia and meningitis

Aminoglycosides (such as gentamicin and tobramycin) – tend to only be used in hospital to treat very serious illnesses such as septicaemia, as they can cause serious side effects, including hearing loss and kidney damage; they’re usually given by injection, but may be given as drops for some ear or eye infections

Tetracyclines (such as tetracycline and doxycycline)– can be used to treat a wide range of infections, but are commonly used to treat moderate to severe acne and rosacea

Macrolides (such as erythromycin and clarithromycin) – can be particularly useful for treating lung and chest infections, or an alternative for people with a penicillin allergy, or to treat penicillin-resistant strains of bacteria

Fluoroquinolones (such as ciprofloxacin and levofloxacin) – broad-spectrum antibiotics that can be used to treat a wide range of infections

The use of antibiotics especially for conditions that aren’t serious has led to a rise in the number of high-tolerant infections, or superbugs. These superbugs and have a high tolerance to many anti-bodies and include:

methicillin-resistant Staphylococcus aureus (MRSA)
Clostridium difficile (C. diff)
the bacteria that cause multi-drug-resistant tuberculosis (MDR-TB)
carbapenemase-producing Enterobacteriaceae (CPE)

Ridding the world of these types of infections can be challenging, and these superbugs are becoming an increasing cause of disability and death across the world. The biggest worry is that new strains of bacteria may emerge with higher levels of resistance and that can’t be effectively treated by any existing antibiotics, so we have to be wary in how we use them, and when we suffer from minor infections, let the body try to fight off the infection instead of relying on antibiotics which may weaken the body’s immunity in the long run.