A short history of non-medical prescribing

It had long been recognised that nurses spent a significant amount of time visiting general practitioner (GP) surgeries and/ or waiting to see the doctor in order to get a prescription for their patients. Although this practice produced the desired result of a prescription being written, it was not an efficient use of either the nurses’or the GPs’time. Furthermore, it was an equally inefficient use of their skills, exacerbated by the fact that the nurse had usually themselves assessed and diagnosed the patient and decided on an appropriate treatment plan.

The situation was formally acknowledged in the Cumberlege Report (Department of Health and Social Security 1986), which initiated the call for nurse prescribing and recommended that community nurses should be able to prescribe from a limited list, or formulary. Progress was somewhat measured, but The Crown Report of 1989 (Department of Health (DH) 1989) considered the implications of nurse prescribing and recommended suitably qualified registered nurses (district nurses (DN) or health visitors (HV)) should be authorised to prescribe from a limited list, namely, the nurse prescribers’formulary (NPF). Although a case for nurse prescribing had been established, progress relied on legislative changes to permit nurses to prescribe.

Progress continued to be cautious with the decision made to pilot nurse prescribing in eight demonstration sites in eight NHS regions. In 1999, The Crown Report II (DH 1999) reviewed more widely the prescribing, supply and administration of medicines and, in recognition of the success of the nurse prescribing pilots, recommended that prescribing rights be extended to include other groups of nurses and health professionals. By 2001, DNs and HVs had completed education programmes through which they gained V100 prescribing status, enabling them to prescribe from the NPF. The progress being made in prescribing reflected the reforms highlighted in The NHS Plan (DH 2000), which called for changes in the delivery of healthcare throughout the NHS, with nurses, pharmacists and allied health professionals being among those professionals vital to its success.

The publication of Investment and Reform for NHS Staff –Taking Forward the NHS Plan (DH 2001) stated clearly that working in new ways was essential to the successful delivery of the changes. One of these new ways of working was to give specified health professionals the authority to prescribe, building on the original proposals of The Crown Report (DH 1999). Indeed, The NHS Plan (DH 2000) endorsed this recommendation and envisaged that, by 2004, most nurses should be able to prescribe medicines (either independently or supplementary) or supply medicines under patient group directions (PGDs) (DH 2004). After consultation in 2000, on the potential to extend nurse prescribing, changes were made to the Health and Social Care Act 2001.

The then Health Minister, Lord Philip Hunt, provided detail when he announced that nurse prescribing was to include further groups of nurses. He also detailed that the NPF was to be extended to enable independent nurse prescribers to prescribe all general sales list and pharmacy medicines prescribable by doctors under the NHS. This was together with a list of prescription-only medicines (POMs) for specified medical conditions within the areas of minor illness, minor injury, health promotion and palliative care. In November 2002, proposals were announced by Lord Hunt, concerning ‘supplementary’prescribing (DH 2002).

The proposals were to enable nurses and pharmacists to prescribe for chronic illness management using clinical management plans. The success of these developments prompted further regulation changes, enabling specified allied health professionals to train and qualify as supplementary prescribers (DH 2005). From May 2006, the nurse prescribers’extended formulary was discontinued, and qualified nurse independent prescribers (formerly known as extended formulary nurse prescribers) were able to prescribe any licensed medicine for any medical condition within their competence, including some controlled drugs.

Further legislative changes allowed pharmacists to train as independent prescribers (DH 2006) with optometrists gaining independent prescribing rights in 2007. The momentum of non-medical prescribing continued, with 2009 seeing a scoping project of allied health professional prescribing, recommending the extension of prescribing to other professional groups within the allied health professions and the introduction of independent prescribing for existing allied health professional supplementary prescribing groups, particularly physiotherapists and podiatrists (DH 2009).

In 2013, legislative changes enabled independent prescribing for physiotherapists and podiatrists. As the benefits of non-medical prescribing are demonstrated in the everyday practice of different professional groups, the potential to expand this continues, with consultation currently under way to consider the potential for enabling other disciplines to prescribe.

The bigger issues that come with preventing hearing loss

Is there cause for optimism when it comes to preventing hearing loss? Certainly the latest research into this suggests that if positive effects experienced by mice could be transferred to humans and maintained for the long term, then hereditary hearing loss could be a thing of the past.

It has always been assumed that hearing loss is always down to old age. The commonly held view is that as people grow older, their muscles and body functions deteriorate with time to the point that muscle function is impaired and eventually lost. But hearing loss is not necessarily down to age, although there are cases where constant exposure to loud noise, over time, causes reduced sensitivity to aural stimuli. Over half of hearing loss cases are actually due to inheriting faulty genetic mutations from parents.

How do we hear? The hair cells of the inner ear called the cochlea respond to vibrations and these signals are sent to the brain to interpret. The brain processes these signals in terms of frequency, duration and timbre in order to translate them into signals we know.

For example, if we hear a high frequency sound of short duration that is shrill, our brain interprets these characteristics and then runs through a database of audio sounds, an audio library in the brain, and may come up with the suggestion that it has come from a whistle and may signify a call for attention.

What happens when you have a genetic hearing loss gene? The hairs on the inner ear do not grow back and consequently sound vibration from external stimuli do not get passed on to the brain.

With progressive hearing loss too, the characteristics of sound also get distorted. We may hear sounds differently to how they are produced, thereby misinterpreting their meaning. Sounds of higher and lower frequency may be less audible too.

How does that cause a problem? Imagine an alarm. It is set on a high frequency so that it attracts attention. If your ability to hear high frequencies is gradually dulled then you may not be able to detect the sound of an alarm going off.

As hearing gradually deteriorates, the timbre of a sound changes. Sharper sounds become duller, and in the case of the alarm, you may hear it, but it may sound more muted and the brain may not be able to recognise that it is an alarm being heard.

Another problem with hearing loss is the loss of perception of volume. You may be crossing the road and a car might sound its horn if you suddenly encroach into its path. But if you cannot hear that the volume is loud, you may perceive it to be from a car far away and may not realise you are in danger.

The loss of the hairs in the inner ear is a cause of deafness in humans, particularly those for whom hearing loss is genetic. Humans suffering from hereditary hearing loss lose the hairs of the inner ear, which result in the difficulties mentioned above. But there is hope. In a research experiment, scientists successfully delayed the loss of the hairs in the inner ear for mice using a technique that edited away the genetic mutation that causes the loss of the hairs in the cochlea.

Mice were bred with the faulty gene that caused hearing loss. But using a technology known as Crispr, the faulty gene was replaced with a healthy normal one. After about eight weeks, the hairs in the inner ears of mice with genetic predisposition to hearing loss flourished, compared to similar mice which had not been treated. The genetic editing technique had removed the faulty gene which caused hearing loss. The treated mice were assessed for responsiveness to stimuli and showed positive gains.

We could be optimistic about the results but it is important to stress the need to be cautious.

Firstly, the research was conducted on mice and not humans. It is important to state that certain experiments that have been successful in animals have not necessarily had similar success when tried on humans.

Secondly, while the benefits in mice were seen in eight weeks, it may take longer in humans, if at all successful.

Thirdly, we should remember that the experiment worked for the mice which had the genetic mutation that would eventually cause deafness. In other words, they had their hearing at birth but were susceptible to losing it. The technique prevented degeneration in hearing in mice but would not help mice that were deaf at birth from gaining hearing they never had.

Every research carries ethical issues and this one was no different. Firstly, one ethical issue is the recurring one of whether animals should ever be used for research. Should mice be bred for the purposes of research? Are all the mice used? Are they accounted for? Is there someone from Health and Safety going around with a clipboard accounting for the mice? And what happens to the mice when the research has ceased? Are they put down, or released into the ecosystem? “Don’t be silly,” I hear you say, “it’s only mice.” That’s the problem. The devaluation of life, despite the fact that it belongs to another, is what eventually leads to a disregard for other life and human life in general. Would research scientists, in the quest for answers, eventually take to conducting research on beggars, those who sleep rough, or criminals? Would they experiment on orphans or unwanted babies?

The second, when it comes to genetics, is whether genetic experimentation furthers good or promotes misuse. The answer, I suppose, is that the knowledge empowers, but one cannot govern its control. The knowledge that genetic mutation can be edited is good news, perhaps, because it means we can genetically alter, perhaps, disabilities or life-threatening diseases from the onset by removing them. But this, on the other hand, may promote the rise of designer babies, where mothers genetically select features such as blue eyes for their unborn child to enhance their features from birth, and this would promote misuse in the medical community.

Would the use of what is probably best termed genetic surgery be more prominent in the future? One can only suppose so. Once procedures have become more widespread it is certain to conclude that more of such surgeons will become available, to cater for the rich and famous. It may be possible to delay the aging process by genetic surgery, perhaps by removing the gene that causes skin to age, instead of using botox and other external surgical procedures.

Would such genetic surgery ever be available on the NHS? For example, if the cancer gene were identified and could be genetically snipped off, would patients request this instead of medical tablets and other external surgical processes? One way of looking at it is that the NHS is so cash-strapped that under QALY rules, where the cost of a procedure is weighed against the number of quality life years it adds, the cost of genetic surgery would only be limited to more serious illnesses, and certainly not for those down the rung. But perhaps for younger individuals suffering from serious illnesses, such as depression, the cost of a surgical procedure may far outweigh a lifetime’s cost of medication of anti-depressant, anti-psychotics or antibiotics. If you could pinpoint a gene that causes a specific pain response, you might alter it to the point you may not need aspirin, too much of which causes bleeds. And if you could genetically locate what causes dementia in another person, would you not be considered unethical if you let the gene remain, thereby denying others the chance to live a quality life in their latter years?

Genetic editing may be a new technique for the moment but if there is sufficient investment into infrastructure and the corpus of genetic surgery information widens, don’t be surprised if we start seeing more of that in the next century. The cost of genetic editing may outweigh the cost of lifelong medication and side effects, and may prove to be not just more sustainable for the environment but more agreeable to the limited NHS budget.

Most of us won’t be around by then, of course. That is unless we’ve managed to remove the sickness and death genes.

Physical and Mental Contamination

Is there a need to start worrying about your kitchen? I don’t mean in the home improvement context, never mind that the island unit is looking a bit worse for wear, and that your swanky appliances need upgrading so you can have two ovens to cook for your little army; or maybe you are thinking you could expand beyond the microwave and gas cooker. Or perhaps you are considering the option of creating an open plan kitchen. Whatever the physical changes you are considering, they are beyond the scope of discussion. Danger lurks in your kitchen.

It doesn’t come in the form of masked strangers brandishing kitchen implements. Or ruthless critics in the form of master chefs or children. No, the hidden danger in your kitchen is more subtle, more soft, yet potentially more lethal.

The kitchen sponge.

Scientists estimate that the kitchen sponge contains the highest concentration of bacteria than anywhere in the house. On the face of things, this is not an unrealistic statement. The kitchen sponge is in contact with remnants of food as it passes over the crockery and cutlery, and while the latter are clean, microscopic elements of food have merely been transferred to the sponge. And even if you take the effort to rinse out the sponge, or go a step further by microwaving the sponge, trace elements of food bacteria will remain.

According to the Mail Online, one of the more sensationalist newspapers in the United Kingdom, there are 54 billion cells of bacteria residing on the humble sponge. But of course the Mail Online would say that – it is taking a simple fact and blowing it out of proportion in order to create a purchasing headline. (And what is a purchasing headline? It is one that intrigues you enough to make a financial physical purchase to discover more, or hook you in enough to commit your time to reading more, never mind that the headline was slightly manipulated in the first place.) The fact is, bacteria exist all around us. They are on the surfaces of things around us. But it is important to distinguish between good and bad bacteria. The majority of bacteria around us are harmless. The remaining bacteria can do us harm if they enter our bodies, which is why it is a good idea to wash hands before eating. This ensures the harmful bacteria on our hands, either from touching door knobs or taps or other contaminated surfaces, does not rest on food that we ingest. It is also a good idea to cover up exposed cuts so that bacteria does not enter the bloodstream.

Bacteria is all around us but we can’t live life in fear of it.

Can you imagine if the word bacteria was substituted with the word humans? It would give a better perspective.

The headline would read that something like “A higher concentration of criminals found in [name of city]”. But you can’t live like every human in that city will do you harm. You can only take necessary steps to avoid being negatively affected.

Just like bacteria.

The current guidelines around hand washing recommend that we  our hands with water and soap for at least 20 seconds, after instances such as using the toilet, handling raw food like meat and vegetables. It is advised that we wash our hands before eating or after contact with animals.

Does washing with specialist soaps make any difference? A study by Rutgers University and GOJO Industries in the US found that there was little difference, which suggests the science between Brand X and Brand Y is as manufactured as the products themselves.

The study involved twenty volunteers and examined variables of hand washing such as brand, volume and time elapsed. A non-harmful strain of the bacteria e-coli was placed on the hands of the volunteers and then examined after washing to see how much remained.

The study found that there was little to distinguish between normal soaps and anti-microbial formulations. In fact, as long as volunteers washed their hands with soap for thirty seconds, the difference in results after washing were negligible.

There were a couple of minor limitations to the study conducted by Rutgers and GOJO Industries.

Firstly, that sample size is too small. Secondly, volunteers could not ethically be asked to handle deadly bacteria so the results may have only be applicable to that particular strain of bacteria.

There was a major stumbling block to the research however. GOJO Industries manufactures hand soaps.

We have already examined in the past how it is not a good idea if pharmaceutical companies run their own tests because the authenticity cannot be guaranteed completely if there is a bias from the outset. If a pharmaceutical company or any other manufacturing company is going to invest time, money and effort into production, it is going to choose results which have a positive bias, rather than those with a negative one which either force further research, impacting on production time and costs, or one that cause the complete abandonment of results.

Is there anything we can trust anymore? The dilemmas we have are that the media distorts reporting, and research is funded with an agenda which produces an expected outcome. It is difficult to secure funding for research if there is no meaningful purpose behind it beyond sales.

Returning to the original issue of bacteria, as long as we take necessary precautions, that is the best we can do. These precautions include replacing the sponge regularly, and not leaving unwashed dishes in the kitchen, and washing our hands to avoid contamination.

And take in what you read and hear about health and news with a pinch of objectivity. Avoid contaminating your mind too!

An overview of mental health

Mental illness continues to be one of the most misunderstood, mythologised and controversial of issues. Described for as long as human beings have been able to record thoughts and behaviours, it is at once a medical, social and at times political issue. It can lead to detention against one’s will and has its very own Act of Parliament, and yet we really know very little about it.

Societies through the ages have responded to this mystery by the locking up of people whose sometimes bizarre behaviour was deemed dangerous, unsuitable or just plain scandalous. Only within the relatively recent past have the tall, thick walls of the asylum been dismantled and those who remained institutionalised and hidden allowed out into the community.

Little wonder then that mental health and mental disorder remain misunderstood to most, and frightening to many. Recent reports suggest that stigma is on the decline (Time to Change 2014) but progress has been slow. Despite the best efforts of soap scriptwriters, high-profile celebrities ‘coming clean’ about mental illness, and the work of mental health charities and support groups in demystifying diagnoses such as depression, we still see and hear many examples of discrimination and myth.

Given the sheer ubiquity of mental illness throughout the world, the stigma and mystery is surprising. The most recent national survey confirms the now well-known statistic that just under one in four English adults are experiencing a diagnosable mental disorder at any one time (McManus et al. 2009). Depression is identified by the World Health Organization as the world’s leading cause of years of life lost due to disability (WHO 2009).

Relatively few of those experiencing mental health problems will come to the attention of a GP, let alone a mental health professional. This is especially so in the developing world where initiatives to develop local mental health interventions are gaining considerable ground after generations of cultural stigma and ignorance (WHO 2009). But even in parts of the world where people have ready access to medical help, many suffer alone rather than face the apparent shame of experiencing mental health problems.

Perhaps part of our reluctance to accept mental illness lies with difficulties determining mental health. We are made aware of factors that determine positive mental health. Connecting with people, being active, learning new things, acts of altruism and being aware of oneself (NHS 2014) have been evidenced as ways of promoting our well-being, but mental order remains rather more loosely defined than mental disorder.

So what are the systems used to categorise and define mental illness? In the United Kingdom, mental health professionals often refer to an ICD-10 diagnosis to refer to a patient’s condition. This is the World Health Organization’s (WHO) diagnostic manual, which lists all recognised (by WHO at least) diseases and disorders, including the category ‘mental and behavioural disorders’ (WHO 1992). The Diagnostic and Statistical Manual of Mental Disorders (better known as DSM-5) is more often used in the United States and elsewhere in the world (American Psychiatric Association 2013). These two sets of standards are intended to provide global standards for the recognition of mental health problems for both day-to-day clinical practice and clinical researchers, although the tools used by the latter group to measure symptoms often vary from place to place and can interfere with the ‘validity’ of results, or in other words the ability of one set of results to be compared with those from a different research team.

ICD-10 ‘mental and behavioural disorders’ lists 99 different types of mental health problem, each of which is further sub-divided into a variety of more precise diagnoses, ranging from the relatively common and well known (such as depression or schizophrenia) to more obscure diagnoses such as ‘specific developmental disorders of scholastic skills’.

The idea of using classification systems and labels to describe the highly complex vagaries of the human mind often meets with fierce resistance in mental health circles. The ‘medical model’ of psychiatry – diagnosis, prognosis and treatment – is essentially a means of applying the same scientific principles to the study and treatment of the mind as physical medicine applies to diseases of the body. An X-ray of the mind is impossible, a blood test will reveal nothing about how a person feels, and fitting a collection of psychiatric symptoms into a precise diagnostic category does not always yield a consistent result.

In psychiatry, symptoms often overlap with one another. For example, a person with obsessive compulsive disorder may believe that if they do not switch the lights on and off a certain number of times and in a particular order then a disaster will befall them. To most, this would appear a bizarre belief, to the extent that the inexperienced practitioner may label that person as ‘delusional’ or ‘psychotic’. Similarly, a person in the early stages of Alzheimer’s disease may often experience many of the ‘textbook’ features of clinical depression, such as low mood, poor motivation and disturbed sleep. In fact, given the tragic and predictable consequences of dementia it is unsurprising that sufferers often require treatment for depression, particularly while they retain the awareness to know that they are suffering from a degenerative condition with little or no improvement likely.

Psychiatry may often be a less-than-precise science, but the various diagnostic terms are commonplace in health and social care and have at least some descriptive power, although it is also important to remember that patients or clients may experience a complex array of feelings, experiences or ‘symptoms’ that may vary widely with the individual over time and from situation to situation.

Defining what is (or what is not) a mental health problem is really a matter of degrees. Nobody could be described as having ‘good’ mental health every minute of every day. Any football supporter will report the highs and lows encountered on an average Saturday afternoon, and can easily remember the euphoria of an important win or the despondency felt when their team is thrashed six-nil on a cold, wet Tuesday evening. But this could hardly be described as a ‘mental health problem’, and for all but the most ardent supporters their mood will have lifted within a short space of time.

However, the same person faced with redundancy, illness or the loss of a close family member might encounter something more akin to a ‘problem’. They may experience, for example, anger, low mood, tearfulness, sleep difficulties and loss of appetite. This is a quite normal reaction to stressful life events, although the nature and degree of reaction is of course dependent on a number of factors, such as the individual’s personality, the circumstances of the loss and the support available from those around them at the time. In most circumstances the bereaved person will recover after a period of time and will return to a normal way of life without the need for medical intervention of any kind. On the other hand, many people will experience mental health problems serious enough to warrant a visit to their GP.

The majority of people with mental health problems are successfully assessed and treated by GPs and other primary care professionals, such as counsellors. The Improving Access to Psychological Therapies (IAPT) programme is a now well-established approach to treating mental health problems in the community. GPs can make an IAPT referral for depressed and/or anxious patients who have debilitating mental health issues but who don’t require more specialised input from a psychiatrist or community mental health nurse. Most people receiving help for psychological problems will normally be able to carry on a reasonably normal lifestyle either during treatment or following a period of recovery. A small proportion of more severe mental health issues will necessitate referral to a Community Mental Health Team (CMHT), with a smaller still group of patients needing in-patient admission or detention under the Mental Health Act.

Mental health is a continuum at the far end of which lies what professionals refer to as severe and enduring mental illness. This is a poorly defined category, but can be said to include those who suffer from severely debilitating disorders that drastically reduce their quality of life and that may necessitate long-term support from family, carers, community care providers, supported housing agencies and charities. The severe and enduring mentally ill will usually have diagnoses of severe depression or psychotic illness, and will in most cases have some degree of contact with mental health professionals.

Revising Traditional Antibiotic Advice

What do you do when you have a cold and feel under the weather? Perhaps you decide to tough it out, and head to work as usual. You grin and bear it, because as far as you are concerned, it’s just a common cold and you can’t do anything about it.

But suppose you don’t get any better after a week, when you expected that the cold would have already run its course. You decide to stay at home to rest, and after a further two days when no improvement is seen, you go to visit the doctor.

The doctor’s advice? A course of antibiotics. Two tablets three times a day after meals, and by the way, keep finishing the course even when you feel better.

This is the advice that has been dispensed through decades to patients. Finish the whole prescription of antibiotics. And as patients, we put our trust in doctors so whatever they said went. Who were we to argue with seven years of medical training?

But what would you say if this medical advice turned out to be incorrect? I know what I’d think – firstly the sceptic in me would say medical advice is fickle and flows with what is fashionable at the time. At times, medicine seems also subservient to politics and economy. Remember the case with red wine? When the economy was flagging, a glass of red wine was said to be good for you. Yet when the NHS was under strain this so-called health benefit was reversed.

In this day and age it is also fashionable for everyone to carve a niche for themselves, and for many the way to do so is to turn traditional advice upside down on its head and revise or reformat existing information. And so, with these in mind, it is unsurprising that we learn of yet another study that claims the rule that patients must finish antibiotics course is wrong.

The new slant on the old problem is that patients should stop taking the prescribed medication when they feel better rather than as what doctors previously used to recommend.

The new panel of experts suggest that  the long embedded rule is incorrect, because continually taking medication after we have felt better only lowers the body’s resistance in the long run. They argue that if the body already feels better, giving it medication it does not need has counter-productive effects.

This differs with the advice that doctors have traditionally recommended, which is based on the idea that bacteria remains in our bodies even though we feel better and these bacteria may develop adaptation to antibiotics if they are not fully killed off. In other words, if you have not fully killed off the bacteria, it develops tolerance and immunity to the drug which partially fended it off, and ultimately the antibiotics’ effectiveness is negated.

Imagine two medieval armies: Trojans and Greeks. One day the Trojans manage to get inside the Greek city walls and wreak havoc (according to the Greeks anyway) with their torches, spears and swords. But the Greeks have a special weapon, say for arguments’ sake, an M16 with a laser sight. If the Greeks completely defeat the Trojans, the effectiveness of their weapon is guaranteed against successive waves of Trojan attacks. But if the Greek army stops to celebrate the moment the city battle swings in their favour, retreating Trojans may bring back information about the weapon, and how it works, and plan successive attacks that limit the effectiveness of the weapon or destroy it completely.

Martin Llewelyn, professor in infectious diseases at Brighton and Sussex medical school have called for a re-examination of the traditional advice. In an analysis in the British Medical Journal, they say “the idea that stopping antibiotic treatment early encourages antibiotic resistance is not supported by evidence, while taking antibiotics for longer than necessary increases the risk of resistance”.

In other words, stop taking the medicine the moment you feel better.
In the past, the theory supporting the completion of a course of antibiotics has been that too short a course would allow the bacteria causing  disease to mutate and become resistant to the drug.

For certain diseases, bacteria can clearly become resistant if the drugs are not taken for long enough to completely eradicate them. One such example of this is tuberculosis.

But a large majority of the bacteria that cause illnesses are found in the environment around us and have no impact until the bacteria gets into the bloodstream or the gut. The case putting forward a cessation in medication once the patient’s health improves is that the longer the bacterial exposure to antibiotics within the body, the higher the chance of developed resistance.

The hypothesis put forth by Professor Llewelyn has not been without its backers.

Peter Openshaw, president of the British Society for Immunology, said he had always considered the notion  that stopping antibiotic treatment early would make organisms more drug-resistant rather “illogical”.

He supported the idea of a more sparing use of antibiotics because the evidence of a link between long-term complete use and benefit was tenuous.

He dismissed claims that not finishing a course of antibiotics would lead to bacteria gaining antibiotic resistance but thought the reverse would be more true. “Far from being irresponsible, shortening the duration of a course of antibiotics might make antibiotic resistance less likely.”

A great British authority, Prof Harold Lambert had made the suggestion as far back as in 1999 in a Lancet article entitled “Don’t keep taking the tablets”. Even though the idea had been broached then, it had not been taken seriously and with hindsight it is surprising that nearly two decades later the medical world has not investigated the alternatives fully and that the optimum duration of antibiotics courses or doses in many conditions remains an investigated fast.

Jodi Lindsay, a professor of microbial pathogenesis at St George’s, University of London, stated that the new research by Professor Llewellyn was good in principle, and that the previous advice to complete a course of antibiotics may have been based on a fear of under-treatment. But nevertheless she cautioned against an over-reaction towards the results of the findings. “The evidence for shorter courses of antibiotics being equal to longer courses, in terms of cure or outcome, is generally good, although more studies would help and there are a few exceptions when longer courses are better – for example, TB.”

To complicate matters, the ideal length of a course of antibiotics varies in individuals depending on what antibiotics they have taken in the past. Hospitalised patients can be tested to find out when the drugs can be stopped. Outside of a hospital setting, this testing is not feasible.

The World Health Organisation advice is still based on the pre-existing guidelines and has not changed.

The Royal College of GPs, however, expressed caution over the findings. “Recommended courses of antibiotics are not random,” said its chair, Prof Helen Stokes-Lampard. She further elaborated that antibiotic treatment courses were already being customised according to individual conditions and if patients took it upon themselves to adjust the prescribed periods, stopping when they felt better, it would be dangerous because a slight turn in outlook did not necessarily demonstrate the complete eradication of the disease. Professor Stokes-Lampard also stressed that it was important for patients to have clear guidelines to adhere to and any adjustment using feel as an indicator might be confusing.

The National Institute for Health and Care Excellence is currently developing guidance for managing common infections, which will look at all available evidence on appropriate prescribing of antibiotics.

The cynics among us might ask, has such a review on current guidelines been made with the objective to cut the cost of medical care? It is well known the health budget is ever dwindling, and one cannot help but feel that the review on existing guidelines of antibiotics has been made with an objective to save on the cost of medicine rather than put patient health first.

The health service is currently riding the trend of developing sustainability in infrastructure and treatment, and this revision of traditional guidelines may seem to be a reframing of the evidence to suit a pre-determined outlook.

Let us return to the example of Greeks and Trojans. If the battle is raging within the Greek city walls and the tide turns against the Trojans, should the Greeks fire their ammunition at the retreating Trojans until they all fall to the ground? Ammunition in the form of gunpowder and metal casings cost money and if the ammunition could be used sparingly, then there is more money to funnel towards other  daily activities like farming and livestock. The question we are being asked to address is the equivalent of this hypothetical situation: Should the Greeks keep firing their weapons, until all the Trojans fall before they manage to retreat and leave the Greek city walls, or should the Greeks try to save the cost of a few rounds of ammunition if they are certain the Trojans are so heavily wounded they would never survive the escape and make it to their own city walls to compromise the information they know about the secret weapon?

You may decide, as I did, that the cost of a few extra rounds of ammunition outweighs all the mental confusion of wondering “what if …?” for the next few months. “What if I didn’t take the medication long enough? What if the bacteria has mutated?”

You can see why it is easier that when it comes to health, be cautious, don’t customise. Don’t experiment on the one life you’ve got!