A short history of non-medical prescribing

It had long been recognised that nurses spent a significant amount of time visiting general practitioner (GP) surgeries and/ or waiting to see the doctor in order to get a prescription for their patients. Although this practice produced the desired result of a prescription being written, it was not an efficient use of either the nurses’or the GPs’time. Furthermore, it was an equally inefficient use of their skills, exacerbated by the fact that the nurse had usually themselves assessed and diagnosed the patient and decided on an appropriate treatment plan.

The situation was formally acknowledged in the Cumberlege Report (Department of Health and Social Security 1986), which initiated the call for nurse prescribing and recommended that community nurses should be able to prescribe from a limited list, or formulary. Progress was somewhat measured, but The Crown Report of 1989 (Department of Health (DH) 1989) considered the implications of nurse prescribing and recommended suitably qualified registered nurses (district nurses (DN) or health visitors (HV)) should be authorised to prescribe from a limited list, namely, the nurse prescribers’formulary (NPF). Although a case for nurse prescribing had been established, progress relied on legislative changes to permit nurses to prescribe.

Progress continued to be cautious with the decision made to pilot nurse prescribing in eight demonstration sites in eight NHS regions. In 1999, The Crown Report II (DH 1999) reviewed more widely the prescribing, supply and administration of medicines and, in recognition of the success of the nurse prescribing pilots, recommended that prescribing rights be extended to include other groups of nurses and health professionals. By 2001, DNs and HVs had completed education programmes through which they gained V100 prescribing status, enabling them to prescribe from the NPF. The progress being made in prescribing reflected the reforms highlighted in The NHS Plan (DH 2000), which called for changes in the delivery of healthcare throughout the NHS, with nurses, pharmacists and allied health professionals being among those professionals vital to its success.

The publication of Investment and Reform for NHS Staff –Taking Forward the NHS Plan (DH 2001) stated clearly that working in new ways was essential to the successful delivery of the changes. One of these new ways of working was to give specified health professionals the authority to prescribe, building on the original proposals of The Crown Report (DH 1999). Indeed, The NHS Plan (DH 2000) endorsed this recommendation and envisaged that, by 2004, most nurses should be able to prescribe medicines (either independently or supplementary) or supply medicines under patient group directions (PGDs) (DH 2004). After consultation in 2000, on the potential to extend nurse prescribing, changes were made to the Health and Social Care Act 2001.

The then Health Minister, Lord Philip Hunt, provided detail when he announced that nurse prescribing was to include further groups of nurses. He also detailed that the NPF was to be extended to enable independent nurse prescribers to prescribe all general sales list and pharmacy medicines prescribable by doctors under the NHS. This was together with a list of prescription-only medicines (POMs) for specified medical conditions within the areas of minor illness, minor injury, health promotion and palliative care. In November 2002, proposals were announced by Lord Hunt, concerning ‘supplementary’prescribing (DH 2002).

The proposals were to enable nurses and pharmacists to prescribe for chronic illness management using clinical management plans. The success of these developments prompted further regulation changes, enabling specified allied health professionals to train and qualify as supplementary prescribers (DH 2005). From May 2006, the nurse prescribers’extended formulary was discontinued, and qualified nurse independent prescribers (formerly known as extended formulary nurse prescribers) were able to prescribe any licensed medicine for any medical condition within their competence, including some controlled drugs.

Further legislative changes allowed pharmacists to train as independent prescribers (DH 2006) with optometrists gaining independent prescribing rights in 2007. The momentum of non-medical prescribing continued, with 2009 seeing a scoping project of allied health professional prescribing, recommending the extension of prescribing to other professional groups within the allied health professions and the introduction of independent prescribing for existing allied health professional supplementary prescribing groups, particularly physiotherapists and podiatrists (DH 2009).

In 2013, legislative changes enabled independent prescribing for physiotherapists and podiatrists. As the benefits of non-medical prescribing are demonstrated in the everyday practice of different professional groups, the potential to expand this continues, with consultation currently under way to consider the potential for enabling other disciplines to prescribe.

The bigger issues that come with preventing hearing loss

Is there cause for optimism when it comes to preventing hearing loss? Certainly the latest research into this suggests that if positive effects experienced by mice could be transferred to humans and maintained for the long term, then hereditary hearing loss could be a thing of the past.

It has always been assumed that hearing loss is always down to old age. The commonly held view is that as people grow older, their muscles and body functions deteriorate with time to the point that muscle function is impaired and eventually lost. But hearing loss is not necessarily down to age, although there are cases where constant exposure to loud noise, over time, causes reduced sensitivity to aural stimuli. Over half of hearing loss cases are actually due to inheriting faulty genetic mutations from parents.

How do we hear? The hair cells of the inner ear called the cochlea respond to vibrations and these signals are sent to the brain to interpret. The brain processes these signals in terms of frequency, duration and timbre in order to translate them into signals we know.

For example, if we hear a high frequency sound of short duration that is shrill, our brain interprets these characteristics and then runs through a database of audio sounds, an audio library in the brain, and may come up with the suggestion that it has come from a whistle and may signify a call for attention.

What happens when you have a genetic hearing loss gene? The hairs on the inner ear do not grow back and consequently sound vibration from external stimuli do not get passed on to the brain.

With progressive hearing loss too, the characteristics of sound also get distorted. We may hear sounds differently to how they are produced, thereby misinterpreting their meaning. Sounds of higher and lower frequency may be less audible too.

How does that cause a problem? Imagine an alarm. It is set on a high frequency so that it attracts attention. If your ability to hear high frequencies is gradually dulled then you may not be able to detect the sound of an alarm going off.

As hearing gradually deteriorates, the timbre of a sound changes. Sharper sounds become duller, and in the case of the alarm, you may hear it, but it may sound more muted and the brain may not be able to recognise that it is an alarm being heard.

Another problem with hearing loss is the loss of perception of volume. You may be crossing the road and a car might sound its horn if you suddenly encroach into its path. But if you cannot hear that the volume is loud, you may perceive it to be from a car far away and may not realise you are in danger.

The loss of the hairs in the inner ear is a cause of deafness in humans, particularly those for whom hearing loss is genetic. Humans suffering from hereditary hearing loss lose the hairs of the inner ear, which result in the difficulties mentioned above. But there is hope. In a research experiment, scientists successfully delayed the loss of the hairs in the inner ear for mice using a technique that edited away the genetic mutation that causes the loss of the hairs in the cochlea.

Mice were bred with the faulty gene that caused hearing loss. But using a technology known as Crispr, the faulty gene was replaced with a healthy normal one. After about eight weeks, the hairs in the inner ears of mice with genetic predisposition to hearing loss flourished, compared to similar mice which had not been treated. The genetic editing technique had removed the faulty gene which caused hearing loss. The treated mice were assessed for responsiveness to stimuli and showed positive gains.

We could be optimistic about the results but it is important to stress the need to be cautious.

Firstly, the research was conducted on mice and not humans. It is important to state that certain experiments that have been successful in animals have not necessarily had similar success when tried on humans.

Secondly, while the benefits in mice were seen in eight weeks, it may take longer in humans, if at all successful.

Thirdly, we should remember that the experiment worked for the mice which had the genetic mutation that would eventually cause deafness. In other words, they had their hearing at birth but were susceptible to losing it. The technique prevented degeneration in hearing in mice but would not help mice that were deaf at birth from gaining hearing they never had.

Every research carries ethical issues and this one was no different. Firstly, one ethical issue is the recurring one of whether animals should ever be used for research. Should mice be bred for the purposes of research? Are all the mice used? Are they accounted for? Is there someone from Health and Safety going around with a clipboard accounting for the mice? And what happens to the mice when the research has ceased? Are they put down, or released into the ecosystem? “Don’t be silly,” I hear you say, “it’s only mice.” That’s the problem. The devaluation of life, despite the fact that it belongs to another, is what eventually leads to a disregard for other life and human life in general. Would research scientists, in the quest for answers, eventually take to conducting research on beggars, those who sleep rough, or criminals? Would they experiment on orphans or unwanted babies?

The second, when it comes to genetics, is whether genetic experimentation furthers good or promotes misuse. The answer, I suppose, is that the knowledge empowers, but one cannot govern its control. The knowledge that genetic mutation can be edited is good news, perhaps, because it means we can genetically alter, perhaps, disabilities or life-threatening diseases from the onset by removing them. But this, on the other hand, may promote the rise of designer babies, where mothers genetically select features such as blue eyes for their unborn child to enhance their features from birth, and this would promote misuse in the medical community.

Would the use of what is probably best termed genetic surgery be more prominent in the future? One can only suppose so. Once procedures have become more widespread it is certain to conclude that more of such surgeons will become available, to cater for the rich and famous. It may be possible to delay the aging process by genetic surgery, perhaps by removing the gene that causes skin to age, instead of using botox and other external surgical procedures.

Would such genetic surgery ever be available on the NHS? For example, if the cancer gene were identified and could be genetically snipped off, would patients request this instead of medical tablets and other external surgical processes? One way of looking at it is that the NHS is so cash-strapped that under QALY rules, where the cost of a procedure is weighed against the number of quality life years it adds, the cost of genetic surgery would only be limited to more serious illnesses, and certainly not for those down the rung. But perhaps for younger individuals suffering from serious illnesses, such as depression, the cost of a surgical procedure may far outweigh a lifetime’s cost of medication of anti-depressant, anti-psychotics or antibiotics. If you could pinpoint a gene that causes a specific pain response, you might alter it to the point you may not need aspirin, too much of which causes bleeds. And if you could genetically locate what causes dementia in another person, would you not be considered unethical if you let the gene remain, thereby denying others the chance to live a quality life in their latter years?

Genetic editing may be a new technique for the moment but if there is sufficient investment into infrastructure and the corpus of genetic surgery information widens, don’t be surprised if we start seeing more of that in the next century. The cost of genetic editing may outweigh the cost of lifelong medication and side effects, and may prove to be not just more sustainable for the environment but more agreeable to the limited NHS budget.

Most of us won’t be around by then, of course. That is unless we’ve managed to remove the sickness and death genes.

Health umbrella reviews mask the real issues

You have to wonder why the breakfast tea doesn’t get the same level of attention. Or perhaps whether in France, the humble croissant is elevated to the same status. Or maybe the banana could soon be the star of another media show. But unfortunately it is coffee that headlines tomorrow’s fish and chips papers.

“Drinking three or four cups of coffee a day could have benefits for your health”. As we have seen previously, this kind of headline bears the hallmarks of a media health report:

1) repackaging of common information requiring little or no specialist examination;

2) use of a modal auxiliary verb (could) to conveniently justify or disclaim an

attention-grabbing headline – which, by the way, is point number three.

The health reports in the media also incorporate:

4) a statistically small group of trial participants, whose results are then blown up in proportion as if to be representative of the 7 billion people on the planet.

5) Assumptions. A media report about health could simply include assumptions.

Why dwell on coffee? For starters, it is a commonly consumed drink and so any meaningful research would potentially have bearings on millions of people. It is common media practice to focus on common food and activities because of the relevance to daily life.

But if you examine this carefully, why not tea? Why not write about tea? While conspiracy theories may be slightly far fetched, it is possible that – unless it is a speciality tea – coffees cost more, and any potential health benefits would lead people to spend more, hence generating more for the economy in the forms of tax. Perhaps this is why media writers don’t waste too much ink on researching the potential life-saving benefits of bananas, even though they are widely consumed. The research isn’t going to drive people to buy bananas in bulk, and even so, the extra revenue generated from a low priced item isn’t going to raise much extra tax.

Are there any notable similarities or differences in style across different countries? One wonders whether Parisian newspapers, on a regular basis, churn out headlines such as:

“Eating two or more croissants a day could reduce your chances of heart disease.”

“Pan aux raisins linked with dementia”.

The research done was an umbrella review to potentially examine whether further research should be undertaken into researching the effects of coffee and its role in preventing liver cancer. An umbrella review meant that no actual research was undertaken, but that existing research was examined and analysed to glean insights.

The problem with umbrella reviews is that they are very generalised, no actual research is done, and they are only brief analyses of existing research. This means that first of all, an umbrella review could arrive at a particular conclusion, but in no way should that be taken as the final conclusion.

In fact, the findings of an umbrella review are only the preliminary to more detailed investigation. If an umbrella review suggested that drinking coffee could prevent cancer, then what it is saying is more research needs to be undertaken, and the media needs to be ethically responsible by not reporting “Coffee prevents Cancer”, because there are people that look at newspapers and television as the source of their information and assume just because it has been released in the public domain, it is truth. Who could conceive that newspapers spend time and resources to publish trivial information and that television is pure rubbish?

The second problem with umbrella reviews is that the outcomes are only as good as the original sources. If someone gave you a set of grainy photos, then asked you to make a collage with them, then your collage is going to be as good as the grainy photos will allow. If the original sources were not thorough or exact in their investigation, are any subsequent findings based on these merely just a waste of time?

The third issue with umbrella reviews is that under closer scrutiny, the overall picture is distorted by over focussing on small statistical variances, or sometimes minute errors are magnified and lead one down the wrong path.

If you took a picture on your phone and then blew it up to the size of a mural covering the side of your house, the picture becomes very dotty. You might see big patchy squares. But if you started looking for that big patchy square from the image in your phone… one has to wonder what the purpose of that is.

The fourth is that because umbrella reviews are a prelude to a more thorough investigation, their end results are slightly skewed from the outset. If an umbrella review is bound to provide a few avenues for later time-consuming research then it is fundamentally biased into having to provide one in the first place. Why, in that case, have such reviews in the first place? Some may point out that the flaw in the system is that umbrella reviews are relied on by those in academia and research to warrant the continued longevity of their positions. In other words, if researchers had nothing to research, they might be out of a job, so they best find something to stick their noses in.

Have you ever read the London newspaper Metro and come across some research news such as:

“Going to bed angry can wreck your sleep” (25 Sept 2017)

It is the sort of headline that makes you think “Why bother doing the research in the first place?”

It is likely that you have read a media report of an umbrella review.

What were the findings of the original coffee review?

Drinking coffee was consistently linked with a lower risk of death from all causes and from heart disease. The largest reduction in relative risk of premature death was seen in people consuming three cups a day, compared with non-coffee drinkers.

Now, when an umbrella review mentions drinking coffee is linked with a lower risk of death, it is important to be clear about what it specifically means. And what it is stating is that those who had a lower risk of death all happened to drink coffee. It might have nothing to do with the coffee itself. It might have been that they took a break to slow down a fast-paced lifestyle, and the taking of a break gave them a lower risk of death. By that logic of association, tea could also be linked with a lower risk of death.

Coffee was also associated with a lower risk of several cancers, including prostate, endometrial, skin and liver cancer, as well as type-2 diabetes, gallstones and gout, the researchers said. The greatest benefit was seen for liver conditions such as cirrhosis of the liver.

Again, to be clear, the above link means that those who were at lower risk of those cancers happened to drink coffee. But it is not necessarily stating the coffee had anything to do with it.

And coffee is such a commonly consumed drink, that it is easy to use it to draw links to anything.

If people who died from car accidents happened to drink coffee, an umbrella review might state that drinking coffee is linked with higher incidences of car accidents.

The findings can be summarised by a health analyst:

“Does coffee prevent chronic disease and reduce mortality? We simply do not know. Should doctors recommend drinking coffee to prevent disease? Should people start drinking coffee for health reasons? The answer to both questions is ‘no’.”

We should perhaps add a further third question: Did the umbrella review produce any actionable findings, and should it have been undertaken in the first place?

Probably not.

Night time eating? Heart disease coming

That late night snack may be comforting and the perfect end to a day. However, if research is proven to be right, it could be the cumulative cause of heart disease.

Scientists have always known that night shift workers are at greater health risks than workers who work regular patterns. Which is why if you divided the pay shift workers receive by the hours worked, you would find that they have a higher hourly rate compared to those who do the same job during normal hours. That extra pay is to compensate for what is commonly perceived as the extra demand of working during the night, at a time your body is looking to shut down for a rest. The external pressures of going against your body, over a prolonged period, can exert a toll on the body.

Scientists in Mexico researching the links between diet and the human body tested their hypotheses on rats. The rats were fed food at a time when their bodies would normally be at rest, and the results showed that the fats from food remained longer as triglycerides in the body’s bloodstream for longer, because their bodies were at a resting state and not primed to break down food.

Bearing in mind that the research was done on rats, and while some results may have bearing on humans and some may not, what points could we take from these research results?

Having high levels of triglycerides in one’s body means that the risk of cardiovascular diseases such as heart attacks are significantly increased. Hence, if you are eating late at night, you may be at greater risk. Although the research is only at its infancy, they could suggest that the body is better when it comes to the processing of fats, when it is at its most active state, as it comes at more of a natural time.

What can you do if you work shifts? You may not have much control over the food you eat, but you can take steps towards eating a healthier diet and make time for regular exercise so the overall risk of heart disease is lowered. And if you do not work shifts, but work during the day, a big meal late at night is also best avoided for you.

The role of pharmacy in healthcare

Pharmacists are experts on the actions and uses of drugs, including their chemistry, their formulation into medicines and the ways in which they are used to manage diseases. The principal aim of the pharmacist is to use this expertise to improve patient care. Pharmacists are in close contact with patients and so have an important role both in assisting patients to make the best use of their prescribed medicines and in advising patients on the appropriate self-management of self-limiting and minor conditions. Increasingly this latter aspect includes OTC prescribing of effective and potent treatments. Pharmacists are also in close working relationships with other members of the healthcare team –doctors, nurses, dentists and others –where they are able to give advice on a wide range of issues surrounding the use of medicines.

Pharmacists are employed in many different areas of practice. These include the traditional ones of hospital and community practice as well as more recently introduced advisory roles at health authority/ health board level and working directly with general practitioners as part of the core, practice-based primary healthcare team. Additionally, pharmacists are employed in the pharmaceutical industry and in academia.

Members of the general public are most likely to meet pharmacists in high street pharmacies or on a hospital ward. However, pharmacists also visit residential homes (see Ch. 49), make visits to patients’own homes and are now involved in running chronic disease clinics in primary and secondary care. In addition, pharmacists will also be contributing to the care of patients through their dealings with other members of the healthcare team in the hospital and community setting.

Historically, pharmacists and general practitioners have a common ancestry as apothecaries. Apothecaries both dispensed medicines prescribed by physicians and recommended medicines for those members of the public unable to afford physicians’fees. As the two professions of pharmacy and general practice emerged this remit split so that pharmacists became primarily responsible for the technical, dispensing aspects of this role. With the advent of the NHS in the UK in 1948, and the philosophy of free medical care at the point of delivery, the advisory function of the pharmacist further decreased. As a result, pharmacists spent more of their time in the dispensing of medicines –and derived an increased proportion of their income from it. At the same time, radical changes in the nature of dispensing itself, as described in the following paragraphs, occurred.

In the early years, many prescriptions were for extemporaneously prepared medicines, either following standard ‘recipes’from formularies such as the British Pharmacopoeia (BP) or British Pharmaceutical Codex (BPC), or following individual recipes written by the prescriber (see Ch. 30). The situation was similar in hospital pharmacy, where most prescriptions were prepared on an individual basis. There was some small-scale manufacture of a range of commonly used items. In both situations, pharmacists required manipulative and time-consuming skills to produce the medicines. Thus a wide range of preparations was made, including liquids for internal and external use, ointments, creams, poultices, plasters, eye drops and ointments, injections and solid dosage forms such as pills, capsules and moulded tablets (see Chs 32–39). Scientific advances have greatly increased the effectiveness of drugs but have also rendered them more complex, potentially more toxic and requiring more sophisticated use than their predecessors. The pharmaceutical industry developed in tandem with these drug developments, contributing to further scientific advances and producing manufactured medical products. This had a number of advantages. For one thing, there was an increased reliability in the product, which could be subjected to suitable quality assessment and assurance. This led to improved formulations, modifications to drug availability and increased use of tablets which have a greater convenience for the patient. Some doctors did not agree with the loss of flexibility in prescribing which resulted from having to use predetermined doses and combinations of materials. From the pharmacist’s point of view there was a reduction in the time spent in the routine extemporaneous production of medicines, which many saw as an advantage. Others saw it as a reduction in the mystique associated with the professional role of the pharmacist. There was also an erosion of the technical skill base of the pharmacist. A look through copies of the BPC in the 1950s, 1960s and 1970s will show the reduction in the number and diversity of formulations included in the Formulary section. That section has been omitted from the most recent editions. However, some extemporaneous dispensing is still required and pharmacists remain the only professionals trained in these skills.

The changing patterns of work of the pharmacist, in community pharmacy in particular, led to an uncertainty about the future role of the pharmacist and a general consensus that pharmacists were no longer being utilized to their full potential. If the pharmacist was not required to compound medicines or to give general advice on diseases, what was the pharmacist to do?

The need to review the future for pharmacy was first formally recognized in 1979 in a report on the NHS which had the remit to consider the best use and management of its financial and manpower resources. This was followed by a succession of key reports and papers, which repeatedly identified the need to exploit the pharmacist’s expertise and knowledge to better effect. Key among these reports was the Nuffield Report of 1986. This report, which included nearly 100 recommendations, led the way to many new initiatives, both by the profession and by the government, and laid the foundation for the recent developments in the practice of pharmacy, which are reflected in this book.

Radical change, as recommended in the Nuffield Report, does not necessarily happen quickly, particularly when regulations and statute are involved. In the 28 years since Nuffield was published, there have been several different agendas which have come together and between them facilitated the paradigm shift for pharmacy envisaged in the Nuffield Report. These agendas will be briefly described below. They have finally resulted in extensive professional change, articulated in the definitive statements about the role of pharmacy in the NHS plans for pharmacy in England (2000), Scotland (2001) and Wales (2002) and the subsequent new contractual frameworks for community pharmacy. In addition, other regulatory changes have occurred as part of government policy to increase convenient public access to a wider range of medicines on the NHS (see Ch. 4). These changes reflect general societal trends to deregulate the professions while having in place a framework to ensure safe practice and a recognition that the public are increasingly well informed through widespread access to the internet. For pharmacy, therefore, two routes for the supply of prescription only medicines (POM) have opened up. Until recently, POM medicines were only available on the prescription of a doctor or dentist, but as a result of the Crown Review in 1999, two significant changes emerged.

First, patient group directions (PGDs) were introduced in 2000. A PGD is a written direction for the supply, or supply and administration, of a POM to persons generally by named groups of professionals. So, for example, under a PGD, community pharmacists could supply a specific POM antibiotic to people with a confirmed diagnostic infection, e.g. azithromycin for Chlamydia.

Second, prescribing rights for pharmacists, alongside nurses and some other healthcare professionals, have been introduced, initially as supplementary prescribers and more recently, as independent prescribers.

The council of the Royal Pharmaceutical Society of Great Britain (RPSGB) decided that it was necessary to allow all members to contribute to a radical appraisal of the profession, what it should be doing and how to achieve it. The ‘Pharmacy in a New Age’consultation was launched in October 1995, with an invitation to all members to contribute their views to the council. These were combined into a subsequent document produced by the council in September 1996 called Pharmacy in a New Age: The New Horizon. This indicated that there was overwhelming agreement from pharmacists that the profession could not stand still.

The main output of this professional review was a commitment to take forward a more proactive, patient-centred clinical role for pharmacy using pharmacists’ skills and knowledge to best effect.