One cigarette a day can cost a lot

According to the newspaper headlines of late, teenagers should be kept away from cigarette exposure because of this worrying statistic.

A survey of over 216,000 adults found that over 60% of them had been offered and tried a cigarette at some point, and of these, nearly 70% went on to become regular smokers. The conclusion drawn was that there are strong links between trying a cigarette ones to be sociable and going on to develop it as a habit.

This of course ended up in the newspapers with headlines such as “One cigarette is enough to get you hooked”. The Mail Online, Britain’s go-to newspaper for your important health news (and I’m being ironic here) went a step further, saying one puff from a cigarette was enough to get you hooked for life. Never mind if you had one draw of a cigarette, felt the nicotine reach your lungs, then coughed in revulsion at the bitter aftertaste and swore that you would never again try a cigarette again. The Mail Online bets you would return to the lure of the dark side, seduced by its nicotine offers.

I digress.

While we all know that any event, repeated many times becomes a habit, the statistics in this case are a little dubious.

The study was conducted by Queen Mary University (nothing dubious in itself) but among the various concerns were what you might call the high conversion rate. Nearly 70% of those who tried a cigarette once went on to smoke regularly as a habit.

I’m not sure why the 70% is worrying. In fact, I wonder why it is not 100%! Surely, if you asked a habitual smoker, “Have you smoked a cigarette before?”, the answer would be a resounding “Yes”!

Unless you have caught someone in the act of sneakily smoking his virgin cigarette. But he wouldn’t yet be a habitual smoker.

Let’s establish the facts of the matter again.

216,000 adults were surveyed.

130,000 of them (60% of the adults) had tried a cigarette before.

86,000 (40%) have never smoked before.

Of the 130,000 who had tried a cigarette before, 81,000 (70%) went on to become regular smokers.

49,000 (30%) of those who tried a cigarette before either did not go on to smoke at all or did not smoke regularly.

Another way of looking at the data would be as follows:

216,000 adults surveyed.

135,000 adults do not smoke regularly or at all. Some did try once in the past.

81,000 adults smoke regularly and these people have obviously tried a cigarette before.

Suddenly the data doesn’t look sexy anymore.

The data was an umbrella studywhich means data was pooled rather than created from scratch through surveys. As previously examined, the final outcome is also dependent on the integrity of the original source.

Bias can also creep in because the data has not been directly obtained and inferences have been drawn.

For example, the influence of e-cigarettes and vaping on the results have not been scrutinised, because some of the data may have existed before then.

Before we leave it at this, here is another example of data bias:
216,000 adults were surveyed.

130,000 of them (60% of the adults) had tried a cigarette before.

86,000 (40%) have never smoked before.

We can conclude that 100% of the 86,000 who have never smoked a cigarette in the past have never smoked a cigarette.

You can see the absurdity more when it’s spelt out more in words than in numbers.

If research is costly and expensive, in terms of money and time, then why is it wasted on these?

One reason is that it keeps academics and researchers in their jobs, if they produce findings that are financially low-cost but can stave off the question of what they actually do, and their purpose.

This kind of research is the academic version of the newspaper filler article, one that columnists generate based on the littlest of information, in order to fill the papers with “news”, that actually mask the fact that they are there to sell advertising space. And in this, columnists and researchers are at times colluding for the same purpose. Vultures who tear at the carcass of a small rodent and then serve up the bits as a trussed up main meal.

Unethical? Who cares, it seems. Just mask the flawed process and don’t make it too obvious.

A short history of non-medical prescribing

It had long been recognised that nurses spent a significant amount of time visiting general practitioner (GP) surgeries and/ or waiting to see the doctor in order to get a prescription for their patients. Although this practice produced the desired result of a prescription being written, it was not an efficient use of either the nurses’or the GPs’time. Furthermore, it was an equally inefficient use of their skills, exacerbated by the fact that the nurse had usually themselves assessed and diagnosed the patient and decided on an appropriate treatment plan.

The situation was formally acknowledged in the Cumberlege Report (Department of Health and Social Security 1986), which initiated the call for nurse prescribing and recommended that community nurses should be able to prescribe from a limited list, or formulary. Progress was somewhat measured, but The Crown Report of 1989 (Department of Health (DH) 1989) considered the implications of nurse prescribing and recommended suitably qualified registered nurses (district nurses (DN) or health visitors (HV)) should be authorised to prescribe from a limited list, namely, the nurse prescribers’formulary (NPF). Although a case for nurse prescribing had been established, progress relied on legislative changes to permit nurses to prescribe.

Progress continued to be cautious with the decision made to pilot nurse prescribing in eight demonstration sites in eight NHS regions. In 1999, The Crown Report II (DH 1999) reviewed more widely the prescribing, supply and administration of medicines and, in recognition of the success of the nurse prescribing pilots, recommended that prescribing rights be extended to include other groups of nurses and health professionals. By 2001, DNs and HVs had completed education programmes through which they gained V100 prescribing status, enabling them to prescribe from the NPF. The progress being made in prescribing reflected the reforms highlighted in The NHS Plan (DH 2000), which called for changes in the delivery of healthcare throughout the NHS, with nurses, pharmacists and allied health professionals being among those professionals vital to its success.

The publication of Investment and Reform for NHS Staff –Taking Forward the NHS Plan (DH 2001) stated clearly that working in new ways was essential to the successful delivery of the changes. One of these new ways of working was to give specified health professionals the authority to prescribe, building on the original proposals of The Crown Report (DH 1999). Indeed, The NHS Plan (DH 2000) endorsed this recommendation and envisaged that, by 2004, most nurses should be able to prescribe medicines (either independently or supplementary) or supply medicines under patient group directions (PGDs) (DH 2004). After consultation in 2000, on the potential to extend nurse prescribing, changes were made to the Health and Social Care Act 2001.

The then Health Minister, Lord Philip Hunt, provided detail when he announced that nurse prescribing was to include further groups of nurses. He also detailed that the NPF was to be extended to enable independent nurse prescribers to prescribe all general sales list and pharmacy medicines prescribable by doctors under the NHS. This was together with a list of prescription-only medicines (POMs) for specified medical conditions within the areas of minor illness, minor injury, health promotion and palliative care. In November 2002, proposals were announced by Lord Hunt, concerning ‘supplementary’prescribing (DH 2002).

The proposals were to enable nurses and pharmacists to prescribe for chronic illness management using clinical management plans. The success of these developments prompted further regulation changes, enabling specified allied health professionals to train and qualify as supplementary prescribers (DH 2005). From May 2006, the nurse prescribers’extended formulary was discontinued, and qualified nurse independent prescribers (formerly known as extended formulary nurse prescribers) were able to prescribe any licensed medicine for any medical condition within their competence, including some controlled drugs.

Further legislative changes allowed pharmacists to train as independent prescribers (DH 2006) with optometrists gaining independent prescribing rights in 2007. The momentum of non-medical prescribing continued, with 2009 seeing a scoping project of allied health professional prescribing, recommending the extension of prescribing to other professional groups within the allied health professions and the introduction of independent prescribing for existing allied health professional supplementary prescribing groups, particularly physiotherapists and podiatrists (DH 2009).

In 2013, legislative changes enabled independent prescribing for physiotherapists and podiatrists. As the benefits of non-medical prescribing are demonstrated in the everyday practice of different professional groups, the potential to expand this continues, with consultation currently under way to consider the potential for enabling other disciplines to prescribe.

The bigger issues that come with preventing hearing loss

Is there cause for optimism when it comes to preventing hearing loss? Certainly the latest research into this suggests that if positive effects experienced by mice could be transferred to humans and maintained for the long term, then hereditary hearing loss could be a thing of the past.

It has always been assumed that hearing loss is always down to old age. The commonly held view is that as people grow older, their muscles and body functions deteriorate with time to the point that muscle function is impaired and eventually lost. But hearing loss is not necessarily down to age, although there are cases where constant exposure to loud noise, over time, causes reduced sensitivity to aural stimuli. Over half of hearing loss cases are actually due to inheriting faulty genetic mutations from parents.

How do we hear? The hair cells of the inner ear called the cochlea respond to vibrations and these signals are sent to the brain to interpret. The brain processes these signals in terms of frequency, duration and timbre in order to translate them into signals we know.

For example, if we hear a high frequency sound of short duration that is shrill, our brain interprets these characteristics and then runs through a database of audio sounds, an audio library in the brain, and may come up with the suggestion that it has come from a whistle and may signify a call for attention.

What happens when you have a genetic hearing loss gene? The hairs on the inner ear do not grow back and consequently sound vibration from external stimuli do not get passed on to the brain.

With progressive hearing loss too, the characteristics of sound also get distorted. We may hear sounds differently to how they are produced, thereby misinterpreting their meaning. Sounds of higher and lower frequency may be less audible too.

How does that cause a problem? Imagine an alarm. It is set on a high frequency so that it attracts attention. If your ability to hear high frequencies is gradually dulled then you may not be able to detect the sound of an alarm going off.

As hearing gradually deteriorates, the timbre of a sound changes. Sharper sounds become duller, and in the case of the alarm, you may hear it, but it may sound more muted and the brain may not be able to recognise that it is an alarm being heard.

Another problem with hearing loss is the loss of perception of volume. You may be crossing the road and a car might sound its horn if you suddenly encroach into its path. But if you cannot hear that the volume is loud, you may perceive it to be from a car far away and may not realise you are in danger.

The loss of the hairs in the inner ear is a cause of deafness in humans, particularly those for whom hearing loss is genetic. Humans suffering from hereditary hearing loss lose the hairs of the inner ear, which result in the difficulties mentioned above. But there is hope. In a research experiment, scientists successfully delayed the loss of the hairs in the inner ear for mice using a technique that edited away the genetic mutation that causes the loss of the hairs in the cochlea.

Mice were bred with the faulty gene that caused hearing loss. But using a technology known as Crispr, the faulty gene was replaced with a healthy normal one. After about eight weeks, the hairs in the inner ears of mice with genetic predisposition to hearing loss flourished, compared to similar mice which had not been treated. The genetic editing technique had removed the faulty gene which caused hearing loss. The treated mice were assessed for responsiveness to stimuli and showed positive gains.

We could be optimistic about the results but it is important to stress the need to be cautious.

Firstly, the research was conducted on mice and not humans. It is important to state that certain experiments that have been successful in animals have not necessarily had similar success when tried on humans.

Secondly, while the benefits in mice were seen in eight weeks, it may take longer in humans, if at all successful.

Thirdly, we should remember that the experiment worked for the mice which had the genetic mutation that would eventually cause deafness. In other words, they had their hearing at birth but were susceptible to losing it. The technique prevented degeneration in hearing in mice but would not help mice that were deaf at birth from gaining hearing they never had.

Every research carries ethical issues and this one was no different. Firstly, one ethical issue is the recurring one of whether animals should ever be used for research. Should mice be bred for the purposes of research? Are all the mice used? Are they accounted for? Is there someone from Health and Safety going around with a clipboard accounting for the mice? And what happens to the mice when the research has ceased? Are they put down, or released into the ecosystem? “Don’t be silly,” I hear you say, “it’s only mice.” That’s the problem. The devaluation of life, despite the fact that it belongs to another, is what eventually leads to a disregard for other life and human life in general. Would research scientists, in the quest for answers, eventually take to conducting research on beggars, those who sleep rough, or criminals? Would they experiment on orphans or unwanted babies?

The second, when it comes to genetics, is whether genetic experimentation furthers good or promotes misuse. The answer, I suppose, is that the knowledge empowers, but one cannot govern its control. The knowledge that genetic mutation can be edited is good news, perhaps, because it means we can genetically alter, perhaps, disabilities or life-threatening diseases from the onset by removing them. But this, on the other hand, may promote the rise of designer babies, where mothers genetically select features such as blue eyes for their unborn child to enhance their features from birth, and this would promote misuse in the medical community.

Would the use of what is probably best termed genetic surgery be more prominent in the future? One can only suppose so. Once procedures have become more widespread it is certain to conclude that more of such surgeons will become available, to cater for the rich and famous. It may be possible to delay the aging process by genetic surgery, perhaps by removing the gene that causes skin to age, instead of using botox and other external surgical procedures.

Would such genetic surgery ever be available on the NHS? For example, if the cancer gene were identified and could be genetically snipped off, would patients request this instead of medical tablets and other external surgical processes? One way of looking at it is that the NHS is so cash-strapped that under QALY rules, where the cost of a procedure is weighed against the number of quality life years it adds, the cost of genetic surgery would only be limited to more serious illnesses, and certainly not for those down the rung. But perhaps for younger individuals suffering from serious illnesses, such as depression, the cost of a surgical procedure may far outweigh a lifetime’s cost of medication of anti-depressant, anti-psychotics or antibiotics. If you could pinpoint a gene that causes a specific pain response, you might alter it to the point you may not need aspirin, too much of which causes bleeds. And if you could genetically locate what causes dementia in another person, would you not be considered unethical if you let the gene remain, thereby denying others the chance to live a quality life in their latter years?

Genetic editing may be a new technique for the moment but if there is sufficient investment into infrastructure and the corpus of genetic surgery information widens, don’t be surprised if we start seeing more of that in the next century. The cost of genetic editing may outweigh the cost of lifelong medication and side effects, and may prove to be not just more sustainable for the environment but more agreeable to the limited NHS budget.

Most of us won’t be around by then, of course. That is unless we’ve managed to remove the sickness and death genes.

Migraines could be a headache of the past

Is there hope for the many millions of migraine sufferers in the United Kingdom and around the world? Researchers at King’s College Hospital certainly believe that this is the case. While they are cautious about the findings of their latest research, the results certainly are one that point towards optimism for migraine sufferers.

It is estimated that the number of migraine attacks everyday in the UK number over 190,000. This figure was estimated by the Migraine Trust, and it was probably obtained by taking a sample size of the population, taking into account the number of migraine attacks experienced within that group and then multiplying it by the general population in the United Kingdom. This of course means two things: firstly, the figure was proposed by a group that has an interest in promoting awareness about migraines and is hence slightly biased, probably over-estimated. Secondly, bearing in mind that the UK population is over 66 million, and it is unlikely that the Trust surveyed 1 million people – or even anywhere near that – any differences could have been amplified by over 66 times.

What is the difference between a migraine and a normal headache? A migraine is a headache which happens frequently. Migraines themselves are classed as two types. Headaches which happen more than 15 days a month are known as chronic migraine, while episodic migraine is a term used to describe headaches which happen less than fifteen times a month.

The research uncovered that a chemical in the brain was involved both in the feeling of pain and sensitivity to sound and light. This chemical is known as calcitonin gene-related peptide, or CGRP. If CGRP is neutralised, or if part of a brain cell which it interacts with is blocked, then pain receptors are dulled and migraines are reduced.

There are currently four drug companies in the race to develop a CGRP neutraliser.

Race is an accurate term, for the company that develops and trials the drug successfully may win the patent for developing and marketing the drug over twenty years. Drug companies or pharmaceuticals are normally granted that period to reward them for the time and cost invested into research.

One such company, Novartis, trialled an antibody, erenumab on episodic migraine sufferers. Those who took part in the trial suffered migraines on an average of eight days a month.

955 patients took part in the trial and half of those who received injections of erenumab successfully halved their number of migraine days per month. 27% of patients also reduced their number of migraine days without treatment. The results suggest that the drug was successful, particularly as it worked for over 450 people, and that if it were used for those with chronic migraine it might be equally successful. Even if the same percentage were maintained (50% vs 27%), the number of working days saved by migraine prevention could have significant savings for the economy.
Another pharmaceuticals company, Teva, produced another antibody, fremanezumab, and trialed it on 1130 patients. Unlike Novartis’s trials, the participants in Teva’s were those with chronic migraine, with over 15 or more attacks each month. In the Teva trial, 41% of patients reportedly halved the number of days that they suffered migraine attacks. 18% reported the same effect, so the confidence interval in the trial is pretty high and suggests a high degree of positive use.

The study is very important and useful because of the understanding it offers in treating migraine, and the medical products can reduce the frequency and severity of headaches. It makes for fewer days lost to the disease and more positive, functioning people.

Besides CGRP antibodies, there are other current treatments for migraine such as epilepsy and heart disease pills. Even botox is sometimes used. However, all three come with side-effects and are not necessarily the best for everyone.

The hope is that CGRP antibodies, which are traditionally more expensive to manufacture, will in the long term be available at a more affordable cost, and would benefit those who currently get no benefit from existing therapies.

If the estimation that one in seven people live with regular migraine is accurate, migraine reduction could have significant life-improvement effects for humans. Chronic migraine is in the top seven disabling conditions and improvements in understanding it and how to manage it would not only improve the quality of life for those who suffer with it, but also in reducing the number of work days lost for the economy. But the benefits do not just remain with migraine sufferers. Having to live with chronic disabling conditions often leads to other symptoms such as depression. Who knows? Perhaps CGRP antibodies may even negate the effect of depression, resulting in a secondary effect. It may be possible that those who suffer from migraine alongside depression may even not require treatment for the latter if the CGRP antibodies prove to be effective.
Can you imagine a world without anti-depressants? At the moment millions live on some pain-relief medication of some sort. It would be great if they could be phased out. Although it might not be so great for the economy!

Should we be excited about the results? Well, yes. The combined large sample size of both studies, of over 2000 migraine sufferers showed that there was some weight behind the study compared to if – for example – it had been done only on one hundred participants. Secondly, while the research was undertaken by pharmaceutical companies, the outcome was actionable, meaning that it produced a result that was useful, rather than one that merely formed the prelude to a more extensive study. In previous posts I demonstrated how some – such as the coffee umbrella review – did not produce any significantly useful outcome. But we know from this particular research that it may work to neutralise either CGRP, or lessen its interaction with the particular brain cells in order to lower the effect of migraine.

Did the media have a field day with this? Unsurprisingly, no. You see, good research does not lend itself to sensationalist headlines.

The financial considerations of investing in medicine and medical research

BBC News reports that a drug that would reduce the risk of HIV infection would result in cost savings of over £1bn over 80 years. Pre-exposure prophylaxis, or Prep, would reduce infection and hence lower the treatment costs for patients in the long term.

The catch? There is one. It’s the long term.

The cost of the treatment and prevention is such that its provision for the first twenty years – bundling together the cost of medical research and production of medicine – would result in a financial loss, and parity would only be achieved after a period of about thirty to forty years; this period is hard to define because it is dependent on what the drug would cost in the future.

Prep combines two anti-HIV drugs, emtricitabine and tenofovir. The medical trials behind it have concluded it has an effective rate of over one in five when it comes to protecting men who have unprotected sex with men from HIV infection. The exact figure is close to 86%.

Prep can be used either on a daily basis, or on what has been termed a sexual event basis – using it for two days before, during and after periods of unprotected sex.

The research model analysed the potential impact of Prep and found that it could reduce infection rates by over a quarter. The cost of the treatment itself, comparative to the cost of treating infection, would result in a saving over one billion pounds over eight years.

However, it does raise a few ethical questions. If the National Health Service is aiming to be a sustainable one – and one of the aims of sustainability is to empower citizens to take responsibility for their own health –  shouldn’t it be considering less about how it will balance the books, but spend more on education for prevention in the first place? The cost of producing Prep on the NHS would be £19.6 billion over 80 years; while the estimated savings from treatment would be £20.6 billion over the same period. Educating people not to have unprotected sex with those at the risk of HIV arguably would result in a higher saving over a lower time period. Perhaps the NHS should consider ways of reducing cost more significantly, rather than latching on to a cheaper prevention drug immediately. If consumer behaviour is not going to change, symptoms are still going to surface, and the provision of Prep on the NHS may only encourage less self-regulation and awareness.