Antibiotics; A journey from good, to bad, to ugly

As I hope you are already aware, medicine is facing a crisis at the moment which could prove to be a bigger killer than cancer; antibiotic resistance. Bacterial infection seems an archaic issue, one that we solved years ago and have since forgotten as a major threat to human health in the public eye, but it’s making a comeback. Before we get into all that, however, let’s first look at what antibiotics are and how they came about in the first place.

Antibiotics are a group of drugs which can treat or prevent bacterial infections by inhibiting or stopping the growth of bacteria. There are hundreds of different antibiotics in circulation today, but they all stem from just a few drugs so can be categorised into six groups. One of these groups are the penicillins, which include the drugs penicillin and amoxicillin and this is where the antibiotics’ journey begins. Penicillin was discovered in 1928 by Alexander Fleming and is generally thought of as the world’s first antibiotic. Its initial discovery is quite well known, with Fleming accidentally uncovering the antibacterial properties of a mold called Penicillium notatum when returning from a holiday to find that where the mold had grown, normal growth of Staphylococci was prevented. Fleming then grew some more of the mold in order to confirm his findings; he had discovered something that would not only prevent growth but could be harnessed to fight bacterial infection, changing the world in its wake. However, Fleming didn’t develop Penicillin into what it is today. To do this, the active ingredient would need to be isolated, purified, tested and then produced on a grand scale.

It was 10 years later that endeavours into this began, when Howard Florey came across Fleming’s work and began further developing it with his team at Oxford University. Florey, and one of his employees Ernst Chain, managed to produce penicillin culture fluid extracts. From this, they experimented on 50 mice that they had infected with streptococcus and treated only half with penicillin injections. Whilst half of the infected mice died from sepsis, the ones which had been treated survived thus proving the effectiveness of penicillin. In 1940 the first human test was conducted, with injections being given for 5 days to an infected Albert Alexander. Alexander began to recover, but unfortunately Florey and Chain didn’t have enough pure penicillin to completely treat the infection and so Alexander ultimately died. This was now their biggest problem; making enough penicillin. It took 2,000 litres of culture fluid to extract enough pure penicillin to treat just 1 human case, so you can see how treating an entire population would frankly be impossible if they continued in this way. It was in the US whilst Florey and Chain were looking to find a solution to this problem that they happened upon a different, more prolific fungus called Penicillium chrysogeum. This yielded 200 times more penicillin than the previously used species, and with a few mutation-causing X-rays yielded 1000 times more. This allowed for 400 million units of penicillin to be produced for use during the war in 1942, reducing death rate from bacterial infection to less than 1%. And so began the antibiotic revolution in medicine which changed the world into what it is today.

Now, over 70 years later, we’re faced with a huge issue. Bacteria have become resistant to antibiotics due to their overuse. Antibiotic resistance occurs when a bacteria mutates so that the antibiotic can no longer kill it, and is part of natural selection. However, from previous knowledge of natural selection you might know that it is slow process across many generations and relies on the mutation being an advantage in order for it to become widespread. Yet when you use antibiotics rather than having 1 mutated bacteria amongst millions of non-mutated, all non-mutated bacteria are killed leaving a 100% resistant population. Albeit that population is exactly 1, but bacteria can multiply and start causing havoc very quickly. And then those same antibiotics no longer have any effect on the new culture, leading to longer hospital stays, more expensive medication and increased mortality rates.

Given how easily everyone is now able to travel around the world, this issue is not just localised to 1 country or continent but is on global scale. Given that tackling antibiotic resistance requires investment into finding new antibiotics and using the more expensive medications that are on hand, developing countries could be hit much harder than developed. Meanwhile, our struggling NHS will experience even more pressure as patients have longer stays at hospital. Worst-case scenario, if this issue isn’t addressed we could be returned to a time before antibiotics where common infections are once again fatal, killing 10 million people a year by 2050. Clearly not a desirable outcome, so how can we tackle antibiotic resistance?

Prevention and control of antibiotic resistance requires change at several different levels of society. The World Health Organisation has made recommendations for what we as individuals, healthcare professionals, policy makers, and people in the agriculture sector. Amongst the advice for individuals is not sharing leftover antibiotics, not demanding antibiotics if they aren’t considered necessary and preparing food hygienically to prevent infections. Meanwhile the healthcare industry needs to invest in research into new antibiotics, as so far most drugs being developed have only been modifications to existing antibiotics. They don’t work for long, and of the 51 new antibiotics currently being developed only 8 of them have been classed by WHO as innovative and capable of contributing meaningfully to the issue.

Research into new drugs is of the utmost importance, but the responsibility appears to fall to governments to fund such research as pharmaceutical companies are reluctant to do so. Some suggest that this is because it is more lucrative for drug companies to treat chronic conditions in which patients rely on their drugs for a lifetime than providing short-term (and may I point out, life-saving) treatments. Thankfully, this year countries including Switzerland, South Africa and the U.K. pledged a combined $82 million to support the Global Antibiotic Research and Development Partnership which was set up by WHO and the Drugs for Neglected Diseases Initiative. Less encouraging, is the estimation by the director of the WHO Global Tuberculosis Programme that more than $800 million is needed annually to fund research into new anti-tuberculosis medicines.

I know. It’s feeling dismal. But what can we do, on the metaphorical shop floor? Stop the overuse of antibiotics, follow through on any prescribed courses of antibiotics even if you’re feeling better and make other people aware of the issue. The bigger the issue becomes in the eyes of the public, the better the government and other organisations will address it so talk about it to anyone who will listen! I am optimistic that we as a species will survive this most recent challenge to our health, but only if we start pulling up our socks and addressing the situation.

 

Thank you for reading, remember to rate and share!

Vampire or Victim?

Brasov in Transylvania, Romania

Following my recent trip to Transylvania in Romania for work experience, I have decided to dedicate this post to the monsters and myths which stem from the region: specifically vampires. Like many historical ideas and beliefs, the creation of such a supernatural being likely served as an attempt to explain ailments that couldn’t otherwise be explained, before the emergence of scientific understanding. In this post, I will describe the medical condition most popularly believed to have led to belief in vampires.

The most commonly referenced illness used to explain away vampirism is porphyria, a group of inherited metabolic disorders in which haem (used in haemoglobin) production is disrupted. The production of haem from a chemical called ALA involves several steps, and each step requires the use of an enzyme. In people suffering from porphyria, one of those enzymes is faulty due to the inheritance of a mutated gene which codes for the synthesis of that enzyme. This means that the production of haem is slowed or even stopped, and as a result the ‘transitional chemicals’ made in the stages between ALA and haem, known as porphyrins, can build up to harmful levels. As well as this, the limited haem production can mean that not enough healthy red blood cells can be made. There are different types of porphyria depending on which enzyme is dysfunctional, most of which produce different symptoms with some overlap.

The most common form, and the one best related to vampires, is porphyria cutanea tarda (PCT) which affects the skin. PCT causes photosensitivity in which exposure to sunlight can cause painful, blistering and itchy skin… sound like something you’ve heard of before? A well known characteristic of vampires is that they burn in sunlight, hence must stay out of the sun and so have a dramatically pale complexion. Similarly many porphyria sufferers are indeed pale as, naturally, they avoid the sunlight due to their photosensitivity.  What’s more, healing after this reaction to sunlight is often slow and, with repeated damage, can cause the skin to tighten and shrink back. If this shrinking causes gums to recede, you can imagine that the canines may start to resemble fangs.

Another general symptom of porphyria is that when the accumulated porphyrins are excreted, the resultant faeces may turn a purple-red colour. Whilst the same conclusion may not be reached in modern times, historically this may have given the impression that the sufferer had been drinking blood which is another vampire hallmark. Interestingly, drinking blood could- and I say this tentatively- actually relieve some symptoms of porphyria. Whilst the haemoglobin would be broken down, the haem pigment itself could survive digestion and be absorbed from the small intestine meaning in theory that drinking blood would do the same for relieving symptoms as a blood transfusion would. Finally, garlic. Seemingly the most random trait of vampires is their aversion to garlic however even this could be explained by porphyria. Garlic can stimulate red blood cell and haem production which, for a person with porphyria, could worsen their symptoms as more porphyrins build up. This could lead to an acute attack in which abdominal pain, vomiting and seizures may occur. Seems like an extreme reaction, but perhaps…

So does porphyria explain how the legends of vampires came about? I would say so, but some folklorists and experts would disagree. It is suggested by such people that porphyria doesn’t accurately explain the original characteristics of vampires but more the fictional adaptations that have more recently been referred to. Folkloric vampires weren’t believed to have issues with sunlight at all and were described as looking healthy, ‘as they were in life’, which contradicts the pale complexion and blistering skin seen in PCT. Furthermore, it is still unclear whether or not drinking blood would truly relieve symptoms of porphyria and even so, how those affected would know to try it with no understanding of their disease and no craving for blood makes it all seem rather unlikely. Speaking of probability, reports of vampires were rampant in the 18th century yet porphyria is a relatively rare  occurrence and its severity ranges from full-on vampire to no symptoms developing at all, making it seem even less probable that such an apparently widespread phenomenon could be the result of PCT.

Whether you believe that porphyria caused the vampire myths or not, it certainly is an interesting disease that sadly has no cure (so far), is difficult to diagnose and relies generally on management rather than treatment. Here’s hoping that future developments using gene therapy and even research spun off of use of ‘vampire plant’ models could lead to improvements some day.

 

Read it? Rate it!

A Look Back at Anaesthesia

An anaesthetic is a substance that induces insensitivity to pain with anaesthesia literally meaning ‘without sensation’. Anaesthesia is used in modern times on a day to day basis, during tests and surgical operations in order to numb areas of the body or induce sleep. The types which you are probably most aware of, local and general, are the two most common but there are other types such as regional anaesthetics and sedatives.

Anaesthetics may not be perfect, with c.10 deaths for every million anaesthetics given in the UK, however developments over the years has made the use of anaesthesia to be considered very safe with serious problems being rare. I’m taking a look back in time to see how anaesthesia- more specifically general- has changed over the last 2 centuries.

 

 

Starting in the 1820s, Henry Hickman explored the use of carbon dioxide as an anaesthetic to perform painless surgery on animals. Carbon dioxide could be used to effectively suffocate the animals out of consciousness for a long enough time to perform the surgery. This is considered to have been a major breakthrough in surgery and anaesthesia, however carbon dioxide wasn’t used widespread due to the risks associated with partial asphyxiation and Hickman’s work was heavily criticised at the time. In more recent times, carbon dioxide was said to be used for medicine in the USA during the 1950s and is currently used before slaughter in numerous slaughterhouses.

 

Ether had a reputation as a recreational drug during the 1800s but in 1842 it was used for the first time as an anaesthetic by American physician William Clarke in a tooth extraction. That same year, Crawford Long used it to remove a tumour from a man’s neck although he did not publish an account of until several years later, in which he described that the patient felt nothing throughout the procedure.

Morton gave the use of ether popularity when in 1846 he removed a tumour from a man’s jaw. News of the operation travelled round the world quickly and ether became a widely adopted method with Morton often being credited as the pioneer of general anaesthesia. Ether did have it’s drawbacks though, causing coughing in patients and being highly flammable meaning research in anaesthesia continued to develop.

 

Humphrey Davy discovered that nitrous oxide could dull pain in 1799, however the use of nitrous oxide as an anaesthetic wasn’t fully realised until 1844. Horace Wells attended a public demonstration of nitrous oxide by Gardner Colton and the very next day Wells himself underwent the first ever reported painless tooth extraction while Colton administered nitrous oxide. Nitrous oxide is still used today, you may know it better as ‘laughing gas’, in dentistry and childbirth and even recreationally as the 4th most used drug in the UK (according the Global Drug Survey 2015). Of course, nitrous oxide only dulls pain so could not and cannot be used in major surgery unless used in conjunction with other anaesthetics.

 

You may have heard that Queen Victoria used chloroform during the birth of her 8th child in 1853, but it had actually been around 6 years beforehand and was met with a lot of opposition preceding said event. Scottish obstetrician James Simpson was the first to use chloroform to relieve the pain of childbirth in 1847, and it widely replaced the use of ether as it was quicker acting, didn’t smell as pungent and had fewer side effects. Nevertheless, it was met with opposition mainly due to deaths and religion.

Some religious people believed it was God’s intention for women to feel pain during childbirth so such pain should be endured, not relieved. At the time religion held a lot of power so this scared many God-fearing people away from chloroform. Meanwhile administration required great skill, as surgeons had to be experienced enough to give the right dose. As such the first reported death from chloroform overdose was in 1848. Chloroform fatalities were widely publicised but were mainly caused from poor administration, which was overcome when John Snow invented the chloroform inhaler which controlled the dosage to make the anaesthetic safer and more effective. When Snow used the inhaler to anaesthetise Queen Victoria, the positive publicity left little opposition remaining. The use of chloroform has since been discontinued as it was realised that chloroform could cause liver and heart damage.

 

Skipping forward to 1934, when sodium thiopental was made as the first intravenous aesthetic (injected into bloodstream). It was founded by Ernest Volwiler and Donalee Tabern, both working for Abbott Laboratories and a clinical trial of thiopental at the Mayo Clinic was conducted 3 months later. It rapidly entered common practice as it was short and fast acting (4-7 minutes and 30 seconds respectively), and the fact that it was intravenous allowed for more precise dosage. Volwiler and Tabern were inducted into the National Inventors Hall of Fame in 1986, and thiopental is still used in conjunction/before other anaesthetics although not alone because its effects do not last long enough to be of practical value. Since the 1980s thiopental has been slowly replaced by propofol as it too is short and fast acting but also has antiemetic properties (prevents nausea and vomiting).

 

Halogenated inhaled agents are routinely used today and it could be argued that their emergence transformed anaesthesia as much as chloroform did over 100 years prior. Halothane was first synthesised by C. Suckling in 1951 and first used clinically by Dr Johnstone in Manchester 5 years later and is still widely used in developing countries and veterinary surgery because it is low cost. Following halothane came enflurane(1966), isoflurane(1979), sevoflurane(1990) and finally desflurane(1990s). These halogenated inhaled agents have the beneficial properties of low solubility (meaning they take rapid effect), minimal cardiorespiratory depression and non-flammability. Despite these characteristics, halogenated agents only cause lack of consciousness and do not relieve pain so are used in conjunction with other anaesthetics.

 

Curare was traditionally used on poison darts and arrows by aboriginal people and became the first non-depolarising muscle relaxant (blocks the agonist from binding to receptors) in 1942. Neuromuscular blocking drugs like curare can be used to produce paralysis, allowing intubation of the trachea and to optimise the surgical field. Endotracheal intubation is important as it is used to administer gases like the halogenated inhaled agents, and to aid ventilation of the lungs. The fact that muscle relaxants basically paralyse the muscle means that muscles cannot contract, enabling surgery to be performed as best and safely as possible. Patients can still feel pain at full intensity with the use of muscle relaxants but cannot move, so analgesics (pain relieving drugs) are often also given to prevent anaesthesia awareness.

 

As you can see, anaesthetics has changed a lot in the last 200 years and anaesthesia is now an entire section of medicine in its own right. Nowadays, general anaesthetic cannot be given to a patient without a trained anaesthetist, dosage is carefully controlled and a combination of anaesthetics can be used to achieve the ideal effect.

 

Read it? Rate it!

Making the Burr Hole

Trepanned Skull

‘Making a Burr Hole’ is just another way of saying trepanning. What is trepanning? Basically, it’s drilling a hole in the skull and is one of the oldest surgical procedures which has archaeological evidence.

I have recently been to London, and during my trip spent an afternoon in the Science Museum, which I would recommend to any science lover, prospective medic, history buff and it’s great for kids. During my visit, I visited the ‘Journeys Through Medicine’ exhibition which was super interesting, and sparked my interest in a lot of areas to do with medicine and the history of it all. It’s amazing to see how ideas and practices have changed over time, and this is just one such area that I’d like to talk about.

 

So, back to trepanning. It dates all the way back to prehistory, with evidence to say it was being used as far back as 6,500 BC. And the motivation? Medicine! Surely you’d assume that such a method at a time before even writing had been invented would never actually help and I’m sure it didn’t, yet people back then clearly believed it did because trepanning was used right up until Renaissance time.

During Prehistory, it may have been used to remove evil spirits which were believed to be causing pain, for example headaches or cranial injuries. However, given the lack of hard evidence about the purpose, much of it is up to speculation and some historians have found evidence to suggest that trepanning was used as part of a ritual, so perhaps not related to medicine.

What is really interesting is that with many of the trepanned skulls found in archaeological digs, the bone showed signs of healing which proves the ‘patient’ survived the procedure which says something about the prehistoric people, right?

 

Trepanning continued through the Ancient period with evidence of the scrapings of the skull being used to make potions in Ancient Egypt, and mentions of trepanning from Hippocrates (Greek) during which time the instrument used was similar to the modern trephine. Trepanning is also seen during Ancient Roman times with Galen specifying the risks associated with trepanning,

“If pressed too heavily on the brain, the effect is to render the person senseless as well as incapable of all voluntary motion”.

Over time we see the method become slightly more refined. For example during the Ancient Roman period, due to the frequency of war as Romans conquered more lands, army doctors gained lots of practice on wounded soldiers in basic first aid and surgery, leading to improvement in technique. What’s more, during the Ancient era the focus of trepanning began to shift towards dealing with trauma as opposed to superstitious beliefs.

However during the Medieval period, there were few developments and even some regression so typically a medieval surgeon would carry out trephining on an epileptic patient for similar reasons as they did during Prehistoric times.

Finally, details of ‘old-school’ trepanning appear during the Renaissance as Paré’s notes describe trepanning as the most commonly used procedure to treat skull fractures. He even provided an image of what trepanning tools of the Renaissance looked like!

 

Now, I know I said a long time ago but in fact trepanning is still used in the modern day. For example, trepanation is used in some modern eye surgeries like in a corneal transplant. It is also used in intercranial pressure monitoring and in surgery to treat subdural and epidural haematomas (blood clot) in which a craniotomy or burr hole is made to remove the blood. See the connection?

 

Thousands of years, and trepanning is still around in one form or another. That’s a lot of history, huh?

 

Thank you for reading, remember to rate and share!