The Tuskegee Experiments

A few weeks ago, I was reading through my lecture notes and came across a single bullet point just casually mentioning an experiment on African-Americans where the subjects didn’t have informed consent called the Tuskegee Experiments. My reaction was two-fold – first, what?! Second – why have they not gone into more detail about this?? So I went and found out some more detail. And what I found was a shocking example of abhorrent abuse of power by trusted healthcare professionals that I think should’ve been detailed in its own lecture and not just mentioned as a passing comment. As a result I find myself writing this post, because I think everyone should know their history and see what can happen when the medical profession isn’t held accountable to the people it serves. 

In 1932 the US Public Health Service launched a study following the progression of untreated syphilis in black men in Macon County, Alabama. The study was called the “Tuskegee Study of Untreated Syphilis in the Negro Male” – named Tuskegee because the Public Health Service conducted the study with the Tuskegee Institute in Macon County. 

At the time syphilis was a massive public health issue, with 300,000 new cases every year. Pre-1932, the study had originally been 1 of 6 pilot projects in the South aiming to diagnose and treat syphilis in poor black communities. But this program was discontinued when the Public Health Service rather quickly realised they had underestimated the cost. During the course of that program, however, they did uncover news that an incredibly high proportion of people in Macon County were infected with syphilis- 35%!. 

In this poor, black and highly infected community Dr Taliaferro Clark saw a so-called “unparalleled scientific opportunity”. He proposed a 6-month study of untreated syphilis with the primary aim of investigating whether the disease affected black people the same as it did whites. And just like that, patients became experimental subjects…

The study involved 600 black men; 399 with syphilis and 201 without. The participants were primarily sharecroppers, many of whom had never before visited a doctor. This study took advantage of medically and socially disadvantaged, vulnerable people. These men trusted the Public Health Service and were unlikely to ever question the doctors. They were promised free medical care, given free meals and burial insurance in return for taking part in the study but were never truly informed of the nature of the study. 

The nature of the study being…. really quite horrific. To start, the researchers told the men they were being treated for “bad blood” which is an ambiguous term which could’ve meant syphilis but could also have meant a number of other things. And throughout the study the men were never told that it was syphilis they were being treated for. Not that they actually were being treated, despite being told they were being. 

The researchers not only didn’t treat the men but actually went to the effort of pretending they were treating them by giving them placebos which were in fact just vitamins, aspirin and tonics which did nothing to help syphilis – which the researchers knew, of course. In order to determine the prevalence of neurological syphilis researchers would conduct a spinal tap on subjects under the guise of giving “very special free treatment”. This level of dishonesty is shameful, not only omitting the truth which is bad enough but explicitly lying time and time again to the unwitting participants. 

The data from those initial 6 months showed that blacks did suffer the same complications as white patients infected with syphilis. And those complications weren’t exactly trivial. They included blindness, insanity and death. But instead of shutting down the study and treating the participants for this life-changing disease this news only propelled the researchers forward. Raymond Vonderlehr, one of the chief originators of the Tuskegee study, pushed to make it an open-ended project and as a result a study that started out as 6 months actually lasted nearly 40 years. 

And in those 40 years, not only was treatment denied but the lengths that the researchers went to prevent treatment was pretty extreme. In 1938 a nationwide campaign to treat syphilis was introduced- offered everywhere except for in Macon County. What’s more, if participants sought out help they were stopped; one detailed attending a clinic in another town and being sought out by a nurse from the study and sent back home before they could access the treatment. And during World War II, scientists even appealed to the draft board to exclude Tuskegee men from service in order to prevent the men from being treated for their syphilis as it was part of policy for men who were enlisted to be treated. At this point I guess it won’t surprise you that penicillin also wasn’t offered to the men when it became the established and widely available treatment for syphilis. But it does disappoint, doesn’t it? To think that on so many occasions these men could have been treated and saved from their suffering and an active decision was made not to.

Now you may be wondering, how did something like this go on for 40 years and no one noticed? The sad fact is that the Tuskegee experiments weren’t exactly a secret. The results of the study were periodically published in major medical journals throughout the study period and even in Congress. But nobody questioned the ethics of the study because- simply put- nobody cared about the lives of these poor black men. The awful truth is that treating African-American individuals as test subjects who’s lives were somehow worth less was a completely normalised practice. And coming to that realisation is probably what upset me most about this whole torrid affair. I have to say that I assumed, even with my knowledge of what society was like back then, that had the wider medical profession known about the study they would’ve condemned it. I might only be one third of a doctor but even I know that this study was unethical on so many levels. It doesn’t take a genius to see that this goes against the very first medical principle we’re taught, ‘do no harm’.

It’s amazing, in a terrible way, that the study actually ran for as long as it did. Even when ethical principles for human experimentation were laid out like The Nuremberg Code in 1947, the study continued. The director of the experiment supposedly failed to see how the Tuskegee experiment was in direct violation of it. Alright then, let’s have a look. “Voluntary informed consent”. No. “Risk must be weighed against expected benefit”. Strike two. “Unnecessary pain and suffering must be avoided”. I think you get the picture. Even with the backdrop of the civil rights movement in the 60s, where we saw massive legislative gains in the rights of African-Americans, this study prevailed. Social change was happening and the researchers totally disregarded it. So often bad behaviour is given a ‘Get Out of Jail Free’ card because “it was ok at the time”. I don’t believe that is a very strong argument to start with but when you consider the context there was literally no excuse for this study to continue because it wasn’t ok at the time. 

So when, and how did the study come to an end? Let’s just say they didn’t go down easy. In the mid-1960s a venereal disease investigator called Peter Buxton heard a story about an insane man who was rushed to a physician and treated, only for the local medical society to punish the physician in response. Buxton expressed his concerns around the Tuskegee study to his superiors and so the Public Health Service formed a committee to review the study. Great, sorted! Nope. The panel voted to continue the study and continue to withhold treatment with the goal of tracking the participants until they had all died. Their reasoning was that the men had been untreated for so long that penicillin might not help anyway and “telling them would only upset them”. Ridiculous, is my response to that. Might not help? Sounds like one way of saying MIGHT HELP THEM YOU SOCIOPATHS. Sorry, calm. I. Am. Calm. As for the whole, “telling them would upset them” thing, I think that that is beside the point. Yes, it will upset them. Rightly so. They deserved to be shown some respect, told upfront what had happened and receive an apology. Plus reparations. A whole lot of reparations. Back to Peter Buxton, who I have profound respect for – not only for speaking up in the first place but then for leaking the story to a reporter following the PHS’ decision. 

The Washington Star broke the story in July 1972, prompting public outrage and causing the government to launch an official investigation. In October 1972 the panel advised stopping the study at once and a month later the end of the Tuskegee Study was announced. Over the course of the Tuskegee study, 28 participants died from syphilis. 100 more passed away from related complications. At least 40 spouses had been diagnosed with it and the disease had been passed on to 19 children at birth. It’s important to keep that in mind, to understand that no amount of reparations would take back what happened here. In 1973 a class-action lawsuit was filed and a $10 million settlement was reached. As part of the settlement, the US government promised to give lifetime medical benefits and burial services to all living participants and in 1975 wives, widows and offspring were added to the program. The last study participant died in January 2004. 

I wish that I could say that this was an isolated incident but it wasn’t. Upon researching this I came across the details of so many unethical experiments that had been carried out in the name of science and it makes me sick. I always pride myself in my association (or future association) to medicine, think of it as a noble profession where you can do some good. That hasn’t completely changed, but I am left a little less starry-eyed by the medical world than when I started. Medicine has a dark past like everything else, I guess it was silly of me to think it could be exempt from it. What’s important is that we learn from our history and strive to be better. A whole lot better. 

When I originally thought of writing this blog post it was not quite as topical as it is today. In the past several weeks we’ve seen a massive Black Lives Matter movement and I absolutely stand by it. When I was younger and learnt about the American Civil Rights Movement and what society had looked like back then I would think ‘wow that’s awful, at least it’s nothing like that now’. As I’ve grown up I’ve come to see that we’ve not come nearly as far as I thought we had and to say I’m disappointed would be an understatement. My little sister said something to me that’s been playing on repeat in my head ever since. “Wow. I used to wish I wasn’t brown because people assume we’re terrorists but now I’m just glad I wasn’t born black.” There is so much wrong with this. It says so much about the world we live in. My sisters deserve better. The Tuskegee participants deserved better. George Floyd. Breonna Taylor. Elijah McClain. They deserved better. And countless others. 

Antibiotics; A journey from good, to bad, to ugly

As I hope you are already aware, medicine is facing a crisis at the moment which could prove to be a bigger killer than cancer; antibiotic resistance. Bacterial infection seems an archaic issue, one that we solved years ago and have since forgotten as a major threat to human health in the public eye, but it’s making a comeback. Before we get into all that, however, let’s first look at what antibiotics are and how they came about in the first place.

Antibiotics are a group of drugs which can treat or prevent bacterial infections by inhibiting or stopping the growth of bacteria. There are hundreds of different antibiotics in circulation today, but they all stem from just a few drugs so can be categorised into six groups. One of these groups are the penicillins, which include the drugs penicillin and amoxicillin and this is where the antibiotics’ journey begins. Penicillin was discovered in 1928 by Alexander Fleming and is generally thought of as the world’s first antibiotic. Its initial discovery is quite well known, with Fleming accidentally uncovering the antibacterial properties of a mold called Penicillium notatum when returning from a holiday to find that where the mold had grown, normal growth of Staphylococci was prevented. Fleming then grew some more of the mold in order to confirm his findings; he had discovered something that would not only prevent growth but could be harnessed to fight bacterial infection, changing the world in its wake. However, Fleming didn’t develop Penicillin into what it is today. To do this, the active ingredient would need to be isolated, purified, tested and then produced on a grand scale.

It was 10 years later that endeavours into this began, when Howard Florey came across Fleming’s work and began further developing it with his team at Oxford University. Florey, and one of his employees Ernst Chain, managed to produce penicillin culture fluid extracts. From this, they experimented on 50 mice that they had infected with streptococcus and treated only half with penicillin injections. Whilst half of the infected mice died from sepsis, the ones which had been treated survived thus proving the effectiveness of penicillin. In 1940 the first human test was conducted, with injections being given for 5 days to an infected Albert Alexander. Alexander began to recover, but unfortunately Florey and Chain didn’t have enough pure penicillin to completely treat the infection and so Alexander ultimately died. This was now their biggest problem; making enough penicillin. It took 2,000 litres of culture fluid to extract enough pure penicillin to treat just 1 human case, so you can see how treating an entire population would frankly be impossible if they continued in this way. It was in the US whilst Florey and Chain were looking to find a solution to this problem that they happened upon a different, more prolific fungus called Penicillium chrysogeum. This yielded 200 times more penicillin than the previously used species, and with a few mutation-causing X-rays yielded 1000 times more. This allowed for 400 million units of penicillin to be produced for use during the war in 1942, reducing death rate from bacterial infection to less than 1%. And so began the antibiotic revolution in medicine which changed the world into what it is today.

Now, over 70 years later, we’re faced with a huge issue. Bacteria have become resistant to antibiotics due to their overuse. Antibiotic resistance occurs when a bacteria mutates so that the antibiotic can no longer kill it, and is part of natural selection. However, from previous knowledge of natural selection you might know that it is slow process across many generations and relies on the mutation being an advantage in order for it to become widespread. Yet when you use antibiotics rather than having 1 mutated bacteria amongst millions of non-mutated, all non-mutated bacteria are killed leaving a 100% resistant population. Albeit that population is exactly 1, but bacteria can multiply and start causing havoc very quickly. And then those same antibiotics no longer have any effect on the new culture, leading to longer hospital stays, more expensive medication and increased mortality rates.

Given how easily everyone is now able to travel around the world, this issue is not just localised to 1 country or continent but is on global scale. Given that tackling antibiotic resistance requires investment into finding new antibiotics and using the more expensive medications that are on hand, developing countries could be hit much harder than developed. Meanwhile, our struggling NHS will experience even more pressure as patients have longer stays at hospital. Worst-case scenario, if this issue isn’t addressed we could be returned to a time before antibiotics where common infections are once again fatal, killing 10 million people a year by 2050. Clearly not a desirable outcome, so how can we tackle antibiotic resistance?

Prevention and control of antibiotic resistance requires change at several different levels of society. The World Health Organisation has made recommendations for what we as individuals, healthcare professionals, policy makers, and people in the agriculture sector. Amongst the advice for individuals is not sharing leftover antibiotics, not demanding antibiotics if they aren’t considered necessary and preparing food hygienically to prevent infections. Meanwhile the healthcare industry needs to invest in research into new antibiotics, as so far most drugs being developed have only been modifications to existing antibiotics. They don’t work for long, and of the 51 new antibiotics currently being developed only 8 of them have been classed by WHO as innovative and capable of contributing meaningfully to the issue.

Research into new drugs is of the utmost importance, but the responsibility appears to fall to governments to fund such research as pharmaceutical companies are reluctant to do so. Some suggest that this is because it is more lucrative for drug companies to treat chronic conditions in which patients rely on their drugs for a lifetime than providing short-term (and may I point out, life-saving) treatments. Thankfully, this year countries including Switzerland, South Africa and the U.K. pledged a combined $82 million to support the Global Antibiotic Research and Development Partnership which was set up by WHO and the Drugs for Neglected Diseases Initiative. Less encouraging, is the estimation by the director of the WHO Global Tuberculosis Programme that more than $800 million is needed annually to fund research into new anti-tuberculosis medicines.

I know. It’s feeling dismal. But what can we do, on the metaphorical shop floor? Stop the overuse of antibiotics, follow through on any prescribed courses of antibiotics even if you’re feeling better and make other people aware of the issue. The bigger the issue becomes in the eyes of the public, the better the government and other organisations will address it so talk about it to anyone who will listen! I am optimistic that we as a species will survive this most recent challenge to our health, but only if we start pulling up our socks and addressing the situation.


Thank you for reading, remember to rate and share!

Vampire or Victim?

Brasov in Transylvania, Romania

Following my recent trip to Transylvania in Romania for work experience, I have decided to dedicate this post to the monsters and myths which stem from the region: specifically vampires. Like many historical ideas and beliefs, the creation of such a supernatural being likely served as an attempt to explain ailments that couldn’t otherwise be explained, before the emergence of scientific understanding. In this post, I will describe the medical condition most popularly believed to have led to belief in vampires.

The most commonly referenced illness used to explain away vampirism is porphyria, a group of inherited metabolic disorders in which haem (used in haemoglobin) production is disrupted. The production of haem from a chemical called ALA involves several steps, and each step requires the use of an enzyme. In people suffering from porphyria, one of those enzymes is faulty due to the inheritance of a mutated gene which codes for the synthesis of that enzyme. This means that the production of haem is slowed or even stopped, and as a result the ‘transitional chemicals’ made in the stages between ALA and haem, known as porphyrins, can build up to harmful levels. As well as this, the limited haem production can mean that not enough healthy red blood cells can be made. There are different types of porphyria depending on which enzyme is dysfunctional, most of which produce different symptoms with some overlap.

The most common form, and the one best related to vampires, is porphyria cutanea tarda (PCT) which affects the skin. PCT causes photosensitivity in which exposure to sunlight can cause painful, blistering and itchy skin… sound like something you’ve heard of before? A well known characteristic of vampires is that they burn in sunlight, hence must stay out of the sun and so have a dramatically pale complexion. Similarly many porphyria sufferers are indeed pale as, naturally, they avoid the sunlight due to their photosensitivity.  What’s more, healing after this reaction to sunlight is often slow and, with repeated damage, can cause the skin to tighten and shrink back. If this shrinking causes gums to recede, you can imagine that the canines may start to resemble fangs.

Another general symptom of porphyria is that when the accumulated porphyrins are excreted, the resultant faeces may turn a purple-red colour. Whilst the same conclusion may not be reached in modern times, historically this may have given the impression that the sufferer had been drinking blood which is another vampire hallmark. Interestingly, drinking blood could- and I say this tentatively- actually relieve some symptoms of porphyria. Whilst the haemoglobin would be broken down, the haem pigment itself could survive digestion and be absorbed from the small intestine meaning in theory that drinking blood would do the same for relieving symptoms as a blood transfusion would. Finally, garlic. Seemingly the most random trait of vampires is their aversion to garlic however even this could be explained by porphyria. Garlic can stimulate red blood cell and haem production which, for a person with porphyria, could worsen their symptoms as more porphyrins build up. This could lead to an acute attack in which abdominal pain, vomiting and seizures may occur. Seems like an extreme reaction, but perhaps…

So does porphyria explain how the legends of vampires came about? I would say so, but some folklorists and experts would disagree. It is suggested by such people that porphyria doesn’t accurately explain the original characteristics of vampires but more the fictional adaptations that have more recently been referred to. Folkloric vampires weren’t believed to have issues with sunlight at all and were described as looking healthy, ‘as they were in life’, which contradicts the pale complexion and blistering skin seen in PCT. Furthermore, it is still unclear whether or not drinking blood would truly relieve symptoms of porphyria and even so, how those affected would know to try it with no understanding of their disease and no craving for blood makes it all seem rather unlikely. Speaking of probability, reports of vampires were rampant in the 18th century yet porphyria is a relatively rare  occurrence and its severity ranges from full-on vampire to no symptoms developing at all, making it seem even less probable that such an apparently widespread phenomenon could be the result of PCT.

Whether you believe that porphyria caused the vampire myths or not, it certainly is an interesting disease that sadly has no cure (so far), is difficult to diagnose and relies generally on management rather than treatment. Here’s hoping that future developments using gene therapy and even research spun off of use of ‘vampire plant’ models could lead to improvements some day.


Read it? Rate it!

A Look Back at Anaesthesia

An anaesthetic is a substance that induces insensitivity to pain with anaesthesia literally meaning ‘without sensation’. Anaesthesia is used in modern times on a day to day basis, during tests and surgical operations in order to numb areas of the body or induce sleep. The types which you are probably most aware of, local and general, are the two most common but there are other types such as regional anaesthetics and sedatives.

Anaesthetics may not be perfect, with c.10 deaths for every million anaesthetics given in the UK, however developments over the years has made the use of anaesthesia to be considered very safe with serious problems being rare. I’m taking a look back in time to see how anaesthesia- more specifically general- has changed over the last 2 centuries.



Starting in the 1820s, Henry Hickman explored the use of carbon dioxide as an anaesthetic to perform painless surgery on animals. Carbon dioxide could be used to effectively suffocate the animals out of consciousness for a long enough time to perform the surgery. This is considered to have been a major breakthrough in surgery and anaesthesia, however carbon dioxide wasn’t used widespread due to the risks associated with partial asphyxiation and Hickman’s work was heavily criticised at the time. In more recent times, carbon dioxide was said to be used for medicine in the USA during the 1950s and is currently used before slaughter in numerous slaughterhouses.


Ether had a reputation as a recreational drug during the 1800s but in 1842 it was used for the first time as an anaesthetic by American physician William Clarke in a tooth extraction. That same year, Crawford Long used it to remove a tumour from a man’s neck although he did not publish an account of until several years later, in which he described that the patient felt nothing throughout the procedure.

Morton gave the use of ether popularity when in 1846 he removed a tumour from a man’s jaw. News of the operation travelled round the world quickly and ether became a widely adopted method with Morton often being credited as the pioneer of general anaesthesia. Ether did have it’s drawbacks though, causing coughing in patients and being highly flammable meaning research in anaesthesia continued to develop.


Humphrey Davy discovered that nitrous oxide could dull pain in 1799, however the use of nitrous oxide as an anaesthetic wasn’t fully realised until 1844. Horace Wells attended a public demonstration of nitrous oxide by Gardner Colton and the very next day Wells himself underwent the first ever reported painless tooth extraction while Colton administered nitrous oxide. Nitrous oxide is still used today, you may know it better as ‘laughing gas’, in dentistry and childbirth and even recreationally as the 4th most used drug in the UK (according the Global Drug Survey 2015). Of course, nitrous oxide only dulls pain so could not and cannot be used in major surgery unless used in conjunction with other anaesthetics.


You may have heard that Queen Victoria used chloroform during the birth of her 8th child in 1853, but it had actually been around 6 years beforehand and was met with a lot of opposition preceding said event. Scottish obstetrician James Simpson was the first to use chloroform to relieve the pain of childbirth in 1847, and it widely replaced the use of ether as it was quicker acting, didn’t smell as pungent and had fewer side effects. Nevertheless, it was met with opposition mainly due to deaths and religion.

Some religious people believed it was God’s intention for women to feel pain during childbirth so such pain should be endured, not relieved. At the time religion held a lot of power so this scared many God-fearing people away from chloroform. Meanwhile administration required great skill, as surgeons had to be experienced enough to give the right dose. As such the first reported death from chloroform overdose was in 1848. Chloroform fatalities were widely publicised but were mainly caused from poor administration, which was overcome when John Snow invented the chloroform inhaler which controlled the dosage to make the anaesthetic safer and more effective. When Snow used the inhaler to anaesthetise Queen Victoria, the positive publicity left little opposition remaining. The use of chloroform has since been discontinued as it was realised that chloroform could cause liver and heart damage.


Skipping forward to 1934, when sodium thiopental was made as the first intravenous aesthetic (injected into bloodstream). It was founded by Ernest Volwiler and Donalee Tabern, both working for Abbott Laboratories and a clinical trial of thiopental at the Mayo Clinic was conducted 3 months later. It rapidly entered common practice as it was short and fast acting (4-7 minutes and 30 seconds respectively), and the fact that it was intravenous allowed for more precise dosage. Volwiler and Tabern were inducted into the National Inventors Hall of Fame in 1986, and thiopental is still used in conjunction/before other anaesthetics although not alone because its effects do not last long enough to be of practical value. Since the 1980s thiopental has been slowly replaced by propofol as it too is short and fast acting but also has antiemetic properties (prevents nausea and vomiting).


Halogenated inhaled agents are routinely used today and it could be argued that their emergence transformed anaesthesia as much as chloroform did over 100 years prior. Halothane was first synthesised by C. Suckling in 1951 and first used clinically by Dr Johnstone in Manchester 5 years later and is still widely used in developing countries and veterinary surgery because it is low cost. Following halothane came enflurane(1966), isoflurane(1979), sevoflurane(1990) and finally desflurane(1990s). These halogenated inhaled agents have the beneficial properties of low solubility (meaning they take rapid effect), minimal cardiorespiratory depression and non-flammability. Despite these characteristics, halogenated agents only cause lack of consciousness and do not relieve pain so are used in conjunction with other anaesthetics.


Curare was traditionally used on poison darts and arrows by aboriginal people and became the first non-depolarising muscle relaxant (blocks the agonist from binding to receptors) in 1942. Neuromuscular blocking drugs like curare can be used to produce paralysis, allowing intubation of the trachea and to optimise the surgical field. Endotracheal intubation is important as it is used to administer gases like the halogenated inhaled agents, and to aid ventilation of the lungs. The fact that muscle relaxants basically paralyse the muscle means that muscles cannot contract, enabling surgery to be performed as best and safely as possible. Patients can still feel pain at full intensity with the use of muscle relaxants but cannot move, so analgesics (pain relieving drugs) are often also given to prevent anaesthesia awareness.


As you can see, anaesthetics has changed a lot in the last 200 years and anaesthesia is now an entire section of medicine in its own right. Nowadays, general anaesthetic cannot be given to a patient without a trained anaesthetist, dosage is carefully controlled and a combination of anaesthetics can be used to achieve the ideal effect.


Read it? Rate it!

Making the Burr Hole

Trepanned Skull

‘Making a Burr Hole’ is just another way of saying trepanning. What is trepanning? Basically, it’s drilling a hole in the skull and is one of the oldest surgical procedures which has archaeological evidence.

I have recently been to London, and during my trip spent an afternoon in the Science Museum, which I would recommend to any science lover, prospective medic, history buff and it’s great for kids. During my visit, I visited the ‘Journeys Through Medicine’ exhibition which was super interesting, and sparked my interest in a lot of areas to do with medicine and the history of it all. It’s amazing to see how ideas and practices have changed over time, and this is just one such area that I’d like to talk about.


So, back to trepanning. It dates all the way back to prehistory, with evidence to say it was being used as far back as 6,500 BC. And the motivation? Medicine! Surely you’d assume that such a method at a time before even writing had been invented would never actually help and I’m sure it didn’t, yet people back then clearly believed it did because trepanning was used right up until Renaissance time.

During Prehistory, it may have been used to remove evil spirits which were believed to be causing pain, for example headaches or cranial injuries. However, given the lack of hard evidence about the purpose, much of it is up to speculation and some historians have found evidence to suggest that trepanning was used as part of a ritual, so perhaps not related to medicine.

What is really interesting is that with many of the trepanned skulls found in archaeological digs, the bone showed signs of healing which proves the ‘patient’ survived the procedure which says something about the prehistoric people, right?


Trepanning continued through the Ancient period with evidence of the scrapings of the skull being used to make potions in Ancient Egypt, and mentions of trepanning from Hippocrates (Greek) during which time the instrument used was similar to the modern trephine. Trepanning is also seen during Ancient Roman times with Galen specifying the risks associated with trepanning,

“If pressed too heavily on the brain, the effect is to render the person senseless as well as incapable of all voluntary motion”.

Over time we see the method become slightly more refined. For example during the Ancient Roman period, due to the frequency of war as Romans conquered more lands, army doctors gained lots of practice on wounded soldiers in basic first aid and surgery, leading to improvement in technique. What’s more, during the Ancient era the focus of trepanning began to shift towards dealing with trauma as opposed to superstitious beliefs.

However during the Medieval period, there were few developments and even some regression so typically a medieval surgeon would carry out trephining on an epileptic patient for similar reasons as they did during Prehistoric times.

Finally, details of ‘old-school’ trepanning appear during the Renaissance as Paré’s notes describe trepanning as the most commonly used procedure to treat skull fractures. He even provided an image of what trepanning tools of the Renaissance looked like!


Now, I know I said a long time ago but in fact trepanning is still used in the modern day. For example, trepanation is used in some modern eye surgeries like in a corneal transplant. It is also used in intercranial pressure monitoring and in surgery to treat subdural and epidural haematomas (blood clot) in which a craniotomy or burr hole is made to remove the blood. See the connection?


Thousands of years, and trepanning is still around in one form or another. That’s a lot of history, huh?


Thank you for reading, remember to rate and share!