Vampire or Victim?

August 3, 2017 in History, The Science Behind...

 

Brasov in Transylvania, Romania

Following my recent trip to Transylvania in Romania for work experience, I have decided to dedicate this post to the monsters and myths which stem from the region: specifically vampires. Like many historical ideas and beliefs, the creation of such a supernatural being likely served as an attempt to explain ailments that couldn’t otherwise be explained, before the emergence of scientific understanding. In this post, I will describe the medical condition most popularly believed to have led to belief in vampires.

The most commonly referenced illness used to explain away vampirism is porphyria, a group of inherited metabolic disorders in which haem (used in haemoglobin) production is disrupted. The production of haem from a chemical called ALA involves several steps, and each step requires the use of an enzyme. In people suffering from porphyria, one of those enzymes is faulty due to the inheritance of a mutated gene which codes for the synthesis of that enzyme. This means that the production of haem is slowed or even stopped, and as a result the ‘transitional chemicals’ made in the stages between ALA and haem, known as porphyrins, can build up to harmful levels. As well as this, the limited haem production can mean that not enough healthy red blood cells can be made. There are different types of porphyria depending on which enzyme is dysfunctional, most of which produce different symptoms with some overlap.

The most common form, and the one best related to vampires, is porphyria cutanea tarda (PCT) which affects the skin. PCT causes photosensitivity in which exposure to sunlight can cause painful, blistering and itchy skin… sound like something you’ve heard of before? A well known characteristic of vampires is that they burn in sunlight, hence must stay out of the sun and so have a dramatically pale complexion. Similarly many porphyria sufferers are indeed pale as, naturally, they avoid the sunlight due to their photosensitivity.  What’s more, healing after this reaction to sunlight is often slow and, with repeated damage, can cause the skin to tighten and shrink back. If this shrinking causes gums to recede, you can imagine that the canines may start to resemble fangs.

Another general symptom of porphyria is that when the accumulated porphyrins are excreted, the resultant faeces may turn a purple-red colour. Whilst the same conclusion may not be reached in modern times, historically this may have given the impression that the sufferer had been drinking blood which is another vampire hallmark. Interestingly, drinking blood could- and I say this tentatively- actually relieve some symptoms of porphyria. Whilst the haemoglobin would be broken down, the haem pigment itself could survive digestion and be absorbed from the small intestine meaning in theory that drinking blood would do the same for relieving symptoms as a blood transfusion would. Finally, garlic. Seemingly the most random trait of vampires is their aversion to garlic however even this could be explained by porphyria. Garlic can stimulate red blood cell and haem production which, for a person with porphyria, could worsen their symptoms as more porphyrins build up. This could lead to an acute attack in which abdominal pain, vomiting and seizures may occur. Seems like an extreme reaction, but perhaps…

So does porphyria explain how the legends of vampires came about? I would say so, but some folklorists and experts would disagree. It is suggested by such people that porphyria doesn’t accurately explain the original characteristics of vampires but more the fictional adaptations that have more recently been referred to. Folkloric vampires weren’t believed to have issues with sunlight at all and were described as looking healthy, ‘as they were in life’, which contradicts the pale complexion and blistering skin seen in PCT. Furthermore, it is still unclear whether or not drinking blood would truly relieve symptoms of porphyria and even so, how those affected would know to try it with no understanding of their disease and no craving for blood makes it all seem rather unlikely. Speaking of probability, reports of vampires were rampant in the 18th century yet porphyria is a relatively rare  occurrence and its severity ranges from full-on vampire to no symptoms developing at all, making it seem even less probable that such an apparently widespread phenomenon could be the result of PCT.

Whether you believe that porphyria caused the vampire myths or not, it certainly is an interesting disease that sadly has no cure (so far), is difficult to diagnose and relies generally on management rather than treatment. Here’s hoping that future developments using gene therapy and even research spun off of use of ‘vampire plant’ models could lead to improvements some day.

‘Carbon dating’ cancer? What’s that all about?

May 16, 2017 in In the News

 

Last week, the Institute of Cancer Research announced that scientists have been able to precisely pinpoint the timing in which different stages of a patient’s cancer developed. This could result in some interesting progress in the treatment and understanding of cancer and I will explain how researchers are doing this, but first- what is cancer?

Cancer is a group of diseases caused by the uncontrollable division of damaged cells. Cells can become damaged in this way due to a mutation in their DNA which intervenes with the regulation of mitosis (cell division). More specifically, the genes proto-oncogenes and tumour suppressor genes can become mutated. Proto-oncogenes trigger division, however mutated ones (known as oncogenes) trigger mitosis to happen at a much faster rate than normal. Meanwhile, mutated tumour suppressor genes fail to inhibit cell division as they should. Both mutations, as you can imagine, lead to the fast and furious growth of a cell into a tumour.

Mutations naturally occur quite frequently, but not all mutations cause cancer, and in fact it often requires more than one mutation in a cell in order for it to become cancerous. Most of the time either the mutation is relatively harmless, the DNA is repaired or the cell ‘kills itself’ before it can do any harm (apoptosis). However, in cancer cells the signals telling them to undergo apoptosis can be overridden so that the damaged cell continues to divide, producing even more damaged cells.

There are multiple methods currently in use to treat cancer, but the three most common treatments are surgery, chemotherapy and radiotherapy. In surgery, the tumour is simply removed from the body however this only completely cures if the cancer is contained in one area and hasn’t spread. Surgery is often used in combination with other treatments such as chemotherapy to shrink the tumour before surgery (neo adjuvant treatment).

Chemotherapy is the use of drugs to treat cancer, usually by stopping cells from dividing. The drugs do this by either preventing DNA from replicating (which occurs in the time preceding mitosis) or by interrupting mitosis during the metaphase stage. Chemotherapy is most effective against rapidly dividing cells like cancer cells but it can also effect other cells which divide frequently such as hair-producing cells. This explains some of the side effects such as loss of hair that can occur during chemotherapy.

Radiation works by damaging the DNA in cells that are dividing using high-energy rays (normally x-rays). Seems confusing, doesn’t it, given that cancer itself occurs due to damage to DNA in the first place? Radiation is different because the way in which it damages the cells means that they can’t grow or divide anymore. That damage can generally be repaired in normal cells, but not always which is why there are unwanted side effects to radiotherapy, but cancer cells cannot fix themselves so they die over time.

So now that you are clued up on how cancer occurs and can be stopped, back to carbon-dating. Carbon dating is used to determine the age of organic matter by measuring the amount of carbon-14 that they contain, but here’s the burn- I’m not really talking about carbon dating. Sorry! Don’t leave just yet, because what these scientists did to find out when various stages of cancer progressed in a patient is still pretty interesting. The researchers used genetic analysis and mathematical models that’s normally used in evolutionary biology and applied it to cancer instead. In evolutionary biology, genetic data from current species can be used in combination with carbon-dated fossils of ancestral species to estimate when the current species- or species in between ‘now’ and ‘then’- arose throughout history. Now you can perhaps see why the carbon-dating link comes in.

These methods could only be applied, however, due to a needle tract tumour which occurred when a biopsy of the patient’s tumour was taken. What this means is that a sample of the cancer was taken using a needle and where that needle was removed, some of the cancer cells contaminated the needle track. These cancer cells grew into a metastatic tumour (tumour which has spread from the primary site of cancer to a different area) but because the scientists knew exactly when this tumour happened, the genetic data from these cancer cells could be analysed and compared etcetera etcetera so that a timeline of how the cancer had started and spread was made.

This timeline is useful because it could help with diagnosis and treatment, not only directly but also from what else the scientists found. The researchers discovered that the cancer spread faster during the first year, however after metastasis this progression slowed. What this suggests is that the degree of genetic instability may play a more important part in the deadliness of cancer  than the amount that the cancer has spread. This could be used to determine a patient’s prognosis more accurately, and could help doctors when evaluating how well a treatment might- or might not- work. Furthermore, tracking a cancer’s progression could enable doctors to better predict the cancer’s behaviour in the future thus influencing the strategy for treating it.

So that’s a bit of information about what cancer is, how it can be treated and a recent research development in the field. Some of what I have shared in this post, I learnt at a ‘medicine insight day’ hosted by a group of medical students at Oxford University. They shared with us tips on how to get into university, explained some of the science in cancer and educated us about ovarian and testicular cryopreservation- a method to preserve the fertility of teenagers who have survived cancer.

This preservation is necessary because if a young person survives cancer, the treatment against cancer is often so aggressive that an ‘early menopause’ is triggered in the patient meaning they become infertile (unable to have children). The Future Fertility Trust is  charitable trust fund which offers cryopreservation in which ovarian or testicular tissue is collected, stored and re-implanted after cancer treatment. This enables young cancer survivors to have children in later life, however the work is not funded by the NHS so relies on donations and fundraising. I would like to see ovarian and testicular cryopreservation help more young people and hope that it might become available to all young people by the NHS is the future, and this could be possible if enough cases are funded to show the usefulness and success of  the technique. If you would like to find out more about cryopreservation, Future Fertility Trust, or to donate, go to http://www.futurefertilitytrustuk.org.

Peanut or Pea-not?

April 29, 2017 in The Science Behind...

 

The humble peanut is much loved around the world, making up 67% of all nut consumption in the US and can be found as an ingredient in a ridiculous number of foods or eaten as a savoury snack. Unfortunately, peanut allergy is also one of the most common food allergies with about 1 in 100 people in the UK being allergic.

My interest in this topic arose when someone broke the news to me that the peanut is not actually a nut. Okay, so maybe not the most ground-shaking piece of information but a surprise to me all the same. Turns out, the peanut is actually a legume. This is because a nut is considered to be a fruit whose ovary wall become very hard at maturity, hence the phrase “tough nut to crack”. Also, nuts tend to grow on trees. In contrast, peanut pods split open when ripe and they grow underground. In fact, they are scientifically classified as Arachis hypogaea with hypogaea meaning ‘under the earth’.

Peanuts actually play a lot of roles in health and medicine. In terms of nourishment, peanuts are a great source of protein. They are richer in protein than soy beans and just two tablespoons of peanut butter has more protein than an egg. They also contain nearly 50% fat- but don’t worry! 70% of that fat is unsaturated so whilst that still makes them rather calorific, they are a very good source of energy. Mineral content such as potassium, phosphorus, iron, zinc, copper and magnesium is also proportionately high in peanuts. Finally, peanuts have a lot of vitamin B.

In relation to medicine, peanuts are rich in resveratrol which has antioxidant properties. This helps control cholesterol and prevent heart disease. Peanuts also have an abundance, it seems, of genistein which is known to combat PMS and also prevent the development of cancer cells. One of the many amino acids in peanuts is tyrosine. Tyrosine is used by our bodies to make dopamine, also known as one of the ‘happy hormones’. Lack of dopamine is actually associated with ADHD (attention deficit hyperactivity disorder) in children which can cause problems at school. However, as with everything in life I would not recommend a huge intake of peanuts just for these health benefits as anything in excess can be harmful.

Speaking of harmful, what is a peanut allergy and why does it happen? A peanut allergy occurs when the body’s immune system mistakenly believes that peanuts or something in the peanuts are harmful, so produces an immune response to attack those allergens. During an allergic reaction, histamine is released by mast cells which are found in connective tissues, like the skin. One of the effects of histamine is that it causes capillaries to become more permeable to white blood cells. As a result, fluid moves out of capillaries creating symptoms like watery eyes and runny noses.

Antihistamine can be used to relieve the symptoms of a mild allergic reaction but in a severe reaction anaphylactic shock can occur in which case emergency medical treatment or an injection of epinephrine (adrenaline) as advised by a doctor is required. The best treatment is to steer clear of peanuts if you’re allergic. Although peanuts are not a nut, as we recently found out, if you have a nut allergy you are likely allergic to peanuts too (and visa versa). This is because the allergens are similar so the body responds in the same way.

So that’s a bit about peanuts and nut allergies, but also about some of the benefits of the superfood too. With over 40 million tonnes of shelled peanuts produced worldwide, peanuts are indeed a very popular foodstuff, whether they are a nut or not…

Peanut or Pea-not?

April 29, 2017 in The Science Behind...

 

The humble peanut is much loved around the world, making up 67% of all nut consumption in the US and can be found as an ingredient in a ridiculous number of foods or eaten as a savoury snack. Unfortunately, peanut allergy is also one of the most common food allergies with about 1 in 100 people in the UK being allergic.

My interest in this topic arose when someone broke the news to me that the peanut is not actually a nut. Okay, so maybe not the most ground-shaking piece of information but a surprise to me all the same. Turns out, the peanut is actually a legume. This is because a nut is considered to be a fruit whose ovary wall become very hard at maturity, hence the phrase “tough nut to crack”. Also, nuts tend to grow on trees. In contrast, peanut pods split open when ripe and they grow underground. In fact, they are scientifically classified as Arachis hypogaea with hypogaea meaning ‘under the earth’.

Peanuts actually play a lot of roles in health and medicine. In terms of nourishment, peanuts are a great source of protein. They are richer in protein than soy beans and just two tablespoons of peanut butter has more protein than an egg. They also contain nearly 50% fat- but don’t worry! 70% of that fat is unsaturated so whilst that still makes them rather calorific, they are a very good source of energy. Mineral content such as potassium, phosphorus, iron, zinc, copper and magnesium is also proportionately high in peanuts. Finally, peanuts have a lot of vitamin B.

In relation to medicine, peanuts are rich in resveratrol which has antioxidant properties. This helps control cholesterol and prevent heart disease. Peanuts also have an abundance, it seems, of genistein which is known to combat PMS and also prevent the development of cancer cells. One of the many amino acids in peanuts is tyrosine. Tyrosine is used by our bodies to make dopamine, also known as one of the ‘happy hormones’. Lack of dopamine is actually associated with ADHD (attention deficit hyperactivity disorder) in children which can cause problems at school. However, as with everything in life I would not recommend a huge intake of peanuts just for these health benefits as anything in excess can be harmful.

Speaking of harmful, what is a peanut allergy and why does it happen? A peanut allergy occurs when the body’s immune system mistakenly believes that peanuts or something in the peanuts are harmful, so produces an immune response to attack those allergens. During an allergic reaction, histamine is released by mast cells which are found in connective tissues, like the skin. One of the effects of histamine is that it causes capillaries to become more permeable to white blood cells. As a result, fluid moves out of capillaries creating symptoms like watery eyes and runny noses.

Antihistamine can be used to relieve the symptoms of a mild allergic reaction but in a severe reaction anaphylactic shock can occur in which case emergency medical treatment or an injection of epinephrine (adrenaline) as advised by a doctor is required. The best treatment is to steer clear of peanuts if you’re allergic. Although peanuts are not a nut, as we recently found out, if you have a nut allergy you are likely allergic to peanuts too (and visa versa). This is because the allergens are similar so the body responds in the same way.

So that’s a bit about peanuts and nut allergies, but also about some of the benefits of the superfood too. With over 40 million tonnes of shelled peanuts produced worldwide, peanuts are indeed a very popular foodstuff, whether they are a nut or not…

The Secret to Getting Good Sleep

April 21, 2017 in Everyday Medicine

 

Now I’m not particularly athletically inclined, try as I might, but the one thing that I could win at is getting good sleep. Of course practice makes perfect, but a few tips on how to fall asleep faster and sleep better can really make all the difference…

 

First of all, what should you be aiming for? It varies, but in general teenagers need around 9 hours and adults need about 8 hours. That’s the first step to winning; make sure you actually give your body enough time to grow, repair and everything else mentioned in my previous post, ‘The Science Behind Sleep’.

The next key aspects are the two R’s: regularity and routine. If you regularly go to bed and wake up at the same time everyday, your internal body clock will become ‘synchronised’ with your timings which will promote better sleep. And when I say everyday, I mean it! Weekend lie-ins can skew this schedule so try to wake up as close to your regular time as you can. This may seem like a real sacrifice but if you are able to improve your sleep quality and get enough sleep on weekdays, then weekend lie-ins will become redundant anyway.

Establishing a nightly routine before bed will indicate to your body that it’s time to wind down. Your routine could consist of a warm bath, relaxation exercises like yoga and reading a book or listening to music. Watching TV or using electronics, however, could hinder your sleep as the blue-wavelength light of bright screens can trick your body into thinking it’s daytime. This in turn causes hormones involved with falling asleep to be delayed. Therefore it is recommended you avoid such screens in the last 30 minutes before bed.

Make sure your sleeping environment is optimal with a comfortable mattress and pillow. The room should be dark, quiet, cool (between 18-24°C) and relaxing. Your diet should also work to your advantage when it comes to falling asleep. Avoid stimulants like caffeine and nicotine in the hours before sleep, and limit alcohol intake as too much alcohol before bed can disrupt sleep later at night.

Stress not only spoils the daytime, but can also cause insomnia by keeping you distracted and awake at night. It is important that you find ways to manage stress. The most obvious way of doing this would be to remove yourself from whatever is causing the stress but I’m well aware that it is not always that simple. Take basic steps to ensure at least some stress is relieved by being organised, allowing yourself to take breaks, eating well and doing exercise. Also, make time for hobbies and being with friends and do not be afraid to talk about your problems. Another good tip is to write a to-do list of what needs to be done the next day before you go to bed.

As little as 10 minutes of aerobic exercise daily can promote sleep as well as the numerous other health benefits to exercising, although in general you should avoid strenuous exercise close to bedtime. Finally, try to cap daytime sleeping to a maximum of 30 minutes. Even if you haven’t gotten enough sleep in the night, a daytime nap cannot make up for that. That said, a power nap between 20-30 minutes in the afternoon can improve alertness and mood.

The Science Behind Sleep

April 16, 2017 in Everyday Medicine, The Science Behind...

 

UntitledWe spend about a third of our lives sleeping, and many aspects of it remains a mystery to scientists but what they do know is that it is very important in brain development, muscle repair, memory consolidation and growth.

Historically, sleep was thought to be a way of conserving energy however the energy actually saved is minimal and sleeping for 8 hours only actually saves about 50kcal- the same amount of energy as a slice of toast! Another theory is that the sleep period keeps animals safe at a time of day most dangerous in terms of predator encounters. However, the lack of consciousness and response to stimuli leaves sleeping animals vulnerable so this theory is also not very strong.

The more widely accepted theory is that physical restoration occurs during sleep. During REM sleep, the majority of what happens is brain repair, restoration and development whilst non-REM sleep is mainly devoted for body repair and restoration. Many studies also show how sleep improves long-term memory processing and converting short-term memories into long term.

 

Sleep is generally split into REM and Non-REM (NREM), in which the NREM is sub-split into 3/4 other stages. NREM makes up about 75% of sleep whilst REM has the rest- in adult. Infants spend closer to 50% of sleep time in REM.

Stage 1 of NREM is Light Sleep, a state between asleep and awake. In light sleep muscle activity slows down, breathing and heart rate begins to slow down and people can be easily awoken. In stage 2, sometimes known as True Sleep, breathing and heart rate are regular, body temperature drops (by about 1o) and awareness of surroundings begins to fade. A sleeper spends more time in stage 2 than in any other.

Stages 3 and 4 are often lumped together as Deep Sleep. Breathing and heart rate reaches their lowest levels and responsiveness to the environment reduces even further. There is no eye movement or muscle activity and most of the information processing and memory consolidation takes place in deep sleep- although it does to some extent happen in stage 2 and REM. Stage 3/4 is where tissue growth and repair happens and hormones like growth hormone is released. Children may experience night terrors, bed-wetting or sleep walking during deep sleep.

Following Deep Sleep we move into REM which stands for Rapid Eye Movement. These side-to-side eye movements are intermittent and considered to be due to images seen internally during dreaming. The majority of dreams happen during REM although scientists do not know why we dream. Unlike in NREM, heart rate and blood pressure increases and breathing becomes faster and irregular. What’s more, most muscles become temporarily paralyzed during REM as brain impulses which control movement are suppressed. This is called atonia, and is thought to prevent us from acting out our dreams and possibly hurting ourselves. This theory was developed by Michel Jouvet who stopped this atonia from occurring in an experiment on cats, and consequently observed that the cats would physically run, jump and stalk prey during their dreams.

The first occurrence of REM lasts for around 90 minutes before the whole cycle begins again. Recurrence of REM becomes longer whilst periods of deep sleep become shorter over the course of the night.

 

So that’s what happens each night when you fall asleep, it’s not as simple as just ‘being unconscious’ as your body takes that opportunity to store memories, heal and dream. Watch this space for a follow up post on how to get that much needed sleep!

 

Read it? Rate it!

The Science Behind the Sun

March 31, 2017 in The Science Behind...

 

As we approach Easter time, many of us will be noticing the blooming flowers and longer, sunnier days- for many of us, a reason to rejoice! The better weather definitely seems to cheer people up and boost ice cream sales, and whilst most people know to some degree that catching a bit of sun can both benefit and harm health, sun protection advice is increasingly ignored by teens wanting to tan and children wanting to play out in the sun. So I’m here to educate and tentatively advise a little bit about the sun.

 

As most people can confidently tell me, sunlight is a great- the best, in fact- source of vitamin D. But what is vitamin D? For one thing, it’s not technically a vitamin. Vitamins are generally defined as organic chemicals that are obtained from a person’s diet because they’re not produced by the body. Yet vitamin D is produced by the body and about 90-95% of it is obtained through sunlight. Also, it isn’t found in any natural foods except egg yolks and fish. Still, old habits die hard so it’s referred to as a vitamin even so.

Vitamin D is obtained from sunlight by using the sun’s ultraviolet B energy to turn a chemical in your skin into vitamin D3. D3 is then carried to your liver and kidneys, each time picking up oxygen and hydrogen molecules, to finally become 1,25(OH)2D aka calcitriol or vitamin D.

Now what vitamin D actually does is a bit more interesting. It’s best-known role is to keep bones healthy. The way it does this is by increasing the amount of calcium that can be absorbed in the intestines. Without enough vitamin D, the body only absorbs 10-15% of the calcium in our diets whilst 30-40% can be absorbed with the right amount of it. It also helps the body to absorb phosphate in our diet, which is also required for bone health.

Without sufficient vitamin D, bones can become soft and weak leading to bone deformities like rickets in children. Rickets is no longer as common as it used to be, but it causes bone pain, poor growth and deformities of the skeleton.

 

Sunlight has many other benefits, such as mood improvement. Exposure to sunlight can increase the brain’s release of the hormone serotonin, which is associated with mood boosting and a deficit of serotonin can lead to depression. There is also a correlation between the number of deaths from heart disease in the summer as opposed to the winter suggesting that the sun can reduce heart disease. UV radiation from sun exposure can be used to treat eczema (dry itchy skin), jaundice (yellowing of skin and whites of eyes) and acne and is sometimes recommended by doctors if they think light treatment would help. Finally, a moderate amount of sunlight may prevent cancer. According to a study from Environmental Health Perspectives, people who live in areas with less sun/daylight hours are more likely to have a variety of cancers including ovarian, pancreatic and colon cancer. However, too much sun can also cause cancer so it’s important to get the right balance.

 

As too many people have experienced first hand, staying out in the sun too much without protection can cause sunburn which not only is painful but also increases your chance of getting skin cancer. Sunburn is caused by the UV light from the sun which can damage the DNA in cells. As a result, the cell with damaged DNA ‘commits suicide’ (apoptosis). Cancer can occur if cells with damaged DNA do not die as they should, but instead continue to multiply. According to the Skin Cancer Foundation, people who have had 5+ sunburns have twice the risk of developing skin cancer. Another thing to be aware of is you can still get sunburnt in the UK and/or if it’s cloudy.

 

Heat exhaustion and stroke are two other serious conditions that can happen when you get too hot, sometimes from being in the direct sun but other times just from being in a hot climate.Heat exhaustion describe the condition in which you become very hot and begin to lose water and/or salt from your body leading to feelings of weakness, dizziness, sickness and various other symptoms.

If heat exhaustion is not treated it can lead to heatstroke, which is when your body can no longer cool itself so your body temperature becomes dangerously high. This puts a strain on multiple organs including the brain, heart and lungs and can be life-threatening. If you have heatstroke, symptoms of heat exhaustion can develop into more serious symptoms like seizures (fits) and loss of consciousness.

If a person displays signs of heat exhaustion, you should try to cool them down by moving them to a shaded or air conditioned area, using a wet flannel to cool their skin and rehydrating them. However, the best advice that can be given is to not get heat exhaustion, heatstroke or sunburnt in the first place.

 

To ensure you are safe in the sun, you should spend time in the shade when the sun is at it’s strongest (between 11am-3pm in the UK) and use at least factor 15 suncream. Even if you are wearing water resistant suncream, it should be reapplied after you’ve been in water or if you’ve been sweating. You should protect your eyes using sunglasses with the CE mark and wear a wide-brimmed hat to shade your face and neck. Children are especially at risk as their skin is more sensitive than adult skin, so children should be encouraged to play in the shade, cover up in loose cotton clothes and wear lots of suncream. Finally, you should not spend a longer time in the sun wearing suncream than you would normally spend without it- my mum’s general rule of thumb is to limit it to 20 minutes of being in direct sunlight at any one time.

 

Whilst the sun does present some dangers, it’s warmth is also essential for human life to even exist and I still encourage you to enjoy it this spring and summer. All I ask is that you do so sensibly, because no sane person enjoys sunburn and heatstroke.

Does the 5-second rule really work?

March 24, 2017 in Everyday Medicine, In the News, The Science Behind...

 

I’m sure you’re familiar with and maybe even ‘daring’ enough to use the 5-second rule, but a news article this week has brought this question to my attention as Professor Anthony Hilton has decreed that it’s indeed true, to a degree. For those of you who actually don’t know what it is, the 5-second rule suggests that if food is dropped on the floor, it can still be eaten if it is picked up within a window of 5 seconds.

 

Notably since 2003, scientists have been making attempts to prove or disprove this theory with Jillian Clarke starting the proceedings by proving that foods will be contaminated- even with brief exposure- to a floor inoculated with E.coli. She did, however, also find that there was little evidence that public floors are in fact contaminated. In 2006, another study found that bacteria could thrive under dry conditions for over a month and that contamination does increase as the food is left on the floor for longer.

Researchers at Rutgers University tested extensively using different surfaces and foods with a total of 2,560 measurements to find that wet foods pick up more contaminants than dry, and that carpet is surprisingly a better surface than steel or tile when it comes to transference of bacteria. Lead researcher Professor Schaffner states, “Bacteria can contaminate instantaneously” and the evidence agrees, but does that answer the question?

As previously mentioned, Anthony Hilton at Aston University led a study in 2014 which found much the same as Schaffner’s yet suggests such results support the 5-second claim. Whilst he accepts that bacteria is inevitably picked up and that eating food from the floor is never “entirely risk-free”, he also points out that the research shows food is unlikely to pick up harmful bacteria from the few seconds spent on the floor. Furthermore, he has said there should be little concern about food that has touched the floor for such a short time. I think that this conclusion rings true with more of the general public than the latter, with 79% of 2000 people admitting to eating food that had fallen on the floor.

 

My view is that most people do not truly believe zero bacteria is picked up in those precious 5 seconds, but assume that the amount is negligible and neither numerous nor dangerous enough to cause any harm. The science does show that the longer food is on the floor, the more bacteria is picked up and in those first 5 seconds any harm from said bacteria is unlikely. Therefore, I would argue that the 5-second rule does work, but really it is up to personal preference and circumstance. But if you’ve dropped a slice of watermelon (made up of 97% water) on a visibly dirty tile, I’d say give it a miss- it’s just common sense…

A Look Back at Anaesthesia

March 18, 2017 in History

 

An anaesthetic is a substance that induces insensitivity to pain with anaesthesia literally meaning ‘without sensation’. Anaesthesia is used in modern times on a day to day basis, during tests and surgical operations in order to numb areas of the body or induce sleep. The types which you are probably most aware of, local and general, are the two most common but there are other types such as regional anaesthetics and sedatives.

Anaesthetics may not be perfect, with c.10 deaths for every million anaesthetics given in the UK, however developments over the years has made the use of anaesthesia to be considered very safe with serious problems being rare. I’m taking a look back in time to see how anaesthesia- more specifically general- has changed over the last 2 centuries.

 

 

Starting in the 1820s, Henry Hickman explored the use of carbon dioxide as an anaesthetic to perform painless surgery on animals. Carbon dioxide could be used to effectively suffocate the animals out of consciousness for a long enough time to perform the surgery. This is considered to have been a major breakthrough in surgery and anaesthesia, however carbon dioxide wasn’t used widespread due to the risks associated with partial asphyxiation and Hickman’s work was heavily criticised at the time. In more recent times, carbon dioxide was said to be used for medicine in the USA during the 1950s and is currently used before slaughter in numerous slaughterhouses.

 

Ether had a reputation as a recreational drug during the 1800s but in 1842 it was used for the first time as an anaesthetic by American physician William Clarke in a tooth extraction. That same year, Crawford Long used it to remove a tumour from a man’s neck although he did not publish an account of until several years later, in which he described that the patient felt nothing throughout the procedure.

Morton gave the use of ether popularity when in 1846 he removed a tumour from a man’s jaw. News of the operation travelled round the world quickly and ether became a widely adopted method with Morton often being credited as the pioneer of general anaesthesia. Ether did have it’s drawbacks though, causing coughing in patients and being highly flammable meaning research in anaesthesia continued to develop.

 

Humphrey Davy discovered that nitrous oxide could dull pain in 1799, however the use of nitrous oxide as an anaesthetic wasn’t fully realised until 1844. Horace Wells attended a public demonstration of nitrous oxide by Gardner Colton and the very next day Wells himself underwent the first ever reported painless tooth extraction while Colton administered nitrous oxide. Nitrous oxide is still used today, you may know it better as ‘laughing gas’, in dentistry and childbirth and even recreationally as the 4th most used drug in the UK (according the Global Drug Survey 2015). Of course, nitrous oxide only dulls pain so could not and cannot be used in major surgery unless used in conjunction with other anaesthetics.

 

You may have heard that Queen Victoria used chloroform during the birth of her 8th child in 1853, but it had actually been around 6 years beforehand and was met with a lot of opposition preceding said event. Scottish obstetrician James Simpson was the first to use chloroform to relieve the pain of childbirth in 1847, and it widely replaced the use of ether as it was quicker acting, didn’t smell as pungent and had fewer side effects. Nevertheless, it was met with opposition mainly due to deaths and religion.

Some religious people believed it was God’s intention for women to feel pain during childbirth so such pain should be endured, not relieved. At the time religion held a lot of power so this scared many God-fearing people away from chloroform. Meanwhile administration required great skill, as surgeons had to be experienced enough to give the right dose. As such the first reported death from chloroform overdose was in 1848. Chloroform fatalities were widely publicised but were mainly caused from poor administration, which was overcome when John Snow invented the chloroform inhaler which controlled the dosage to make the anaesthetic safer and more effective. When Snow used the inhaler to anaesthetise Queen Victoria, the positive publicity left little opposition remaining. The use of chloroform has since been discontinued as it was realised that chloroform could cause liver and heart damage.

 

Skipping forward to 1934, when sodium thiopental was made as the first intravenous aesthetic (injected into bloodstream). It was founded by Ernest Volwiler and Donalee Tabern, both working for Abbott Laboratories and a clinical trial of thiopental at the Mayo Clinic was conducted 3 months later. It rapidly entered common practice as it was short and fast acting (4-7 minutes and 30 seconds respectively), and the fact that it was intravenous allowed for more precise dosage. Volwiler and Tabern were inducted into the National Inventors Hall of Fame in 1986, and thiopental is still used in conjunction/before other anaesthetics although not alone because its effects do not last long enough to be of practical value. Since the 1980s thiopental has been slowly replaced by propofol as it too is short and fast acting but also has antiemetic properties (prevents nausea and vomiting).

 

Halogenated inhaled agents are routinely used today and it could be argued that their emergence transformed anaesthesia as much as chloroform did over 100 years prior. Halothane was first synthesised by C. Suckling in 1951 and first used clinically by Dr Johnstone in Manchester 5 years later and is still widely used in developing countries and veterinary surgery because it is low cost. Following halothane came enflurane(1966), isoflurane(1979), sevoflurane(1990) and finally desflurane(1990s). These halogenated inhaled agents have the beneficial properties of low solubility (meaning they take rapid effect), minimal cardiorespiratory depression and non-flammability. Despite these characteristics, halogenated agents only cause lack of consciousness and do not relieve pain so are used in conjunction with other anaesthetics.

 

Curare was traditionally used on poison darts and arrows by aboriginal people and became the first non-depolarising muscle relaxant (blocks the agonist from binding to receptors) in 1942. Neuromuscular blocking drugs like curare can be used to produce paralysis, allowing intubation of the trachea and to optimise the surgical field. Endotracheal intubation is important as it is used to administer gases like the halogenated inhaled agents, and to aid ventilation of the lungs. The fact that muscle relaxants basically paralyse the muscle means that muscles cannot contract, enabling surgery to be performed as best and safely as possible. Patients can still feel pain at full intensity with the use of muscle relaxants but cannot move, so analgesics (pain relieving drugs) are often also given to prevent anaesthesia awareness.

 

As you can see, anaesthetics has changed a lot in the last 200 years and anaesthesia is now an entire section of medicine in its own right. Nowadays, general anaesthetic cannot be given to a patient without a trained anaesthetist, dosage is carefully controlled and a combination of anaesthetics can be used to achieve the ideal effect.

The Science Behind Clinodactyly

March 10, 2017 in The Science Behind...

 

The word ‘clinodactyly’ stems from the Ancient Greek meaning “to bend” and “digit”, and that is basically what this genetic abnormality is. I first became interested in this when I noticed my younger sister’s little finger was bent inwards, although until now I’d always assumed that it was nothing…

 

Clinodactyly is the medical term used to describe when a finger or toe is curved or bent at an angle, usually with an incline between 15° and 30°. The condition affects about 10% of the population and is passed on through inheritance. It may present either as an isolated anomaly or as part of an associated syndrome, for example a significant percentage of individuals with Down syndrome also have clinodactyly. The condition occurs more in boys than girls and is visible as soon as the child is born.

In most cases, clinodactyly is caused by the growth plate in the hand (or foot) being an abnormal shape or having an abnormal orientation, so the bones do not grow at 90° to the finger axis. Treatment is only necessary if the digit is bent enough to cause disability or emotional distress, in most cases a person with Clinodactyly can use their hands or feet normally. If surgery is required due to inference with function, the procedure involves making a small incision on the affected finger or toe and cutting the bone to correct the deformity. The finger is then stabilised until the bone and soft tissue has healed. Most of the time, surgery is successful however there is a risk that the digit reverts resulting in the need for future surgery.

 

So that’s just a short and simple post about a minor abnormality that you’re likely to see everywhere, now that you know about what you’re looking for.