Tuesday, 25 April 2017

Talk : Food Allergies: What Are They? Why Do We Have Them?

For April's Cafe Sci, Dr Marcos Alcocer from the School of Biosciences at the University Nottingham comes to talk about Food Allergies: What Are They? Why Do We Have Them?. @Gav Squires was there and has kindly written this guest post summarising the event, with some linkage added by NSB.

With the end of our own vegetable gardens, in our industrialised world, not everything is under our control any more. These days, wherever you go, it's almost a "fashion" to have a good allergy. These adverse reactions to food can range from dizziness, itching, vomiting to diarrhoea. Non-toxic reactions to food can include the enzymatic (e.g. intolerance to lactose, alcohol, galactosemia) or pharmacological (e.g. intolerance to caffeine, acids, tyramine alkaloids, histamine, monosodium glutamate, salicylates or benzoates).


Food hypersensitivity can also be immune related and can kill you, unlike something like lactose intolerance, which is just upsetting and painful. Food allergy is immunity going wrong, it recognises things in food as being toxic. In our gut, we have a "sampling device" and everything that we eat is sampled and checked with the immune system. In most of the cases the immune system doesn't respond but if it sees a problem then it activates.

The first time that you encounter an allergen, your body will produce antibodies. For example, for peanuts it will recognise one of the proteins in the peanuts. Cells will now be loaded with antibodies and when these recognise future instances of that allergen, histamines are released. Amongst other things, Histamine causes dilation of blood vessels (reducing blood pressure) and bronchial tubes (which results in difficulty in breathing). In extreme cases this can result in anaphylactic shock.

Structure of an allergy antibody 

It is impossible for your GP to tell you what you are actually allergic to just from your symptoms as they bear no relation to the food that actually caused them. The first paper on food allergies was only published relatively recently - we are only just starting to make the connections. So, while they seem to be a modern fad, they have always been around but people didn't realise what they were.

There were 37 fatalities in the UK between 1992 and 1998:

Peanut - 10
Nuts - 10
Walnut - 5
Uncertain - 4
Seafood - 3
Milk - 2
Chickpea - 1
Nectarine - 1
Banana - 1

We also have the breakdown of where they ate the food that killed them:

Restaurant/Bar - 13
Take-away - 6
Home - 6
Other - 5
Canteen - 3
School - 2
Party food - 2

Roasted Peanuts

Is the prevalence increasing? Yes, it is, but there are problems with the data acquisition. Between 0 and 4 years old, there is a massive incidence of food allergy but most people grow out of it. So, should mothers avoid eating foods that contain allergens while pregnant or while breast feeding? Should parents delay feeding solid food? We don't know. However, there does appear to be one very clear, three-month window between 3 and 6 months in which to make children tolerant.

Atopy, which is inherited hypersensitivity, is increasing. One of the more popular hypothesises about why this is happening is the hygiene hypothesis - the cleaner that you are the more at risk you are. The theory being that too much cleaning can create an immune system that is not used to real life. For example, studies were done in East and West Germany when they were still separate counties. In clean, modern West Germany, atopy rates were 37%, whereas in East Germany they were just 17%.

However, there are other risk factors, such as geography/environment and genetics. For example, Jewish children in the UK were 10 times more likely to suffer from food allergies than those from the same family when they were still in Israel.

There is also a hypothesis that looks at gut microbiotica. Variety of species in the gut is a good indicator of gut health - the more, the better. Allergens react to the bacteria in the gut. We can test this hypothesis by transferring faeces from one animal to another.

Gut Bacteria

What makes an allergen is not well defined - why is one protein an allergen and another not? For example in a Brazil nut, there is one protein that causes a reaction but in a peanut, it can be one of 16.

In the last 10 years, technology advances have allowed us to sequence everything. We can sequence peanut proteins and so we can tell if people will cross-react. Scandinavians and Southern Europeans are both often allergic to apples. However, it is a different protein that causes the reaction. Scandinavians actually get their allergy from Birch trees, which contain the a protein that is also found in apples. The problem with sampling is the time that it takes. A human has around 100,000 proteins but wheat is eight times more complex.

Despite, all of the research, we don't have a cure yet. The only treatment is an adrenaline shot, that will keep you alive for enough time to get to the hospital.

Café Sci returns to The Vat & Fiddle on the 8th of May at 8:00 where Sara Goodacre will talk on Arachnoglobia: Long Journeys by 8-Legged Travellers & Other Stories. For more information, check out the Café Sci MeetUp page - https://www.meetup.com/nottingham-culture-cafe-sci/

https://en.wikipedia.org/wiki/File:Histamine_3D_ball.png https://en.wikipedia.org/wiki/File:Antibody_IgG2.png https://en.wikipedia.org/wiki/File:EscherichiaColi_NIAID.jpg

Sunday, 2 April 2017

Talk - Bits And Bytes - When Horses Meet Computers

For March's University of Nottingham Public Lecture Series talk, Dr Mandy Roshier, from the School of Veterinary Medicine and Science, and Dr Steve North, from the School of Computer Science, join forces to talk about Bits And Bytes - When Horses Meet Computers. @Gav Squires was there and has kindly written this guest post summarising the event, with some linkage added by NSB.

Mandy and Steve are looking at how horses feel and what they want, using computers to measure their behaviour in an objective manner so that we can improve our understanding and the horses' welfare.

The research has focused on animal-computer interaction. Animals have interacted with our technology throughout human history and they now interact with our computer-based systems, whether they know it or not. The lack of an animal perspective on system design can have a negative effect on both animal users and the purpose for which the technology was developed. Animal-computer interaction is a recent field of study and is still very small.

Prehistoric and more recent attempts at depicting horse motion

However, horses have been interacting with our technology for a long time. Before 4000BCE, horses were only used as food for humans. Then between 4000-3000BCE, humans started to use horses for traction and transport. While we still ate horses during this period, this was the beginning of horses having to interact with our technology. These were "hard" technologies, the saddle, bridle, reigns, bit, halter, whip, working collar, harness, chariot, cart and plough. Then there was the most significant development - the "hot shod" horse shoe. As the horse enable humans to travel great distances, trade, carry cargo and share culture and language, it could be argued that the horse was a primary driver of human technological development.

Horse-computer interaction can include environmental, physical training, health and sport performance analysis and more general health such as pregnancy monitors and web cams. Some of these techniques can be invasive. However, touch screen computers are changing the way that scientists carry out equine research as they can take out the risk of human influence in equine decision making and actions. There is also the Aktiv stable, which controls a horse's environment. Microchips attached to the horse control feeding, watering and social interaction between horses. There are benefits to the horse in that it allows a more natural lifestyle but it reduces human interaction.

How do horses respond to their environment? How does this differ between sports horses, those just kept in regular stables and feral horses? (there are no such thing as wild horses, only feral) Behaviour is a part of welfare, although it is one that vets can often overlook in favour of the physiological side of animal care. In the 70s, the Animal Welfare Council came up with their five freedoms, which included a reference to behaviour:

* Freedom from hunger and thirst
* Freedom from discomfort
* Freedom from pain, injury and disease
* Freedom to express normal behaviour
* Freedom from fear and distress

Animal research can include psychology (the study of the mind), ethology (the study of behaviour) and physiology (for example measuring something like cortisol levels) The physiology and the ethology need to be considered together so that you can tell whether a rise in cortisol levels is due to increased stress or increased excitement for example. Within these three areas, you can record observable actions and interpret them. Computers help here as they can cope with large levels of data. This has led to research such as EquiFACS, which looks at equine anatomy by recording minute movements of muscle. This initially started by looking at humans but has moved on to other animals. Similarly, the Horse Grimace Scale was born from research that started with rodents but now looks at a way of measuring how much pain a horse is in. Meanwhile equine motion has fascinated us for years, from the earliest cave paintings to the present day.

Previously, there have only been cumbersome way of analysing horse motion

Following behaviour is when horses follow each other in a group. There are a number of instances of this:

* Following as reproductive behaviour between a mare and a foal, parenting, early development
* Following as intermale interaction
* Trek as maintenance behaviour - locomotion
* Parallel prance as intermale behaviour
* Chase as either intermale behaviour or as play
* Fleeing in response to a threat or unfamiliar stimulus - stampede

Why is this of interest? If we can learn the intricacies of horse-horse following and horse-human following, it could improve the training of horses. How is movement initiated when horse follows horse? When horses follow and mirror the speed and direction of humans, does the horse think that the human is another horse? Is this imprinting or is it learned?

Mandy and Steve are working on HABIT - the Horse Automated Behaviour Identification Tool. This is considering the use of technology to automatically recognise horse behaviours. The goal is to automatically produce YouTube quality videos of automatic analysis of horse-horse and horse-human behaviours. From this it should be possible to asses equine behaviour from a welfare perspective and answer the questions, "Are horse interacting with humans as if they were other horses?" and "Are horse in training behaving 'normally'?" From here the programme aims to inform, increase people's knowledge base and aid training. There is a focus on low stress handling, welfare and safety and a consideration for species interactions.

There are five bedrocks of HABIT

1.Know your species. Where did it come from originally? How did it evolve? How was it domesticated? What is its social life like?
2.Sensory capabilities. How do they view the world? Humans and horses see colour differently because humans are tri-chromatic but horses are only b-chromatic. Horses also have very mobile ears
3.Communication. Reading body language - what do these expressions mean?
4.Our verbal and body language. What's in a name? Our words can make a difference in how others interact with a horse
5.Consider the individual, both nature and nurture

There are many breeds of horse

Using computer vision and machine learning, the system performs video based behaviour identification. It features helper apps that provide a smooth workflow and which processes longer videos into shorter clips ready for analysis by the main system. Computerised tracking and reliability is difficult in the field, as is automatically identifying different horses.

Computer vision can be inflexible. Real time processing can also cause some issues compared to retrospective processing. Some simple behaviours can be processed in real time but more complex behaviours need to be processed. Longer videos are processed into short ones by first identifying segments that contain horses. The video is then summarised in a similar way to instant highlights in sport. The hardest thing to do is to identify those sections that contain horse action and this happens frame by frame using Haar Cascades.

Haar Cascades were invented by Viola and Jones in 2002, named after Alfred Haar for his work in the 20th century on wavelets. It's a machine learning approach where software is trained from many positive and negative images and then used to detect objects in other images. It combines increasingly more complex classifiers into a cascade. It's used for face recognition, finger print recognition and number plate detection on motorways.

HAAR like features used in image processing

To train the system, the computer moves each member of a set of graphical shapes, called features, across the test image at different scales and orientations. Each of these features consists of contrasting regions of black and white rectangles. At each position, the code checks if the image contains a similar range of contrast to the current feature. The best features in a specific position, scale, etc are retained as useful classifiers. The best classifiers are separated in ever more complex stages, these sequential stages are the cascade. A test image can be rejected at an early stage, using a small number of classifiers, thus saving computing power.

For an efficient detector for HABIT, there needs to be unique features of horses that are readily identifiable and visible from multiple viewpoints. The horse "ear detector" has proven to be quite efficient and when this is combined with detectors for "legs" and "side view", reliability increases.

The Habit Ear Detector, detecting ears.

Next, we need to analyse the clips for behaviours. In the field of computer vision, this is sometimes called action spotting. Firstly, you train a classifier to decide if a behaviour is present in the test video clip. Then you repeat for all horse behaviours, of which there are lots. Then you can test a video clip with all of the classifiers and report one or more behaviours identified. To train the classifier, you have to extract key frames to build up a bag of visual words. From here histograms are built that enable you to identify examples of the behaviour that you're looking for.

The next stages of the research are continuing the development of the HABIT system, especially the behaviour identification module. Then work needs to be done on collecting a video of specific behaviours for training and testing. From there, it might be possible to start applying HABIT-like approaches to behaviour identification in other species.

The Public Lecture Series returns on the 20th of April at 6pm where Professor Philip Moriarty will talk about When the Quantum Uncertainty Principle Goes Up to 11. For more information, visit the Public Lecture Series website: http://www.nottingham.ac.uk/physics/outreach/science-public-lectures.aspx

Image Sources
All via Gav Squires

Talk : Visual illusions reveal that the world is different from what we think

Following on from last year's success, the Nottinghamshire branch of the British Science Association again put on a series of talks at Science in the Park 2017, held at Wollaton Hall. Professor Peter Mitchell from the University of Nottingham gives a talk on "Our eyes deceive us: visual illusions reveal that the world is different from what we think". @Gav Squires was there and has kindly written this guest post summarising the event, with some linkage added by NSB.

How do we mis-perceive that world? The was that we see the world is inaccurate and this is demonstrated in the drawings that we produce of it. However, some people do see the world in a more accurate way and they are able to better represent what's actually there.

Drawing has always been a core human activity. Even as far back as 30,000 years ago, humans were producing art on the walls of caves. Their knowledge of the world influence what they drew. By the Renaissance, artists had learned the laws of perspective by creating an invisible eye line. This enabled the three dimensional world to be represent on a two dimensional surface. The way that we draw the world is influence by the way that we see the world.

Cave Art

When drawing something that we've seen in a photograph, we are guilty of "boundary extension" - putting in details that are outside the frame of the photo. We extrapolate from what we've actually seen. Is this because we want to draw complete object? No, even in the case where there are no cropped items in the photograph, we will still extend the boundary. This is because of our inherent knowledge of the world - we know that there is more than just what is in the photograph.

Despite the invention of perspective, we have difficulty depicting three dimensions. For example, if we are drawing an object from a photograph that we know have 90 degree angles, then we will depict them as such even if they don't appear to be 90 degrees in the photograph. Our knowledge is contaminating our perception. Even someone like Raphael was guilty of this. We default to our knowledge.

What about the drawings of children? These tend to focus on what children find important so when they are drawing a car, they will make a big deal of the boot as that's where they put their toys on journeys, for example. However, there is an example of a schoolchild of 11 drawing the reading room at the British Library. He only looked at the scene for ten minutes and took neither notes nor sketches but drew the whole thing from memory. He had autism, which explains why he could see the world more accurately and with more objectivity. He even drew the inside of the dome with correct perspective, which is incredibly difficult to do.

Library Drawing

If we look at visual illusions, they reveal how we mis-perceive the world. The devil's triangle appears to be an impossible shape because of the way that we perceive 3D cues. The Shepard's table illusion shows two tables that appear to be different sizes but are, in fact, the same. People with autism are fooled less by this illusion that people without.

Visual Illusions

Shepard Illusion

A lot of time and effort has gone into finding out what is "wrong" with autistic people and trying to "fix" them. However, they also have strengths. We should be looking to build on them and help them achieve their potential.

Image Sources
All via Gav Squires

Talk - Photobiology - Effects of UV Radiation on Normal Skin

Graham Harrison formerly of Photobiology Dept at St John's Institute of Dermatology, King's College London and now of the University of Nottingham comes to Café Sci to talk about Photobiology - Effects of UV Radiation on Normal Skin. @Gav Squires was there and has kindly written this guest post summarising the event, with some linkage added by NSB.

Visible light has a wavelength between 400 and 700 nanometers. Around 1000 nanometers, you're into the infrared while down at 100 nanometers, you're into the ultraviolet. The shorter the wavelength, the more energy it contains.

There are three types of ultraviolet radiation - UVA(longer wavelength), UVB and UVC(shorter wavelength). All of the UVC in sunlight is blocked by the atmosphere. Only 5% of the light that reaches the earth is ultraviolet and only 5% of that is UVB. UVA penetrates much deeper into the skin but UVB is responsible for 80% of sunburn.

In fact, UVB is at least 1000 more powerful than UVA when it comes to causing sunburn. While UVB is also responsible for the production of vitamin D, as the ozone thins there is more of it coming through the atmosphere.

We can measure the UV radiation using a radiometer. A broad band one is handheld while a more accurate spectroradiometer costs around £30,000. On the other hand, you can use a biological method and examine skin for levels of sunburn. This is done by examining the Minimum Erythema Dose (MED) - the point at which the skin starts to burn. This MED will change depending on the skin type. There are six skin types in all, ranging from Type-I (white), which has a high risk of sunburn and cancer to Type-VI (black), which has a low risk of both.

Skin Types and their reaction to UV

In all of the interactions between UV and skin, photochemistry precedes photobiology. The sunlight is absorbed by a molecule and its energy changes the molecule. This leads to a multitude of effects from tanning to sunburn to cell death to vitamin D photosynthesis. When the UV reaches the DNA, it causes photodamage. Then it binds to the DNA and can cause a mutation. It's possible to stain for the anti-bodies that are evidence of this damage. There was a time when you would have to do a biopsy to look at the scale of the damage but now it's possible to measure the excretion products in urine.

Photoageing is caused when the tissue is damaged by sun exposure. It's actually damage to the collagen in the skin and is called solar elastosis. It is thought to be a UVA effect. DNA damage is also responsible for tanning. This happens when the pigment producing cells (melanocytes) in the skin are activated. Of course, the more serious outcome is skin cancer. The UV goes into the skin and causes a mutation where the DNA is repaired erroneously. The P53 gene usually stops tumours but if this is mutated you can get abnormal cell growth(dysplasia), then immunosuppression and this can lead to cancer. However, there are a number of factors can play a part in cancer forming, including physical environment, behavioural causes, non-behavioural causes and any prevention measures taken. With skin cancer, melanomas are only around 10% of the total skin cancers but they are the ones that kill you.

The P53 Protein
As well as humans, dolphins can get sunburnt. UV radiation can also damage your eyes, it leads to cataracts. Glass protects against UVB so glasses wearers are partly protected but you can still get a tan standing in a greenhouse. Plastic meanwhile will block all UV radiation. Sand on the other hand, reflects all UV and this is why you can get sunburnt particularly badly at the beach.

UV radiation concentration is greatest at noon as the sunlight has to get through less of the atmosphere. There is also much more UV radiation in the summer. Although you do need to be careful because even on a cloudy day, there is UV damage happening to the skin. Most indoor workers get 50% of their annual UV exposure over a span of just 33 days. This usually includes their summer holiday.

Sunscreens are specifically designed to stop sunburn rather than any of the other effects of UV such as ageing. Hence they are only interested in stopping the UVB. However they are generally not used as they should be. The recommended thickness is 2mg/cm2 of skin. This would require 32g of sunscreen to cover a woman's body and 38g to cover a man's. Since you are supposed to re-apply every three hours, your bottle isn't going to last very long and it's going to get very expensive very quickly.

But what does the Sun Protection Factor(SPF) on a sunscreen actually mean? Well, if you could usually spend 20 minutes in the sun before burning then SPF6 would allow you to spend 6 times as long, 120 minutes, in the sun before you burnt. However, these tests are based on thick applications, which isn't how people use it. However, you will still get some benefit from it and even SPF2 blocks 50% of the light. SPF4 blocks 75% and SPF50 blocks 99%, which is why you can't get a higher SPF than 50. Sunscreen is also tested with a very artificial sunlight - equivalent to sunlight at the top of a mountain at the equator.

Titanium Dioxide, a popular suncreen

But the sun isn't all bad. As well as vitamin D creation, there is a feelgood factor from UV radiation.

Café Sci returns to The Vat & Fiddle on April the 10th at 8pm when Dr Marcos Alcocer will talk on Food Allergies - What Are They And Why Do We Have Them? For more information, visit the Café Sci MeetUp page: https://www.meetup.com/nottingham-culture-cafe-sci/

Image Sources
P53, Tiox, Skin Types via Gav Squires.

Tuesday, 28 March 2017

Talk - Gravitational Waves and Black Holes

Dr Thomas Sotiriou from the University of Nottingham recently gave a Café Sci (or Café Scientifique et Cultural to give it its full name) talk on Gravitational Waves and Black Holes - Einstein's Amazing Legacy. @Gav Squires was there and has kindly written this guest post summarising the event, with some linkage added by NSB.

Dr Sotiriou began by describing how scientific theories are replaced with better ones, starting with Newton's law of universal gravitation, which describes the gravitational forces between two bodies in terms of their masses and the distance between them - multiplied by a factor called the Gravitational Constant.

Newton's law of universal gravitation

Gravity doesn't just attract things, it determines how objects move in space. In Newton's time his theory unified how we understood gravity both on the scale of the solar system and also how it works on Earth. It was a theory that proved useful for 200 years. Then, in the 19th century, it was observed that the planet Mercury didn't really obey this theory. Its orbit was very slightly different to what was predicted but this didn't really concern anybody. At this time the outermost planet that had been discovered was Uranus and its orbit didn't match Newton's theory either. The observed orbit hinted that there was another planet that was affecting it - this is how Neptune was discovered. It was then thought that something similar must be happening to Mercury and so an innermost planet called Vulcan was predicted. In reality, it was Newton's theory that was incorrect.

Dr Thomas Sotiriou, with visual aid

Einstein was very interested in light (his Nobel prize was for the discovery of the photoelectric effect). At the start of the 20th century it was known that observers moving at different speeds who are measuring the speed of light get the same value. This is counter-intuitive - if you're running straight at the light then surely you'd get a different speed. Einstein knew that the only way to explain this was that people moving at different speeds must have a different view of time and distance. This was the basis for his theory of special relativity. Einstein realised that space and time were independent, this is where his idea of spacetime came from. To pin down an event you need to know where and when it happened. This revolutionised the way that people thought about physics. The theory of electromagnetism worked really well with special relativity but Newton's theory of gravity did not.

Special relativity only relates to observes moving at constant speed. Einstein knew that to say something about observers that were accelerating, he would have to say something about gravity. For 10 years, he tried to formulate a theory that included both accelerating observers and gravity. The result was his theory of general relativity and it explains how matter curves spacetime. If we know what the matter distribution is then we know how spacetime will curve and this curvature tells matter how to move. This theory accounted for the deviations in Mercury's orbit.

This didn't impress scientists, they wanted a prediction for some unknown thing that existed only in the theory. Einstein predicted the bending of light rays and a year after the theory was published Eddington proved it during an eclipse. This was not the only ground-breaking prediction. Special relativity told us that nothing could travel faster than light and general relativity showed that light is affected by gravity. If light feels this pull and has finite speed then if there was something of huge mass in a very small space not even light could escape its pull. This is what we call a black hole and was predicted by general relativity. People didn't believe it for a long time.

The day light was shown to be affected by gravity

What happens when objects move around in space time? When a boat moves in water in causes ripples - the same thing happens in spacetime and this is where we get gravitational waves. This emits energy and this loss of energy causes objects to get closer together. This is happening with the Earth and the sun but it would take billions of years for the Earth to plunge into the sun. These emissions of energy are very small but when you come to black holes, the gravitational waves are much larger. It took four decades to develop the technology required to detect gravitational waves. The LIGO detector discovered the gravitational waves caused by the collision of two black holes. The energy emitted from the collision was more than the energy from all of the stars in the universe at that moment. Even so, the movement that LIGO detected was the size of an atom over 4km.

The LIGO Black Hole collision

Unlike with Newton's theory, general relativity has nothing to do with mass or forces - this is why it works with photons. While we know that energy and mass are related (E=MC2) but we don't need mass to have energy - photons have kinetic energy. It is actually energy that causes the curvature of spacetime. The famous equation E=MC2 actually only relates to mass at rest.

General relativity is better than Newton's theory but could we eventually have an even better one? Are dark energy and dark matter the equivalent of the procession of Mercury for general relativity? General relativity isn't a quantum theory so it's possible that at some point we will get a new theory of gravity or maybe even a new theory of matter.

Café Sci returns to The Vat & Fiddle on the 13th of March at 8pm where Graham Harrison from the University of Nottingham will talk on Photobiology - Effects Of UV Radiation On Normal Skin. For more information check out the MeetUp site: https://www.meetup.com/nottingham-culture-cafe-sci/

Sunday, 12 March 2017

Talk : Things That Go Bang In The Night (Sky)

The University of Nottingham Science Public Lecture Series had their February talk presented by Julian Onions on the subject of Things That Go Bang In The Night (Sky). @Gav Squires was there and has kindly written this guest post summarising the event, with a few additions from NSB who was also at the event.

Julian Onions

Of everything in the universe what would go off with the biggest bang? At that size, scales are in the region of billions of years - astronomy is a slow science. And it's big, we talk about things in terms of solar masses. A solar mass is approximately 2x10^30kg.

First a bit of solar theory. A big ball of gas collapses down and the pressure makes it get hotter and hotter and in the centre nuclear fusion happens. So there is gravity pushing inwards and energy pushing out and at some point these two forces reach equilibrium. The sun is around 15million degrees Kelvin at the centre. A red dwarf is around half a solar mass and it like a boiling pot. It last for a long time and it very efficient. The sun is a boiling mass at the edge but from around a third of the way out from the centre it is being held up by radiative pressure (light). Giant stars, from around five solar masses, are being held up by just light.

When the sun runs out of fuel, the outward pressure will stop and gravity will win. It will contract down to around the size of the Earth and will become a white dwarf. Then it will glow for hundreds of billions of years. In fact, no white dwarf formed since the beginning of the universe has gone out yet.

Nova are new stars - they are things that suddenly brighten in the night sky. White dwarfs start to steal material from a fellow binary star, building up a hydrogen shell. This warms the star up and can cause the nuclear reactions to start again. It burns very fiercely and this is almost instantaneous and we get a burst of light. It happens for between 25 and 80 days and then it dims. This can happen several times.

A white dwarf is around the size of the Earth and is less than 1.5 solar masses. A neutron star is between 1.5 and 5 solar masses and is around 20km wide. A black hole is larger than 5 solar mases and has an even horizon that is around 30km across. Is there anything between a neutron star and a black hole? In a white dwarf matter goes into a fiery, squashed state called degenerate matter, which is very odd stuff. If you add more matter, it gets smaller. In a regular start matter is around 0.1kg/cm3. Degenerate matter is 10,000kg/cm3 while neutron star matter is 1014kg/cm3.
Masses of different star types compared

At this point it's time to introduce a new unit of measure, the foe. It's a unit of super nova power. One foe is equivalent to 10^44 Joules or around the same amount of energy in 186 Earths. To give that a little context in terms of "bang", the biggest hydrogen bomb we've ever developed is 10^17 Joules.

A kilonova is around 10 foe. It happen when 2 neutron stars are orbiting each other, losing energy through gravitational waves. There is a huge explosion when the two stars become one and this is one of the major ways that heaver elements in the universe are created. This creates short gamma ray bursts but because neutron stars are difficult to see these kilonovas are hard to track down.

There are several types of supernova - 1a/1ax/1b/1c, 2a/2p/2c/2b. Zwicky originally had other types but these have since been subsumed into one of these. Type 1a supernova detonate while 1b, 1c and all type 2 suffer from core collapse.

Comparing supernova types 

Type 1a is around 1 foe in energy and is very similar to the nova. However, the white dwarf is a bit more advanced. It still steals material from a companion but rather than just the outer layers burning off it all lights up. The temperature gets to around 100,000,000 degrres Kelvin and then it explodes. This happens at the same point in every white dwarf - when it reaches around 1.5 solar masses. These are a favourite of astronomers as they almost always give off the same amount of light so it is easy to measure distance. They are almost like a "standard candle" for measuring the universe.

SN1987a - a recent Supernova

Type 1ax was only discovered in 2013. It's where a white dwarf that has lost nearly all of its outer layer of hydrogen and helium goes supernova. Energy wise it's at maximum half a foe and probably around a third of supernova are of this type.

In types 1b, 1c and 2 the hydrogen in the centre of the star is burnt off. Then the star starts burning the less efficient helium. To give some context, the sun will burn for 10 billion years using its hydrogen but only for an extra billion by burning its helium. By the time it reaches silicon, the star is getting desperate and when it reaches iron, it is using more energy to burn it than is being given off. With the power off, there is nothing to counteract the force of gravity. The star contracts at a third of the speed of light. The whole thing then stops, shudders and explodes but no-one knows why. One theory is that the gravity creates neutrinos - most of the energy comes out in neutrinos rather than light. The centre that is left is now a white dwarf or a neutron star.

"Onion Burning"

In a type 1b, the star loses its outer layer of hydrogen so its surface is just helium. Type 1c loses its hydrogen and its helium so it has carbon and oxygen at its outer layers. Type 2l are between 5 and 100 foe but you don't see the actual explosion as it doesn't give out light. Then there is a Peak of luminosity, which slowly fades. Type 2p has a Peak of luminosity and then a plateau before the fade. Type 2n and 2b are all pretty much the same.

A hypernova is a much bigger star that explodes and these generate long gamma ray bursts. Again, they are caused by core collapse. For stats between 8 and 10 solar masses, electron capture takes away the power that was being used to support the star. The temperature gets up around 10^10 degrees Kelvin and then the whole thing catches fire. Between 10 and 140 solar masses, the star suffers from iron core collapse. Between 140 and 250 solar masses, the star suffers from pair instability. Very high energy gamma rays are produced and the energy coming out goes into creating matter rather than supporting the star. These are very rare. Over 250 solar masses and you get photodisintegration. The star turns in on itself and the iron is turned into helium. This then turns into a black hole. The size and the amount of heavy metal in a star determines its fate.

What of other "bangs" in the night sky? The recent detection of gravitation waves was caused by two black holes colliding. Three solar masses worth of energy were given off, around 5300 foe. So, what if two super massive black holes collided? These are 1,000,000 solar masses each and could happen when two galaxies collide. This is a very rare occurrence and it would also be quite a drawn out affair - the two super massive black holes would orbit each other for a billion years. Super massive black holes give off 10^9 foe of energy anyway, this is emitted constantly over millions of years.

Then of course there is the big one, literally. The big bang gave off 10^25 foe of energy. It took around 20 minutes and then the universe went into a decline for the next 300,000 years. That's a one off though and the likelihood of a local black hole collision is very low. So, the supernova is the winner. If a supernova goes off in our galaxy it will probably be visible during the day and Betelgeuse is a candidate to go off in the not too distant future.

So, which big bang are we most likely to see....

The Public Lecture series returns on the 16th of March where Dr Mandy Roshier and Dr Steve North will be talking about Bits & Bytes - When Horses Meet Computers. For more information visit the UoN website: https://www.nottingham.ac.uk/physics/outreach/science-public-lectures.aspx

Image sources
All courtesy of Gav Squires from the talk

Talk : Thinking outside the (pill) box): alternative drug delivery strategies

The University of Nottingham Science Public Lecture Seriesstarted 2017 with a talk by Claire Sycamore entitled "Thinking Outside the (Pill) Box - Alternative Drug Delivery Strategies". Claire is a PhD student in Prof Neil Thomas's research group in the UoN Faculty of Science. @Gav Squires was there and has kindly written this guest post summarising the event, with a few additions from NSB who was also at the event.

Claire Sycamore

Local vs Systemic delivery
Pharmacology is the superhero of our time, different from other treatments such as surgery or radiology. It first came to prominence in the 1930s following the discovery of penicillin in 1928. These days a person will take on average 14,000 prescription pills and 40,000 non-prescription pills. The three most popular non-prescription drugs are all non-targeted and you can actually take quite large maximum doses in a day:

Paracetamol - 4.0g
Ibuprofen - 1.2g
Asprin - 3.6g

Drug delivery systems are all about the interaction at the point that the drug is taken. By working on these systems you can improve the efficacy and the safety of the drugs and control the rate and location of the drug being released. A drug delivery system is something that is given at the same time as the drug.

Ibuprofen has a ph of 4.4, is not very well absorbed and can lead to stomach ulcers. It acts on a fatty hormone called prostaglandin H2 and has two forms "R" & "S". It is only the "S" form that actually works as a painkiller (although the but R can be converted into S in your body over time).

Ideally, we would have something that works locally, not just systematically. For example, the anti-fungal drug Terbinafine can be applied as a cream to the affected area or taken as a tablet. When you take a tablet, the whole body is flooded with the drug. This can lead to strong side effects such as problems with the kidney and the liver.

Common painkillers and their max allowed dosages

So, we need to look at routes of delivery - how the drug gets into the body, for example orally, inhalation, injection. One of the latest inventions is the microneedle (see also here). Needles in general are a great way of getting a drug into a body quickly. They are easy to use and cheap to produce. However, not everyone likes needles and there can be issues with training people to use them properly, for example with diabetes patients. Microneedles avoid all pain, you don't actually feel them piercing the skin. There's less to be fearful of, it requires no training and it give precise localisation. They can even be used to deliver drugs straight into the eye. The only real issue with microneedles is that they can only be used for drugs that you inject.

For drugs that can't easily be delivered by microneedles, a key area of research is delivery vehicles - getting the drugs get to the places that they need to go. Nanotechnology and nanoparticles are the big thing here, allowing controlled targeting and greatly reducing side effects.

But why nanoparticles?

Due to their size nanoparticles have a greater mass to surface ratio. They also have some quantum properties, in that they act more like a wave in some respects. They also have the ability to absorb and carry other compounds. Can we assume that something that works at the "bulk" scale will be just as effective at the nano level?

Microneedles (Copyright: Ryan Donnelly, Queen''s University Belfast)

Getting drugs to the target areas is particularly important in cancer treatment, where the drugs are designed to kill cells and have harsh side effects. These side effects are one reason that an estimated 50% of cancer patients do not comply with their medication pathways. If we can target just the tumour then we can reduce these side effects and make treatment better for patients.

This can be done by using something called a pro-drug. These are drugs which are inactive when administered and are converted within the body, often by an enzyme, into a therapeutic drug.

Prodrugs have been tested on mice where the enzyme is added to a clostridia bacteria and then spores(dormant forms of the bacteria) are taken. These spores are injected into a mouse and then allowed to grow for a couple of weeks. Critically, clostridia bacteria (and the enzymes they carry) will only grow in a low oxygen environment - like a tumour. Then, when the pro-drug is injected it will only activate in the tumour because that is the only place where the bacteria (and hence enzymes) are. You can read more on this research here and here.

Polymer delivery systems
Another problem is the rise in antibiotic resistance. For example, an American woman died in January despite being given all 26 available antibiotics.

According to the World Health Organisation, "Without effective antimicrobials for prevention and treatment of infections, medical procedures such as organ transplantation, cancer chemotherapy, diabetes management and major surgery (for example, caesarean sections or hip replacements) become very high risk."

A potential answer to the threat of antibiotic resistance is to use plastics for drug delivery. Plastics are a type of polymer (incidentally so is DNA) and polymers have a number of advantages in the body:

Easy to prepare
Reduces dosing frequency
Maintain therapeutic concentration with one dose
Reduced side effects
Improved stability
Prolonged release

However, we need to consider what happens with this plastic long term. How long is acceptable to leave in the body? So, we need to find a biodegradable polymer. This isn't as straightforward as it could be as you need the right enzymes and bacteria to degrade the polymer. For example, a biodegradable polymer wouldn't actually degrade in a landfill because it is too dry and there is too little oxygen so the enzymes and bacteria can't survive there.

There are some very specific requirements for this plastic. It has to be bio-compatible, non-toxic, permeable, biodegradable, pure and with a high tensile strength. There are three plastics that are being looked at, PLA, PGA and PTMC. The later seems to be the best choice as it is resistant to hydrolysis, which means that it sticks around longer and it isn't brittle.

Polymers for drug delivery

How can we alter the properties of PTMC to make it into the delivery system that we want? Through using technical processes such as cross-linking, copolymerization and functionalization to incorporate functional side chains. The idea is to attach antibiotics into the basic structure of the PTMC. The antibiotics Gentamicin and Clindamycin are both being looked at with regard to this process as they cause severe side effects (Gentamicin can cause permanent deafness). You can read more about this research here.

Different delivery vehicles to get drugs into the eye are also being looked at. 95% of dose placed in the eye using a dropper is washed away. Is there a better way? Work has started on a contact lens that would include an antibiotic imprinted into it. That way the drug is trapped between the contact lens and the cornea - See UoN's research here and also some work by Harvard here.

Another big area of research is on the cargo - the drug itself. Does it have to be a small molecule? For example, even though there is no human-human transmission at the moment, there are huge fears about H5N1 influenza, also known as bird flu. It has a 60% mortality rate and would be a massive issue if it became pandemic. So a nanovaccine has been created, which is preventative rather than curative (some background can be found in this UoN pdf presentation and this research from the US).

Flu Virus 

Virus Like Particles
The final area of research is targeting strategies. The exterior of a virus is often a protein polymer cage known as a capsid. So called "virus like particles" mimic these capsids and tripper an antibody resonse that protects the vaccine recipient from later infection. An example of this technology is the Gardasil HPV vaccine.

You can also make these biological cages from things such as Ferritin, a storage protein for iron. The cage can opened and closed by varying the pH of the environment - while the cage is open, the iron can replaced by other things such as cancer drugs.


Final Comments
Of course there are crossovers between lots of these areas of research. It may take a while for some to reach the public but these are exciting times in the field of drug delivery strategies.

Overall, it is clear that the direction of travel is for new drugs to be highly targetted so that only milligram dosages are required - aspirin certainly would not be licensed today!

The Public Lecture Series returns to the University of Nottingham on the 16th of February at 6:00pm where Julian Onions will talk about Things That Go Bang In The Night (Sky) For more information, please visit the Public Lecture Series site: https://www.nottingham.ac.uk/physics/outreach/science-public-lectures.aspx

Image Sources
Microneedles - Copyright: Ryan Donnelly, Queen''s University Belfast
Images from Talk - Copyright : Gav Squires
Flu Virus

Monday, 6 March 2017

Truss Me

A buddy I shall call Dr K tipped NSB about an iOS and Android app called "Truss Me" in which you are challenged to build support structures for increasingly difficult combinations of weights and base fixing points. It's a great combination of game and educational experience, and really provides a feel for what kind of structures work....and which ones don't!

The game award a rating of 1-3 "bolts" to your design, depending on how lightweight it is. The app also helpfully shows which beams are in tension and which are in compression, which is helpful to know as beams in compression tend to buckle if made too thin. - NSB is working through the game with the aim of getting 3 bolts at each stage! [Update 25th March - now completed all 24 levels, 3 bolts all the way!]

You can read more about Truss Me on the developers page at http://www.scientificmonkey.com/software.html.

BFTF's scores are shown below - can you beat them by designing a lighter structure?


Challenge 1 : 799 points, 3.1kg
Challenge 2 : 249 points, 10.0kg

Challenge 3 : 283 points, 8.8kg

Challenge 4 : 268 points, 9.3kg
Challenge 5 : 94 points, 26.6kg

Challenge 6 : 87 points, 28.7kg

Challenge 7 : 278 points, 27.0kg
Challenge 8 : 98 points, 25.4kg

Challenge 9 : 211 points, 35.6kg

Challenge 10 : 74 points, 33.9kg
Challenge 11 : 214 points, 23.4kg

Challenge 12 : 54 points, 46.2kg

Challenge 13 : 230 points, 43.5kg
Challenge 14 : 88 points, 28.3kg
Challenge 15 : 63 points, 79.4kg
Challenge 16 : 177 points, 42.4kg
Challenge 17 : 111 points, 22.6kg
Challenge 18 : 98 points, 76.6kg
Challenge 19 : 349 points, 28.7kg
Challenge 20 : 245 points, 61.2kg
Challenge 21 : 874 points, 11.4kg
Challenge 22 : 320 points, 46.8kg
Challenge 23 : 201 points, 87.3kg

Challenge 24 : 306points, 24.5kg

Saturday, 18 February 2017

Talk : Sense about Science

Nottingham Cafe Sci recently hosted Leah Fitzsimmons from the University of Birmingham who gave a talk as an Ask For Evidence Ambassador for Sense About Science on the importance of asking for evidence. @Gav Squires was there and has kindly written this guest post summarising the event.

Sense About Science was formed thirteen years ago because they were sick of the ridiculous news headlines about science. It's based around the idea of putting science and evidence in the hands of the public. They have run a number of campaigns:

- For The Record
- AllTrials
- Don't Destroy Research
- Evidence Matters
- Libel Reform
- Understanding Health Research

Of these, the AllTrials campaign has probably been the largest. It's all about transparency. Dr Ben Goldacre was involved with setting it up and its goal is to see every single clinical trial registered and every single outcome reported. It has resulted in the creation of an automatic trial tracker.

Sense About Science also deals with education. They run workshops and produce publications such as the "I Don't Know What To Believe" leaflet, which had been downloaded more than 500,000 times. They also aim to connect experts with the public through initiatives such as "Voice of Young Science", online public Q & A sessions and "Ask For Evidence"

Ask For Evidence is built around three questions:

What is evidence?
Is it good evidence?
What does the evidence mean?

The idea is that the public can take any claim, for example something they've seen in an advert or something from government policy, and Sense Abut Science will find an expert in that field and find evidence relating to that claim. Here are some of the things that they've looked into so far:

Ann Summers claimed that Buzz Fresh wipes would help users prevent infection. However, they later admitted that things could be as clean and hygienic using water alone.

Vision Express said that their most expensive contact lenses helped to preserve eyesight the best. However, they admitted that it was a sales ploy and agreed to retrain their staff.

Holland & Barrett said that their detox tea was proven to help reduce weight and provided some research to back it up. Unfortunately, the research was about the effect of green tea on people who are morbidly obese - Holland & Barret's tea didn't contain green tea though, it contained Oolong.

Wireless Armour claimed to protect your "assets" from radiation. The research they cited didn't actually test the product claims and the Advertising Standards Agency ruled that there was "insufficient evidence" and so the claims had to be dropped.

How does Ronseal's "anti-microbial" paint work? When Sense About Science asked, they were informed that Ronseal couldn't tell them as it would "breach confidentiality"

An article in The Sun suggest that frequent gadget use put kids at higher risk of autism. But the research presented only showed a correlation not a causal link.

Are old festival wristbands an illness causing health alert? Not according to the research involved. Yes, they do contain bacteria but a lot less than other things such as toilet handles.

Speaking of festivals, are those reusable cups that they have, and that are used at several sporting events, actually improving the eco-credentials of these events? Several studies have shown that they are actually better than biodegradable ones. This was even true when they were first introduced and had to be shipped back to France in order to be washed.

The Ask For Evidence campaign also looks at helping turn speculation into evidence. For example, Network Rail were looking at installing blue lights at all train stations as a way to combat suicides. Research has shown that blue lights have an effect on mice but would that translate to humans? So, now Network Rail are carrying out a study. Had they just gone ahead and installed the blue lights, it would have been impossible to tell whether any fluctuation in suicide numbers was down to the lights or was just coincidence.

When Sense About Science was looking into whether investment in alcohol treatment can recoup five times the cost in savings as claimed, DrugsScope tweeted a reply and links to detailed evidence within 8 minutes.

Understanding evidence makes us a more empowered society but noone can critique every bit of research. While there is no magic bullet for transparency, by not asking or evidence, we're letting science off the hook. So, if you've seen a claim, ask for evidence!

Café Sci returns to The Vat & Fiddle at 8:00pm on the 13th of February where Dr Thomas Sotiriou from the University of Nottingham comes t talk about "Gravitational Waves and Black Holes - Einstein's Amazing Legacy" For more information, visit the Café Sci Meet Up site:


Leah and Sense About Science

Sunday, 1 January 2017

Talk : Breast Cancer - A Researcher's Perspective

The University of Nottingham recently hosted Professor Stewart Martin from the School of Medicine for a public lecture on "Breast Cancer - A Researcher's Perspective". @Gav Squires was there and has kindly written this guest post summarising Prof Martins talk.(NB: some images sourced by NSB, text slightly edited)

The city of Nottingham has an international reputation for its work on breast cancer. The Nottingham Grading System is the international gold standard for the classification of breast cancer. Meanwhile, the Nottingham Prognostic Index is one of the most effective systems for clinical decision support in routine clinical management - it dictates the best support. The first blood test to detect lung cancer was developed in Nottingham and now a similar test is in development for breast cancer. Over the next two years a major breast cancer research centre will be built in Nottingham.

UoN Medical School (source)

Breast Cancer Incidence and Survival Rates
Around two thirds of women who get breast cancer will survive for 21 years or more, 80% survive for 10 years or more and 95% survive for at least a year. This is despite a growing incidence of breast cancer. There are around 15,500 breast cancers diagnosed every year. This leads to around 1,300 lives being saved although 4,000 are over-diagnosed. In the UK, 120,000 women will lose their lives over the next decade and 10,000 women die globally every week. Breast cancer is the most common form of cancer in women. In addition there are 350-400 cases in men each year.

Breast cancer incidence by age in women (UK) 2006-08 (by Mikael Häggström)

Breast Cancer Development
Breast cancer can either occurs in the breast ducts or the lobules but it is more common in the ducts and it happens when normal cells start behaving abnormally.

Early Stage (Stage 1) invasive cancer is where it hasn't spread beyond the breast or the lymph nodes on the same side of the body.

Locally Advanced (Stage 3) is when it still hasn't spread but is bigger than 5cm across and growing into the skin or muscle of the chest and/or much lymph node involvement.

Metastatic (Stage 4) cancer is where it has spread to other parts of the body such as the liver or bones.

Of around 700 patients annually in Nottingham, only 10% are at the locally advanced stage - a tribute to the screening programmes.

How Cancer Spreads (By Jane Hurd )

The Nottingham Prognostic Index
This index looks at three measures of the cancer:

• How big?
• How aggressive?
• How much nodal involvement?
Patients are stratified into three groups - those with a poor outlook, those with an average outlook and those with a good outlook.

Cancer Treatment
There has been an explosion in how you can map cancers at the molecular level. It turns out that there are between 15 and 20 types of breast cancer and these fit into four main classes - Luminal A, Luminal B, HER2+ and Triple Negative. Hormone therapy can help with Luminal cancers, while molecular techniques can help with HER2+. Triple Negative cancers are treated with traditional therapies such as chemotherapy.

Around 50% of breast cancers are Luminal A and these have the best prognosis. 15-20% are Triple Negative and these have the worst prognosis. Around 10% of cases are HER2+ and the remaining are Luminal B.

There are currently three big gaps in breast cancer research. The first is early detection. At the moment, only women over 47 can get a mammogram. So, the plan is to try and develop a blood test. The second is trying to stop the cancer spreading - discover why is spreads and find ways to stop it. Finally, the need to treat it right - develop targeted treatments for each patient and develop ways to improve conventional treatments.

Blood Test Samples (by GrahamColm)

Cancer Break Up is Bad
When the primary cancer breaks up and makes its way around the body, the survival rate drops. Blood vessels feed the cancer and allow it to get bigger. Meanwhile, lymphatic vessels help to drain the cancer. Ten years ago it was thought that cancer was metastised through the blood vessels. New technology has allowed scientists to differentiate between blood vessels and lymphatic vessels. You don't see that many lymphatic vessels in a cancer but there are a lot of blood vessels. So we expect to see many more cancer cells in the blood vessels but actually 97% of the invasion is in the lymphatic vessels. It invades these vessels because it is easier to migrate and it also causes the immune system to act differently.

Prof Martin (via @GavSquires)

Cancer Mortality (Mikael Häggström, using using reference:Jemal A, Siegel R, Ward E et al. (2008). "Cancer statistics, 2008")

Calpastatin and Calpain
Decreased levels of the protein calpastatin leads to increased levels of one of more of the calpain protein family, which allow cells to start migrating. Patients with triple negative breast cancer (the most dangerous type) have significantly worse prognosis if their tumours have high expression of calpain-2. Calpain also seems to pay a role in regulating therapeutic response - women resistant to certain treatments have high levels of caplain-1. So, looking at caplain levels helps with prognosis and by targeting it, treatment can be improved and the spread of cancer can be stopped.

Calpain (by Jawahar Swaminathan and MSD staff at the European Bioinformatics Institute

Radiotherapy uses high-energy radiation to treat cancer. It increases oxidation stress but cancer cells often have increased anti-oxidants which could decrease the treatment’s effectiveness.

Can we do something to improve this treatment? Yes, one family of proteins stands out - if we can target these proteins we can improve the treatment. Unfortunately, it takes a long time to test the drugs that could have an impact. However, it seems that metformin, a treatment for diabetes that is over 50 years old, can make radiotherapy 100 times more effective in certain cancers.

Metformin (source)


Life Cycle 6 Fundraiser

The public lecture series returns on 19th January at 6pm in Lecture Theatre B1 at the School of Physics and Astronomy where Claire Sycamore will talk about "Thinking Outside the (Pill) Box: Alternative Drug Delivery Strategies" For more information visit: http://www.nottingham.ac.uk/physics/outreach/science-public-lectures.aspx

Related Posts
Targeted Drug Therapeutics