A Snail in the Ginger Beer


Donaghue v Stevenson 1932 and our duty of care


Paisley may be known for its patterns and its shawls, but this town to the west of Glasgow has a much greater claim to fame, at least according to those interested in the law.  For it was there, 90 years ago, that one of the most important legal decisions in history, and one that still has implications for medical practice today, had its origins.

Mrs Donoghue’s ginger beer

On the hot summer evening of 26 August 1928, Mrs May Donoghue, a shop assistant from the East End of Glasgow took a tram ride to Paisley.  There, she met a friend at the Wellmeadow Café and they decided to quench their thirst. Her friend bought the refreshments and Mrs Donoghue was served a glass containing ice cream over which the waiter poured some of a bottle of ginger beer making an iced drink or float.  After Mrs Donoghue drank some, her friend added more ginger beer from the dark glass bottle that had been left on the table.  As she poured, Mrs Donoghue noticed something fall into the glass, which she recognised as a decomposing snail.  Understandably, she immediately felt sick and became “shocked and ill.”  She was clearly affected by the event for she sought medical treatment three days later from her own doctor and again three weeks later in mid-September at Glasgow Royal Infirmary.

The legal case

Because it was her friend who had purchased the ginger beer, Mrs Donoghue had no contractual relationship with the café owner.  She would later learn that the only one she might sue would be the manufacturer of the drink, David Stevenson, whose name was clearly written on the dark glass bottle in large white lettering.  Moreover, she would have to prove negligence on his part, if she was to recover any damages and that claim of negligence would require there to be a duty of care between her and Stevenson.  However, at the time, the law supported the existence of a duty of care to people harmed by the negligent acts of others only in very limited circumstances.  These would include instances where a contract existed between the parties, if the manufacturer was acting fraudulently or if the product was inherently dangerous.  In Mrs Donoghue’s case none of these applied, but she was determined to seek such damages and engaged the Glasgow solicitor Walter Leechman.  Interestingly, Leechman’s firm had represented the unsuccessful pursuers in two recent similar “mouse in ginger beer” cases.  It seems more than a coincidence that of all the lawyers Mrs Donoghue might have consulted, the one she chose had both the experience and the resolve to pursue such a case.  Quite how she found him, or was directed to him, remains a mystery.  Leechman issued a writ against Stevenson claiming damages of £500 plus costs and noting that, “snails and the slimy trails of snails were frequently found” in the factory where his ginger beer was manufactured and bottled.

Stevenson’s counsel moved the Court of Session to dismiss the claim and were eventually successful.  Thus, Leechman now began the process of appealing the decision to the House of Lords.  However, Mrs Donoghue had no money.  Not only had she to declare herself in writing as a pauper so she could be absolved of the need to post security to cover any costs if her appeal was unsuccessful, but her counsel had to proceed in representing her without any guarantee of payment.

On 10 December 1931, five Law Lords met to hear the first of two days’ arguments in Mrs Donoghue’s case.  Some five months later they delivered their judgement and by a majority of three to two they agreed she did have a case.  Mrs Donoghue, they ruled, was owed a duty of care by the manufacturer and bottler of the ginger beer and she could bring an action against him.  This duty of care was founded on the “neighbour principle” eloquently expounded by one of the law lords, Lord Atkin.  He summarised this principle in the ruling as follows:

“The rule that you are to love your neighbour becomes in law, you must not injure your neighbour; and the lawyer’s question, Who is my neighbour? receives a restricted reply. You must take reasonable care to avoid acts or omissions which you can reasonably foresee would be likely to injure your neighbour. Who, then, in law, is my neighbour? The answer seems to be—persons who are so closely and directly affected by my act that I ought reasonably to have them in contemplation as being so affected when I am directing my mind to the acts or omissions which are called in question.”

The “lawyer’s question” referred to was the one asked of Jesus in Luke’s Gospel, and which prompted Christ to tell the parable of the Good Samaritan.  Indeed, Lord Atkin’s ruling was firmly based on his reading of Judeo-Christian scripture.

Our duty of care today

The importance of this ruling lay in its implications for our understanding of negligence.  In law, for there to be medical negligence three criteria must be met.  First, a doctor must owe a duty of care to the patient in question. Second, there must be a breach of that duty of care.  And third, the breach must result in harm to the patient.  Thus, the concept of “duty of care” is central to our understanding of negligence, and it may be defined as an obligation we hold to take care to prevent harm being suffered by others.   In defining to whom we owe this duty of care as our “neighbour”, Lord Atkin created a new basis for the law of negligence, and of course his wide definition of neighbour would certainly include any doctor’s patient.


Not long after the House of Lords had ruled that Mrs Donaghue would be entitled to recover damages if she could prove what she alleged regarding the snail, David Stevenson died.  His executors agreed an out of court settlement of £200 (almost £10,000 today) and as such the case never went to trial.  May Donoghue died in a Mental Hospital in 1958, likely unaware of the global impact that summer evening trip to Paisley had had some 30 years earlier.

Today, in Paisley you will find a small park, a bench and a memorial stone at the corner of Well St and Lady Lane where the café once stood.  Most locals know little of its significance, but occasionally you will see a stranger standing reading the inscription on the stone.  Often they will be lawyers, sometimes doctors, who have made the pilgrimage from England, from North America or from even further afield, just so they can stand on the spot where our duty of care and the concept of negligence began.

The site of the Wellmeadow Café today


Post script

In everything that has been written about this case, including all of the original legal documents, the bottle in question is said to have contained “ginger beer”.  What is perhaps not widely known is that the term “ginger” is a colloquialism in Glasgow for any fizzy drink.  It is possible that in reality the bottle did not contain true “ginger beer” but some other form of flavoured, aerated water such as orangeade referred to by Mrs Donoghue and her friend as “ginger”. Whether the bottle contained a snail at all is also a subject of controversy.  Some have argued that Mrs Donoghue’s claim was just a hoax to extort compensation.  Whatever the truth of the contents of that bottle, it remains the reason for our understanding of negligence around the world.


Detail from the adjacent mural painted by local artist Kevin Cantwell showing the famous snail and the ginger beer bottle.


© Allan Gaw 2018

This article was originally published in the MDDUS’ Insight


If you are interested in reading more about medical history take a look at my books:

‘Trial by Fire’ and ‘On Moral Grounds’ — paperback copies of these are available from my website:


and e-copies are available from kindle on Amazon.



Unravelling the magic — the history behind placebos in research

TODAY, we take the use of placebos in clinical trials for granted, often assuming that this is a relatively recent innovation. The truth, however, is more interesting and begins in pre-revolutionary France.

In 1778, Parisians who were sick and rich could try a novel treatment from a charismatic physician called Anton Mesmer who had recently arrived from Vienna. His clinic was in the exclusive Place Vendôme in Paris. There you would enter a dimly-lit room and join others seated in concentric circles. At the centre of the room was a wooden tub filled with ground glass, iron fillings and bottles of magnetised water along with metal rods. You would be invited to hold one of these rods on your affected body part. In the background there would be hushed silence punctuated by the ethereal sounds from the glass harmonica – a newly invented musical instrument sounding like a wet finger stroking the rim of a wine glass.

The scene set, Mesmer would appear in a lilac silk coat carrying a metal wand. He would sit en rapport with some patients – knees touching and gazing intently into their eyes. His assistants, reported to have been young and handsome, would also help the magnetic flux by massaging the knees, backs and breasts of patients. This combination of sensory stimuli caused many patients to become entranced or mesmerised and some to faint or convulse. And, of course, many claimed to be cured.

But what was really happening here? Lighting, music, costume, drama and sensuality – what was going on was more ritual than medicine, more suggestion than treatment, a little more Dumbledore than doctoring. Perhaps in a pre-enlightenment era, this would simply have been viewed as magic. But, this was the 1780s – the world had moved on. Now, this magic had to have a rational scientific basis and Mesmer provided it. He believed magnetic fluid flowed into us from the stars and that disease was the result of an obstruction to this flow. His treatment was designed to realign this animal magnetism.

Mesmer’s treatments soon became the height of fashion, but he was not without critics, and the establishment would have nothing to do with him. Indeed, the King himself stepped in and appointed a commission to investigate, asking the elderly American Ambassador to France to take the lead. This was none other than Benjamin Franklin. Today we remember Franklin as an elder statesman, but in his lifetime he was among the most celebrated scientists and it was in this capacity that the King sought his help.

Franklin and his colleagues devised a series of experiments using placebos for the first time. Subjects were presented with magnetised objects and with sham objects that looked the same but were untreated. The patients were unable to distinguish the two and variably reported the effects. As a result of these placebo-controlled experiments, the commission was able to conclude that there was no basis to Mesmer’s claims. Instead, they explained that animal magnetism “owed its apparent efficacy to the power of suggestion in susceptible or naïve individuals.”

Although the term placebo did not enter medical parlance until 1785, it is clear that for centuries before healers had used remedies they knew to be inactive, but which they also knew would appease their patients. Placebo indeed is Latin for “I shall please”. However, Franklin and the Commissioners are credited with being the first to use placebos in a clinical research setting.

Placebos are now an essential part of modern research, used to prevent confounding from the so-called placebo effect, i.e. the effects that an inactive substance, procedure or device may have when administered in a clinical context over and above the effects observed of no treatment. This effect is complex and still relatively poorly understood, but it is undoubtedly real and can significantly impact our evaluation of different treatments if not taken into account. Whatever the treatment, it may be possible to create a matched, but ineffective alternative to act as a control. Benefit may only be claimed if the active treatment produces significantly greater effect than the placebo.

Thus, without the ingenuity of a group of enlightened French scientists led by an aging American diplomat, perhaps today we would not have the placebo-controlled randomised clinical trial. Perhaps our clinical practice might still be based only on observation and anecdote rather than hard evidence. And perhaps physicians would still have wands.

© Allan Gaw 2018

Originally published in FYi on July 26, 2018



If you are interested in reading more about the history of clinical research take a look at my books:

‘Trial by Fire’ and ‘On Moral Grounds’ — paperback copies of these are available from my website:


and e-copies are available from kindle on Amazon.



People who look like me


When former First Lady Michelle Obama unveiled her new portrait at the National Gallery of Art in Washington DC on 12 February 2018, she made a speech.  Standing before the striking image painted by Amy Sherald, she said just how important she thought it was for young black girls to visit the gallery and see the painting, to realise what was possible — to see a portrait of someone who looked like them.  Perhaps she hoped the portrait of a black woman who had become First Lady, hanging side by side with the great and the good of America’s history, would inspire.  Or perhaps not quite that; perhaps she hoped it would simply normalise the achievement.

In whatever field we work, we should not underestimate the power of examples, of role models, of trail blazers, of “people who look like me.”  Women dominate the world of clinical research in terms of numbers, but not in terms of leadership.  For example, what percentage of Chief investigators are women?  In contrast, what percentage of Research Nurses and Administrators and Pharmacists are women? There are obviously a number of reasons for this imbalance, but might the lack of visible role models be one of the reasons why women are so underrepresented in leadership roles in this field.

When we write books and tell the tale of how Clinical Research has developed, it is inevitably a story of men and white men at that.  I know this for a fact, because I have written several of those books.  How many women innovators in clinical research can you name?  Probably none.  Is that because there weren’t any, or is it simply that their contributions have been overlooked and their stories untold?

Even a cursory glance through the pages of history, especially recent history, would suggest it is the latter. For example, there is Elizabeth Wortley Montagu who pioneered smallpox inoculation 75 years before Edward Jenner tested his vaccination procedure.

There are Dorothy Irene Height, an African-American Civil Rights Activist and Patricia King, an African-American Lawyer who both served on the National Commission that produced the Belmont Report in 1979 and defined the three key principles that would underpin Clinical Research Ethics in the US and in many other countries up to the present day.

And there are three women in Nuremberg — Herta Oberhausen, the German defendant, Wladyslawa Karolewska, the Polish victim and Vivien Spitz, the American Court Reporter who all sat in the same dismal courtroom in 1947.  Their combined story helped shape the outcome of the Doctors’ Trial and the way we would remember it seven decades later.

Against this background of contemporary gender inequality, should we do more to raise awareness of these women and the countless others who have helped forge modern clinical research?  I think we should, and I’m busy trying to do that right now, but in so doing I have come across a further, deeper complication in the work.

My approach has been one of excavation — to identify women who have played a significant part in the story of clinical research and to tell their stories, and thus to inspire and motivate or perhaps just to normalise their contributions.  But, by specifically focussing on women I have been brought to task by some who feel that the problem is not solved by such positive discrimination.

“Why would you write a book about women in research?” they say. “You should be writing a book about people in research.”  This objection seems to cut to the core of modern feminism — once an inequality is acknowledged, how do you redress it?  Do you restore the balance by giving renewed emphasis to those who have been left out of the story, or is that just a patronising fix that only serves to paper over a much deeper crack?  By focussing on women in this work, do I simply perpetuate, even amplify, a difference that should never have existed in the first place?  I suppose my take, for what it’s worth, is that while it should never have been an issue, it is a real problem now with, I think, real consequences.

I know this is a controversial topic, and I know there is no single answer to it.  I also know that it is not simply my male perspective at odds with a female one, after all not all women think alike, nor do all men.  What it is, however, is an issue that divides rather than unites.  If the ‘solution’ to a problem makes that problem worse, we should take a breath and start again. But if we only think it might make it worse, isn’t it worth a try?

We all need people who look like us whether we are young or old, black or white, male or female, and if they don’t paint pictures of them or write books about them, I think they should start.


© Allan Gaw 2018



Coming Soon…

Now available in paperback…

Testing the Waters: Lessons from the History of Drug Research

What can we learn from the past that may be relevant to modern drug research?

Screen Shot 2015-10-14 at 21.22.11

“I cannot recommend it highly enough: even if you read nothing else about the origins of drug research and what it can (should?) teach us, read this….This is a ‘buy’.”  Madhu Davis review in Pharmaceutical Physician May 2016.

My other books currently available on kindle:

Screen Shot 2015-01-18 at 17.19.43Screen Shot 2015-01-18 at 17.19.00BIS Cover copyScreen Shot 2015-01-18 at 17.19.29Screen Shot 2015-01-18 at 17.18.54Screen Shot 2015-01-18 at 17.19.37Screen Shot 2015-01-18 at 17.19.16Screen Shot 2015-01-18 at 17.19.58Screen Shot 2015-01-18 at 17.20.06

In the absence of consent

Modern clinical practice and research are built upon the ethical foundation of voluntary informed consent.  Especially in the world of clinical research, we are compelled to navigate a complex maze of rule and regulation in this respect, and for very good reason.

Much of the legislation and professional regulation that govern the issue of consent in research unfortunately has its origins in tales of atrocity, abuse and exploitation.  All histories of clinical research re-tell the tales of ancient, and not so ancient, experimentation where ruthless investigators probed, incised and manipulated their hapless victims without a thought for their well-being.

While this is true and should never be forgotten, it is far from the whole story.  And if we are to have a rounded view of where we have come from, and if we are to understand the meaning of our modern research governance, it is important that we look beyond these stereotypes.

It is uncommon to find any mention of consent or a consent process in research involving human participants before the turn of the 20th century.  However, prior to that time there are many well-documented examples of experimentation involving interventions, including many that were potentially life-threatening.  Without the mention of consent in these tales, we are instantly on our guard and may condemn these experiments as immediately unlawful and immoral.  But, even if there is no formal informed consent process, at least as we would recognise it today, the alternative is not merely abuse and exploitation.  There is surely a lot of middle ground and much of it might still be governed by common decency, professionalism and basic human respect.   Our forebears in clinical research were not, as they are sometimes depicted, all monsters. In fact, almost none of them were—they just lived in a different time.

The approach they took when working in a world where the very notion of patient consent was quite alien, may still have been respectful.  Take the example of the following case study.

Edward Jenner, the Gloucestershire doctor from the 18th century, is today best remembered for his contribution to our understanding of vaccination. In the spring of 1796, he performed his first human experiments. A local milkmaid called Sarah Nelms had contracted cow pox and Jenner found her to have fresh lesions on her hands. On 14 May 1796, Jenner took pus from Nelms’ lesions, and inoculated the son of his gardener.  The boy, James Phipps, was 8-years old at the time, and he subsequently developed a mild fever and discomfort in the axillae, but apart from these minor symptoms quickly recovered. Almost seven weeks after the cow pox inoculation, Jenner inoculated the boy again, this time with pus from a fresh smallpox lesion. Phipps, despite this challenge, did not develop the disease and Jenner concluded that he was immune.  This clinical experiment along with others were written up by Jenner who detailed the procedures but not the informed consent process for there was none.

There is no record that James Phipps resented his involvement in Jenner’s clinical experiments.  Indeed, the fact that he served as one of Jenner’s pall-bearers some 27 years later at the doctor’s funeral would add weight to the belief that he bore Jenner no ill will as he reflected on his childhood involvement in the vaccination experiments.  Jenner’s experiments would, of course, never find approval from a modern research ethics committee, and his inclusion of minors without any formal parental consent would provoke outrage if carried out today. But, that does not mean he did not respect Phipps and his family.

Of course, we might say that Phipps, as the son of Jenner’s gardener, was outclassed and that his parents were in no position to refuse their employer.  All of this is true, but again it does not automatically make Jenner a villain. You may also say that this is a poor example because Phipps was ultimately unharmed by Jenner. Might we not cite the countless other examples of individuals who were maimed in other clinical experiments, who died as a result of their involvement or lived on feeling embittered and used?  Almost certainly we could, but the point of this argument is not to deny that there has been incalculable harm done to people without their knowledge and consent in the name of medical research.  It is to argue that it is not universally true that all past investigators did not care or treat their subjects with respect.

It is also about the importance of not measuring the past against the standards of the present.  I wonder how we will fare when it is our turn to be judged by history.  In the 22nd and 23rd centuries, what will medical historians think of our current practices and what we chose to concern us?   Will we be ridiculed for our lack of sophistication, pitied for our ignorance or even vilified for our cruelty?  Will we be viewed as morally reprehensible as well as stupid?  Or will an enlightened historian in the future say that we were doing the best we could with what we had and with what we knew?  With luck, he or she might conclude that we were not fools, nor were we barbarians or monsters; we were just the products of our time.

© Allan Gaw 2018

Coming Soon…

Now available in paperback…

Testing the Waters: Lessons from the History of Drug Research

What can we learn from the past that may be relevant to modern drug research?

Screen Shot 2015-10-14 at 21.22.11

“I cannot recommend it highly enough: even if you read nothing else about the origins of drug research and what it can (should?) teach us, read this….This is a ‘buy’.”  Madhu Davis review in Pharmaceutical Physician May 2016.

My other books currently available on kindle:

Screen Shot 2015-01-18 at 17.19.43Screen Shot 2015-01-18 at 17.19.00BIS Cover copyScreen Shot 2015-01-18 at 17.19.29Screen Shot 2015-01-18 at 17.18.54Screen Shot 2015-01-18 at 17.19.37Screen Shot 2015-01-18 at 17.19.16Screen Shot 2015-01-18 at 17.19.58Screen Shot 2015-01-18 at 17.20.06

The Burden of Swallows


Shortly after Seamus Heaney’s death, one lover of the Nobel Laureate’s poetry wrote in the condolence book in Belfast’s Linen Hall Library, ‘Famous Seamus, you’ve up and left us.’  There was affection, familiarity and even poetry in that simple lament.  And it’s the poetry — the poetry of life — that is worthy of reflection.

In his poem The Railway Children Heaney tells us that the wires stretching between the telegraph poles ‘sagged under their burden of swallows.’  What those darting, twilight visitors lack in size, they make up for in number.  As they sit in endless rows, these birds appear to weigh down those lines that are now so busy carrying other forms of twitter.  But, of course that’s the illusion — the poetic explanation of something rather more mundane.  In reality, when the birds rise to flight and vacate their roosts, the wires still sag almost in memory of their recent occupants.  However, that notion of memory is also a poetic one, for wires and telegraph poles, and even swallows, do not share our playfulness with the English language.

Poetry is seen by many as a rather unnecessary luxury in their lives.  It decorates the world with something unseen, but felt.  It creates an added sense that tries to express what we might otherwise struggle with.  But, we don’t need poetry in the way we need basic sustenance and shelter and love.  Nevertheless, expressions of beauty, and of horror, as well as all the emotions we strive so hard to keep away from science, do have a use.

Let me take the poetry out of Heaney and see what we have left.  I could tell you that the average mass of a swallow is 18g and that the average number of swallows per metre of a well-stocked telegraph wire might be about five.  This load, of around 90g per metre, is less than the weight of the wire itself, which of course is the real reason for the sag.  All this is true, but after I first read Heaney’s words, I have never been able to look at a sagging telegraph wire without imagining the flock of swallows that has just up and left us.  A poetic vision of the world does not just add colour, it also brings it into sharper focus.  And it adds context.  After all, we do not carry on our business of discovery in a vacuum, but in a very real world where consequences have hard and sometime sharp edges.

The scientist has a clear responsibility to the facts and may define his or her endeavours as a search for the truth.  With that sense of responsibility, often comes a sense of derision for poetry. But, it doesn’t have to be like that; one need not necessarily mean the demise of the other.

Science is a world full of ‘how’ and ‘why’ that begins with amazement.  This universe, this cell, this molecule have all possessed majesty long before they possessed meaning.  The American cosmologist Carl Sagan reminds us that every child who ever lived has gazed up and asked why the sky is blue.  Some, I suspect a tiny minority, receive the answer, while others are simply told to be quiet. Some might be offered a fable or a myth as a substitute for the facts.  But whatever the answer, the question was borne of wonder. And wonder is the stuff of poetry.

Another Nobel-prize winner, the immunologist Sir Peter Medawar, reflecting on the very nature of science, said of its paradoxes,

…a scientist must indeed be freely imaginative and yet skeptical, creative and yet a critic. There is a sense in which he must be free, but another in which his thought must be very precisely regimented; there is poetry in science, but also a lot of bookkeeping.

Those who forget to be poets, who see such a way of looking at the world as flimsy and unnecessary, are only seeing half of what the universe has to offer. And scientists who cannot appreciate the poetry of their work and who focus solely on that bookkeeping are only ever doing half the job.

© Allan Gaw 2018

Now available in paperback…

Testing the Waters: Lessons from the History of Drug Research

What can we learn from the past that may be relevant to modern drug research?

Screen Shot 2015-10-14 at 21.22.11

“I cannot recommend it highly enough: even if you read nothing else about the origins of drug research and what it can (should?) teach us, read this….This is a ‘buy’.”  Madhu Davis review in Pharmaceutical Physician May 2016.

My other books currently available on kindle:

Screen Shot 2015-01-18 at 17.19.43Screen Shot 2015-01-18 at 17.19.00BIS Cover copyScreen Shot 2015-01-18 at 17.19.29Screen Shot 2015-01-18 at 17.18.54Screen Shot 2015-01-18 at 17.19.37Screen Shot 2015-01-18 at 17.19.16Screen Shot 2015-01-18 at 17.19.58Screen Shot 2015-01-18 at 17.20.06



Elastic Glue


In Seven Dials there’s a sign you might miss. Between the first floor windows above a shop in Earlham Street, you’ll find the red, white and blue, but slightly battered, board for F. W. Collins.  “Who was he?” you might ask.  Well, the sign also let’s you know his claim to fame: “Elastic Glue Manufacturer (Sole Inventor 1857)”.  It’s hard to read that and not think there’s a story worth telling.

Apparently, the shop beneath was for generations a hardware shop run by a series of fathers and sons all named Fred Collins.  Back in mid-Victorian London, one of them had the idea of mixing an adhesive that would stay flexible when set.  They prepared it in the basement of the shop in huge cast iron cauldrons and sold batches to fix saddles and problem horse shoes.  The innovation allowed the leather of the saddle or the iron of the horseshoe to hold fast, but to accommodate the movement necessary for their use. In short, Fred Collins’ Elastic Glue ensured any such repairs were effective and resilient.

Today, we talk a lot about resilience — the ability to cope and withstand what’s thrown at us in both our personal and professional lives.  The nurturing of resilience is seen as a good thing, an important attribute to acquire and develop and nothing short of a modern-day survival tool.  But what does it entail?  Like everything else to do with people, it’s hard to imagine that any single answer to that question would apply to everyone. But, if there is some common ground in it all, it might be the ability to bend in the wind, but still hold fast.

The young sapling needs a flexible trunk to take the force of the autumn gusts, but it also needs roots to anchor it to the earth.  Only then can it grow, and in time, after a long apprenticeship learning to deal with the wind, can it acquire the heft to push back with equal force.

In the same way, we need to be flexible enough to bend and accommodate the adverse forces put upon us.  But we also need to be able to spring back and hold our position when those forces abate.  Just like the tree we need roots to hold us to our course.  There might seem to be a paradox in there.  How can we stay in one place and at the same time be expected to move? Resilience requires resolution for we need to know what we are standing for.  But it also requires the ability to absorb the blows of misfortune and adversity.  Only by yielding to the wind does the young tree save itself from snapping, and only by accommodating might we remain unbroken.  But bending does not mean we have to shift.  Resilience is not about running away, but rather about standing our ground, while forces that might break us are used to strengthen our base.

So much for metaphor, but when we are tired and vulnerable to insult, the last thing we might be able to do is bend.  That’s where the nurturing of resilience alluded to above comes in.  But, it does prompt the question: can resilience really be taught? Or are we merely at the mercy of our DNA, hardwired by our genes to be pessimists or optimists, worriers, victimised or down-trodden, or those who take it all in their stride and emerge on the other side of adversity walking even taller and straighter than before?  Many psychologists contend that while some may be naturally resilient, many are not, but they can learn how to be.  However, those same psychologists disagree on quite what is meant by resilience, so understandably there are different views on what we might focus on to achieve it.  That said, there is a lot of common ground between the theories and here are five points you might want to consider.

1. Be objective

There is a bigger picture in life, which may be difficult to see, especially when we’re standing in a hole, weighed down by what has been piled on our shoulders.  But, remember it’s not all about you — in fact it’s hardly ever about you.  Things happen, but the universe wasn’t setting out specifically to spoil your day. It’s easy for the pessimist in all of us to view adversity as inevitable, but it doesn’t need to be seen as personal.  Standing back and viewing the stresses in life — whether they be the impact of the one-off catastrophe or the collective erosive power of day-to-day inconvenience — as having the potential to teach us something can be empowering. No matter how dark an experience may seem, there is usually something to be learned from it.  That, of course, is easy to say with hindsight, but much harder to acknowledge in the midst of the night.

2. Be grateful

Kindnesses received and given are perhaps the most underestimated drivers of resilience.  Connecting with other people and sharing your gratitude for what you have and what you are able to give can re-centre your view of it all.  Even stealing a moment privately to take stock of your own resources serves to offset any disappointment for all those things you don’t possess.

3. Acknowledge change

We live in flux, where the world and everything in it, including ourselves, grows, flourishes and dies.  We need to find the flexibility of mind to accommodate that reality.  As we age, some of our goals need to be realigned; as we acquire new responsibilities, some dreams need to be re-imagined. Sometimes it’s about making them smaller, but it might also be about making them bigger.  This is not a failure in ourselves or a cause for distress, but an opportunity to keep our aspirations up-to-date and relevant.

4. Stay healthy

The physical resilience that comes with a good diet, ideal body weight, physical exercise and adequate sleep helps a lot with its psychological counterpart.

5. Laugh

And maybe all this advice should should begin with a realisation that life is really rather ridiculous.  This recognition comes much more easily with advancing years.  Everything seems so serious in our youth and it is only with the passing years that we can muster the perspective to see that almost everything we choose to worry about not only doesn’t matter but is often laughable.  A sense of humour is one of the strongest weapons against despair, and those who can laugh first and foremost at themselves and then at the situations they find themselves in are much more likely to come out the other side.


It is impossible to live a life free of adversity, large or small.  The difference we can make is how we deal with it.  Do we bend in that wind and grow stronger or do we break under the strain?  Or do we hold fast, but stay flexible — just like old Mr Collins’ elastic glue.

© Allan Gaw 2018


Now available in paperback…

Testing the Waters: Lessons from the History of Drug Research

What can we learn from the past that may be relevant to modern drug research?

Screen Shot 2015-10-14 at 21.22.11

“I cannot recommend it highly enough: even if you read nothing else about the origins of drug research and what it can (should?) teach us, read this….This is a ‘buy’.”  Madhu Davis review in Pharmaceutical Physician May 2016.

My other books currently available on kindle:

Screen Shot 2015-01-18 at 17.19.43Screen Shot 2015-01-18 at 17.19.00BIS Cover copyScreen Shot 2015-01-18 at 17.19.29Screen Shot 2015-01-18 at 17.18.54Screen Shot 2015-01-18 at 17.19.37Screen Shot 2015-01-18 at 17.19.16Screen Shot 2015-01-18 at 17.19.58Screen Shot 2015-01-18 at 17.20.06

Churchill’s Tweets

When examples of great writing are sought, one man’s words are often cited.  The words for which he is best remembered were, however, not designed to be read, but to be heard.  They were delivered by their author from the floor of the House of Commons during one of Britain’s darkest periods.  The man was Winston Churchill and the time was one of world war. Two decades later, when John F. Kennedy was proposing to confer Honorary American Citizenship on Churchill, he summed up the achievements of this consummate writer:

In the dark days and darker nights when Britain stood alone—and most men save Englishmen despaired of England’s life—he mobilized the English language and sent it into battle.

Churchill’s style was deliberately stirring.  He set out to rouse, and as Kennedy put it, to mobilize.  Churchill made many speeches throughout a long and varied political career, and was a prolific author.  Indeed, he was awarded the Nobel Prize for Literature in 1953 ‘for his mastery of historical and biographical description as well as for brilliant oratory in defending exalted human values.’

But, why do we regard these words as examples of great writing?  What sets them apart as memorable, and how can we emulate them? To answer these questions we need to look at some of Churchill’s best remembered speeches.

On May 13, 1940 Churchill made his first speech as Prime Minister to the House of Commons where he pledged,

I would say to the House, as I said to those who have joined this government: ‘I have nothing to offer but blood, toil, tears and sweat.

After the evacuation of Dunkirk, Churchill once again rose to the dispatch box of the House on June 4, 1940 and laid out the simple but relentless strategy for war without surrender,

We shall go on to the end, we shall fight in France, we shall fight on the seas and oceans, we shall fight with growing confidence and growing strength in the air, we shall defend our Island, whatever the cost may be, we shall fight on the beaches, we shall fight on the landing grounds, we shall fight in the fields and in the streets, we shall fight in the hills; we shall never surrender, and even if, which I do not for a moment believe, this Island or a large part of it were subjugated and starving, then our Empire beyond the seas, armed and guarded by the British Fleet, would carry on the struggle, until, in God’s good time, the New World, with all its power and might, steps forth to the rescue and the liberation of the old.

Two weeks later, Churchill roused the country with his almost Shakespearean rhetoric,

Let us therefore brace ourselves to our duties, and so bear ourselves that, if the British Empire and its Commonwealth last for a thousand years, men will still say, “This was their finest hour.”

With the Battle of Britain won in the air, Churchill voiced the gratitude of the nation in the House of Commons on August 20, 1940.

The gratitude of every home in our Island, in our Empire, and indeed throughout the world, except in the abodes of the guilty, goes out to the British airmen who, undaunted by odds, unwearied in their constant challenge and mortal danger, are turning the tide of the world war by their prowess and by their devotion.  Never in the field of human conflict was so much owed by so many to so few.

At the height of the war, while visiting his old school on October 29, 1941, Churchill addressed the boys and reminded them of his mantra,

…never give in, never give in, never, never, never, never—in nothing, great or small, large or petty—never give in except to convictions of honour and good sense. Never yield to force; never yield to the apparently overwhelming might of the enemy.

On November 10, 1942 Churchill delivered his Mansion House Speech and with Field Marshall Rommel’s recent defeat at the Battle of El Alamein, was able to strike a more hopeful chord,

Now, however, we have a new experience. We have victory—a remarkable and definite victory. The bright gleam has caught the helmets of our soldiers and warmed and cheered all our hearts.

But, even then counselled caution,

Now, this is not the end. It is not even the beginning of the end. But it is, perhaps, the end of the beginning.

These speeches were targeted at the British people and were designed to demonstrate leadership at a time of turmoil and provide reassurance that despite the storms ahead we, the British people, would prevail.  The language used is not complex, with many of the words consisting of a single syllable and the majority in common use.  He uses many standard rhetorical ploys, such as threesomes of examples, e.g. ‘in our Island, in our Empire, and indeed throughout the world’, contrasting pairs, e.g. ‘the New World, with all its power and might, steps forth to the rescue and the liberation of the old’ and building rhythms with repetition, e.g. ‘never give in, never give in, never’ and ‘We have victory—a remarkable and definite victory.’

One of the important features of Churchill’s speech writing in contrast to some of his predecessors was the length of his sentences.  His average word count per sentence was 24.2,  in contrast to Benjamin Disraeli’s 42.8 and Francis Bacon’s 72.2.  As Jeff Toney recently noted this means Churchill could have tweeted while Bacon could not.

Since the average word has about 5 characters, Francis Bacon’s sentence length needed at least 360 characters – too long for a Tweet. In contrast, Winston Churchill’s average sentence can fit at about 120 characters – room to spare for Twitter’s limit of 140.

This of course raises a fascinating ‘what if?’ What would Churchill’s tweets have looked like?  Perhaps we don’t have to wonder for they are all there before us — sound bites embedded in his most famous speeches.

Blood, toil, tears and sweat @PM

We shall fight on the beaches, we shall fight on the landing grounds, we shall fight in the fields #Nazis #Hitler

Never in the field of human conflict was so much owed by so many to so few #Few #RAF

We have victory—a remarkable and definite victory 🙂 #El Alamein

Not the #end – not even beginning of the #end. But perhaps end of the beginning. #warhope

© Allan Gaw 2017


If you would like to read more about how to write well…


Screen Shot 2015-01-18 at 17.18.54Screen Shot 2015-01-18 at 17.19.16Screen Shot 2015-01-18 at 17.20.06