Why do we call it a stent?

 

 

They are a small select group.  Those individuals whose surnames have been transformed over the years into nouns and even verbs have certainly achieved immortality of a sort.  But, of course, often when we use these words we do not even recognise they were once names.  The French physician and inventor Joseph-Ignace Guillotin is a member of the group, as is the American industrialist William H. Hoover. And so is a now largely forgotten English dentist, Charles Thomas Stent.

Stent was born in Brighton in 1807 and practised dentistry in Victorian London.  His principal contribution to his field came in 1856 when he successfully modified the material used to make dental impressions.  Earlier in the nineteenth century the main impression materials had been bees’ wax and plaster of Paris.  Because these were far from perfect, the English dentist Edwin Truman had introduced the use of gutta percha in 1847.  This natural material derived from rubber trees was an improvement over its predecessors but was still found wanting.  It had a tendency to distort  after removal from the patient’s mouth and would shrink on cooling.  To stabilise the gutta percha and improve its plasticity,  Stent decided to add stearine derived from animal fat.  He also added talc as an inert filler and red colouring.  When the new, improved material was introduced it was an instant success and indeed Stent was lauded by his profession.

Both Stent’s sons followed him into dentistry and together they founded a company, C.R. and A. Stent that would manufacture the increasingly successful Stent’s Compound for the next four decades. In 1901, when the second of his sons died, the dental supply company Claudius Ash and Sons of London purchased the rights and continued its manufacture under the Stent name.

But how did the name of a dental impression material find its way into the wider world of surgical devices?  The story is convoluted and not without controversy.  Stent’s compound was certainly widely known in dental circles throughout the latter half of the nineteenth century, but the story really begins with its use by a Dutch Plastic Surgeon in the First World War. That surgeon, J.F. Esser, was trying to find novel ways of repairing serious facial wounds in soldiers from the trenches.  He described in 1917 how he used “the mould of denticle mass (Stent’s) in fixation of skin grafts in oral surgical repair of war wounds.”  Later in the same article, he referred to the material he used as “stent’s mould.”  This pioneering work was cited in a 1920 book on the Plastic Surgery of the Face, in which the author noted, “The dental composition for this purpose is that put forward by Stent and a mould composed of it is known as a ‘Stent’.”  Thus, Charles Stent’s surname became a noun for the first time.

Throughout the subsequent decades on the twentieth century, surgeons trained in the UK and US would have been well aware of such material being used in oral and plastic surgery.  As there is often significant crossover from one sub-speciality to another, especially as they develop their own identity, the concept of the “Stent” would find its way into diverse fields.  In the reconstruction of the common bile duct, polyethylene tubes would be used to maintain the structure’s patency, and in 1954 this was referred to as “a stent for the anastomosis”. In urology, where there is also an obvious need to hold tubes open, the word stent was first used in 1972.  However, today the most common use of the word stent is in cardiology.  Its in that field was not until 1966, where it was used in the context of heart valve surgery.  While its author was fully aware of the use of the word by plastic and oral surgeons, he later claimed that he thought it was “an all-purpose term for any kind of nonbiological support used to give shape or form to biological tissue.”  The notion of an endoarterial tube graft as we know it today is attributed to researchers in the 1960s and the first coronary stent was implanted in Toulouse in 1986.

Today, the word stent has entered common usage and, particularly because of its wide use in cardiology, is well-understood by any informed lay person.  Few, however, either in the medical professions or elsewhere are aware of the origins of the name.  One cardiac surgeon noted, “The greatest accolade that can be given to any inventor is to have the initial capital letter dropped from his name, for that is recognition that the word is now in the general language.”  Charles Stent, the Victorian London dentist, has certainly achieved immortality, but his contribution to the development of the modern stent has often been overlooked.

 

© Allan Gaw 2018

Now available in paperback…

Testing the Waters: Lessons from the History of Drug Research

What can we learn from the past that may be relevant to modern drug research?

Screen Shot 2015-10-14 at 21.22.11

“I cannot recommend it highly enough: even if you read nothing else about the origins of drug research and what it can (should?) teach us, read this….This is a ‘buy’.”  Madhu Davis review in Pharmaceutical Physician May 2016.

My other books currently available on kindle:

Screen Shot 2015-01-18 at 17.19.43Screen Shot 2015-01-18 at 17.19.00BIS Cover copyScreen Shot 2015-01-18 at 17.19.29Screen Shot 2015-01-18 at 17.18.54Screen Shot 2015-01-18 at 17.19.37Screen Shot 2015-01-18 at 17.19.16Screen Shot 2015-01-18 at 17.19.58Screen Shot 2015-01-18 at 17.20.06

Advertisements

Writing the rules for drug research: Johann Christian Reil (1759-1813)

Today, the name Johann Christian Reil is perhaps best known to neuroanatomists and those working in mental health for this German physician, who was born over 250 years ago, made seminal contributions to both these fields. As his legacy, we have several structures in the brain that bear his name as well as ‘psychiatry’, a specialty that he named. However, earlier in his career, Reil was responsible for another, perhaps even more generally applicable piece of work, when he set forth the principles on which the modern evaluation of drugs in humans should be based.

His life

Born in Northwest Germany, Reil began his medical education in Göttingen in 1779. He taught at the University of Halle for 22 years, during which time he established himself as an esteemed physician, scientist and educator. During his tenure, he promoted the idea that medical practice should be grounded in physiology, which in turn should have chemistry as its foundation.

In 1810, he moved to the new Berlin University as the Dean of the Medical Faculty, but the continued political unrest in Europe would interrupt his career and bring it to an end . In 1813, during the Napoleonic Wars, Reil volunteered for military service in the Prussian Army. He was commissioned to the field and, possibly as a result of his efforts to control a typhus epidemic then raging in Germany, he contracted the disease himself. Approximately 10% of the German population were infected and of these 10% (some 250,000) died. Reil was one of these unfortunates, dying of the disease at the age of 54.

His approach to pharmacology

In 1799, while he was the Professor of Medicine at Halle, Reil published his Beitrag zu den Prinzipien für jede künftige Pharmakologie (Contribution to the principles of a future pharmacology), in which he proposed a set of rules for the conduct of pharmacological research. These were presented as a theoretical framework on which to build a scientific approach to the study and evaluation of drugs.

As his first rule, Reil chooses to emphasise the overall approach and motivation for pharmacological research and indeed for any form of scientific endeavour.

1) The observer must have good common sense, good understanding, judgement, know how to make observations but also have a healthy degree of scepticism. He should not allow himself to be influenced by egotism, doctrine, an attachment to his school, or any prejudice, but by the simple love of truth.

Next, he states the importance of standardization both with respect to the drug under test and to the research participants. He notes that the researcher has the power to control the quality and consistency of drugs used, but the inter and intra-individual variation of human subjects can only be acknowledged and taken into account.

2) If the results of experiments, that is the changes brought about in the human body by the drugs, are consistent, they can be considered to be undoubtedly valid only if both the drugs and the human beings used in the series of tests are of a standardized nature. If the reagents which are working on one another are sometimes of one quality, sometimes of another, then the results will be correspondingly inconsistent…Every individual is different from another, and the same individual is not the same all of the time. With human beings a standard, average type has to be established, and before the experiment the test subjects must be assessed to identify exactly how they deviate from the standard so that variations in the results can be compensated for accurately for each individual according to how they deviated from the norm. How difficult this point is and yet how inexact efforts are with regard to it.

His third rule focuses on a similar theme, but here he discusses the importance of accurate diagnosis and previews one of his later themes—the all-importance of consistent nomenclature.

3) If we are experimenting on invalids the same things apply. Their illness must not be hypothetical, but real, identifiable from recognizable symptoms and deemed by the experimenter as really being present…How many mistakes have wormed their way into pharmacology because of confused terms about the nature of diseases and an imperfect semiotic of the same.

The reproducibility of research findings is emphasized in Reil’s fourth rule. The confounding effects of variables that have not been controlled and the cherry-picking of the desired result from a variety of non-standardised experiments are, in Reil’s opinion, reasons for the mistaken advocacy of ineffective treatments.

4) The experiments must be repeated often and under exactly the same conditions and in each repetition the results have to be the same. This alone can convince us that the results are effects of the drug. If, after one or more tests, a given effect sometimes is seen and sometimes not, the possibility remains that it was due not to the use of the drug but some other possible cause.

As an extension to his call for the control of variables within an experiment, Reil makes a call for the study of individual, or simple, drugs in his fifth rule. Here, he draws an important distinction between the study of drugs and their practical use, with his realization that compounds or mixtures will be prescribed. He argues, however, for their components to be tested first separately, and only when understood to be tested in combinations.

5) A drug must be tested on its own, not in conjunction with others, because otherwise it remains uncertain which of the substances used has brought about the effect in question. I am not suggesting, however, that in practice no compound substances should be used… First of all, we must determine the powers of the simple, individual substances in order to be able to work out the effect of compounds of them. The compounded substance must then be thoroughly tested, the same as for a simple, to understand the alteration of its effect which has been caused by its compounding.

The terminology we use to describe drug actions is called into question by Reil in his sixth rule. Here, he calls for specificity and a greater depth to our language that goes beyond mere superficial description.

6) The effects of drugs must be described specifically, not in terms that are too general, as otherwise they are of no practical use…It is not a question of whether the substance simply has an effect, but the ‘what’ and the ‘how’ of this effect.

In Reil’s seventh rule, he states the importance of the scientific method in the study of pharmacology. The critical importance of experiential learning and the process of inductive logic are emphasized.

7) The effects of drugs must be established either through direct experience or from conclusions, which were clearly able to be drawn from direct experience. Their characteristics have to be clearly described… Isolated observations must be collated and general results deduced according to certain rules (e.g. frequency, causality).

In Reil’s eighth and final rule, he returns to his call for improved terminology, with greater precision in the words we use to describe our experimental findings. After stating his rule, he goes on to offer a number of specific examples in support of his argument for a dedicated, technical language of pharmacology.

8) Finally, the terminology used in pharmacology deserves sharp criticism. The meaning we give to words needs to be more precise, more expansive and more accurate. Without this improvement we will remain virtually unintelligible to each other.

His rules in perspective

In the second half of the 18th century, medical practice was in turmoil after a period of relative stagnation lasting more than 500 years. Through numerous discoveries and a wholly different approach, medieval medicine was giving way to a more enlightened system. As part of this, there was a growing desire to place medical practice on a much more solid scientific foundation. The term pharmacology itself was probably coined in the late 17th century, but acquired its modern definition in 1791 from Friedrich Albrecht Karl Gren, a German chemist, physician and friend of Reil.  Gren distinguished the science of the action of drugs from the mere description and collection of drugs. This cataloguing of drugs was materia medica, while pharmacology was a science. A science, however, needed a rational framework and a method, and these Reil sought to provide.

Reil’s rules may be seen as a natural development in this context. However, Reil was not the first to propose a set of rules to formalise the study and evaluation of drugs. The ancient Roman physician and philosopher Galen and the 11th century Persian polymath Avicenna, as well as medieval physicians who followed them, did the same.

An obvious question to address is: how much was Reil influenced by his predecessors?

Galen and Avicenna had dominated the medical curriculum for centuries. Galen’s works were widely read and studied in their original Greek and from the 15th century in Latin translations. They formed part of the core curriculum in most European medical schools but by the Renaissance their authority was being questioned. By the mid to late 17th century they were no longer taught at the European medical schools.

Similarly, Avicenna’s famous Canon of Medicine was available in Latin throughout the later medieval period and was probably first translated in the second half of the 12th century. The Canon continued to be used in medical schools up until the late 17th to early 18th centuries. However, by the time Reil attended medical school it would have fallen out of favour, to be replaced with more contemporary texts of the medical enlightenment.

In parallel with the upheaval in medical practices in the 18th century, there was a corresponding overhaul of medical education. Medical curricula were changing to accommodate new subjects such as chemistry, botany and physiology. While the classic texts such as those of Galen might still be read, they no longer formed part of the core syllabus.

Thus, Reil would have had access to the works of Galen and Avicenna, but it is questionable whether his medical education in the 1780s would have focussed on, or even included, these classic authors’ works. It is perhaps unsurprising then that Reil’s rules differ significantly in their emphasis from those of the ancients.   Like Galen and Avicenna, Reil recognized the need for appropriate experimental design, but he goes much further than his forebears in his calls for scientific rigour and for a new vocabulary to report our findings.

Conclusion

Reil, like many physicians of his day was not content to specialize in one area. His contributions to several branches of medicine, including pharmacology, are significant, but partly because of his premature death and partly because of his shifting interests, we had to wait until later in the 19th century to see a truly scientifically founded pharmacology.

(c) Allan Gaw 2018

 

This blog is a shortened version of my paper in the European Journal of Pharmacology

 

If you are interested in reading more about medical history take a look at my books:

‘Trial by Fire’ and ‘On Moral Grounds’ — paperback copies of these are available from my website:

http://www.allangaw.com/sapress.htm

and e-copies are available from kindle on Amazon.

 

A Snail in the Ginger Beer

 

Donaghue v Stevenson 1932 and our duty of care

Introduction

Paisley may be known for its patterns and its shawls, but this town to the west of Glasgow has a much greater claim to fame, at least according to those interested in the law.  For it was there, 90 years ago, that one of the most important legal decisions in history, and one that still has implications for medical practice today, had its origins.

Mrs Donoghue’s ginger beer

On the hot summer evening of 26 August 1928, Mrs May Donoghue, a shop assistant from the East End of Glasgow took a tram ride to Paisley.  There, she met a friend at the Wellmeadow Café and they decided to quench their thirst. Her friend bought the refreshments and Mrs Donoghue was served a glass containing ice cream over which the waiter poured some of a bottle of ginger beer making an iced drink or float.  After Mrs Donoghue drank some, her friend added more ginger beer from the dark glass bottle that had been left on the table.  As she poured, Mrs Donoghue noticed something fall into the glass, which she recognised as a decomposing snail.  Understandably, she immediately felt sick and became “shocked and ill.”  She was clearly affected by the event for she sought medical treatment three days later from her own doctor and again three weeks later in mid-September at Glasgow Royal Infirmary.

The legal case

Because it was her friend who had purchased the ginger beer, Mrs Donoghue had no contractual relationship with the café owner.  She would later learn that the only one she might sue would be the manufacturer of the drink, David Stevenson, whose name was clearly written on the dark glass bottle in large white lettering.  Moreover, she would have to prove negligence on his part, if she was to recover any damages and that claim of negligence would require there to be a duty of care between her and Stevenson.  However, at the time, the law supported the existence of a duty of care to people harmed by the negligent acts of others only in very limited circumstances.  These would include instances where a contract existed between the parties, if the manufacturer was acting fraudulently or if the product was inherently dangerous.  In Mrs Donoghue’s case none of these applied, but she was determined to seek such damages and engaged the Glasgow solicitor Walter Leechman.  Interestingly, Leechman’s firm had represented the unsuccessful pursuers in two recent similar “mouse in ginger beer” cases.  It seems more than a coincidence that of all the lawyers Mrs Donoghue might have consulted, the one she chose had both the experience and the resolve to pursue such a case.  Quite how she found him, or was directed to him, remains a mystery.  Leechman issued a writ against Stevenson claiming damages of £500 plus costs and noting that, “snails and the slimy trails of snails were frequently found” in the factory where his ginger beer was manufactured and bottled.

Stevenson’s counsel moved the Court of Session to dismiss the claim and were eventually successful.  Thus, Leechman now began the process of appealing the decision to the House of Lords.  However, Mrs Donoghue had no money.  Not only had she to declare herself in writing as a pauper so she could be absolved of the need to post security to cover any costs if her appeal was unsuccessful, but her counsel had to proceed in representing her without any guarantee of payment.

On 10 December 1931, five Law Lords met to hear the first of two days’ arguments in Mrs Donoghue’s case.  Some five months later they delivered their judgement and by a majority of three to two they agreed she did have a case.  Mrs Donoghue, they ruled, was owed a duty of care by the manufacturer and bottler of the ginger beer and she could bring an action against him.  This duty of care was founded on the “neighbour principle” eloquently expounded by one of the law lords, Lord Atkin.  He summarised this principle in the ruling as follows:

“The rule that you are to love your neighbour becomes in law, you must not injure your neighbour; and the lawyer’s question, Who is my neighbour? receives a restricted reply. You must take reasonable care to avoid acts or omissions which you can reasonably foresee would be likely to injure your neighbour. Who, then, in law, is my neighbour? The answer seems to be—persons who are so closely and directly affected by my act that I ought reasonably to have them in contemplation as being so affected when I am directing my mind to the acts or omissions which are called in question.”

The “lawyer’s question” referred to was the one asked of Jesus in Luke’s Gospel, and which prompted Christ to tell the parable of the Good Samaritan.  Indeed, Lord Atkin’s ruling was firmly based on his reading of Judeo-Christian scripture.

Our duty of care today

The importance of this ruling lay in its implications for our understanding of negligence.  In law, for there to be medical negligence three criteria must be met.  First, a doctor must owe a duty of care to the patient in question. Second, there must be a breach of that duty of care.  And third, the breach must result in harm to the patient.  Thus, the concept of “duty of care” is central to our understanding of negligence, and it may be defined as an obligation we hold to take care to prevent harm being suffered by others.   In defining to whom we owe this duty of care as our “neighbour”, Lord Atkin created a new basis for the law of negligence, and of course his wide definition of neighbour would certainly include any doctor’s patient.

Conclusion

Not long after the House of Lords had ruled that Mrs Donaghue would be entitled to recover damages if she could prove what she alleged regarding the snail, David Stevenson died.  His executors agreed an out of court settlement of £200 (almost £10,000 today) and as such the case never went to trial.  May Donoghue died in a Mental Hospital in 1958, likely unaware of the global impact that summer evening trip to Paisley had had some 30 years earlier.

Today, in Paisley you will find a small park, a bench and a memorial stone at the corner of Well St and Lady Lane where the café once stood.  Most locals know little of its significance, but occasionally you will see a stranger standing reading the inscription on the stone.  Often they will be lawyers, sometimes doctors, who have made the pilgrimage from England, from North America or from even further afield, just so they can stand on the spot where our duty of care and the concept of negligence began.

The site of the Wellmeadow Café today

 

Post script

In everything that has been written about this case, including all of the original legal documents, the bottle in question is said to have contained “ginger beer”.  What is perhaps not widely known is that the term “ginger” is a colloquialism in Glasgow for any fizzy drink.  It is possible that in reality the bottle did not contain true “ginger beer” but some other form of flavoured, aerated water such as orangeade referred to by Mrs Donoghue and her friend as “ginger”. Whether the bottle contained a snail at all is also a subject of controversy.  Some have argued that Mrs Donoghue’s claim was just a hoax to extort compensation.  Whatever the truth of the contents of that bottle, it remains the reason for our understanding of negligence around the world.

 

Detail from the adjacent mural painted by local artist Kevin Cantwell showing the famous snail and the ginger beer bottle.

Sources

© Allan Gaw 2018

This article was originally published in the MDDUS’ Insight

 

If you are interested in reading more about medical history take a look at my books:

‘Trial by Fire’ and ‘On Moral Grounds’ — paperback copies of these are available from my website:

http://www.allangaw.com/sapress.htm

and e-copies are available from kindle on Amazon.

 

Unravelling the magic — the history behind placebos in research

TODAY, we take the use of placebos in clinical trials for granted, often assuming that this is a relatively recent innovation. The truth, however, is more interesting and begins in pre-revolutionary France.

In 1778, Parisians who were sick and rich could try a novel treatment from a charismatic physician called Anton Mesmer who had recently arrived from Vienna. His clinic was in the exclusive Place Vendôme in Paris. There you would enter a dimly-lit room and join others seated in concentric circles. At the centre of the room was a wooden tub filled with ground glass, iron fillings and bottles of magnetised water along with metal rods. You would be invited to hold one of these rods on your affected body part. In the background there would be hushed silence punctuated by the ethereal sounds from the glass harmonica – a newly invented musical instrument sounding like a wet finger stroking the rim of a wine glass.

The scene set, Mesmer would appear in a lilac silk coat carrying a metal wand. He would sit en rapport with some patients – knees touching and gazing intently into their eyes. His assistants, reported to have been young and handsome, would also help the magnetic flux by massaging the knees, backs and breasts of patients. This combination of sensory stimuli caused many patients to become entranced or mesmerised and some to faint or convulse. And, of course, many claimed to be cured.

But what was really happening here? Lighting, music, costume, drama and sensuality – what was going on was more ritual than medicine, more suggestion than treatment, a little more Dumbledore than doctoring. Perhaps in a pre-enlightenment era, this would simply have been viewed as magic. But, this was the 1780s – the world had moved on. Now, this magic had to have a rational scientific basis and Mesmer provided it. He believed magnetic fluid flowed into us from the stars and that disease was the result of an obstruction to this flow. His treatment was designed to realign this animal magnetism.

Mesmer’s treatments soon became the height of fashion, but he was not without critics, and the establishment would have nothing to do with him. Indeed, the King himself stepped in and appointed a commission to investigate, asking the elderly American Ambassador to France to take the lead. This was none other than Benjamin Franklin. Today we remember Franklin as an elder statesman, but in his lifetime he was among the most celebrated scientists and it was in this capacity that the King sought his help.

Franklin and his colleagues devised a series of experiments using placebos for the first time. Subjects were presented with magnetised objects and with sham objects that looked the same but were untreated. The patients were unable to distinguish the two and variably reported the effects. As a result of these placebo-controlled experiments, the commission was able to conclude that there was no basis to Mesmer’s claims. Instead, they explained that animal magnetism “owed its apparent efficacy to the power of suggestion in susceptible or naïve individuals.”

Although the term placebo did not enter medical parlance until 1785, it is clear that for centuries before healers had used remedies they knew to be inactive, but which they also knew would appease their patients. Placebo indeed is Latin for “I shall please”. However, Franklin and the Commissioners are credited with being the first to use placebos in a clinical research setting.

Placebos are now an essential part of modern research, used to prevent confounding from the so-called placebo effect, i.e. the effects that an inactive substance, procedure or device may have when administered in a clinical context over and above the effects observed of no treatment. This effect is complex and still relatively poorly understood, but it is undoubtedly real and can significantly impact our evaluation of different treatments if not taken into account. Whatever the treatment, it may be possible to create a matched, but ineffective alternative to act as a control. Benefit may only be claimed if the active treatment produces significantly greater effect than the placebo.

Thus, without the ingenuity of a group of enlightened French scientists led by an aging American diplomat, perhaps today we would not have the placebo-controlled randomised clinical trial. Perhaps our clinical practice might still be based only on observation and anecdote rather than hard evidence. And perhaps physicians would still have wands.

© Allan Gaw 2018

Originally published in FYi on July 26, 2018

https://www.mddus.com/resources/publications-library/fyi/issue-20/unravelling-the-magic

 

If you are interested in reading more about the history of clinical research take a look at my books:

‘Trial by Fire’ and ‘On Moral Grounds’ — paperback copies of these are available from my website:

http://www.allangaw.com/sapress.htm

and e-copies are available from kindle on Amazon.

 

 

People who look like me

 

When former First Lady Michelle Obama unveiled her new portrait at the National Gallery of Art in Washington DC on 12 February 2018, she made a speech.  Standing before the striking image painted by Amy Sherald, she said just how important she thought it was for young black girls to visit the gallery and see the painting, to realise what was possible — to see a portrait of someone who looked like them.  Perhaps she hoped the portrait of a black woman who had become First Lady, hanging side by side with the great and the good of America’s history, would inspire.  Or perhaps not quite that; perhaps she hoped it would simply normalise the achievement.

In whatever field we work, we should not underestimate the power of examples, of role models, of trail blazers, of “people who look like me.”  Women dominate the world of clinical research in terms of numbers, but not in terms of leadership.  For example, what percentage of Chief investigators are women?  In contrast, what percentage of Research Nurses and Administrators and Pharmacists are women? There are obviously a number of reasons for this imbalance, but might the lack of visible role models be one of the reasons why women are so underrepresented in leadership roles in this field.

When we write books and tell the tale of how Clinical Research has developed, it is inevitably a story of men and white men at that.  I know this for a fact, because I have written several of those books.  How many women innovators in clinical research can you name?  Probably none.  Is that because there weren’t any, or is it simply that their contributions have been overlooked and their stories untold?

Even a cursory glance through the pages of history, especially recent history, would suggest it is the latter. For example, there is Elizabeth Wortley Montagu who pioneered smallpox inoculation 75 years before Edward Jenner tested his vaccination procedure.

There are Dorothy Irene Height, an African-American Civil Rights Activist and Patricia King, an African-American Lawyer who both served on the National Commission that produced the Belmont Report in 1979 and defined the three key principles that would underpin Clinical Research Ethics in the US and in many other countries up to the present day.

And there are three women in Nuremberg — Herta Oberhausen, the German defendant, Wladyslawa Karolewska, the Polish victim and Vivien Spitz, the American Court Reporter who all sat in the same dismal courtroom in 1947.  Their combined story helped shape the outcome of the Doctors’ Trial and the way we would remember it seven decades later.

Against this background of contemporary gender inequality, should we do more to raise awareness of these women and the countless others who have helped forge modern clinical research?  I think we should, and I’m busy trying to do that right now, but in so doing I have come across a further, deeper complication in the work.

My approach has been one of excavation — to identify women who have played a significant part in the story of clinical research and to tell their stories, and thus to inspire and motivate or perhaps just to normalise their contributions.  But, by specifically focussing on women I have been brought to task by some who feel that the problem is not solved by such positive discrimination.

“Why would you write a book about women in research?” they say. “You should be writing a book about people in research.”  This objection seems to cut to the core of modern feminism — once an inequality is acknowledged, how do you redress it?  Do you restore the balance by giving renewed emphasis to those who have been left out of the story, or is that just a patronising fix that only serves to paper over a much deeper crack?  By focussing on women in this work, do I simply perpetuate, even amplify, a difference that should never have existed in the first place?  I suppose my take, for what it’s worth, is that while it should never have been an issue, it is a real problem now with, I think, real consequences.

I know this is a controversial topic, and I know there is no single answer to it.  I also know that it is not simply my male perspective at odds with a female one, after all not all women think alike, nor do all men.  What it is, however, is an issue that divides rather than unites.  If the ‘solution’ to a problem makes that problem worse, we should take a breath and start again. But if we only think it might make it worse, isn’t it worth a try?

We all need people who look like us whether we are young or old, black or white, male or female, and if they don’t paint pictures of them or write books about them, I think they should start.

 

© Allan Gaw 2018

 

 

Coming Soon…

Now available in paperback…

Testing the Waters: Lessons from the History of Drug Research

What can we learn from the past that may be relevant to modern drug research?

Screen Shot 2015-10-14 at 21.22.11

“I cannot recommend it highly enough: even if you read nothing else about the origins of drug research and what it can (should?) teach us, read this….This is a ‘buy’.”  Madhu Davis review in Pharmaceutical Physician May 2016.

My other books currently available on kindle:

Screen Shot 2015-01-18 at 17.19.43Screen Shot 2015-01-18 at 17.19.00BIS Cover copyScreen Shot 2015-01-18 at 17.19.29Screen Shot 2015-01-18 at 17.18.54Screen Shot 2015-01-18 at 17.19.37Screen Shot 2015-01-18 at 17.19.16Screen Shot 2015-01-18 at 17.19.58Screen Shot 2015-01-18 at 17.20.06

In the absence of consent

Modern clinical practice and research are built upon the ethical foundation of voluntary informed consent.  Especially in the world of clinical research, we are compelled to navigate a complex maze of rule and regulation in this respect, and for very good reason.

Much of the legislation and professional regulation that govern the issue of consent in research unfortunately has its origins in tales of atrocity, abuse and exploitation.  All histories of clinical research re-tell the tales of ancient, and not so ancient, experimentation where ruthless investigators probed, incised and manipulated their hapless victims without a thought for their well-being.

While this is true and should never be forgotten, it is far from the whole story.  And if we are to have a rounded view of where we have come from, and if we are to understand the meaning of our modern research governance, it is important that we look beyond these stereotypes.

It is uncommon to find any mention of consent or a consent process in research involving human participants before the turn of the 20th century.  However, prior to that time there are many well-documented examples of experimentation involving interventions, including many that were potentially life-threatening.  Without the mention of consent in these tales, we are instantly on our guard and may condemn these experiments as immediately unlawful and immoral.  But, even if there is no formal informed consent process, at least as we would recognise it today, the alternative is not merely abuse and exploitation.  There is surely a lot of middle ground and much of it might still be governed by common decency, professionalism and basic human respect.   Our forebears in clinical research were not, as they are sometimes depicted, all monsters. In fact, almost none of them were—they just lived in a different time.

The approach they took when working in a world where the very notion of patient consent was quite alien, may still have been respectful.  Take the example of the following case study.

Edward Jenner, the Gloucestershire doctor from the 18th century, is today best remembered for his contribution to our understanding of vaccination. In the spring of 1796, he performed his first human experiments. A local milkmaid called Sarah Nelms had contracted cow pox and Jenner found her to have fresh lesions on her hands. On 14 May 1796, Jenner took pus from Nelms’ lesions, and inoculated the son of his gardener.  The boy, James Phipps, was 8-years old at the time, and he subsequently developed a mild fever and discomfort in the axillae, but apart from these minor symptoms quickly recovered. Almost seven weeks after the cow pox inoculation, Jenner inoculated the boy again, this time with pus from a fresh smallpox lesion. Phipps, despite this challenge, did not develop the disease and Jenner concluded that he was immune.  This clinical experiment along with others were written up by Jenner who detailed the procedures but not the informed consent process for there was none.

There is no record that James Phipps resented his involvement in Jenner’s clinical experiments.  Indeed, the fact that he served as one of Jenner’s pall-bearers some 27 years later at the doctor’s funeral would add weight to the belief that he bore Jenner no ill will as he reflected on his childhood involvement in the vaccination experiments.  Jenner’s experiments would, of course, never find approval from a modern research ethics committee, and his inclusion of minors without any formal parental consent would provoke outrage if carried out today. But, that does not mean he did not respect Phipps and his family.

Of course, we might say that Phipps, as the son of Jenner’s gardener, was outclassed and that his parents were in no position to refuse their employer.  All of this is true, but again it does not automatically make Jenner a villain. You may also say that this is a poor example because Phipps was ultimately unharmed by Jenner. Might we not cite the countless other examples of individuals who were maimed in other clinical experiments, who died as a result of their involvement or lived on feeling embittered and used?  Almost certainly we could, but the point of this argument is not to deny that there has been incalculable harm done to people without their knowledge and consent in the name of medical research.  It is to argue that it is not universally true that all past investigators did not care or treat their subjects with respect.

It is also about the importance of not measuring the past against the standards of the present.  I wonder how we will fare when it is our turn to be judged by history.  In the 22nd and 23rd centuries, what will medical historians think of our current practices and what we chose to concern us?   Will we be ridiculed for our lack of sophistication, pitied for our ignorance or even vilified for our cruelty?  Will we be viewed as morally reprehensible as well as stupid?  Or will an enlightened historian in the future say that we were doing the best we could with what we had and with what we knew?  With luck, he or she might conclude that we were not fools, nor were we barbarians or monsters; we were just the products of our time.

© Allan Gaw 2018

Coming Soon…

Now available in paperback…

Testing the Waters: Lessons from the History of Drug Research

What can we learn from the past that may be relevant to modern drug research?

Screen Shot 2015-10-14 at 21.22.11

“I cannot recommend it highly enough: even if you read nothing else about the origins of drug research and what it can (should?) teach us, read this….This is a ‘buy’.”  Madhu Davis review in Pharmaceutical Physician May 2016.

My other books currently available on kindle:

Screen Shot 2015-01-18 at 17.19.43Screen Shot 2015-01-18 at 17.19.00BIS Cover copyScreen Shot 2015-01-18 at 17.19.29Screen Shot 2015-01-18 at 17.18.54Screen Shot 2015-01-18 at 17.19.37Screen Shot 2015-01-18 at 17.19.16Screen Shot 2015-01-18 at 17.19.58Screen Shot 2015-01-18 at 17.20.06

The Burden of Swallows

 

Shortly after Seamus Heaney’s death, one lover of the Nobel Laureate’s poetry wrote in the condolence book in Belfast’s Linen Hall Library, ‘Famous Seamus, you’ve up and left us.’  There was affection, familiarity and even poetry in that simple lament.  And it’s the poetry — the poetry of life — that is worthy of reflection.

In his poem The Railway Children Heaney tells us that the wires stretching between the telegraph poles ‘sagged under their burden of swallows.’  What those darting, twilight visitors lack in size, they make up for in number.  As they sit in endless rows, these birds appear to weigh down those lines that are now so busy carrying other forms of twitter.  But, of course that’s the illusion — the poetic explanation of something rather more mundane.  In reality, when the birds rise to flight and vacate their roosts, the wires still sag almost in memory of their recent occupants.  However, that notion of memory is also a poetic one, for wires and telegraph poles, and even swallows, do not share our playfulness with the English language.

Poetry is seen by many as a rather unnecessary luxury in their lives.  It decorates the world with something unseen, but felt.  It creates an added sense that tries to express what we might otherwise struggle with.  But, we don’t need poetry in the way we need basic sustenance and shelter and love.  Nevertheless, expressions of beauty, and of horror, as well as all the emotions we strive so hard to keep away from science, do have a use.

Let me take the poetry out of Heaney and see what we have left.  I could tell you that the average mass of a swallow is 18g and that the average number of swallows per metre of a well-stocked telegraph wire might be about five.  This load, of around 90g per metre, is less than the weight of the wire itself, which of course is the real reason for the sag.  All this is true, but after I first read Heaney’s words, I have never been able to look at a sagging telegraph wire without imagining the flock of swallows that has just up and left us.  A poetic vision of the world does not just add colour, it also brings it into sharper focus.  And it adds context.  After all, we do not carry on our business of discovery in a vacuum, but in a very real world where consequences have hard and sometime sharp edges.

The scientist has a clear responsibility to the facts and may define his or her endeavours as a search for the truth.  With that sense of responsibility, often comes a sense of derision for poetry. But, it doesn’t have to be like that; one need not necessarily mean the demise of the other.

Science is a world full of ‘how’ and ‘why’ that begins with amazement.  This universe, this cell, this molecule have all possessed majesty long before they possessed meaning.  The American cosmologist Carl Sagan reminds us that every child who ever lived has gazed up and asked why the sky is blue.  Some, I suspect a tiny minority, receive the answer, while others are simply told to be quiet. Some might be offered a fable or a myth as a substitute for the facts.  But whatever the answer, the question was borne of wonder. And wonder is the stuff of poetry.

Another Nobel-prize winner, the immunologist Sir Peter Medawar, reflecting on the very nature of science, said of its paradoxes,

…a scientist must indeed be freely imaginative and yet skeptical, creative and yet a critic. There is a sense in which he must be free, but another in which his thought must be very precisely regimented; there is poetry in science, but also a lot of bookkeeping.

Those who forget to be poets, who see such a way of looking at the world as flimsy and unnecessary, are only seeing half of what the universe has to offer. And scientists who cannot appreciate the poetry of their work and who focus solely on that bookkeeping are only ever doing half the job.

© Allan Gaw 2018

Now available in paperback…

Testing the Waters: Lessons from the History of Drug Research

What can we learn from the past that may be relevant to modern drug research?

Screen Shot 2015-10-14 at 21.22.11

“I cannot recommend it highly enough: even if you read nothing else about the origins of drug research and what it can (should?) teach us, read this….This is a ‘buy’.”  Madhu Davis review in Pharmaceutical Physician May 2016.

My other books currently available on kindle:

Screen Shot 2015-01-18 at 17.19.43Screen Shot 2015-01-18 at 17.19.00BIS Cover copyScreen Shot 2015-01-18 at 17.19.29Screen Shot 2015-01-18 at 17.18.54Screen Shot 2015-01-18 at 17.19.37Screen Shot 2015-01-18 at 17.19.16Screen Shot 2015-01-18 at 17.19.58Screen Shot 2015-01-18 at 17.20.06