Every day we make hundreds of decisions – many are small and apparently insignificant, some seem to have more recognisable consequences and very occasionally we find ourselves confronted by the momentous. Our ability to make all these decisions easily and fluently often means the difference between a stressful, unfulfilling day and one where we can go home with our to-do list ticked off.

But how do we make those decisions? Are you the logical kind of decision maker who carefully tallies up the pros and cons and after a little mental arithmetic computes the “best” decision? Are you the type that goes with their gut, not really knowing how the decision has been reached but feeling that this is the “right” choice? Or are you the kind of person who, before making a decision, asks questions such as: is this the way we should do things, is this what I ought to do? Rather than trying to pigeon-hole yourself into one of these categories you should realise that we are all a complex mixture of different decision-making styles. And, moreover, we tend to use different approaches for different kinds of decisions.

Some decisions benefit from the logical approach: for example, choosing a new bank account where you can readily access all the features of the different options and work out which is the best for you. But, while a pros and cons list might be good for making a financial decision, it rarely works for choosing whom to fall in love with. There our guts, or should it be our hearts, have the upper hand. The same is true of buying a new house. The average Briton takes just 21 minutes to choose a new home, while it takes us 284 minutes to decide on which new TV to buy. We use our guts to “just know” whether the house is right, while we use our heads to calculate the best television.

The reason it takes more than ten times longer to pick the TV is, however, largely due to the overload of information we have to deal with, and there is a lesson to be learned here. We tend to regard important life decisions as difficult decisions – and one important consequence of this is that we have the unfortunate habit of also inferring that difficult decisions must be important. That’s where it all goes wrong: just because a decision is difficult does not mean it’s important.

Ironically, this seems to happen when we are confronted with a decision that is unexpectedly difficult – one that we thought should have been easy. It’s almost as if we think: “Oh, I thought this was going to be simple, but it’s not, so that must mean I’ve misunderstood its importance. I’d better work at this. It needs more time, more effort.”

And if you don’t believe this happens, think back to the last time you were standing in a supermarket aisle buying toothpaste. A “simple” task but now you see there are fifty different varieties to choose from. Some have fluoride, some don’t; some whiten your teeth, some don’t; some are for sensitive teeth, others aren’t. Suddenly, what should have potentially been a trivial decision is elevated by its apparent complexity into a difficult and therefore an important one, worthy of time and attention. But it isn’t. They’re all toothpastes after all; they all clean your teeth and in the big picture of your life it really doesn’t matter which you choose.

And in life there are many toothpaste decisions like that, where we agonise over the trivial, thinking that the very complexity of the decision means that it’s important. Once you realise that this is not the case, indeed is hardly ever the case, you can turn your attention to those decisions that do matter.

Our ability to make effective decisions is undoubtedly important. Indeed Napoleon said, “Nothing is more difficult, and therefore more precious, than to be able to decide.” But he was talking about deciding whether to invade a country and not which brand to buy in Tesco.

Beware of the trivia and beware of the procrastination that can sometimes occur as a result of our inability to decide. “In any moment of decision,” said Teddy Roosevelt, “the best thing you can do is the right thing, the next best thing is the wrong thing, and the worst thing you can do is nothing.”

Sources

  • Roberts L. The Daily Telegraph, 2 July 2010.
  • Sela A, Berger J. Journal of Consumer Research, August 2012

© Allan Gaw 2017

This article was originally published in the Spring issue of Practice Manager, and you can view it as this website:

https://www.mddus.com/resources/publications-library/practice-manager/issue-16-spring-2017/practice-matters-decision-making

 

Now available in paperback…

Testing the Waters: Lessons from the History of Drug Research

What can we learn from the past that may be relevant to modern drug research?

Screen Shot 2015-10-14 at 21.22.11

“I cannot recommend it highly enough: even if you read nothing else about the origins of drug research and what it can (should?) teach us, read this….This is a ‘buy’.”  Madhu Davis review in Pharmaceutical Physician May 2016.

My other books currently available on kindle:

Screen Shot 2015-01-18 at 17.19.43Screen Shot 2015-01-18 at 17.19.00BIS Cover copyScreen Shot 2015-01-18 at 17.19.29Screen Shot 2015-01-18 at 17.18.54Screen Shot 2015-01-18 at 17.19.37Screen Shot 2015-01-18 at 17.19.16Screen Shot 2015-01-18 at 17.19.58Screen Shot 2015-01-18 at 17.20.06

 

If our understanding of medicine was perfect, there would be no need for research. If no new diseases, like Ebola or Zika, emerged, we could dispense with the need for discovery. If we could prevent or cure all cancers, all infections, all diseases such as diabetes, arthritis and dementia we could consign science to the history books. But everyone knows that this is a dream as yet unfulfilled.

The simple fact is, that although we have come a very long way in improving healthcare, there is still much to be done and understood, and there always will be. Diseases change, the characteristics and susceptibilities of the population alter and even treatments that once could be relied upon, no longer work.

For healthcare to be better tomorrow, or even just as good as it is today, we need to keep moving forward and I believe the engine that powers that momentum is clinical research — the process of finding new knowledge and understanding about health and disease that involves people.

Research involving people is thus at the very heart of modern medicine. For this to happen, however, we continually need new, committed researchers and new willing research participants. There are many misconceptions about how modern clinical research is conducted and any serious attempt to grow and develop this aspect of medicine, and to allow it to achieve its full potential must address these through the provision of high quality and accessible educational and training opportunities.

Those in healthcare or contemplating such a career need to be informed and inspired to take part in research, while those we hope will volunteer to take part in research studies need to be fully research aware and to understand the vital importance of their role and how they fit in to the research process.

Thus, the challenge is one of communication and education, and it is a challenge on a grand scale. While there have been many attempts to address this issue most are relatively small and limited, and they are largely designed to deal with local concerns and to take advantage of statutory training needs.

In order to explore new educational possibilities in this area, a team at the National Institute for Health Research Clinical Research Network (NIHR-CRN) decided to tackle this challenge by harnessing the power of modern educational technology to offer a Massive Open Online Course or MOOC. The aim of this enterprise was to educate and inform the public, patients and healthcare professionals about clinical research.

 

The result, ‘Improving Healthcare through Clinical Research’ is a four week online course offered via the NIHR-CRN and their host organisation the University of Leeds on the FutureLearn platform. It consists of short tailor-made videos and animations, structured and directed readings and a series of external links. The MOOC also contains short self-assessment exercises and an end of course test for those interested in gaining evidence of their satisfactory completion of the course. But, a large part of the educational value in the course comes from the discussion boards where learners are encouraged to post questions and comments and to interact and learn from each other. The boards are moderated throughout the four weeks of the course by the presenters and any specific questions on course content are answered.

As I reflect on my involvement with the MOOC, a number of themes emerge.  Most education is local and contained — 10 people in a tutorial group, 30 in a classroom, 200 in a lecture theatre. When we step on to a global platform to deliver education in this or in any field a number of new opportunities present themselves for the first time, along with equally new challenges. We have the opportunity to speak to a diverse and truly international audience and to influence their thinking about clinical research. But we also have a responsibility especially when talking directly to patients — dealing with their fears, prejudices, misunderstandings and in some cases managing their overriding search for hope.  The international community of learners from a wide range of backgrounds also adds a special dimension to this kind of education for we are all able to learn from those living and working in completely different healthcare landscapes.

It is a privilege to be part of something on such a global scale. It is humbling to hear first hand the stories of those involved in research, either as an investigator or as a participant. And, it is a remarkable medium for education because the MOOC has allowed me to teach more people in a few weeks than I have taught in a lifetime in academia.

Indeed, we are about to deliver this course for the fourth time, starting on May 22, 2017, and over these runs we have reached around 20,000 learners from more than 80 countries. Some are interested members of the public, some are patients, some former research participants, some school children and students and some research professionals already working in healthcare.

If you would like to join that community, why not sign up to take part at https://www.futurelearn.com/courses/clinical-research — it’s free and the feedback so far has been excellent.

 

 

© Allan Gaw 2017

 

Now available in paperback…

Testing the Waters: Lessons from the History of Drug Research

What can we learn from the past that may be relevant to modern drug research?

Screen Shot 2015-10-14 at 21.22.11

“I cannot recommend it highly enough: even if you read nothing else about the origins of drug research and what it can (should?) teach us, read this….This is a ‘buy’.”  Madhu Davis review in Pharmaceutical Physician May 2016.

My other books currently available on kindle:

Screen Shot 2015-01-18 at 17.19.43Screen Shot 2015-01-18 at 17.19.00BIS Cover copyScreen Shot 2015-01-18 at 17.19.29Screen Shot 2015-01-18 at 17.18.54Screen Shot 2015-01-18 at 17.19.37Screen Shot 2015-01-18 at 17.19.16Screen Shot 2015-01-18 at 17.19.58Screen Shot 2015-01-18 at 17.20.06

 

For those who believe in a god, the world must seem to be a wonderful evocation of his mind and will. From the golden dawn to the setting scarlet sun; from the blue teaming oceans to the sparkling canopy of stars; from the majesty of the rain-forest to the rainbow over the waterfall — these are all taken as ample evidence of their god’s artistry.

For those of who us who have no such belief, let me assure you the world is just as colourful and just as dazzling, and perhaps even more wondrous. We see a universe fashioned by the forces of nature rather than the hand of a deity, and humanity as the smallest speck in it all. We strive to understand the workings of this remarkable machine called the Earth and this fragile thing we call life.   And, perhaps, most importantly of all, we recognise that this world and this life are our greatest treasures, for there are no others.

I was recently pitied by a friend for my ‘atheism’. A word, I must admit I detest, as I do not see why I should be defined by something I don’t believe in. There are many things I don’t believe in or subscribe to, like ghosts and day-time television, but I’m not sure that’s how I should be described. “If I didn’t believe in God,” he went on, “I would just do what I liked. I mean, how does anyone who has no god in their life behave morally?” I think the question was genuine, as was the arrant stupidity of the sentiment. The idea that morality has anything to do with religious belief is surely put to rest after even a cursory viewing of the evening news. Now, as in the past, men and women have used their belief in one god or another to justify the most despicable of acts imaginable. Abuse, torture, enslavement, mutilation and murder are all carried out by those doing their particular god’s will. No, belief is not a prerequisite for decency, nor is it a necessity for caring about other people, other animals or the planet we call home itself.

The soaring complexity of creation — and yes I do believe it was created, just not by a god — leave me, just like everyone else, in awe. My way into and through this feeling has been science. The artist may attempt to replicate the wonder of it all, even to harness it, but it is the scientist that seeks an explanation. Through our observations we find patterns and by the careful joining of the dots we craft meaningful pictures that help us understand what we are seeing, hearing, feeling. However, the process of finding out how and why does not destroy the wonder; it is still there and perhaps even increased by the business of discovery.

And those observations can lead us to unexpected conclusions. As the evidence piles up we are forced to accept the possibility, however unpalatable, that we are not the centre of the universe, or even the solar system. We are compelled to accept that we are one species amongst many on this rather average small, blue planet and that the level of our insignificance in the universal scheme of things is simply unfathomable to our finite minds. But while science reveals our limitations, it simultaneously offers us a view of a further horizon, a more distant shore.

When the biologist JBS Haldane was asked what the study of the works of creation could teach us about the mind of the Creator, he pondered and replied that He must have “an inordinate fondness for beetles”. I rather like the idea of a god tirelessly trying out new designs, until quite suddenly he finds himself overrun by prototypes and embarrassed by the time he has spent at the workbench. Indeed, there are more than 350,000 species of beetle; that’s around one in five of all species of living things, animals and plants. However, the reason there are so many different species of beetle currently on Earth, and who know how many more since the dawn of life, is not because of an over zealous God, but because there are thousands of different habitats and niches to occupy. No single design can make a living in every environment, so rich diversity is the key to success. Life is there for one reason and one reason only, to thrive and create more life. At least in my opinion, the purpose, even the meaning of life, is life itself.

But what of god — did he weave the strand of DNA that defines you and differentiates you from the microbe, the banana, the haddock or the chimpanzee? Did he craft the molecules from which you as well as the stars are made? Did he shackle the clouds to the sky and the wind to the waves? Did he craft joy as he was etching pain, happiness as he was distilling agony? I think not. For me, the universe, or the vanishingly small portion of it we know about, is quite wondrous enough to comprehend without resorting to a god. Indeed, it was Mervyn Peake who said, ‘To live at all is miracle enough.’ And I think he was right.

© Allan Gaw 2017

 

 

Now available in paperback…

Testing the Waters: Lessons from the History of Drug Research

What can we learn from the past that may be relevant to modern drug research?

Screen Shot 2015-10-14 at 21.22.11

“I cannot recommend it highly enough: even if you read nothing else about the origins of drug research and what it can (should?) teach us, read this….This is a ‘buy’.”  Madhu Davis review in Pharmaceutical Physician May 2016.

My other books currently available on kindle:

Screen Shot 2015-01-18 at 17.19.43Screen Shot 2015-01-18 at 17.19.00BIS Cover copyScreen Shot 2015-01-18 at 17.19.29Screen Shot 2015-01-18 at 17.18.54Screen Shot 2015-01-18 at 17.19.37Screen Shot 2015-01-18 at 17.19.16Screen Shot 2015-01-18 at 17.19.58Screen Shot 2015-01-18 at 17.20.06

 

Is it the little things that matter — small kindnesses, inconspicuous acts of generosity and moments of undivided attention? We were all famously told years back, ‘Not to sweat the small stuff,’ but I’m not so sure that’s true. Keeping the bigger picture in mind and engineering our lives to ensure that we have the major themes in place is undoubtedly important, but so is the realisation that the big stuff is comprised of the detail.

It is in the detail that excellence lies, and inattention to the finer points of anything we do leads us inevitably down the road toward mediocrity. Let me give you an unexpected example.

Whether you are a Harry Potter fan or not, I would defy you not to be impressed by a visit to the original film sets at the Leavesden Studios, north of London. There, as you walk through the great hall and peer into Dumbledore’s Study, the Gryffindor Common Room and the Potions Laboratory you will of course see immediately recognisable spaces, but you will also see much more. Look closely and you will spy details that could never have been seen on film. The care and attention with which the rooms are dressed and the level of intricate detail is simply breathtaking. On the Common Room notice board every hand written flyer tells you exactly what to expect of the forthcoming quidditch practices and where to report any lost toads. In the Potions Laboratory, you can see the benches and the cauldrons and you can almost hear the swish of Professor Snape’s robes, but look on the shelves and there you can start to marvel. Lining shelf upon shelf there are literally hundreds of glass jars and vials of the purported magical ingredients all with intricately handwritten labels. These would never have been visible in the final films, so why bother?

The production crew on the Potter films strove to create a convincing world and part of that was to ensure that every detail was consistent and believable. The actors and the film crew could see these details even if the cinema-goers could not, and doubtless that was the intention. The detail matters and getting the detail right is a hallmark of excellence. By taking the effort to make these sets as convincing as possible, the production teams were declaring unequivocally the standards required by everyone involved in the project, and the bar was firmly set at high.

 

 

Large scale projects are an enormous challenge, not because they present single big problems, but, rather, because they demand us to do tens of thousands of small things well, and consistently well. This is just as true in the world of clinical research as it is in filmmaking. The quality of our work in clinical research has to be of the highest standards possible, because the stakes are so high. Our findings influence and shape healthcare not just for those involved in the study, but also for countless others across the globe in the years and decades ahead. And the way we ensure this quality standard is to make sure we think about the quality of the detail in our work. Like those British filmmakers, just because something will not be obvious, does not mean that it can be skimped or done half-heartedly. It is about creating a culture of excellence and attention to detail that pervades the work and which ultimately shapes the finished product.

In our work, we need to have a zero-tolerance for the mediocre and we should find the merely ‘good enough’ unacceptable. Instead, we need to replace these approaches with a desire for the best we can do. It may require more effort, sometimes more resources, but always a different attitude.   When we understand that excellence is indeed in the detail, we may also come to realise that it is the small things that really do matter.

© Allan Gaw 2017

 

Now available in paperback…

Testing the Waters: Lessons from the History of Drug Research

What can we learn from the past that may be relevant to modern drug research?

Screen Shot 2015-10-14 at 21.22.11

“I cannot recommend it highly enough: even if you read nothing else about the origins of drug research and what it can (should?) teach us, read this….This is a ‘buy’.”  Madhu Davis review in Pharmaceutical Physician May 2016.

My other books currently available on kindle:

Screen Shot 2015-01-18 at 17.19.43Screen Shot 2015-01-18 at 17.19.00BIS Cover copyScreen Shot 2015-01-18 at 17.19.29Screen Shot 2015-01-18 at 17.18.54Screen Shot 2015-01-18 at 17.19.37Screen Shot 2015-01-18 at 17.19.16Screen Shot 2015-01-18 at 17.19.58Screen Shot 2015-01-18 at 17.20.06

 

whitby-abbey

Approaching Whitby along the coastal path there is drama in the air. The ruined Abbey appears like a piece of charred lace against a grey February sky. There is talk of vampires here, and jet — the black fossilised wood found in the nearby cliffs — is carved into expensive jewellery for those who wish to take a little of the gothic home.

As far as the gothic goes, Bram Stoker, the author of Dracula, has a lot to answer for. Setting part of his masterwork in this North Yorkshire port, he forever branded the town as a heaven (or a hell) for goths and others of darker predilections. It’s all good for the tourist trade though and adds another layer, albeit imagined, on to this town, already busy with history. There are Georgian houses and Victorian lighthouses; tales of Viking raiders and smugglers; the legacies of whaling and fossil hunting; and of course civic pride in the young apprentice sailor James Cook who learned his naval ropes here long before he would land on Botany Bay.

Perhaps Whitby is a popular destination because it can cater to your tastes, whatever they may be. Adaptable, it becomes the town you wish of it. Like a skilled courtesan, it pleases without seeming to try too hard, leaving each visitor feeling satisfied and vowing to return to re-experience the best or to try whatever was left undone through lack of time.

So what was I doing looking out across the harbour from my room in the Pier Inn at dawn one winter morning? Unable to sleep, or perhaps awoken by the bells from the church on the cliff, I pulled back the curtains to see a fishing port coming gently to life. In truth, there are few fishing boats now in what was once one of the busiest and most important ports in England, but there are dog walkers, beachcombers and hooded figures scuffling through the cold morning air to their jobs in hotels and guest houses. And there are delivery vans navigating narrow streets wholly unsuitable for the internal combustion engine. Fresh bread, seafood and newspapers are delivered as well as another commodity that caught my eye — Exotic Fruit for the Catering Trade. For some reason this seemed a little incongruous. The seafood yes, the bread naturally and the papers of course — but exotic fruit?

Whitby, with all its delights is not really an exotic fruit sort of place. Solid Yorkshire sandstone has been used to build the town and its inhabitants. Lobster pots and tales of angling success are both piled improbably high on the dock sides. There are medieval streets and alleyways, listed buildings that even list and a sense of its own longevity almost as old as the fossils in its rocks.   But there is little, if any, need to gild this particular lily with the exotic— it is already special and already golden.

We should take pleasure in the unique and even in the merely unusual, without attempting to smooth its corners and make it fit our ideas. The out of the ordinary may be disconcerting, but it is always interesting, and nowhere more so than in science. ‘Treasure your exceptions,’ counselled the early 20th century Cambridge biologist William Bateson, for he recognised just what could be learned from the unusual. We ignore outliers at our peril for it is in the apparently aberrant that the true story of our data may lie, or at least one complicated aspect of it.

But, worse than discarding the unusual are our often botched attempts at improving upon it. To do so is not only a fruitless task but also a foolhardy one. By taking what is unique and therefore already special and attempting to make it better — to improve upon it — all we end up doing is making it like everything else. In short, making it ordinary. The unique is as special as it gets and that’s the important point.

Whitby this winter’s dawn is special and has no need of exotic fruit — by being unique, it is already quite exotic enough.

© Allan Gaw 2017

Now available in paperback…

Testing the Waters: Lessons from the History of Drug Research

What can we learn from the past that may be relevant to modern drug research?

Screen Shot 2015-10-14 at 21.22.11

 

“I cannot recommend it highly enough: even if you read nothing else about the origins of drug research and what it can (should?) teach us, read this….This is a ‘buy’.”  Madhu Davis review in Pharmaceutical Physician May 2016.

 

My other books currently available on kindle:

Screen Shot 2015-01-18 at 17.19.43Screen Shot 2015-01-18 at 17.19.00BIS Cover copyScreen Shot 2015-01-18 at 17.19.29Screen Shot 2015-01-18 at 17.18.54Screen Shot 2015-01-18 at 17.19.37Screen Shot 2015-01-18 at 17.19.16Screen Shot 2015-01-18 at 17.19.58Screen Shot 2015-01-18 at 17.20.06

img_1451

Stand and look down into the stairwell of the Baltic Arts Centre and your heart will skip a beat. Look up and you will be confronted with an equally breathless sense of the infinite, as mirrors below and above you extend the winding stair down into the abyss and up into paradise. This permanent installation by the artist Mark Wallinger, is entitled Heaven and Hell and its simplicity is impressive.

Indeed, there is nothing about this converted flour mill on the banks of the Tyne that does not impress. Remodelled and repurposed into a centre for contemporary art, it stands like an industrial monolith now with panoramic views of Newcastle and Gateshead. Inside, there are large uninterrupted spaces, glass elevators and of course that vertigo inducing stairwell.

‘You’re alright with heights?’ the gallery guide gently probed who greeted me at the door. ‘People usually start at the top and work their way down.’  How different from life, I thought.

I rode the glass elevator to the top floor and wandered through the building, working my way down as directed and marvelled at the place — the setting for the art on show. While filled with an eclectic mixture of modern art, it is the building that is the real masterpiece. Albeit transitory, the set of contemporary installations that were on show the day I visited bewildered and bothered in equal measure, but failed I’m afraid to bewitch. Or at least, I should say, they failed me. Yes, it is all art; I just thought some of it wasn’t very good art.

Because I had spent more than £20 in the gallery gift shop — it was one of those gift shops in which it was very easy to spend more than £20 — I was rewarded with a free tea voucher for the café. The Baltic Kitchen café like the rest of the building is rather lovely with wonderful views of the Sage Gateshead and the Tyne Bridge through huge plate glass walls. But, like the rest of the gallery, I felt its contents did not quite live up to the container. Languid teenage staff who seemed to find it an especial inconvenience to take an order and tea that came with the teabag still in the cup. I mean, really.

But, I was in a forgiving mood — art does that, even bad art. And so does working your way back down to the start from the heights. Most of the time we are scrambling up the increasingly greasy pole of life to get a better view. Here the view was freely given and the journey on offer was a descent, down a winding and infinite stairway, in order to get your feet back firmly on the ground. And, from that ground, the view up into the building was just as exciting as the view down from the top — soaring white spaces just made to be filled with art.

img_1442

But glass elevators were not the only means of scaling the heights of the Baltic, and artistic spaces were not all that soared. Outside the building an inland colony of kittiwakes still nest on its brick ledges just under the stark lettering high on its river facade. They are unaffected by the art on show or the heights to which they have flown. On high is where they dwell. They have no aspiration to climb any higher or indeed to swoop any lower. All they need is here. And they are enjoying the view, unawares that there is any other to be had.

People, however, have a broader perspective on their world and their possibilities. A perspective that leads to ambition and either disappointment or success — or more usually a little of both. We worry about our level, and we waste time on our concerns about the climb. While we usually equate height with status and altitude with success, in the Baltic Arts Centre there was wonder to be had on all levels. Having enjoyed both extremes from the highest heights looking down and from the ground staring up, it was clear that this building might have a lesson to teach.

In this building, as in life, enjoy whatever stage you are at, whatever floor you have reached, and don’t spoil the moment by constantly looking upwards or worrying about the fall.

© Allan Gaw 2017

 

Now available in paperback…

Testing the Waters: Lessons from the History of Drug Research

What can we learn from the past that may be relevant to modern drug research?

Screen Shot 2015-10-14 at 21.22.11

My other books currently available on kindle:

Screen Shot 2015-01-18 at 17.19.43Screen Shot 2015-01-18 at 17.19.00BIS Cover copyScreen Shot 2015-01-18 at 17.19.29Screen Shot 2015-01-18 at 17.18.54Screen Shot 2015-01-18 at 17.19.37Screen Shot 2015-01-18 at 17.19.16Screen Shot 2015-01-18 at 17.19.58Screen Shot 2015-01-18 at 17.20.06

 

 screen-shot-2016-12-04-at-11-47-29

Complex and abstract thoughts are often both difficult to understand and to articulate. When scientists write about their work and detail the interpretations of their findings, they try to offer their readers some coherence, some solidity, but the nature of their subject matter often makes this a challenge. Some authors simply dismiss the challenge and produce turgid prose that is painful and time-consuming to read and ultimately uninspiring. Others, however, pick up the gauntlet and strive to make their ideas understandable on first reading. They are aware of the extra effort needed to make their writing clear and concise. And they appreciate the need to take their readers by the hand and guide them along a logical, well-signposted path of reasoning and argument towards a conclusion.

In addition to this, they also understand the utility of metaphor. For some — especially poorer writers — the notion of metaphor merely conjures up the lyricism of fiction. Metaphor for them represents the unnecessary adornment of language with frills and fripperies. What they fail to understand is that metaphor, used well, is an invaluable piece of instrumentation, and one that can make the difference between clarity and confusion.

Metaphor is a lens through which we can more precisely view an often blurred reality. The use of metaphor can bring an indistinct image into sharper focus. And, of course, in a rather meta way this way of describing metaphor is itself a metaphor to help us understand its use.

A carefully crafted metaphor will do much of the work that is needed in order for a reader to understand our meaning. By presenting a parallel, perhaps more familiar, reality we can align our readers’ thoughts with ours. By offering an alternative, a more solid, example with sharper edges in place of a softer, rather nebulous concept, we can share the fruits of our thought with greater precision.

Whether it be the incomprehensibly large or small, or the dauntingly complex with multiple layers of structure or function that need to be separated to be understood, metaphor has a place. In biochemistry we talk of enzymes and their substrates as ‘keys and locks’; in astrophysics we talk of gravitational ‘waves’; in medicine we talk of a patient suffering ‘depression’. None of these are meant literally, but the metaphor is used in each case to offer clarity to the reader.

What of the structure of the cell? In biology, we study static and often colourful descriptions of the organelles that inhabit the cell and think we know what a mitochondrion or a ribosome look like. In fact, they are nothing like their cartoons, but that is unimportant, when we remember that the metaphor that is presented is merely a lens to bring the microscopic and inaccessible into focus, giving them line and form and therefore ultimately function.

Beyond this simple and common use of parallel terms or simplified pictures to explain the unfamiliar, we also have throughout science the development of more complex metaphorical systems. These might also be called models, but at their heart they are simply metaphors — the substitution of one thing for another.

In chemistry, the atom is presented as a miniature solar system with electrons orbiting a proton and neutron cluster star. While the reality of sub-atomic structure would defy this description, such a model is still useful, especially to help those first learning about atomic structure, before they move on to a more nuanced understanding. In physiology, fluid balance is represented using watertanks and taps which we can visualise and grasp, when in reality this is a stark, if convenient, simplification of the complexities of this aspect of our homeostasis. Again, however, this model, this metaphor, makes the process immediately accessible to those learning about it for the first time.

We talk of ‘black holes’ and the genetic ‘code’ and the ‘flowering’ of a species; we measure the ‘flow’ of electricity and describe the function of ‘messenger’ RNA; and we even say it all began with a ‘Big Bang’ when of course bangs are noisy, and noise would not have figured at all.

Any model is then a form of metaphor, representing and describing something as something else, and science is full of models to help us rationalise and understand things we can never see and never touch.

As we write about our science, we need metaphor to help make sense of it.   Scientists, for the most part, inhabit an abstract world, and the abstract can be dark. Indeed, the abstract draws a veil across meaning and the good science writer’s job is not to offer simplicity for the sake of analgesia, but rather to provide illumination. Good writing ensures the effective and economical transmission of meaning and does it simply. Simplicity encompasses clarity, brevity and above all metaphor. And metaphor brings with it focus.

© Allan Gaw 2016

 

Now available in paperback…

Testing the Waters: Lessons from the History of Drug Research

What can we learn from the past that may be relevant to modern drug research?

Screen Shot 2015-10-14 at 21.22.11

My other books currently available on kindle:

Screen Shot 2015-01-18 at 17.19.43Screen Shot 2015-01-18 at 17.19.00BIS Cover copyScreen Shot 2015-01-18 at 17.19.29Screen Shot 2015-01-18 at 17.18.54Screen Shot 2015-01-18 at 17.19.37Screen Shot 2015-01-18 at 17.19.16Screen Shot 2015-01-18 at 17.19.58Screen Shot 2015-01-18 at 17.20.06

img_1530

 

I like a ruin. I enjoy piecing together the architectural puzzle in the stones that time has left behind. I get excited by standing alone in an abbey transept or in a castle kitchen and just listening to the past. I can hear the long-gone footsteps on flagstones and the crackle of logs in a fireplace; the shuffling of tired monks from their beds in the middle of the night on their way to prayer and the flurry of bread being kneaded and stew being cooked. The stones, no matter how battered and broken, still carry the fingerprints of those who clutched them and the smells of those who lived and died upon them. There is a sense of place in stone that can never be realised on the pages of any history book.

If you like a ruin too, try Jedburgh Abbey. There is a chain of sandstone edifices in various states of dilapidation across the Scottish Borders, but if you can only visit one, try Jedburgh. I say this not necessarily because it is the best or the most interesting or even in the most picturesque location, but rather because it is the one, for me at least, whose stones have the most to say.

There are of course the soaring walls and window traceries with long-lost stained glass. There are the ornate carvings, many admittedly worn and blunted by the passing years, but still sharp enough to make you sigh at the mediaeval stonemason’s craft. There are the remnants of the cloister, thoughtfully recreated with yew hedges with more than a suggestion of the contemplative sanctuary that it all was. There are all these things and more, but there are also some things you don’t expect to see in a mediaeval church complex, and it’s the unexpected that often catches your breath and makes you think.

There is a small doorway just inside the great west entrance of the abbey leading to a stairway, which is steep and winding, and unfortunately too narrow for my claustrophobic head. I once had a very bad experience climbing the Scott Monument on Edinburgh’s Princes St involving an unfortunately overweight tourist, and spiral stone staircases have largely been out of my bounds since. That, however, is a story for another day — back to Jedburgh.

The abbey stairway has a lintel that, at first sight, seems to have some scratching on it, perhaps even some Victorian graffiti, but which on closer inspection reveals itself to be some Latin. In fact, this is not mediaeval church Latin, but the real deal, for the lintel is a Roman altar stone praising a very different God to the one glorified in the rest of the abbey. ‘To Jupiter, best and greatest,’ the inscription begins and I thought, how interesting. How many monks, I wondered, had walked under that stone while it supported the stairway above since it was placed there in around the year 1200? How many of them had taken the time to look up and, with their facility in Latin, noted the mention of the Roman god? How many of them could see the irony; how many had smiled?

The altar stone had been recycled and repurposed as well as realigned in a religious sense. An object of pagan worship had found its way into the very heart of a Christian community and been used quite literally to keep the place standing. Although we will never know what those monks thought, the one thing it does tell us is that there was tolerance to its pagan provenance. And there’s a lesson to be learnt from that.

As we reuse and reconfigure what is left behind, we need to have some understanding and some tolerance for its original purpose. If we choose to rely on materials that already exist rather than what we can make anew, we need to accept the quirks of their original design and the echoes of their original intent. Thirteenth century monks knew this and had no problem reusing a perfectly serviceable piece of dressed stone, feeling no need to scrape it clean before incorporating it into their church.

And so it is with ideas. As we strive to make sense of our work, we call upon the thoughts of those who went before to help us. We read what they have written and study how they analysed their findings and presented their thoughts. Ideas are recycled and reused just as easily as sandstone. And with time and care we build new edifices on foundations laid by others, both in stone and in thought.

But, while we study the past we have to be forgiving. It is only with tolerance that we can reuse without recourse to destruction. Ideas that were formed and articulated in a different time do have to be understood in their context, and while their value might be immediately obvious, often their true worth may only be apparent after their original purpose is taken into account.

But there has not always been tolerance in Jedburgh. In the 16th century, reformists could not tolerate the very stones I was standing upon, and the resulting ruined, half-abbey I now found myself in was a testament to that. As I strained to read the Latin inscription on that Roman altar stone, which to me epitomised a tolerance to history, I realised I was surrounded by equally stony evidence of intolerance. Like most others, this abbey had been vandalised and brought to ruin, as part of the collateral damage from the Reformation.

Yet, in the midst of such intolerance, the Roman altar stone, which was already a thousand years old when it was installed, still bore the weight of the stairway above and still stood as a testament to the tolerance and the pragmatism of an earlier time.

I smiled as I beheld the irony in the stones, just as I am sure my mediaeval forebears had done before me, when they found Jupiter in their abbey.

© Allan Gaw 2016

 

My latest book…

Testing the Waters

Lessons from the History of Drug Research

What can we learn from the past that may be relevant to modern drug research?

http://www.amazon.com/Testing-Waters-Lessons-History-Research-ebook/dp/B01AXBM0WQ/

Screen Shot 2015-10-14 at 21.22.11

My other books currently available on kindle:

Screen Shot 2015-01-18 at 17.19.43Screen Shot 2015-01-18 at 17.19.00BIS Cover copyScreen Shot 2015-01-18 at 17.19.29Screen Shot 2015-01-18 at 17.18.54Screen Shot 2015-01-18 at 17.19.37Screen Shot 2015-01-18 at 17.19.16Screen Shot 2015-01-18 at 17.19.58Screen Shot 2015-01-18 at 17.20.06

screen-shot-2016-10-06-at-17-01-26

 

When it comes to medicine, do you think theory or practical experience is more important? This is a question that is central to modern medical education, and it was a conundrum that exercised the mind of one of the most famous physicians in the ancient world, Claudius Galenus, or as we know him today, Galen.

Born in 129 CE, Galen grew up in the Roman Empire at the height of its power and his birthplace, Pergamon (now in modern day Turkey), was at the time a major cultural and intellectual hub. He received his medical training in several centres in addition to Pergamon, including Smyrna, Corinth and Alexandria. At the age of 28, he returned home from his travels and was appointed to the prestigious role of physician to the gladiators. This post allowed him to expand even further his knowledge of trauma and human anatomy. Four years later, Galen decided to travel to Rome where he found fame and success, eventually becoming the personal physician of the imperial family. He continued to practise, write and move in the highest tiers of Roman society until his death in around 216 CE.

Thus, even in his own lifetime, Galen was something of a medical superstar, but his writings were to be his legacy, for they would influence the way medicine would be taught and practised for the next 1,500 years. Indeed, Galen is thought to be the most prolific author of antiquity, producing around 600 books, of which only around one third survive.

In addition to being a practitioner, Galen was also a theoretician and the relative importance of the two was a theme in many of his writings. In one of his many books, On Medical Experience, Galen makes a number of arguments for the value of experience over theory, but perhaps most convincingly he uses two examples that are immediately familiar to readers. First he evokes a piece of hard-won experience to which most will relate — the hangover.

You know that men taken as a whole, of whatever type they may be, do not feel bound to examine into the nature of wine, but that they know perfectly well that too great indulgence in drinking wine is harmful.

In other words, you do not need too much theoretical understanding of the metabolic effects of alcohol on the human body to appreciate its toxicity, if you have ever drunk too much.

‘And so it is with mushrooms,’ he continues, as he draws upon another potential form of poisoning.

One finds that the learned man who discourses on the natures of things, knows their nature. But if any mushrooms are placed before him, he does not know which are edible and which are not, whereas the country-dwellers can distinguish between them since they are familiar with them and see them constantly, and even the children know them, to say nothing of their elders.

Just as with alcohol, when it comes to the toxic properties of fungi, theory is a poor teacher when compared to experience.

Galen concludes his argument for the superiority of practical experience over theory by stating:

And in short, we find that of the bulk of mankind each individual by making use of his frequent observations gains knowledge not attained by another… [E]xperience and vicissitudes have taught men this, and it is from their wealth of experience that men have learned to perform the things they do.

Despite these vivid examples, Galen was also able to present equally compelling arguments for the pre-eminence of theory over practice, and his works, though extensive and varied, draw upon both. Indeed, one of the threads running throughout his vast work is his equal recognition of the importance in medicine of theory and practice. He was a practitioner of medicine and surgery, but he was also a researcher, reliant on both the theoretical work of his forebears, such as Hippocrates, and his own experiential learning and experimental research. Although Galen placed great store in the Hippocratic writings, which were already more than 500 years old when he was studying them, he was keen not to take things at face value. Galen believed that the practice of medicine would advance through a combination of reason and experience, adding for good measure,

The surest judge of all will be experience alone, and those who abandon it and reason on any other basis not only are deceived but destroy the value of the treatise.

As one of his translators concludes: ‘In the ancient conflict between theory and practice, Galen wishes to lay claim to the best of both worlds.’

So, in answer to the question, ‘theory or practice?’ Galen would have said both, but he might also have asked you to consider whose advice you would take when deciding whether to eat a mushroom or not — one well-versed in theory but with no practical experience, or one who’s tried them before?

© Allan Gaw 2016

 

My latest book…

Testing the Waters

Lessons from the History of Drug Research

What can we learn from the past that may be relevant to modern drug research?

http://www.amazon.com/Testing-Waters-Lessons-History-Research-ebook/dp/B01AXBM0WQ/

Screen Shot 2015-10-14 at 21.22.11

My other books currently available on kindle:

Screen Shot 2015-01-18 at 17.19.43Screen Shot 2015-01-18 at 17.19.00BIS Cover copyScreen Shot 2015-01-18 at 17.19.29Screen Shot 2015-01-18 at 17.18.54Screen Shot 2015-01-18 at 17.19.37Screen Shot 2015-01-18 at 17.19.16Screen Shot 2015-01-18 at 17.19.58Screen Shot 2015-01-18 at 17.20.06

IMG_0546

A burnt and twisted steel window frame lifted from the wreckage of the North Tower of the World Trade Center now in the Imperial War Museum, London

 

Our lives only become a story when we can look back and define the beginning, the middle and most importantly the end of anything. Life itself, the everyday to and fro of our existence, is not being part of a story. We are not characters in our own narratives while we are living them; we exist in the now, always in the now, and can have no real sense of where it began and where it will all end. The uncertainty of life is perhaps why we, as humans, cling so avidly to stories. The idea that a narrative can have a beginning, a middle and an end is deeply reassuring, especially when we can find no such structure in our own lives.

There is fear and uncertainty when we are actually living through anything, but there can also be excitement as well as dread, joy as well as emptiness. However, when we can look back and manufacture the story from the fragments of memories and artefacts and photographs that seem to fill the gaps between those memories, we have a tale to tell of us. Our memories are fashioned in a series of overlapping stories, but our existence is not. Being is not living in a story. The act of being is not the playing of a role, but the living in a single moment that is changing often without our control from second to second. Even a cursory glance back through the last minutes of our life can allow us to construct a story. But that story ceases to exist, can’t exist, while we are living it. The story simply evaporates when it comes into contact with the now.

We talk of mindfulness and relishing the moment we are living in. This is an attempt to stave off the anxiety of worrying about the unknowns of the unfixed future or the consequences of a painfully fixed past. It is in this moment where we should live, and where we should place our efforts and find ourselves. But, we need to realise that this very moment is one that exists outside our story while we are living within it. As long as we are alive, the moment never passes as the twisted thread of time weaves through it and takes us along with it. The moment is always the same moment from the second of our awakening conscience till the dying of the light at the end of our lives. It is not a series of connected moments, but a single one. To live in that moment is not a choice but a necessity of consciousness. Mindfulness is not choosing to live in the moment—we have to do that as there is no alternative—but rather choosing to be aware of it.

On September 10, 2001 I arrived in New York in a thunderstorm. As I travelled in a taxi from Newark Airport I could see the looming towers of Manhattan against a wild sky. In a moment of almost biblical drama, a fork of lightening struck the mast atop one of the twin towers. Apparently, it happened all the time, so my taxi driver said, but neither he nor I were to know that it would never happen again.

The following morning, while in a breakfast meeting in the basement of the Hilton Hotel in mid-town Manhattan, we received the news that was to make the world stop.

I had already checked out of my room, thinking that after my meeting I would pay a leisurely trip to the Museum of Art before catching a taxi back to the airport for my flight home. I realised, however, that I was going nowhere; no one on the island of Manhattan was. The bridges were sealed and U.S. Fighter jets were ominously patrolling the sky. I went to reception to ask if I might extend my stay and re-check in. The desk clerk sighed and assured me that would not be a problem—no one would be arriving to take my room. I stepped outside into what was a sunny morning with a perfect blue sky. The only evidence that the world was ending was the heavy smell of aircraft fuel that hung in the air. There was no noise, no dust, no sign of the horror that was unfolding 20 blocks away.

I have chequered memories of the next few days as we waited, watching the narrative unfold. I remember walking down the middle of a deserted Fifth Avenue in the afternoon to seek solace in the greenery of Central Park. I remember sleeping with my clothes on, in case I had to leave in a hurry. And I remember trying to phone home to let my wife and children know that I was alive. I knew they must be watching scenes of twisted wreckage and smouldering masonry on TV and I had to let them know I was safe, if not exactly sound. Immediately after the attack, there was no mobile phone service in Manhattan. This was undoubtedly in part due to the fact the main transmitter masts were now lying in the streets of lower Manhattan, but also, I have since learned, because the authorities would have taken control of such means of communication to minimise further acts of terrorism. What did work were the landlines, but not to the UK. I dialled every number I knew and managed to wake my brother in the middle of the night in Melbourne, Australia. Surprised to hear my voice at such an hour, he asked if everything was alright. I answered simply that I was in New York. Unaware of what was happening he said that was nice for me and I urged him to get up and turn his TV on and then to phone my family in Scotland.

But if there is one memory that encapsulates those twisted days, it was a moment in the hotel lobby. Like me, everyone in the hotel who had been there on Sept 11 was still there, locked down in the fortress that Manhattan had become. A flip chart in the reception area was being used by the hotel to provide updates regarding transport and in particular airports. That morning—it was perhaps the 12th or 13th—the lobby was full of people milling around, consulting the latest updates. Suddenly, as if the volume control had been turned down, the lobby hushed. Through the revolving door walked a young firefighter in his full protective gear, helmet and breathing apparatus. Like a sculpture he was completely white, covered from helmet to boot in concrete dust. He walked slowly, diagonally across the lobby, through a crowd of people that simply yielded to his presence and parted to create a path to the reception desk. He was clearly exhausted and he slumped on the desk as the clerk checked him in. Firefighters had been drafted in from all surrounding areas and after their shifts were being put up in the city’s hotels. From our perspective in mid-town this was our clearest view of the events that were unfolding so close to us.

On 9/11 I was frightened, and while I knew why, I was surprised. This was a stark moment where I was clearly living outside my narrative. The story of me was still unfolding and I was more than usually aware of it all. Fifteen years on, I still cannot watch the pieces of film that show the planes crashing into the towers. I know they exist, but whenever they are shown I have managed to turn my head. But, I can now look back and I can now tell the story for it now has a beginning, a middle and an end.   While I was living it, while in my moment, there was no reassurance of such, no sense of a story, and importantly no sense of an ending, happy or sad.

Our journey through time offers us many things, but perhaps the greatest of these is perspective. The cavalcade through a single moment that we call our existence is translated into a story we call our life. That narrative is important, but it is also an illusion; a story that we tell afterwards as if it was real, as if it represents what it was to be in that moment. In truth, the past is gone and the future never comes. All there is, and all there ever can be, is the moment of the present in which we live.

© Allan Gaw 2016

 

My latest book…

Testing the Waters

Lessons from the History of Drug Research

What can we learn from the past that may be relevant to modern drug research?

http://www.amazon.com/Testing-Waters-Lessons-History-Research-ebook/dp/B01AXBM0WQ/

Screen Shot 2015-10-14 at 21.22.11

My other books currently available on kindle:

Screen Shot 2015-01-18 at 17.19.43Screen Shot 2015-01-18 at 17.19.00BIS Cover copyScreen Shot 2015-01-18 at 17.19.29Screen Shot 2015-01-18 at 17.18.54Screen Shot 2015-01-18 at 17.19.37Screen Shot 2015-01-18 at 17.19.16Screen Shot 2015-01-18 at 17.19.58Screen Shot 2015-01-18 at 17.20.06

%d bloggers like this: