The Sacrifice of Ignorance


Knowledge is bought at the price of ignorance. We must surrender one for the other. A simple transaction you may think, and one we should be ready to make. However, the loss of ignorance, and the innocence that so often accompanies it, may be more of a wrench that we expect.

Some argue that things have gone too far, that science has wrought a new kind of havoc and that we should resist so-called advances with a neo-Luddite fervour. A simpler world can be conjured in our minds; one without dilemma and expectation and an overload of information, where the ills that currently befall the planet would have been averted. A world where we live simply – simply getting on with it. This is, of course, a fantasy. Life was never simple – it has always been hard, without mercy and often cruel. We were given, or we evolved, our brains – the most sophisticated computers on Earth – to survive. That survival depended on us being able to discover and invent and change. To imagine a pre-scientific world as a better place is to ignore some basic facts. To regard our present day as somehow soiled by innovation and invention and discovery does a gross disservice to many great minds and to the very essence of what it means to be Human.

Our lives before antibiotics were not better. Our hunger before innovations in agriculture was not noble. Our fears of the dark, and the unknown and the supernatural were intense and should not be forgotten.

Giving up the benefits of scientific discovery is not a solution; it is just the formulation of a different problem. When we acquire new knowledge we may feel that we have advanced, that we have become better. Alternatively, we may feel that such new knowledge will change us and our societies adversely, and that it should be resisted, even outlawed.

A modern day example of this may be the continuing controversy over stem cell research. Such research holds much promise to help those with diseases that have, until now, resisted cure. But, stem cell research also poses significant ethical dilemmas for some, to the extent that any potential benefits are, in their eyes, overshadowed by the damage it does to us as Humans with religious and ethical sensitivities. Would it be better to shelve such research? Would it be right to abandon any attempt to find a compromise that would satisfy the zeal for scientific advance on the one hand and a conservatism born of morality on the other? Such debates often drive our scientific agenda, but they are not new.

Galileo was a scientist who watched the stars and the planets. He concluded, through careful observation, that the Earth rotated around the Sun and not, as had been held as the truth, vice versa. He came into conflict with the Church and in his day that was about as bad as it could get. In temporal matters they represented a virtually totalitarian regime holding sway over life and death, and in spiritual matters they held all the cards, deciding not just on your fate in this life, but for all eternity. Galileo was asked to recant – to admit he was wrong –  and he did, because he wanted to live.

An interesting counter factual world can be imagined if this attitude of an authoritarian church had been allowed to persist in it repression of scientific discovery. Kingsley Amis, in his novel “The Alteration”, describes a twentieth century world where the Reformation has never taken place, because one death had been avoided, that of Prince Arthur, Henry’s elder brother and first husband of Katherine of Aragon.  No death, no remarriage for Katherine, no accession by the second son to the throne of England as Henry VIII and no Anne Boleyn. In Amis’ imagined world London is still a semi-rural landscape, there has been no industrial revolution, and electricity has never been harnessed for power. A simpler world, yes – but one where children still die of simple infections and men and women toil in fields with only horses and oxen to help. Not only is there no Internet, there are no books. There is little education and no possibility of turning wonder into discovery. Is that where we want to live?

As Humans we have, it is believed by some, a hard wired morality. We can distinguish right from wrong from a very early age, and the power of what we ought to do often overwhelms us. Part of that “ought” I would argue is to move forward. We are designed for discovery. We ask questions and work out how to find answers. We are designed to walk upright and look up at the stars, not to scrabble in the dirt. We have the capacity to ask for more and work out where to find it. We face our problems by finding their solutions. In short, to deny progress is to deny us our birthright.

Yes, moving forward sometimes takes us into unfamiliar territory. New discoveries raise new spectres as well as provide new hope. As we saw above, we face fresh challenges in medical ethics with the discovery of new techniques and cures. Advances in physics give us new computing powers, but also the bomb. Chemistry provides poisons as well as drugs and Engineering weapons as well as harvesters. I began this blog with the thought that the price of knowledge was a loss of ignorance and innocence. Perhaps, it is more than that. Perhaps, discovery is bought at the price of our humanity.  Perhaps with every step forward we also step down, and our advance is nothing more than a descent.  It was Darwin who coined the term “The Descent of Man” in the 19th century.  And, it was the polymath Bronowski who preferred the opposite, when he created his landmark television series, “The Ascent of Man” in the 1970s.  Our scientific discovery undoubtedly takes us forward, but the question now seems to be does it simultaneously take us down or does it raise us up?

I have no doubt on this point.  Our field of vision has expanded and our horizons have broadened as we have toiled to achieve the scientific discoveries that have shaped our modern world.  We are higher and we see further now than ever before, not just because we have climbed, but because we have climbed on to the shoulders of giants.  The sacrifice of ignorance and innocence and a simplicity of life that exists only in nostalgia is a very small price to pay for such a view.

© Allan Gaw

Screen Shot 2015-01-18 at 17.19.16Screen Shot 2015-01-18 at 17.18.54Screen Shot 2015-01-18 at 17.19.37Screen Shot 2015-01-18 at 17.19.43Screen Shot 2015-01-18 at 17.19.00BIS Cover copyScreen Shot 2015-01-18 at 17.19.29Screen Shot 2015-01-18 at 17.19.58Screen Shot 2015-01-18 at 17.20.06

The Business of Discovery

Discovery is a rather magical word, don’t you think?  It evokes voyages on high, uncharted seas, the delving into ancient, dusty libraries and the witnessing of previously unseen marvels at the end of a microscope’s lens.  Discovery is my business; indeed it is all our business, for every day we search and seek and try to uncover what is there, but hidden, in many aspects of our lives.  In science, this process of discovery is more rigorous, but no less enchanting and it is scientific discovery that is my business.  Or at least it was my business. Now my aim is to show others how to go about the business of finding out.

When it comes to scientific research, there are rules and acceptable behaviours. Some revel in their ability to break these rules, behaving badly in the headlines, and thumbing their nose at convention.  But, in science the rules are there for more reason than simply to straighten the fences and corral our thinking.  We are operating in a world of creativity and possibility, but one where, if we ignore the rules, we do not discover, we only think we have. Scientific discovery is about revealing the truth through a process that does its best to ensure we don’t get it wrong.  Rather like the archaeologist we have evolved a simple set of tools to scrape and brush away the layers of confusion in order to reveal the answer.  Our most important tool is the concept of “control”.

So often we read headlines about how a drug has caused a cancer, how a mining operation has caused an earthquake, how a century of burning fossil fuels has caused changes in our weather.  These stories are convincing because they appeal to our basic human need to find the link between cause and effect.  Something happens and then something else happens, so the first event must have caused the second.  Our primitive ancestors watched the sun rise and set and tended animals and plants.  If they forgot to pray to the gods and the next day their new calf died, it was obvious what had happened.  If they had not made an appropriate sacrifice and there was no rain to drench the crops for the next season, the cause was clear.  We like to think we are more sophisticated, but in truth we have the same brains as our forefathers, and are still dangerously vulnerable to the fallacy of post hoc ergo propter hoc – after it, therefore because of it.  If we are to discover the truth of cause and effect – e.g. did the drug really cause the cancer – we must resist this way of thinking and find a new and perhaps less intuitive approach.  Let us look at an example.

If 10 people have developed a rare form of cancer we will look for reasons.  In our investigations we will first look for characteristics they share: where do they live, what do they eat, what drugs have they been taking?  The first piece of common ground we discover may be the cause, but equally it may not, it may merely be a coincidence.  Scientific method has had to build a system to protect against the power of coincidence.  What if these ten unfortunate people were all receiving the same drug because they had all been suffering the same symptoms related to their cancer, a cancer they had developed before they had ever been prescribed the drug? But, if the presumed causal association between the drug and the cancer persists in the minds of others, how can we prove it one way or the other.  We can employ a range of observational tools. Other people will have received the drug in question; how many of them have developed this form of cancer?  If we examine their medical records we may quickly conclude that very few, or even no, other such cancers have been diagnosed.  Is this conclusive proof?  No, for critics may rightly say that perhaps these individuals have simply not been diagnosed yet, or that their exposure to the “cancer causing drug” has not been sufficiently long.  The critics may also present more elaborate arguments suggesting that the cancer only develops in a person taking the drug who is exposed to one or more other factors, which your group have not shared.  What next? The definitive way to test the hypothesis – for that is what we now have, an hypothesis, which states that the drug causes cancer- is to conduct a randomised controlled trial or RCT.  Here we will select a group of people and we will randomly assign half to receive the drug and the other half to receive a placebo, that looks, smells and tastes exactly like the drug.  We will not reveal to the participants in the study or to the professionals caring for them which form of treatment each has been given – in other words we will “blind” each to their treatment. When both study participants and professionals are blinded to the treatment allocations we refer to such studies as “double-blind”.  The purpose of blinding is to eliminate as much bias as possible. We will then follow the study participants checking them at regular intervals for signs of the cancer in question.  After an appropriate period of time, and if we have studied enough people, we will be able to compare the number of newly diagnosed cancers in the drug treated group with that in the placebo treated group.  Only if the number in the drug group is significantly higher can we confidently conclude that the drug has a causal role in the development of the cancer.  The development of this methodology is regarded by many as one of the most important advances in medicine in the 20th century.  Without RCTs we are operating in the half light, with only observational studies and anecdote to guide us.  The scientific rigour applied within the RCT allows us to “know” where before we could only “wonder”.

But, RCTs are not without their problems.  The example given above, while scientifically sound, may present significant ethical problems. If we really believe that a drug might cause cancer is it right to expose a group of people to that drug only to satisfy our curiosity?  Of course, the answer would be “no”, but it is not our curiosity that is at stake here.  If the drug has important clinical effects that could benefit many patients, we must be sure when using it that we are not unwittingly causing more harm than good.  The only way we can prescribe with any degree of confidence is if we have the data to back it up, and the highest quality data will be derived from a well conducted RCT.

Discovery is wonderful, but it is also difficult.  Scientific research must be conducted rigorously, ethically and in accordance with many forms of legislation and regulation.  Once conducted, the findings of the research must be communicated effectively to our scientific communities and to the wider public.

Those who wonder need to be taught how to channel that sense: we need to learn how to ask the right questions, how to design strategies to answer those questions, how to conduct our studies with rigour, and within the law, and how to tell the world of our discovery.  A cycle of discovery that begins with a thought, a question, a moment of wonder should end with a revelation that we can all share.  For discovery is for us all: it has allowed us to feed our hungry, light our cities, cure our diseases and it has allowed us to walk from our caves and step on the Moon.

My business is discovery and this blog is about my part in that business – teaching what I have learned to the next generation of those who wonder.

© Allan Gaw

My books available on kindle:

Screen Shot 2015-01-18 at 17.19.43Screen Shot 2015-01-18 at 17.19.00BIS Cover copyScreen Shot 2015-01-18 at 17.19.29Screen Shot 2015-01-18 at 17.18.54Screen Shot 2015-01-18 at 17.19.37Screen Shot 2015-01-18 at 17.19.16Screen Shot 2015-01-18 at 17.19.58Screen Shot 2015-01-18 at 17.20.06