Friday 15 March 2013

Preaching to the unconverted: or how to convey science to the devout

It's said that charity begins at home. Likewise, a recent conversation I had with a pious Mormon started me thinking: just how do you promote science, both the method and the uncomfortable facts, to someone who has been raised to mistrust the discipline? Of course, there is a (hopefully) very small segment of the human race that will continue to ignore the evidence even after it is presented right in front of them, but stopping to consider those on the front line - such as biology teachers and ‘outed' atheists in the U.S. Bible Belt - how do you present a well-reasoned set of arguments to promote the theory and practice of science? 

It's relatively easy for the likes of Richard Dawkins to argue his case when he has large audiences of professionals or sympathetic listeners, but what is the best approach when endorsing science to a Biblical literalist on a one-to-one basis? The example above involved explaining just how we know the age of the Earth. Not being the first time I've been asked this, I was fully prepared to enlighten on the likes of uranium series dating, but not having to mention the 'D' words (Darwin or Dawkins) made this a relatively easy task. To aid any fans of science who might find themselves in a similar position I've put together a small toolkit of ideas, even if the conversation veers into that ultimate of controversial subjects, the evolution of the human race:
  1. A possible starting point is to be diffident, explaining the limitations of science and dispelling the notion that it isn't the catalogue of sundry facts it is sometimes described as (for example, in Bill Bryson's A Short History of Nearly Everything). It is difficult but nonetheless profitable to explain the concept that once-accepted elements of scientific knowledge can ostensibly be surpassed by later theories, only to maintain usefulness on a special case basis. A good illustration of this is Newton's Law of Universal Gravitation, which explains the force of gravity but not what creates it. Einstein's General Theory of Relativity provides a solution but Newton's Law is much easier to use, being accurate enough to use even to guide spacecraft. And since General Relativity cannot be combined with quantum mechanics, there is probably another theory waiting to be discovered…somewhere. As British astrophysicist and populariser John Gribbin has often pointed out, elements at the cutting edge of physics are sometimes only describable via metaphor, there not being anything within human experience that can be used as a comparison. Indeed, no-one has ever observed a quark and in the early days of the theory some deemed it just a convenient mathematical model. As for string theory, it's as bizarre as many a creation myth (although you might not want to admit that bit).
  2. Sometimes (as can be seen with Newton and gravity) the 'what' is known whilst the 'why' isn't. Even so, scientists can use the partial theories to extrapolate potential 'truths' or even exploit them via technology. Semi-conductors require quantum mechanics, a theory that no-one really understands. Indeed, no less a figure than Einstein refused to accept many of its implications.  There are many competing interpretations, some clearly more absurd than others, but that doesn't stop it being the most successful scientific theory ever, in terms of the correspondence between the equations and experimental data. So despite the uncertainty - or should that be Uncertainty (that's a pun, for the quantum mechanically-minded) - the theory is a cornerstone of modern physics.
  3. As far as I know, the stereotype of scientists as wild-haired, lab-coated, dispassionate and unemotional beings may stem from the Cold War, when the development of the first civilisation-destroying weapons led many to point their fingers at the inventors rather than their political paymasters. Yet scientists can be as creative as artists. Einstein conducted thought experiments, often aiming for a child-like simplicity, in order to obtain results. The idea that logic alone makes a good scientist is clearly bunkum. Hunches and aesthetics can prove as pivotal as experimental data or equations.
  4. Leading on from this, scientists are just as fallible as the rest of us. Famous examples range from Fred Hoyle's belief in the Steady State theory (and strangely, that the original Archaeopteryx fossils are fakes) through to the British scientific establishment's forty-year failure to recognise that the Piltdown Man finds were crude fakes. However, it isn't always as straightforward as these examples: Einstein's greatest blunder - the cosmological constant - was abandoned after the expansion of the universe was discovered, only for it to reappear in recent years as the result of dark energy. And of course mistakes can prove more useful than finding the correct answer the first time!
  5. There are numerous examples of deeply religious scientists, from Kepler and Newton via Gregor Mendel, the founder of genetics, to the contemporary British particle physicist the Reverend John Polkinghorne. Unlike the good versus evil dichotomy promoted by Hollywood movies, it's rarely a case of us versus them.
  6. Although there are searches for final theories such as the Grand Unified Theory of fundamental forces, one of the current aspects of science that differs profoundly from the attitudes of a century or so ago is that there is the possibility of never finding a final set of solutions. Indeed, a good experiment should generate as many new questions as it answers.
  7. If you feel that you're doing well, you could explain how easy it is to be fooled by non-existent patterns and that our brains aren't really geared up for pure logic. It's quite easy to apparently alter statistics using left- or right-skewed graphs, or to use a logarithmic scale on one axis. In addition, we recognise correlations that just aren't there but we which we would like to think are true. In the case of my Mormon colleague he was entrenched in the notion of UFOs as alien spacecraft! At this point you could even conduct an experiment: make two drawings, one of a constellation and one of evenly-spaced dots, and ask them to identify which one is random. Chances are they will pick the latter. After all, every culture has seen pictures in the random placements of stars in the night sky (or the face of Jesus in a piece of toast).
Constellation vs random dots
Ursa Major (see what you like) vs evenly-spaced dots

So to sum up:
  1. There's a fuzzy line at the cutting edge of physics and no-one understands what most of it means;
  2. We've barely started answering fundamental questions, and there are probably countless more we don't even know to ask yet;
  3. Science doesn't seek to provide comforting truths, only gain objective knowledge, but...
  4. ...due to the way our brains function we can never remove all subjectivity from the method;
  5. No one theory is the last word on a subject;
  6. Prominent scientists easily make mistakes;
  7. And most of all, science is a method for finding out about reality, not a collection of carved-in-stone facts.
So go out there and proselytise. I mean evangelise. Err...spread the word. Pass on the message. You get the picture: good luck!

Wednesday 27 February 2013

An index of possibilities: is science prognostication today worthwhile or just foolish?

A few evenings ago I saw the International Space Station. It was dusk, and walking home with the family we were looking at Jupiter when a moving bright light almost directly overhead got our attention. Too high for an aircraft, too large for a satellite, a quick check on the Web when we got home confirmed it was the ISS. 370 kilometres above our heads, a one hundred metre long, permanently crewed construction confirmed everything I read in my childhood: we had become a space-borne species. But if so few of the other scientific and technological advances I was supposed to be enjoying in adulthood have come true, has the literature of science prediction in these areas also changed markedly?

It is common to hear nowadays that science is viewed as just one of many equally valid methods of describing reality. So whilst on the one hand most homes in the developed world contain a myriad of up-to-date high technology, many of the users of these items haven't got the faintest idea how they work. Sadly, neither do they particularly have any interest in finding out. It's a scary thought that more and more of the key devices we rely on every day are designed and manufactured by a tiny percentage of specialists in the know; we are forever increasing the ease with which our civilisation could be knocked back to the steam age - if not the stone age.

Since products of such advanced technology are now familiar in the domestic environment and not just in the laboratory, why are there seemingly fewer examples of popular literature praising the ever-improving levels of knowledge and application compared to Arthur C. Clarke's 1962 prophetic classic Profiles of the Future and its less critical imitators that so caught my attention as a child? Is it that the level of familiarity has led to the non-scientist failing to find much interest or inspiration in what is now such an integrated aspect of our lives? With scientific advance today frequently just equated with cutting-edge consumerism we are committing an enormous error, downplaying far more interesting and important aspects of the discipline whilst cutting ourselves off from the very processes by which we can gain genuine knowledge.

Therefore it looks as if there's somewhat of an irony: non-scientists either disregard scientific prognostication as non-practical idealism ("just give me the new iPad, please") and/or consider themselves much more tech savvy than the previous generation (not an unfair observations, if for obvious reasons - my pre-teen children can work with our 4Gb laptop whilst my first computer had a 48Kb RAM). Of course it's not all doom and gloom. Although such as landmark experiments as the New Horizons mission to Pluto has gone largely unnoticed, at least by anyone I know, the Large Hadron Collider (LHC) and Mars Curiosity rover receive regular attention in popular media.

Perhaps the most regularly-occurring theme in science news articles over the past decade or so has been climate change, but with the various factions and exposé stories confusing the public on an already extremely complex issue, could it be that many people are turning their back on reading postulated technological advances as (a) technology may have greatly contributed to global warming; and (b) they don't want to consider a future that could be extremely bleak unless we ameliorate or solve the problem? The Astronomer Royal and former President of the Royal Society Martin Rees is one of many authors to offer a profoundly pessimistic view of mankind's future. His 2003 book Our Final Hour suggests that either by accident or design, at some point before AD2100 we are likely to initiate a technological catastrophe here on the Earth, and the only way to guarantee our species' survival is to establish colonies elsewhere as soon as possible.

But there are plenty of futurists with the opposite viewpoint to Rees and like-minded authors, including the grandly-titled World Future Society, whose annual Outlook reports are written with the aim of inspiring action towards improving our prospects. Most importantly, by including socio-economic aspects they may fare better than Arthur C. Clarke and his generation, whose space cadet optimism now seems hopelessly naïve.

One way near-future extrapolation may increase accuracy is for specialists to concentrate in their area of expertise. To this end, many scientists and popularisers have concentrated on trendy topics such as nanotechnology, with Ray Kurzweil perhaps the best known example. This isn't to say that there aren't still some generalist techno-prophets still around, but Michio Kaku's work along these lines has proved very mixed as to quality whilst the BBC Futures website is curiously old school, with plenty of articles on macho projects (e.g. military and transport hardware) that are mostly still in the CAD program and will probably remain that way for many years to come.

With so many factors influencing which science and technology projects get pursued, it seems worthwhile to consider whether even a little knowledge of current states and developments might be as useful as in-depth scientific knowledge when it comes to accurate prognostication, with luck instead playing the primary role. One of my favourite examples of art-inspired science is the iPad, released to an eager public in 2010 some twenty-three years after the fictional PADD was first shown on Star Trek: The Next Generation (TNG) - although ironically the latter is closer in size to non-Apple tablets. In an equally interesting reverse of this, there is now a US$10 million prize on offer for the development of a hand-held Wi-Fi health monitoring and diagnosis device along the lines of the Star Trek tricorder. No doubt Gene Roddenberry would have been pleased that his optimistic ideas are being implemented so rapidly; but then even NASA have at times hired his TNG graphic designer!

I'll admit that even I have made my own modest if inadvertent contribution to science prediction. In an April Fools' post in 2010 I light-heartedly suggested that perhaps sauropod dinosaurs could have used methane emissions as a form of self-defence. Well, not quite, but a British study in the May 2012 edition of Current Biology hypothesises that the climate of the period could have been significantly affected by dino-farts. As they say, truth is always stranger than fiction…