Tuesday 23 December 2014

Easy fixes: simple corrections of some popular scientific misconceptions

A few months' ago I finally saw the film 'Gravity', courtesy of a friend with a home theatre system. Amongst the numerous technical errors - many pointed out on Twitter by Neil deGrasse Tyson - was one that I hadn't seen mentioned. This was how rapidly Sandra Bullock's character acclimatised to the several space stations and spacecraft immediately after removing her EVA suit helmet. As far as I am aware, the former have nitrogen-oxygen atmospheres whilst the suits are oxygen-only, necessitating several hours of acclimatisation.

I may of course be wrong on this, and of course dramatic tension would be pretty much destroyed if such delays had to be woven into the plot, but it got me thinking that there are some huge fundamental errors propagated in non-scientific circles. Therefore my Christmas/Hanukkah/holiday season present is a very brief, easy -on-the-brain round-up of a few of the more obvious examples.

  1. The Earth is perfect sphere.
    Nope, technically I think the term is 'oblate spheroid'. Basically, a planet's spin squashes the mass so that the polar diameter is less than the equatorial diameter. Earth is only about 0.3% flatter in polar axis but if you look at a photograph of Saturn you can see a very obvious squashing.

  2. Continental drift is the same thing as plate-tectonics.
    As a child I often read that these two were interchangeable, but this is not so. The former is the hypothesis that landmasses have moved over time whilst the latter is the mechanism now accepted to account for this, with the Earth's crust floating over the liquid mantle in large segments or plates.

    Geologist Alfred Wegener suggested the former in 1912 but is was largely pooh-poohed until the latter was discovered by ocean floor spreading half a century later. As Carl Sagan often said, "extraordinary claims require extraordinary evidence".

  3. A local increase in cold, wet weather proves that global warming is a fallacy.
    Unfortunately, chaose theory shows that even the minutest of initial changes can cause major differences of outcome, hence weather forecasting being far from an exact science.

    However, there is another evidence for the validity of this theory, fossil fuel lobbyists and religious fundamentalists aside. I haven't read anything to verify this, but off the top of my head I would suggest that if the warm water that currently travels north-east across the Atlantic from the Gulf of Mexico (and prevents north-western Europe from having cold Canadian eastern seaboard winters), then glacial meltwater may divert this warm, denser seawater. And then the Isles of Scilly off the Cornish coast may face as frosty a winter as the UK mainland!

  4. Evolution and natural selection are the same thing.
    Despite Charles Darwin's On the Origin of Species having been published in 1859, this mistake is as popular as ever. Evolution is simply the notion that a population within a parent species can slowly differentiate to become a daughter species, but until Darwin and Alfred Russel Wallace independently arrived at natural selection, there really wasn't a hypothesis for the mechanism.

    This isn't to say that there weren't attempts to provide one, it's just that none of them fit the facts quite as well as the elegant simplicity of natural selection. Of course today's technology, from DNA analysis to CAT scans of fossils, provides a lot more evidence than was available in the mid-Nineteenth Century. Gregor Mendel's breeding programmes were the start of genetics research that led to the modern evolutionary synthesis that has natural selection at its core.

  5. And finally…freefall vs zero gravity.
    Even orbiting astronauts have been known to say that they are in zero gravity when they are most definitely not. The issue is due to the equivalence of gravity and acceleration, an idea which was worked on by luminaries such as Galileo, Newton and Einstein. If you find yourself in low Earth orbit - as all post-Apollo astronauts are - then clearly you are still bound by our planet's gravity.

    After all, the Moon is approximately 1800 times further away from the Earth than the International Space Station (ISS), but it is kept in orbit by the Earth's pull (okay, so there is the combined Earth-Moon gravitational field, but I'm keeping this simple). By falling around the Earth at a certain speed, objects such as the ISS maintain a freefalling trajectory: too slow and the orbit would decay, causing the station to spiral inwards to a fiery end, whilst too fast would cause it to fly off into deep space.

    You can experience freefall yourself via such delights as an out-of-control plummeting elevator or a trip in an arc-flying astronaut training aircraft A.K.A. 'Vomit Comet'. I'm not sure I'd recommend either! Confusingly, there's also microgravity and weightlessness, but as it is almost Christmas we'll save that for another day.
There are no doubt numerous other, equally fundamental errors out there, which only goes to show that we could do with much better science education in our schools and media. After all, no-one would make so many similar magnitude mistakes regarding the humanities, would they? Or, like the writer H.L. Mencken, would I be better off appreciating that "nobody ever went broke underestimating the intelligence of the (American) public"? I hope not!

Sunday 30 November 2014

Consumer complexity: engineering the public out of understanding

Last weekend my car stopped working. If a little knowledge is a dangerous thing, then an hour of internet research is probably worse. Convinced it was either the transmission or gearing, it turned out to be lack of petrol, the fuel gauge and warning light having simultaneously failed. At this point - breathing a sigh of relief that I wasn't facing an enormous repair bill so soon after an annual service - I realised that my knowledge of cars is extremely limited, despite having driven them for almost thirty years.

Obviously I'm far from being unique in this respect. In years past New Zealanders in particular were renowned for maintaining old cars long after other developed nations had scrapped them, with Australians referring to their neighbour as the place where Morris Minors went to die. However, anti-corrosion legislation put an end to such ‘canny Kiwi' tinkering so the country has presumably lost this resourcefulness when it comes to keeping ancient vehicles on the road.

Of course cars just aren't built to last any more: modern vehicles continue to be ever more fuel efficient and built of lightweight materials, but I doubt few will last as long as the classic cars still running after half a century or more. Built-in obsolescence is partly to blame, but the sophistication of today's designs means that their repair and maintenance is becoming ever more difficult without a complete workshop and diagnostic computer. As a teenager I learnt how to change my car's spark plugs but have since been told this should now only be undertaken by professionals as the tolerances required cannot be achieved by hand!

It isn't just motor vehicles that are affected by ever increasing complexity: high-tech consumer gadgets, especially those with integrated circuits (which let's face it, is most of them these days) are seemingly built to prevent tampering or repairs by the end user. Yet this is a fairly recent phenomenon. In my grandparents' generation the most sophisticated item in their house was likely to be a radio that used vacuum tube technology, but a cheaper alternative was available in the form of a do-it-yourself galena or pyrite crystal radio. Even children - Arthur C. Clarke amongst them - were able to build these self-powered devices, which worked rather well except for the fact that they had no speaker and so the user had to listen via headphones. It might seem unlikely that such as device was easy to construct until you remember that pioneer aircraft were built by bicycle manufacturers!

In contrast, the most advanced technological item my parents would have had until their twenties - when television sets started to become affordable - would have been a mass-produced transistor radio. Compared to the valve-infested sideboard gramophone, simple problems such as loose wires in these radios could be repaired with basic tools such as small screwdrivers, needle-nose pliers and a low wattage soldering iron. Whilst requiring a bit of skill and some understanding of wiring, such repairs were still within the range of many consumers.

Today, my experience suggests that the expendable consumerism that first became overt in the late 1960's is a key mind set in developed nations, with do-it-yourself work on gadgetry largely absent. In fact, it is frequently cheaper to buy a replacement item than to have it repaired or purchase the tools in order to attempt those repairs yourself. The speed with which newer models are released is such that it may even prove impossible to source a replacement part only a few years after the item has been purchased. This inevitably increases our distance from the inner workings of the ever more numerous high-tech consumer gadgets we now surround ourselves with. Surely it is a great irony that despite our ability to operate all of them, the vast majority of users have little idea of the fundamentals of the technologies involved?

My own experience with attempting to fix consumer electronics is rather limited, but I can see that manufacturers are deliberately trying to prevent this by using techniques such as hiding screw heads and using one-way pins, ensuring that any attempt to dismantle an item will snap parts within the casing. Additionally, the more sophisticated the technology, the more sensitive it seems to be. An example from a rather different sphere of activity comes from 1976, when a defecting Soviet Air Force pilot delivered a state-of-the-art fighter jet into the hands of Western intelligence. The MiG-25 ‘Foxbat' was discovered to be using valve-based rather than solid-state avionics, yet despite its primitive appearance the electronics were both extremely powerful and able to withstand immense physical stress, which is obviously of great importance in such aircraft.

Back to household gadgetry, I've seen an old cathode ray tube television repaired after water was accidentally tipped down the back of it, whilst flat screen computer monitors that were inadvertently cleaned with water - not by me, I hasten to add - were sent straight to the scrap heap. That isn't to say that there aren't a few brave souls who post internet videos on how to disassemble devices such as iPads in order to fix hardware issues, but I think you would either have to be very confident or quite rich before attempting such repairs. There are also websites dedicated to technology hackers, who enhance, customise or otherwise amend consumer gadgets beyond their out-of-the-box capabilities. Again, I don't have the confidence for this sort of thing, especially since there are hidden dangers: a digital camera for example contains a flash capacitor that can store - and deliver to the unwary - a charge of several hundred volts. Ouch!

So the next time someone declares their bewilderment with the ever-widening array of consumer gadgetry, or bores you with a piece of New Age nonsense, you should remember although we are surrounded with some extremely sophisticated devices, various causes have conspired to remove insight into their inner workings. Our consumerist age is geared towards acceptance of such items whilst limiting our involvement to that of end user. And of course I haven't even mentioned the ultimate fundamentals behind all this integrated circuitry, quantum electrodynamics...

Tuesday 28 October 2014

Sandy strandings: the role of contingency in the beach biosphere

At irregular intervals over the past fifteen years I've been visiting the east coast beaches of New Zealand's Northland between Warkworth and Paihia. Although it's frequently good territory for finding shallow marine fauna via rock pools or along the tideline, a recent visit was enhanced by exciting finds unique in my experience. I usually expect to see the desiccated remains of common species such as sand dollars, scallops, whelks and assorted sea snails, but coastal storms just prior to my arrival brought an added bonus. Two days of exploration along three beaches was rewarded with a plethora of live - but presumably disorientated - creatures such as common sea urchins (Evechinus chloroticus) and large hermit crabs (Pagurus novizealandiae), along with some recently-deceased 5- and 7-arm starfish. As you might imagine, several species of seabird, notably terns and gulls, were having a gastronomic time of it with all these easy pickings.

At the nearby Goat Island Marine Discovery Centre run by the University of Auckland I told our marine biologist guide about my two daughters' attempts to save some of the homeless hermit crabs from the gulls by offering suitable shells as new abodes. The biologist responded with a story of a visitor who had thrown live starfish back into the water after a mass stranding. Someone else commented that his actions wouldn't make a difference; our guide said that as he continued throwing them, the man replied "It made a difference to that one...and that one...and that one..."

Sea urchin

Common sea urchin (Evechinus chloroticus)

Of course we cannot hope to make much of a difference with such good intentions: nature, after all, is essentially immune to human morality and empathy, with survival at a genetic level the only true sign of success. But do small-scale events whose aftermath I recently experienced - in this case a few days of stormy weather and the resultant strandings - have any long-term effects on the local ecosystem?

Apart from a mass marooning of the large barrel jellyfish Rhizostoma pulmo on a North Wales beach around thirty years ago, I haven't experienced anything similar before. But then until three years ago I didn't live near the sea, so perhaps that's not unlikely! There are fairly frequent news stories from around the world about mass whale or dolphin beachings put down to various causes, some man-made such as military sonar. But as these events involve animals larger than humans they make it onto the news: for smaller creatures such as the crabs and urchins mentioned above, there are unlikely to be any widely-disseminated stories.

7 arm starfish

Australian southern sand star (Luidia australiae)

It may seem improbable that the balance between organisms could be profoundly altered by local events, but it should be remembered that a few, minor, outside influences over the course of less than a century can wipe out entire species. For example, although the story of how a single cat was responsible for the demise of the Stephens Island wren around the start of the Twentieth Century is an oversimplification of the events, there is evidence that current human activity is inadvertently causing regional change.

One well-known recent illustration is from the Sea of Cortez, where too much game fishing, especially of sharks, may have led to the proliferation a new top predator, the rapidly spreading Humboldt squid. Estimates suggest that the current population in the region is over 20 million individuals (which suits the local squid-fishing industry just fine), but extraordinary considering none were known in the region before about 1950. Two-metre squid may not sound menacing compared to sharks, but the Humboldt squid is a highly-intelligent pack hunter with a razor-sharp beak and toothed suckers on its tentacles, so diving amongst them is probably not for the faint-hearted.

The TV series Cosmos: A Spacetime Odyssey contained a good introduction to the five mass extinctions of the past 450 million years, but it isn't just these great dyings or even El Niño that can upset ecosystems; we may find out too late that relatively minor, local changes are able to trigger a chain reaction at a far wider level. The evolutionary biologist Stephen Jay Gould repeatedly emphasised the importance of historical contingency and the impact of unpredictable, ad-hoc events on natural history. The modern synthesis of evolutionary biology includes the notion that speciation can result from isolation of a population within an 'island'. This latter differs from the strictly geographical definition: a lake, or even an area within a lake, can be an island for some species. If, for example, local changes cause a gap in the ecosystem, then this gap might be filled by an isolated population with the 'fittest' characteristics, in the sense of a jigsaw piece that fits the relevant-shaped hole.

Hermit crab

Hermit crab (Pagurus novizealandiae)

Back to the beach. American marine biologist Rachel Carson's 1951 award-winning classic The Sea Around Us contains an early discussion of the recycling of nutrients within the oceans, but we are now aware that the sea isn't remotely self-contained. My favourite example of an intricate web of land, sea and even aerial fauna and flora centres on the Palmyra Atoll in the Pacific Northern Line Islands. Various seabirds nest in the atoll's high trees, their nutrient-rich guano washing into the sea where it feeds plankton at the base of the offshore food chain. The plankton population feeds larger marine fauna, with certain fish and squid species in turn providing meals for the seabirds, thus completing the cycle. Such a tightly-knit sequence is likely to undergo major restructuring of population densities if just one of the players suffers a setback.

I appear to have followed Stephen Jay Gould's method of moving from the particular to the general and may be a little out of my depth (okay, call it a feeble attempt at a pun) but it certainly gives food for thought when local shallow marine populations appear to suffer after only a few days of mildly inclement weather. If there’s a moral to any of this, it’s that if natural events can affect an ecosystem in unpredictable ways, what havoc could we be causing, with our pesticide run-off, draining of water tables, high-energy sonar, over-fishing and general usage of the oceans as a rubbish dump? The details may require sophisticated mathematics, but the argument is plain for all to see.

Wednesday 10 September 2014

Mythbusting: bringing science into the arena

My elder daughter is a big fan of the Discovery Channel show Mythbusters, who have spent eleven years testing myths (and not a few Hollywood set pieces) via science, technology, engineering and frequent resort to high explosives. Therefore, as a birthday treat I recently took her to the live Behind the Myths tour, fronted by Mythbusters hosts Adam Savage and Jamie Hyneman. Considering how macho the series frequently is - it's only female presenter, now left, is a vegetarian who was made to eat live bugs - it was interesting to see what and how the science was presented live.

In some respects it lived up to its reputation, with the hosts apologising for the lack of on-stage explosions but claiming their intentions were to 'blow the mind' instead of say, a pick-up truck or hot water cylinder. That's not to say that there weren't some fiery moments, including several montages of explosions and the infamous paintball machine gun aimed at someone wearing a suit of replica armour. Considering a large percentage of the audience consisted of pre-teens with their parents, the big bang elements were very much appreciated. But since the presenters have a special effects rather than science background, was there anything worthwhile beyond the showmanship?


Apart from a brief introduction to Newton's Second Law of Motion (force equals mass times acceleration, in case you weren't sure) there wasn't much of the classroom about the show. Except that for two hours Hyneman and Savage managed to painlessly convey a lot of scientific ideas. Examples included:
  • Archimedes' quote about using a lever to move the world was demonstrated via a fairground high striker and different sized mallets;
  • Perception, thanks to a point of view camera and some comedic cheating;
  • Tessellation and human mechanics, with four interlocked reclining men able to support their own weight when their chairs were taken away;
  • Friction via a circus-like stunt, in which Savage was lifted high above the stage thanks to the strength of interwoven telephone directories.
Although it might be quite easy to lose sight of the science behind all the razzmatazz, perhaps that was the point. These demonstrations reminded me of the Royal Institution's Christmas lectures, aimed primarily at 'young people' and barely a decade shy of being two hundred years' old. Unlike the television series, which has sometimes revisited experiments - occasionally reversing the original results in the process - the Behind the Myths tour was more a solid grounding in basic physics, with a little chemistry and biology thrown in. If anything, the most obvious outcomes would be to promote curiosity by recognising that science is deeply embedded in everyday life, and that exploring reality can be enormous fun.

The first section of the show had Adam Savage demonstrate juggling whilst explaining how he taught himself the techniques. Since his recollection discussed patience, perseverance and learning from your mistakes, you could say he was presenting in microcosm key elements of the scientific enterprise,' eureka' moments excepted.

I'm uncertain how many in the audience would cotton on to the science-by-the-backdoor aspect of the show. If anything, the children present may be more likely to want a career in movie special effects than in science, but the sense of wonder it generated may have also rubbed off on the adults present. Hyneman and Savage have become well-known enough in their support of STEM subjects and dislike of woolly thinking (take note, Discovery Channel , home of Finding Bigfoot) to have spoken at the 2006 annual convention of the US National Science Teachers Association, as well as presenting a demonstration to President Obama. That's no mean feat for a couple of special effects technicians with no formal science training. Let's hope that the some of the audience sees beyond the whizz bangs into the wonderful world that scientific exploration offers!

Saturday 16 August 2014

The escalating armoury: weapons in the war between science and woolly thinking

According to that admittedly dubious font of broad knowledge Wikipedia, there are currently sixteen Creationist museums in the United States alone. These aren't minor attractions for a limited audience of fundamentalist devotees either: one such institution in Kentucky has received over one million visitors in its first five years. That's hardly small potatoes! So how much is the admittance fee and when can I go?

Or maybe not. It isn't the just the USA that has become home to such anti-scientific nonsense either: the formerly robust secular societies of the UK and Australia now house museums and wildlife parks with similar anti-scientific philosophies. For example, Noah's Ark Zoo Farm in England espouses a form of Creationism in which the Earth is believed to be a mere 100,000 years old. And of course in addition to traditional theology, there is plenty of pseudo-scientific/New Age nonsense that fails every test science can offer and yet appears to be growing in popularity. Anyone for Kabbalah?

It's thirty-five years since Carl Sagan's book Broca's Brain: Reflections on the Romance of Science summarised the scientific response to the pseudo-scientific writings of Immanuel Velikovsky. Although Velikovsky and his bizarre approach to orbital mechanics - created in order to provide an astrophysical cause for Biblical events - has largely been forgotten, his ideas were popular enough in their time. A similar argument could be made for the selective evidence technique of Erich von Daniken in the 1970's, whose works have sold an astonishing 60 million copies; and to a less extent the similar approach of Graham Hancock in the 1990's. But a brief look at that powerhouse of publishing distribution, Amazon.com, shows that today there is an enormous market for best-selling gibberish that far outstrips the lifetime capacity of a few top-ranking pseudo-scientists:
  • New Age: 360,000
  • Spirituality: 243,000
  • Religion: 1,100,000
  • (Science 3,100,000)
(In the best tradition of statistics, all figures have been rounded slightly up or down.)

Since there hasn't exactly been a decrease of evidence for most scientific theories, the appeal of the genre must be due to changes in society. After writing-off the fundamentalist/indoctrinated as an impossible-to-change minority, what has lead to the upsurge in popularity of so many publications at odds with critical thinking?

It seems that those who misinterpret scientific methodology, or are in dispute with it due to a religious conviction, have become adept at using the techniques that genuine science popularisation utilises. What used to be restricted to the printed word has been expanded to include websites, TV channels, museums and zoos that parody the findings of science without the required rigorous approach to the material. Aided and abetted by well-meaning but fundamentally flawed popular science treatments such as Bill Bryson's A Short History of Nearly Everything, which looks at facts without real consideration of the science behind them, the public are often left with little understanding of what separates science from its shadowy counterparts. Therefore the impression of valid scientific content that some contemporary religious and pseudo-science writers offer can quite easily be mistaken for the genuine article. Once the appetite for a dodgy theory has been whetted, it seems there are plenty of publishers willing to further the interest.

If a picture is worth a thousand words, then the 'evidence' put forward in support of popular phenomenon such an ancient alien presence or faked moon landings seems all the more impressive. At a time when computer-generated Hollywood blockbusters can even be replicated on a smaller scale in the home, most people are surely aware of how easy it is to be fooled by visual evidence. But it seems that pictorial support for a strongly-written idea can resonate with the search for fundamental meaning in an ever more impersonal technocratic society. And of course if you are flooded with up-to-the-minute information from a dozen sources then it is much easier to absorb evidence from your senses than having to unravel the details from that most passé of communication methods, boring old text. Which perhaps fails to explain just why there are quite so many dodgy theories available in print!

But are scientists learning from their antithesis how to fight back? With the exception of Richard Dawkins and other super-strict rationalists, science communicators have started to take on board the necessity of appealing to hearts as well as minds. Despite the oft-mentioned traditional differentiation to the humanities, science is a human construct and so may never be purely objective. Therefore why should religion and the feel-good enterprises beloved of pseudo-scientists hold the monopoly on awe and wonder?

Carl Sagan appears to have been a pioneer in the field of utilising language that is more usually the domain of religion. In The Demon-Haunted Word: Science As A Candle In The Dark, he argues that science is 'a profound source of spirituality'. Indeed, his novel Contact defines the numinous outside of conventional religiosity as 'that which inspires awe'. If that sounds woolly thinking, I'd recommend viewing the clear night sky away from city lights...

Physicist Freeman Dyson's introduction to the year 2000 edition of Sagan's Cosmic Connection uses the word 'gospel' and the phrase 'not want to appear to be preaching'. Likewise, Ann Druyan's essay A New Sense of the Sacred in the same volume includes material to warm the humanist heart. Of course, one of the key intentions of the Neil deGrasse Tyson-presented reboot of Cosmos likewise seeks to touch the emotions as well as improve the mind, a task at which it sometimes - in my humble opinion - overreaches.

The emergence of international science celebrities such as Tyson is also helping to spread the intentions if not always the details of science as a discipline. For the first time since Apollo, former astronauts such as Canadian Chris Hadfield undertake international public tours. Neil deGrasse Tyson, Michio Kaku and Brian Cox are amongst those practicing scientists who host their own regular radio programmes, usually far superior to the majority of popular television science shows. Even the seven Oscar-winning movie Gravity may have helped promote science, with its at times accurate portrayal of the hostile environment outside our atmosphere, far removed from the science fantasy of most Hollywood productions. What was equally interesting was that deGrasse Tyson's fault-finding tweets of the film received a good deal of public attention. Can this suppose that despite the immense numbers of anti-scientific publications on offer, the public is prepared to put trust in scientists again? After all, paraphrasing Monty Python, what have scientists ever done for us?

There are far important uses for the time and effort that goes into such nonsense as the 419,000 results on Google discussing 'moon landing hoax'. And there's worse: a search for 'flat earth' generates 15,800,00 results. Not that most of these are advocates, but surely very few would miss most of the material discussing these ideas ad nauseum?

Although it should be remembered that scientific knowledge can be progressed by unorthodox thought - from Einstein considering travelling alongside a beam of light to Wegener's continental drift hypothesis that led to plate tectonics - but there is usually a fairly obvious line between an idea that may eventually be substantiated and one that can either be disproved by evidence or via submission to parsimony. Dare we hope that science faculties might teach their students techniques for combating an opposition that doesn't fight fair, or possibly even how to use their own methods back at them? After all, it's time to proselytise!

Tuesday 15 July 2014

An uneasy alliance: science, politics and scientifically-trained politicians

Last April, whilst speaking of the need for technological innovation in order to promote economic growth, President Obama joked that his physics grades made him an unlikely candidate for "scientist in chief". With the recent unease surrounding the (now thankfully dropped) takeover bid of leading UK pharmaceutical company Astra Zeneca by the American firm Pfizer, it seems appropriate to investigate whether science at the national level could be better supported if more politicians had a scientific background or were at least more savvy in science, technology, engineering and mathematics (STEM) subjects. After all, had the Pfizer bid proved successful, the British pharmaceutical sector was predicted to lose in the long term, both scientifically and economically.

There are many statistics that prove the notion that the past half century has seen a major dumbing down in Western politics, such as the reduction in average sound bite length for US presidential candidates from over forty seconds in the late 1960s to barely seven seconds today. It's quite easy to suggest that politicians are simply following mainstream societal trends, but such lack of substance only serves to further distance politics from science, since the latter rarely offers straightforward yes/no answers, especially in cutting-edge research.

One rather bizarre example of how little science can mean in mainstream politics can be seen in President Reagan's reliance for key policy decisions during most of his term in office on astrologer Joan Quigley. Whilst it is easy to mock the far right wing (and Reagan himself looks increasingly liberal by the standards of the Tea Party), those on the left could be equally guilty of paying short shrift to science, especially if there isn't an immediately obvious benefit to society. A combination of relativism and overdosing on political correctness make for difficulties in proclaiming judgement values: if everyone deserves an equal opportunity to air their own pet theory as to how the universe works, then science appears as just another set of beliefs.

If we look back further than the Reagan administration, how well do scientifically-inclined American Presidents fare up? Here's a brief examination of those with scientific leanings:
  1. Thomas Jefferson made contributions to palaeontology and agricultural technology but perhaps more importantly promoted science as essential to national wealth. However, he was still very much man of his time, maintaining conventional Christian beliefs that sometimes overrode his scientific sensibility, including those that questioned the Biblical timescale.
  2. Theodore Roosevelt is well known for what would today be called sustainable development, creating national parks and wildlife refuges at the same time as promoting a balanced exploitation of natural resources. He went on expeditions to Brazil and Africa, ostensibly to find specimens for the Smithsonian National Museum of Natural History, although the results appear more akin to the curious modern phenomenon of scientific whaling (in other words, somewhat lacking in the conservation stakes). Roosevelt also considered a "thorough knowledge of the Bible...worth more than a college education".
  3. Jimmy Carter gained a Bachelor of Science degree and later majored in reactor technology and nuclear physics whilst maintaining a conventional Christian faith. During the energy crisis of the late 1970s he seemingly promoted alternative energy, most famously having solar panels installed on the White House roof. However, in some ways he resembled Nineteenth Century Anglican scientists such as the Dean of Westminster William Buckland, particularly in his looking for the proof of God's existence in nature.
  4. An example from the other side of the Atlantic can be seen in Margaret Thatcher, British Prime Minister from 1979 to 1990, who trained in chemistry under the Nobel laureate Dorothy Hodgkin. Despite her right-wing, monetarist policies (incidentally the political antithesis of Hodgkin), Thatcher has been acclaimed as an active environmentalist: her late 1980s speeches supported action to combat climate change; policies to rapidly phase out CFCs; and the promotion of sustainable development. Yet commentators have viewed Thatcher's concerns for cost-benefit analysis as taking precedence over science, with blue sky thinking getting scant attention. At a practical level, in 1987 she sold the Plant Breeding Institute at Cambridge to Unilever, which has been deemed detrimental in the long-term to British public science.
The only current major Western leader with a scientific background is the German Chancellor Angela Merkel, who has a doctorate in physical chemistry. In contrast, eight out of the nine top government officials in China have backgrounds in STEM subjects. Is it any wonder they have already got their own space station and have become the world's largest exporter of high technology, now only second to the USA in terms of annual expenditure on research and development? Yes, the rate of progress has come at enormous environmental and personal cost, but the way in which the Chinese government is clearly imbued with science and technology is to be marvelled at.

From looking at the above examples, it doesn't appear that scientifically-trained national leaders have substantially improved science's output or public opinion and have on occasion been quite detrimental. The late Stephen Schneider, author of various reports for the Intergovernmental Panel on Climate Change (IPCC), stated that since is up to governments (and to some extent the general public as well) to formulate policy rather than scientists, the former need to understand not just the data, but how to interpret it. In the UK, the Department for Business, Innovation and Skills recently launched a public consultation over spending plans for the research infrastructure of the next five years. But scientific endeavours require a certain level of knowledge and that least common of commodities, critical thinking. Science just doesn't adhere to the simple black versus white mentality so beloved of Hollywood.

This is where scientifically-literate politicians hopefully come into their own, being able to accurately represent to the electorate such difficult material as probability statistics, as well as understanding risks and benefits themselves. If anything, science will only fare better if the majority of politicians have a more thorough science education, rather than just relying on the occasional professionally-trained key statesperson. But therein lies an obvious catch-22: how to persuade politicians to invest more funds in science education? I suppose it starts with us voters...

Wednesday 18 June 2014

Opening hearts and minds: Cosmos old, new, borrowed and blue

As a young and impressionable teenager I recall staying up once a week after the adults in my home had gone to bed in order to watch an amazing piece of television: Cosmos, a magical journey in thirteen episodes that resonated deeply with my own personal hopes and dreams. Now that Cosmos: A Spacetime Odyssey has completed its first run it's worth comparing and contrasting the two series, serving as they do as reflections of the society and culture that created them.

Both versions were launched with aggressive marketing campaigns: I was surprised to see even here in Auckland a giant billboard promoted the series in as hyped a media operation as any Hollywood blockbuster. But then I assume the broadcasters have to get returns for their massive investments (dare I call it a leap of faith?) Both the original series and the updated / reimagined / homage (delete as appropriate) version have greater scope, locales and no doubt budgets than most science documentary series, a few CGI dinosaur and David Attenborough-narrated natural history shows excepted.

The aim of the two series is clearly identical and can be summed up via a phrase from Carl Sagan's introduction to the first version's tie-in book: "to engage hearts as well as minds". In addition, both the 1980 and 2014 versions are dedicated to the proposition that "the public are far more intelligent than generally given credit for". However, with the rise of religious fundamentalist opposition to science in general and evolution in particular, there were times when the new version obviously played it safer than the earlier series, such as swapping Japanese crabs for much more familiar species, dogs. As before, artificial selection was used as a lead-in to natural selection, exactly as per Darwin's On the Origin of Species.

Another example to put the unconverted at their ease in the Neil deGrasse Tyson series is the use of devices that rely on the enormous popularity of science fiction movies and television shows today. Even the title sequence provokes some déjà vu, reminding me of Star Trek: Voyager. But then one of the directors and executive producers is former Star Trek writer-producer Brannon Braga, so perhaps that's only to be expected. In addition, the temple-like interior of Sagan's ship of the imagination has been replaced by something far more reminiscent of the Enterprise bridge. I suppose the intention is to put the scientifically illiterate at their ease before broaching unfamiliar territory.

Talking of science fiction, an echo of the space 'ballet' in 2001: A Space Odyssey can be seen with the use of Ravel's Bolero for the beautiful sequence in episode 11 of the new series. Unfortunately, the commissioned music in the Tyson programme fails to live up to the brilliant selections of classical, contemporary and folk music used in the Sagan version, which were presumably inspired by the creation of the Voyager Golden Record (a truly 1970's project if ever there was one) and with which it shares some of the same material. At times Alan Silvestri's 2014 score is too reminiscent of his Contact soundtrack, which wouldn't in itself be too distracting, but at its most choral/orchestral is too lush and distinctly overblown. Having said that, the synthesizer cues are more successful, if a bit too similar to some of the specially written material Vangelis composed for the 1986 revised version.

I also had mixed feelings about the animated sequences, the graphic novel approach for the characters seemingly at odds with the far more realistic backgrounds. Chosen primarily for budgetary reasons over live-action sequences, the combination of overstated music, dramatic lighting and quirks-and-all characterisation heavy on the funny voices meant that the stories tended to get a bit lost in the schmaltz-fest. I know we are far more blasé about special effects now - the Alexandrian library sequence in the original series blew me away at the time - but I'd rather have real actors green-screened onto digimattes than all this pseudo Dark Knight imagery.

Back to the content, hurrah! For readers of the (distinctly unpleasant) Keay Davidson biography, Carl Sagan, champion of Hypatia, has become known as the feminist ally who never did any housework. He has been left distinctly in the shade by the much greater attention paid to women scientists in the new series. Presumably Ann Druyan is responsible for much of this, although there are some lost opportunities: Caroline Herschel, most obviously; and Rachel Carson wouldn't have gone amiss, considering how much attention was given to climate change. As with the original series, the new version made a fair stab at non-Western contributions to science, including Ibn al-Haytham and Mo Tzu in the new series.

As to what could have been included in the Tyson version, it would have been good to emphasise the ups and downs trial-and-error nature of scientific discovery. After all, Sagan gave a fair amount of time to astronomer, astrologer and mystic Johannes Kepler, including his failed hypothesis linking planetary orbits to the five Platonic solids. Showing such failings is good for several reasons: it makes scientists seem as human as everyone else and also helps define the scientific method, not just the results. Note: if anyone mentions that Kepler was too mystical when compared to the likes of Galileo, point them to any modern biography of Isaac Newton...

Neil deGrasse Tyson is an excellent successor to Sagan but at times he seems to almost be imploring the audience to understand. But whereas Sagan only contended with good old fashioned astrology, his successor faces an audience of young Earth creationists, alien abductees, homeopaths and moon landing hoax theorists, so perhaps his less relaxed attitude is only to be expected. Despite the circa 1800 exoplanets that have now (indirectly) been detected, the new series failed to mention this crucial update to the Drake equation. Indeed, SETI played a distinctly backseat role to the messages of climate degradation and how large corporations have denied scientific evidence if it is at odds with profit margins.

All in all I have mixed feelings about the new series. For a central subject, the astronomy was at times second fiddle to the 'poor boy fighting adversity' theme of Faraday, Fraunhofer, etal. Not that there's anything bad about the material per se, but I think a lot more could have been made of the exciting discoveries of the intervening years: dark matter and dark energy, geological activity on various moons other than Io, even exoplanets.

The original 1980 series was a pivotal moment of my childhood and no doubt inspired countless numbers to become scientists (British physicist and presenter Brian Cox, for one), or at least like me, to dabble amateurishly in the great enterprise in our spare time. I'm pleased to add that I'm one degree of separation from Carl Sagan, thanks to having worked with a cameraman from the original series. But we can never go back. Perhaps if we're lucky, Tyson, Druyan and company will team up for some other inspiring projects in the future. Goodness knows we could do with them!

Tuesday 13 May 2014

Digging apart: why is archaeology a humanity and palaeontology a science?

Although my Twitter account only follows scientists and scientific organisations, every day sees the arrival of a fair few archaeology tweets, even by science-orientated sites such as Science News. As someone who has been an amateur practitioner of both archaeology and palaeontology I thought I'd like to get to grips with why they are categorised so differently. After all, the names themselves don't really help: the word 'archaeology' means "the study of everything ancient." whilst the common definition of 'palaeontology' is pretty much "the study of ancient life". I've even known people with close friends or relatives in one or the other discipline to confuse them: whilst viewing my fossil cabinet, a visitor once told me that her cousin was an archaeologist studying Maori village sites!

Even historically, both fields share many common factors. Not only were they founded by enthusiasts and amateurs, but to this day non-professionals continue to make fundamental contributions. In converse, amateurs can cause serious deficiencies in the data record by lack of rigour or deliberately putting financial gain ahead of the preservation of new information. This can be caused by a variety of methods, from crude or overly hasty preparation of fossils, to metal detectorists and site robbers who sell their finds to private collectors without recording the context, or even the material itself.

It is not immediately obvious where the dividing line between the two disciplines lies when it comes to prehistoric human remains. In the 1990s, archaeologist Mark Roberts led a team that excavated the half a million year old Boxgrove site in southern England. Finds included fragmentary remains of Homo heidelbergensis, thus crossing over to what might traditionally be deemed the territory of palaeontologists. In 2001 the multi-phase Ancient Human Occupation of Britain project started, with deliberate collaboration between both sectors, proof that their skills could overlap and reinforce each other.

By and large, neither palaeontology nor archaeology utilises repeatable laboratory experiments and therefore neither can be classified as a ‘hard’ science. Even palaeontology relies to a large extent on historical contingency, both for remains to be fossilised in the first place and then for them to be discovered and recorded using the relevant methodology. As British palaeontologist Richard Fortey has said "Physics has laboratories; systematic biology has collections." Talking of which, re-examination of old evidence in both disciplines can lead to new discoveries: how often do we see headlines pointing to a fundamental discovery...made in a museum archive?

Although archaeologist were not previously known for conducting experiments,  the New Archaeology/Processual archaeology that arose in the 1960s included an emphasis on testing hypotheses, one result of which is that archaeology now uses experiments to interpret site data. This includes attempts to recreate artefacts, structures, boats, or even food recipes, based on finds from one or more sites. It may not be laboratory conditions, but it is still a method of analysis that can reinforce or disprove an idea in a close equivalent of the scientific hypothesis.

Attempts to improve the quality of data gleaned from the archaeological record have led to the utilisation of an enormous variety of scientific techniques collectively labelled archaeometry. These include microwear analysis, artefact conservation, numerous physical and chemical dating methods such as the well-known radio carbon dating and dendrochronology; geophysical remote sensing techniques involving radar, magnetometry and resistivity; and DNA analysis, pathology and osteo-archaeology.

Teeth of a sand tiger shark
(possibly Odontaspis winkleri)
I found in a wood in Surrey, UK

But there are some major differences between archaeology and palaeontology as well. Although both appear to involve excavation, this is only somewhat true. Not only does archaeology include standing structures such as buildings or ancient monuments, but a project can be restricted to non-invasive techniques such as the geophysical methods mentioned above; excavating a site is the last resort to glean information unobtainable by any other way, especially important if the site is due to be destroyed by development. In contrast, fossils are no use to science by remaining buried. Having said that, I often fossils by sifting through pebbles rather than concerted digging. I have occasionally split rocks or dug through soft sand, but a lot of the time fossils can be found scattered on the surface or prised out of exposed chalk via finger nails. The best way to spot even large finds is to have them already partially exposed through weathering, whilst some archaeology cannot be directly seen from the site but only identified via aerial photography or geophysics.

Archaeological sites can prove extremely complex due to what is known as context: for example, digging a hole is a context, back filling it is another, and any finds contained therein are yet more. Repeated occupation of a site is likely to cause great difficulty in unravelling the sequence, especially if building material has been robbed out. This is substantially different to palaeontology, where even folded stratigraphy caused by geophysical phenomena can be relatively easily understood.

Perhaps the most fundamental difference between the disciplines is that of data analysis. As anyone who has spent time on a site excavation knows, there are often as many theories as there are archaeologists. There are obviously far less fixed data points than that provided by Linnaean taxonomy and so there is a reliance on subjectivity, the keyword being 'interpretation'. Even the prior experience of the excavator with sites of a similar period/location/culture can prove crucial in gaining a correct (as far as we can ever be correct) assessment. In lieu of similarity to previously excavated sites, an archaeologist may turn to anthropology, extrapolating elements of a contemporary culture to a vanished one, such as British prehistorian Mike Parker-Pearson's comparison between the symbolic use of materials in contemporary Madagascar and Bronze Age Britain. In stark contrast, once a fossil has been identified it is unlikely for its taxonomy to be substantially revised - not that this doesn’t still occur from time to time.

As can be seen, not all science proceeds from the hypothesis-mathematical framework-laboratory experiment axis. After all, most of the accounts of string theory that I have read discuss how unlikely it can ever be subject to experiment. The British Quality Assurance Agency Benchmark Statement for Archaeology perhaps comes closest to the true status of the discipline when it lists 'scientific' as one of the four key contexts for higher level archaeological training. In addition, every edition since 2000 has stated "Where possible, thinking scientifically should be part of the armoury of every archaeologist."

So part historical science, part humanity, archaeology is an interesting combination of methodologies and practice, with more resemblances than differences to palaeontology. As the Ancient Human Occupation of Britain project shows, sometimes the practitioners can even work in (hopefully) perfect harmony. Another nail in the coffin for C.P. Snow's 'Two Cultures', perhaps?

Tuesday 1 April 2014

Dino wars: is that dinosaur Kiwi or Aussie?

It's a cheap piece of rhetoric to invoke the long-running if affectionate New Zealand-Australian rivalry, but what with the current campaign to redesign the New Zealand flag in order to differentiate it more its trans-Tasman neighbour, I thought it would be a good opportunity to discuss a science-themed story along these lines. In fact, the account bears some resemblance to the years spent arguing over Otzi, the Copper Age man found preserved in ice on the Austrian-Italian border. Although in this case, the focus of the disagreement isn't as clear-cut, since it concerns ancient remains found in both nations.

Even for a country with a population under five million, New Zealand has a seemingly minimal number of professional palaeontologists. That is, until you consider that the lack of industry application for the discipline's findings means its pretty good that there are any practitioners whatsoever. Numbers vary, but figures I have seen for the past few decades vary from less than a dozen to thirty or so professionals, most working for universities or state bodies. By comparison France, with twice the geographic area of New Zealand, has around one hundred professionals.

It isn't just the current financial crisis that has caused problems for would-be kiwi fossil hunters: funding has been steadily decreasing for the past half century and the emphasis shifted towards environmental research. This latter focuses on exploring the (very) long term changes that have affected not just the landmass as it is today but the largely submerged (90% or so) continent of Zealandia. This is of course extremely timely but it does enhance the idea that without much in the way of obvious practical returns, New Zealand palaeontology could dwindle to almost nothing. As it is, the country doesn't have a specialist palaeontological journal or even a dedicated palaeontological society.

The funding issue is claimed to be responsible for the loss of basic knowledge within the discipline, leading to problems such as taxonomic confusion and a backlog for formal descriptions, perhaps numbering some thousands of species, that are new to science. Of course New Zealand's distance from other nations doesn't help either, since the internet has frequently to be relied upon in lieu of direct representation at international conferences and the like. Therefore perhaps it's not surprising that there are only a couple of professional palaeontologists (part-time, at that) working on Mesozoic flora and fauna, including that much-loved clade, dinosaurs.

Luckily, this lack of professional numbers is partially redressed by dedicated amateurs, some of whom have played a pivotal role in dinosaur discoveries. The most famous is the late Joan Wiffen, who discovered New Zealand's first dinosaur fossils in 1974 after experts had proclaimed it unlikely any would be found (on the basis of the geological history of the current above sea-level land masses). I'm all for amateur fossicking and Joan Wiffen's four decades of dedication is an example to us all.

The heart of this piece concerns the discovery of the ninth dinosaur species found in New Zealand and serves as an instructive example of scientists at work knee-deep in messy reality rather than the unreachable ideal. One specimen that you won't find on FRED - the 95,000+ localities' Fossil Record Electronic Database - is the young theropod (carnivorous dinosaur) discovered in 2008 in New Zealand's dinosaur heartland, the Mangahouanga Stream between Taupo and Hawke's Bay. The specimen is only about forty centimetres long and is largely intact: a fully articulated skeleton only lacking a toe and a few tail end vertebrae. After 18 months careful preparation the reptile was in a suitable condition for high-level analysis, having - due to lack of budget - only received cursory examination during the removal of the overlying matrix. Having assessed the deposition layer as mid-Cretaceous the next obvious question was presumably which species did it belong to?

The most likely candidate for a species already scientifically described is the 5-6 metre gracile carnivore Australovenator wintonensis, which is known from fragmentary remains in central Queensland. At less than half a metre long, the New Zealand find would have to be a very young individual, which was the original opinion of the preparators. But the brief analysis of a visiting British palaeontologist put this into question, for although the upper jaw is missing from the adult Australovenator specimen, enough was present to suggest that the New Zealand skull is both too deep and too robust to be the same species. In addition, the kiwi remains has forearms that appear too long when compared to Australovenator, even accounting for variation in growth between youngster and adult.

Then in late 2009 the Australian Journal of Vertebrate Paleontology published an article claiming the New Zealand specimen was just an infant Australovenator. At this point patriotism started to kick in. Even though 'Australo' only means 'south' the word is close enough to the name of the larger nation to provoke the kiwi fossil community into a counter attack. A core group of Hawke's Bay-based amateur fossil hunters nicknamed the little dinosaur 'Hillaryonyx' (named after Everest pioneer Sir Edmund, of course) and the scene was set for a brontosaurus-sized brouhaha.

Although largely powerless, the passion of the non-professional fossicking community should not be underestimated. Everything that could be done to raise funding for a full analysis of the young reptile was undertaken: web articles were written, t-shirts were printed, even lyrics for a song called 'He's ours' (to the tune of the folk song 'No Moa!') On the basis of this, questions were asked in New Zealand parliament and as a result, and a bit of a whipround by some of the universities, money was found for eight months of part-time analysis by two palaeontologists with some experience on Mesozoic vertebrates. As mentioned previously, the reduction in funding for the discipline meant that there wasn't - and still isn't - a single full-time professional scientist dedicated to the era.

Once the analysis was complete the intention was to have a monograph published by GNS Science, a government-owned research institute, prior to public exhibition of the fossil. Everything seemed to be going smoothly, until several visiting Australian palaeontologists asked to see the prepared slab. They were at first stalled, and then later denied access, even to just photographs of the bones. Several arbitrary reasons were given, but the most likely motive for this behaviour was that the kiwi scientists were still assessing the species of the dinosaur. Which, given the loss of taxonomic knowledge mentioned above, was a tricky business if restricted to just New Zealand scientists. So much so, that it took the next two and a half years before anything further was heard.

The latest New Zealand dinosaur fossil

It's not known who was allowed to examine the fossil during this time but by late 2013 rumours surfaced that the dinosaur had been finally identified as a species new to science. A badly scanned interim report was leaked, containing several figures of the prepared fossil, included the photograph above. More significantly, the report listed eleven points of fundamental anatomical disparity with Australovenator, which have since proved enough to convince the majority of naysayers. The few who are still doubtful are all, needless to mention - but I will anyway - Australian. Until the beginning of this year it seemed the specimen would remain in limbo, but someone, somewhere, perhaps a leading university figure or government official, has pulled their finger out and New Zealand's latest endemic dinosaur species may soon be appearing in the records of the International Commission on Zoological Nomenclature (ICZN).

So not exactly an ideal way to pursue science by any stretch of the imagination. But the story is proof that cuts in funding can cause all sorts of problems for science in the long-term, even if the matter appears trivial to the layman.

Oh, and as for the official name for the creature: Stultusaurus aprillis. How appropriate!

Saturday 15 March 2014

Cutting remarks: investigating five famous science quotations

If hearing famous movie lines being misquoted seems annoying, then misquoted or misused science citations can be exasperating, silly or downright dangerous. To this end, I thought that I would examine five well-known science quotations to find the truth behind the soundbite. By delineating the accurate (as far as I'm aware) words in the wider context in which they were said/written down/overheard by someone down the hallway, I may be able to understand the intended meaning, and not the autopilot definition frequently used. Here goes:

1) God does not play dice (Albert Einstein)

Possibly Einstein's most famous line, it sound like the sort of glib comment that could be used by religious fundamentalists to denigrate science in two opposing fashions: either Einstein is being facetious and therefore sacrilegious; or he supports an old-fashioned version of conventional Judeo-Christian beliefs in which God can be perceived in the everyday world. Talk about having your cake and eating it!

Einstein is actually supposed to have said: "It is hard to sneak a look at God's cards. But that he would choose to play dice with the world...is something that I cannot believe for a single moment." This gives us much more material to work with: it was actually a quote Einstein himself supplied to a biographer. Some years earlier he had communicated with physicist Max Born along similar lines: "Quantum mechanics is certainly imposing. But an inner voice tells me that it is not yet the real thing. The theory says a lot, but does not really bring us any closer to the secret of the 'old one'. I, at any rate, am convinced that He does not throw dice."

So here is the context behind the quote: Einstein's well-known disbelief in the fundamental nature of quantum mechanics. As I've discussed in a previous post Einstein's opinions on the most accurate scientific theory ever devised was completely out of step with the majority of his contemporaries - and physicists ever since. Of course we haven't yet got to the bottom of it; speaking as a non-scientist I find the Copenhagen Interpretation nonsense. But then, many physicists have said something along the lines of that if you think you understand quantum mechanics, you haven't understood it. Perhaps at heart, Einstein was stuck in a Nineteenth Century mind set, unable to conceive of fundamental limits to our knowledge or that probability lies at the heart of reality. He spent decades looking for a deeper, more obviously comfortable, cause behind quantum mechanics. And as for his interest in the 'Old One', Einstein frequently denied his belief in a Judeo-Christian deity but referred to himself as an agnostic: the existence of any presence worthy of the name 'God' being "the most difficult in the world". Now there's a quote worth repeating!

2) Science is a way of thinking much more than it is a body of knowledge (Carl Sagan)

As I've mentioned before, Bill Bryson's A Short History of Nearly Everything is chock full of the results of scientific investigation but rarely stops to consider the unique aspects that drive the scientific method, or even define the limits of that methodology. Sagan's full quote is: "Science is more than a body of knowledge. It is a way of thinking; a way of sceptically interrogating the universe with a fine understanding of human fallibility. If we are not able to ask sceptical questions, to interrogate those who tell us that something is true, to be sceptical of those in authority, then, we are up for grabs for the next charlatan (political or religious) who comes rambling along."

It is interesting because it states some obvious aspects of science that are rarely discussed, such as the subjective rather than objective nature of science. As human beings, scientists bring emotions, selective memory and personal preferences into their work. In addition, the socio-cultural baggage we carry is hardly ever discussed until a paradigm shift (or just plain, old-fashioned time has passed) and we recognise the idiosyncrasies and prejudices embedded into research. Despite being subject to our frailties and the zeitgeist, once recognised, these limitations are part of the strength of the discipline: it allows us, at least eventually, to discover their effect on what was once considered the most dispassionate branch of learning.

Sagan's repeated use of the word sceptical is also of great significance. Behind the multitude of experimental, analytical and mathematical methods in the scientific toolkit, scepticism should be the universal constant. As well as aiding the recognition of the biases mentioned above, the sceptical approach allows parsimony to take precedence over authority. It may seem a touch idealistic, especially for graduate students having to kowtow to senior faculty when seeking research positions, but open-minded young turks are vital in overcoming the conservative old guard. Einstein's contempt for authority is well-known, as he made clear by delineating unthinking respect for it as the greatest enemy of truth. I haven't read Stephen Jay Gould's Rocks of Ages: Science and Religion in the Fullness of Life, but from what I understand of his ideas, the distinction concerning authority marks a clear boundary worthy of his Non-Overlapping Magisteria.

3) The mystery of the beginning of all things is insoluble by us; and I for one must be content to remain an agnostic (Charles Darwin)

From the original publication of On the Origin of Species in 1859 to the present day, one of the most prominent attacks by devoutly religious critics to natural selection is the improbability of how life started without divine intervention. If we eventually find microbial life on Mars - or larger organisms on Titan, Europa or Enceladus - this may turn the tide against such easy a target, but one thing is for certain: Darwin did not attempt to detail the origin of life itself. Although he stated in a letter to a fellow scientist: "But if (and Oh! What a big if!) we could conceive in some warm little pond, with all sorts of ammonia and phosphoric salts, lights, heat, electricity etc., present that a protein compound was chemically formed ready to undergo still more complex changes" there are no such broad assumptions in his public writings.

As it turns out, Darwin may have got some of the details correct, although the 'warm little pond' is more likely to have been a deep sea volcanic vent. But we are still far from understanding the process by which inert chemicals started to make copies of themselves. It's been more than sixty years since Harold Urey and Stanley Miller at the University of Chicago produced amino acids simply by recreating what conditions were then thought to resemble on the early Earth. Despite numerous variations on this classic experiment in subsequent decades, we are little closer to comprehending the origin of life. So it was appropriate that Darwin, who was not known for flights of fancy (he once quipped "My mind seems to have become a kind of machine for grinding general laws out of large collections of facts") kept speculation out of his strictly evidence-based publications.

Just as Darwin has been (at times, deliberately) misquoted by religious fundamentalists determined to undermine modern biology, his most vociferous disciple today, Richard Dawkins, has also been selectively quoted to weaken the scientific arguments. For example, printing just "The essence of life is statistical improbability on a colossal scale" as opposed to the full text from The Blind Watchmaker discussing cumulative natural selection, is a cheap literary device that lessens the critique, but only if the reader is astute enough to investigate the original source material.

4) Anybody who has been seriously engaged in scientific work of any kind realizes that over the entrance to the gates of the temple of science are written the words: 'Ye must have faith.' (Max Planck)

Thomas Henry Huxley (A.K.A. Darwin's Bulldog) once wrote that "Science is organized common sense where many a beautiful theory was killed by an ugly fact." But that was back in the Nineteenth Century, when classical physics ruled and scientists predicted a time in the near future when they would understand all the fundamentals of the universe. In these post-modern, quantum mechanical times, uncertainty (or rather, Uncertainty) is key, and common sense goes out of the window with the likes of entanglement, etc.

Back to Planck. It seems fairly obvious that his quote tallies closely with the physics of the past century, in which highly defined speculation and advanced mathematics join forces to develop hypotheses into theories long before hard evidence can be gleaned from the experimental method. Some of the key players in quantum physics have even furthered Copernicus' preference for beautiful mathematics over observation and experiment. Consider the one-time Lucasian Professor of Mathematics Paul Dirac's partiality for the beauty of equations over experimental results, even though he considered humanity's progress in maths to be 'feeble'. The strangeness of the sub-atomic world could be seen as a vindication of these views; another of Planck's quotes is "One must be careful, when using the word, real."

Leaving aside advanced physics, there are examples in the other scientific disciplines that confirm Planck's view. In the historical sciences, you can never know the full story. For example, fossils can provide some idea of the how and when a species diverged into two daughter species, but not necessarily the where and why (vis-à-vis ecological 'islands' in the wider sense). Not that this lack of precision should be taken as doubt of validity. As evolutionary biologist Stephen Jay Gould once said, a scientific fact is something "confirmed to such a degree that it would be perverse to withhold provisional assent."  So what might appear to primarily apply to one segment of the scientific endeavour can be applied across all of science.

5) Space travel is utter bilge (Richard van der Riet Woolley, Astronomer Royal)

In 1956 the then-Astronomer Royal made a prediction that was thoroughly disproved five years later with Yuri Gagarin's historic Vostock One flight. The quote has been used ever since as an example of how blind obedience to authority is unwise. But Woolley's complete quote was considerably more ambiguous: "It's utter bilge. I don't think anybody will ever put up enough money to do such a thing...What good would it do us? If we spent the same amount of money on preparing first-class astronomical equipment we would learn much more about the universe...It is all rather rot." He went on say: "It would cost as much as a major war just to put a man on the moon." In fact, the latter appears to be quite accurate, and despite the nostalgia now aimed at the Apollo era, the lack of any follow-up only reinforces the notion that the race to the moon was simply the ultimate example of Cold War competition. After all, only one trained geologist ever got there!

However, I'm not trying to defend the edited version of Woolley's inopportune statement since he appears to have been an armchair naysayer for several decades prior to his most famous quote. Back in 1936, his review of Rockets Through Space: The Dawn of Interplanetary Travel by the first president of the British Interplanetary Society (BIS) was even more pessimistic: "The whole procedure [of shooting rockets into space]...presents difficulties of so fundamental a nature, that we are forced to dismiss the notion as essentially impracticable, in spite of the author's insistent appeal to put aside prejudice and to recollect the supposed impossibility of heavier-than-air flight before it was actually accomplished." Again, it might appear in hindsight that Woolley deserves scorn, were it not for the fact that nearly everyone with some knowledge of space and aeronautics was of a similar opinion, and the opposition were a few 'cranks' and the like, such as BIS members.

The moral of the this story is that it is far from difficult to take a partial quote, or a statement out of context, and alter a sensible, realistic attitude (for its time and place) into an easy piece of fun. A recent tweet I saw was a plaintive request to read what Richard Dawkins actually says, rather than what his opponents claim he has says. In a worst-case scenario, quote-mining makes it possible to imply the very opposite of an author's intentions. Science may not be one hundred percent provable, but it's by the far the best approach we have to finding out that wonderful thing we humans call 'the truth'.

Tuesday 18 February 2014

Discovery FM: science programming on the radio

Considering the large amount of trash on satellite TV documentary channels (yes you, Discovery Channel and National Geographic, with your constant stream of gullible, gibbering 'experts' hunting down Bigfoot, UFOs and megalodon), I thought I'd do a bit of research into science programming on that long side-lined medium, radio.

Having grown up with BBC Radio in the UK I've always listened to a variety of documentaries, particularly on Radio Four. Although I now live in New Zealand one of the joys of the internet is the ability to listen to a large number of BBC science and natural history documentaries whenever I want. The BBC Radio website has a Science and Nature section with dozens of STEM (Science, Technology, Engineering and Mathematics) programmes from latest news shows such as Inside Science and Material World to series with specific subject matter such as the environmental-themed Costing the Earth.

A long-running live broadcast BBC series that covers an eclectic variety of both scientific and humanities subjects is novelist and history writer Melvyn Bragg's In Our Time. Over the past sixteen years distinguished scientific guests have explored numerous STEM topics in almost two hundred episodes. Although much of the science-themed material leans towards historical and biographical aspects, there has also been some interesting examination of contemporary scientific thought. The programme is always worth listening to, not least for Bragg's attempt to understand - or in the case of spectroscopy, pronounce - the complexities under discussion.

One of my other favourites is the humorous and wide-ranging The Infinite Monkey Cage, hosted by comedian Robin Ince and physicist/media star Brian Cox. Each episode features a non-scientist as well as several professionals, the former serving as a touchstone to ensure any technicalities are broken down into public-friendly phrasing. Many of the show's topics are already popular outside of science, such as SETI (the Search for Extra-Terrestrial Intelligence) and comparisons of science fiction to fact. The programme is well worth a listen just for the incidental humour: you can almost hear steam coming out of Brian Cox's ears whenever a guest mentions the likes of astrology. Despite having a former career as a professional pop keyboard player, the good professor is well known for his disparaging marks about philosophy and other non-scientific disciplines, cheekily referring to the humanities in one episode as 'colouring in'.

I confess that there are still many episodes I have yet to listen to, although I notice that a fair few of the programme descriptions are similar to topics I would like to discuss in this blog. In fact, an episode from December 2013 entitled "Should We Pander to Pandas?" bares a startling similarity to my post on wildlife conservation from three months earlier! Coincidence, zeitgeist or are the BBC cribbing my ideas? (It wouldn't be the first time, either...)

A final example of an excellent series is the hour-long live talk show The Naked Scientists, covering both topical stories and more general themes. In addition to the programme itself, the related website includes DIY experiments using materials from around the home and an all-embracing forum.

Although consisting of far fewer series, Radio New Zealand also broadcasts a respectable variety of science programming. There are currently thirty or so titles available in the science and factual section on line, including some interesting cross-overs. For instance, back in 2006 the late children's author Margaret Mahy discussed her interest in science and the boundaries between fact and fiction in The Catalogue of the Universe. Thanks to the internet, it isn't just radio stations that supply audio programming either: the Museum of New Zealand, Te Papa Tongarewa in Wellington, releases ad-hoc Science Express podcasts. So far I've been very impressed with the range on offer and it's always good to find in-depth discussion on local science stories.

The United States has a decent range of science programmes on various internet streams and the non-profit NPR network, with the related NPR website dividing the material into obvious themes such as the environment, space, energy and health. Most the programmes are very short - as little as three minutes - and often consist of news items, usually accompanied by a good written précis. NPR also distributes Public Radio International's weekly call-in talk show Science Friday, which is extremely popular as a podcast.  The associated website contains videos as well as individual articles from the radio show, although interestingly, the archive search by discipline combines physics and chemistry into one topic but separates nature, biology, and human brain and body, into three separate topics.

Planetary Radio is the Planetary Society's thirty-minute weekly programme related to the organisation's interests, namely astronomy, space exploration and SETI. For any fan of Carl Sagan's - and now Neil deGrasse Tyson's - Cosmos, it's pretty much unmissable.

Talking of which, various scientists now take advantage of podcasting for their own, personal audio channels. A well-known example is deGrasse Tyson's StarTalk, which as the name suggests, frequently concentrates on space-related themes. In addition to the serious stuff, there are interviews with performing artists and their opinion on science and once in a while some brilliant comedy too: the episode earlier this month in which Tyson speaks to God (who admits that amongst other divine frivolities, monkeys and apes were created as something to laugh at and that the universe really is just six thousand or so years old) is absolutely priceless.

Physicist Michio Kaku has gone one further by hosting two weekly shows: the live, three-hour Science Fantastic talk show and the hour-long Exploration. The former's website incorporates an archive of videos, some as might be expected concentrating on futurology, whilst the talk show itself often covers fruity topics verging on pseudoscience. The latter series is generally more serious but the programme is slightly spoilt by the frequent book-plugging and over-use of baroque background music.

The good news is that far from reducing radio the internet has developed a new multi-media approach to traditional broadcasting, with comprehensive archives of material available from a multitude of sources. One thing the US, UK and New Zealand programming has in common is the inclusion of celebrities, especially actors, both to enhance series profile and to keep content within the realm of comprehension by a general audience.

All in all, I'm pleasantly surprised by the variety and quality of audio programming emerging from various nations, as opposed to the pandering to new age, pseudoscientific and plain woolly thinking that frequently passes for science television broadcasting. Even book shops aren't immune: I was recently disappointed to notice that a major New Zealand chain book store had an 'Inspiration' section twice the size of its STEM material. So the next time you see a team of researchers in on a quest for a species of shark that has been extinct for over a million years, why not relax with good old-fashioned, steam-powered radio instead?

Monday 27 January 2014

An index of possibilities: defining science at a personal level

"If a little knowledge is dangerous, where is the man who has so much as to be out of danger?" - T.H. Huxley

With a sense of revitalisation following the start of a new year - and since the number of misconceived notions of the scientific method are legion - I thought I should put my cards on the table and delineate my personal ideas of what I believe science to be.

I suppose you could say it's a self-learning exercise as much as anything. Most people consider science the least comprehensible of all disciplines, removed from every day experience and only accessible by a select few (a.k.a. an intellectual elite), albeit at the loss of the creativity that drives so many other aspects of our lives. But hopefully the incredible popularity of British physicist Brian Cox and other photogenic scientist-cum-science-communicators is more than a passing fad and will help in the long term to break down this damaging myth. Science is both part and parcel of our existence and will only increase in importance as we try to resolve such vital issues as environmental degradation whilst still providing enough food and water for an ever-increasing population (fingers very much crossed on that one, folks!)

So here goes: my interpretation of the scientific method in ten bite-size, easy-to-swallow, chunks.
  1. A large amount of science is not difficult to comprehend
    Granted, theoretical high-energy physics is one of several areas of science difficult to describe meaningfully in a few, short sound bites. But amidst the more obtuse volumes aimed at a popular readership there are some gems that break down the concepts to a level that retains the essential details without resorting to advanced mathematics. Evolutionary biologist Stephen Jay Gould noted that the fear of incompetence put many intelligent enthusiasts off learning science as a leisure activity, but with the enormity of these popular science sections in many bookstores - there are over 840,000 books in Amazon.com's science section - there is no longer an excuse for not dipping a toe. Leaving physics aside, there are plenty of areas of science that are easy to understand too, especially in the 'historical' disciplines such as palaeontology (more on that later).
  2. Science is not a collection of facts but a way of exploring reality
    This is still one of the most difficult things to convey. Bill Bryson's prize-winning best seller A Short History of Nearly Everything reminds me of the genre of boy's own bumper book of true facts that was still around when I was a child: Victorian-style progress with a capital 'P' and science just a compilation of theories and facts akin to say, history. The reality is of course rather more complicated. The scientific method is a way of examining nature via testable questions that can be resolved to a high degree of certainty by simplified models, either by practical experiments (both repeatable and under 'laboratory conditions') - and including these days, computer simulations - or via mathematics.
  3. Science requires creativity, not just rigor
    The stereotype of scientists as rational, unemotional beings has been broken down over the past thirty years or so, but many non-scientists still have little idea of the creative thinking that can be involved in science, particularly in cutting-edge theorising. From Einstein's thought experiments such as what it would be like to ride alongside a beam of light to the development of string theory - which has little likelihood of experimental evidence in the near future - scientists need to utilise creative thought at least as much as data collation and hard mathematics.
  4. Scientists are only human
    Scientists are far from immune to conditioned paths of thought ingrained via their social and cultural background. Therefore, rather than all scientists being equally adept at developing particular hypotheses, they are subject to the same whims and sense of normality as everyone else. In addition, individual idiosyncrasies can hinder their career. I've discussed previously how Einstein (who famously said his contempt of authority was punished by him becoming an authority himself) refused to accept some of the aspects of quantum theory long after his contemporaries had.
    Scientists could be said then to follow the stereotype visible elsewhere, namely that young radicals frequently evolve into old conservatives.
  5. If there's no proof, is it still science?
    Thomas Henry Huxley (a.k.a. Darwin's Bulldog) once said that the 'deepest sin against the human mind is to believe things without evidence'. Yet scientific hypotheses are sometimes formed prior to any support from nature or real-world experimentation. Although Charles Darwin had plenty of the evidence revealing artificial selection when he wrote On the Origin of Species, the fossil record at the time was extremely patchy and he had no knowledge of Mendelian inheritance. In addition, the most prominent physicists of his day were unaware of nuclear fusion and so their theories of how stars shone implied a solar system far too young for natural selection to be the primary mechanism of evolution. By sticking to his ideas in spite of these issues, did this make Darwin a poor scientist? Or is it feasible that many key advances require a leap of faith - a term unlikely to please Richard Dawkins - due to lack of solid, physical evidence?
  6. Are there two schools of science?
    New Zealand physicist Ernest Rutherford once disparagingly remarked something along the lines of physics being the only real science, and that other so-called scientific disciplines are just stamp collecting. I prefer to think of science as being composed of historical and non-historical disciplines, only occasionally overlapping. For instance, cutting-edge technological application of physics required repeatable and falsifiable experiments, hence the deemed failure of cold fusion, whilst the likes of meteorology, evolutionary biology, and palaeontology are composed of innumerable historical events and/or subject to the complexities of chaos theory and as such are unlikely to provide duplicate circumstances for testing or even capable of being broken down into simplified models that can be accurately tested.
  7. An accepted theory is not necessarily final
    A theory doesn't have to be the absolute end of a quest. For example, Newton's law of universal gravitation had to wait over two centuries for Einstein's general theory of relativity to explain the mechanism behind the phenomenon. Although quantum mechanics is the most accurate theory ever developed (in terms of the match between theory and experimental results), the root cause is yet to be understood, with wildly varying interpretations offered instead. The obvious problem with this approach is that a hypothesis may fit the facts but without an explanatory mechanism, scientists may reject it as untenable. A well-known instance of this scientific conservatism (albeit for good reasons) involved Alfred Wegener's hypothesis of continental drift, which only achieved orthodoxy decades later once plate tectonics was discovered.
  8. Scientific advance rarely proceeds by eureka moments
    Science is a collaborative effort. Few scientists work in a vacuum (except astronauts, of course!) Even the greatest of 'solo' theories such as universal gravitation was on the cards during Newton's lifetime, with contemporaries such as Edmond Halley working along similar lines. Unfortunately, our predilection for simple stories with identifiable heroes means that team leaders and thesis supervisors often receive the credit when many researchers have worked towards a goal. In addition, the priority rule is based on first publication, not when a scientist formulated the idea. Therefore many theories are named after scientists who may not have been the earliest discoverer or formulator. The work of unsung researchers is frequently neglected in favour of this simplified approach that glorifies the work of one pioneer at the expense of many others.
  9. Science is restricted by the necessity of using language to describe it
    Richard Dawkins has often railed against Plato's idealism (a.k.a. Essentialism), using the phrase 'the tyranny of the discontinuous mind'. I recall a primary example of this as a child, whilst contemplating a plastic model kit I had of a Neanderthal. I wondered how the human race had evolved: specifically, how could parents of a predecessor hominid species give birth to a modern human, i.e. a child of a different species? Of course, such discontinuity is nonsense, but it is surprising how frequently our mind interprets the world in this format of neat boundaries. A large part of the problem is how do we define transitional states as the norm, since our language is bound up with intrinsic categories? In addition, we rely on metaphor and analogy to describe aspects of the universe that do not conform to everyday experience, the nature of quantum probability being an obvious example. As with the previous point on our innate need for heroes, we are always constructing narratives, thus restricting our ability to understand nature at a fundamental level.
  10. Science does not include a moral dimension
    Science, like nature, is neither moral nor immoral and cannot provide a framework for human behaviour. Of course, this doesn't prevent scientists from being greedy or stupid, or even just naïve: witness British evolutionary biologist J.B.S. Haldane who recommended the use of poison gas as a war weapon due to it being more humane than conventional weapons (in terms of the ratio of deaths to temporarily incapacitation). This suggests that non-scientists should be involved in the decision-making process for the funding of some science projects, especially those with clear applications in mind. But in order for this to be tenable, the public needs to be considerably more scientifically literate than at present. Otherwise the appalling scare-mongering engendered by the likes of the British tabloid press - think genetically modified crops labelled as 'Frankenstein foods' - will only make matters far worse. GM crops themselves are a perfect example of why the Hollywood approach for clear-cut heroes and villains fails with most of science. Reality is rarely black or white but requires careful analysis of the myriad shades of grey.
In conclusion, it might be said that there are as many variants of science as there are human beings. Contrary to many other disciplines, mistakes and ignorance are clear strengths: as Darwin stated in The Descent of Man, 'Ignorance more frequently begets confidence than does knowledge.' Above all, there are aspects of science that are part and parcel of our everyday experience and as such, we shouldn't just consider it as something to save for special occasions.