Showing posts with label Bill Bryson. Show all posts
Showing posts with label Bill Bryson. Show all posts

Monday 25 January 2021

Ignorance is bliss: why admitting lack of knowledge could be good for science

"We just don't know" might be one of the best phrases in support of the scientific method ever written. But unfortunately it carries an inherent danger: if a STEM professional - or indeed an amateur scientist/citizen scientist - uses the term, it can be used by those wishing to disavow the subject under discussion. Even adding "- yet" to the end of it won't necessarily improve matters; we humans have an unfortunate tendency to rely on gut instinct rather than rational analysis for our world model, hence - well, just about any man-made problem you care to name, now or throughout history.

Even though trust in scientists and the real-world application of their work may have taken an upswing thanks to some rapid vaccine development during the current pandemic, there are many areas of scientifically-gleaned knowledge that are still as unpopular as ever. Incidentally, I wonder whether if it wasn't for much stricter laws in most countries today, we would have seen far more of the quackery that arose during the 1918 Spanish flu epidemic. During this period low-tech 'cures' included gas inhalation, enemas and blood-letting, the former about as safe as last year's suggestion to drink bleach. I've seen very little about alternative cures, no doubt involving crystals, holy water or good old-fashioned prayer, but then I probably don't mix in those sort of circles (and certainly don't have that type of online cookie profile). But while legislation might have prevented alternative pandemic treatments from being advertised as legitimate and effective, it hasn't helped other areas of science that suffer from widespread hostility. 

Partly this is due to the concept - at least in liberal democracies - of free speech and the idea that every thesis must surely have an antithesis worthy of discussion. Spherical planets not your bag, baby? Why not join the Flat Earth Society. It's easy to be glib about this sort of thing, but there are plenty of more serious examples of anti-scientific thinking that show no sign of abating. The key element that disparate groups opposing science seem to have in common is simple; it all comes down to where it disagrees with the world picture they learnt as a child. In most cases this can be reduced even further to just two words: religious doctrine.

This is where a humble approach to cutting-edge research comes in. Humility has rarely been a key characteristic of fictional scientists; Hollywood for example has often depicted (usually male) scientists as somewhere on a crude line between power-crazed megalomaniacs and naive, misguided innocents. The more sensational printed volumes and tv documentaries communicating scientific research to a popular audience likewise frequently eschew ambiguities or dead-ends in favour of this-is-how-it-is approach. Only, quite often, it isn't how it works at all. Doubts and negative results are not only a key element of science, they are a fundamental component; only by discarding failures can the search for an answer to an hypothesis (or if you prefer the description of the brilliant-yet-humble physicist Richard Feynman: a guess) be narrowed down. 

There are plenty of examples where even the most accomplished of scientists have admitted they don't know the answer to something in their area of expertise, such as Sir Isaac Newton being unable to resolve the ultimate cause of gravity. As it was, it took over two centuries for another genius - Albert Einstein - to figure it out. Despite all the research undertaken over the past century or so, the old adage remains as true as ever: good science creates as many new questions as it answers. Key issues today that are unlikely to gain resolution in the next few years - although never say never - include what is the nature of dark energy (and possibly likewise for dark/non-baryonic matter) and what is the ultimate theory behind quantum mechanics? 

Of course, these questions, fascinating though they are, hold little appeal to most people; they are just too esoteric and far removed from everyday existence to be bothered about. So what areas of scientific knowledge or research do non-scientists worry about? As mentioned above, usually it is something that involves faith. This can be broken down into several factors:

  1. Disagreement with a key religious text
  2. Implication that humans lack an non-corporeal element, such as an immortal soul
  3. Removal of mankind as a central component or focal point for the universe 

These obviously relate to some areas of science - from a layman's viewpoint - far more than others. Most non-specialists, even religious fundamentalists, don't appear to have an issue with atomic theory and the periodic table. Instead, cosmology and evolutionary biology are the disciplines likely to raise their ire. Both are not in any sense complete; the amount of questions still being asked is far greater than the answers so far gleaned from research. The former is yet to understand what 96% of the universe is composed of, while the latter is still piecing together the details of the origin and development of life of our planet, from primordial slime up to Donald Trump (so possibly more of a sideways move, then). 

Herein lies the issue: if scientists claim they are 'certain' about the cause of a particular phenomenon or feature of reality, but further research confirms a different theory, then non-scientists are  legitimately able to ask why the new idea is any more final than the previous one? In addition, the word 'theory' is also prone to misinterpretation, implying it is only an idea and not an hypothesis (guess, if you like) that hasn't yet failed any tests thrown at it, be they practical experiments, digital simulations or mathematical constructions. Bill Bryson's best-selling A Short History of Nearly Everything is an example of how science can be done a disservice by material meant to promote it, in that the book treats science as if it were an ever-expanding body of knowledge rather than as a collection of methods that are used to explore answerable questions about life, the universe, and of course, everything.

Perhaps one answer to all this would be for popular science journalism, from books written by professional scientists to short news items, to include elements related to what is not yet known. The simplistic approach that avoids the failures only serves to strengthen the opinion that experts are arrogant believers in their own personal doctrines, as inflexible and uncompromising as holy writ. 

Unfortunately, in efforts to be both concise and easy-to-comprehend, much science communication appears to render the discipline in this manner, avoiding dissension and doubt. In addition, the often wonderful - and yet to be resolved subtleties - of research are neglected. For example, the majority of specialists agree that birds are descended from theropod (i.e. carnivorous) dinosaurs, and yet the primary growth axis on the forelimbs of the two groups differs. This issue has not been satisfactorily answered, but the vast collection of evidence, both from fossils and experimentation, claims it as the most plausible solution to this particular phylogenetics tree. Further research, especially in embryology, may one day find a more complete solution.

Ultimately then, science education would probably benefit from it confirming boundaries of uncertainty, where they exist. This may help allay fears that the discipline wants to impose absolutes about everything; in most areas (the second law of thermodynamics excepted) we are still in the early stages of understanding. This doesn't mean that the Earth may be flat or only six thousand years old, but it does mean that science usually works in small steps, not giant paradigm shifts that offer the final say on an aspect of reality. After all, if scientists already knew everything about a subject, there wouldn't be any need for further research. What a boring world that would be!

Friday 21 December 2018

The Twelve (Scientific) Days Of Christmas

As Christmas approaches and we get over-saturated in seasonal pop songs and the occasional carol, I thought it would be appropriate to look at a science-themed variation to this venerable lyric. So without further ado, here are the twelve days of Christmas, STEM-style.

12 Phanerozoic periods

Although there is evidence that life on Earth evolved pretty much as soon as the conditions were in any way suitable, microbes had the planet to themselves for well over three billion years. Larger, complex organisms may have gained a kick-start thanks to a period of global glaciation - the controversial Snowball Earth hypothesis. Although we often hear of exoplanets being found in the Goldilocks zone, it may also take an awful lot of luck to produce a life-bearing environment. The twelve geological periods of the Phanerozoic (literally, well-displayed life) cover the past 542 million years or so and include practically every species most of us have ever heard of. Hard to believe that anyone who knows this could ever consider our species to be the purpose of creation!

11 essential elements in humans

We often hear the phrase 'carbon-based life forms', but we humans actually contain over three times the amount of oxygen than we do of carbon. In order of abundance by mass, the eleven vital elements are oxygen, carbon, hydrogen, nitrogen, calcium, phosphorus, potassium, sulfur, sodium, chlorine and magnesium. Iron, which you might think to be present in larger quantities, is just a trace mineral; adults have a mere 3 or 4 grams. By comparison, we have about 25 grams of magnesium. In fact, iron and the other trace elements amount to less than one percent of our total body mass. Somehow, 'oxygen-based bipeds' just doesn't have the same ring to it.

10 fingers and toes

The evolution of life via natural selection and genetic mutation consists of innumerable, one-off events. This is science as history, although comparative studies of fossils, DNA and anatomy are required instead of written texts and archaeology. It used to be thought that ten digits was canonical, tracing back to the earliest terrestrial vertebrates that evolved from lobe-finned fish. Then careful analysis of the earliest stegocephalians of the late Devonian period such as Acanthostega showed that their limbs terminated in six, seven or even eight digits. The evolution of five-digit limbs seems to have occurred only once, in the subsequent Carboniferous period, yet of course we take it - and the use of base ten counting - as the most obvious of things. Just imagine what you could play on a piano if you had sixteen fingers!

9 climate regions

From the poles to the equator, Earth can be broadly divided into the following climate areas: polar and tundra; boreal forest; temperate forest; Mediterranean; desert; dry grassland; tropical grassland; tropical rainforest. Mountains are the odd region out, appearing in areas at any latitude that contains the geophysical conditions suitable for their formation. Natural selection leads to the evolution of species suited to the local variations in daylight hours, weather and temperature but the labels can be deceptive; the Antarctic for example contains a vast polar desert. We are only just beginning to understand the complex feedback systems between each region and its biota at a time when species are becoming extinct almost faster than they can be catalogued. We upset the relative equilibrium at our peril.

8 major planets in our solar system

When I was a child, all astronomy books described nine known planets, along with dozens of moons and numerous asteroids. Today we know of almost four thousand planets in other solar systems, some of a similar size to Earth (and even some of these in the Goldilocks zone). However, since 1996 our solar system has been reduced to eight planets, with Pluto amended to the status of a dwarf planet. Technically, this is because it fails one of the three criteria of major planets, in that it sometimes crosses Neptune’s orbit rather than sweeping it clear of other bodies. However, as there is at least one Kuiper belt object, Eris, almost as large as Pluto, it makes sense to stick to a definition that won’t see the number of planets continually rise with each generation of space telescope. This downgrading appears to have upset a lot of people, so it’s probably a good to mention that science is as much a series of methodologies as it is a body of knowledge, with the latter being open to change when required - it’s certainly not set-in-stone dogma! So as astronomer Neil DeGrasse Tyson and author of the best-selling The Pluto Files: The Rise and Fall of America's Favorite Planet put it: "Just get over it!"

7 colours of the rainbow

This is one of those everyday things that most of us never think about. Frankly, I don't know anyone who has been able to distinguish indigo from violet in a rainbow and yet we owe this colour breakdown not to an artist but to one of the greatest physicists ever, Sir Isaac Newton. As well as fulfilling most of the criteria of the modern day scientist, Newton was also an alchemist, numerologist, eschatologist (one of his predictions is that the world will end in 2060) and all-round occultist. Following the mystical beliefs of the Pythagoreans, Newton linked the colours of the spectrum to the notes in Western music scale, hence indistinguishable indigo making number seven. This is a good example of how even the best of scientists are only human.

6 mass extinction events

Episode two of the remake of Carl Sagan's Cosmos television series featuring Neil DeGrasse Tyson was called 'Some of the Things That Molecules Do'. It explored the five mass extinction events that have taken place over the past 450 million years. Tyson also discusses what has come to be known as the Holocene extinction, the current, sixth period of mass dying. Although the loss of megafauna species around the world has been blamed on the arrival of Homo sapiens over the past 50,000 years, the rapid acceleration of species loss over the last ten millennia is shocking in the extreme. It is estimated that the current extinction rate is anywhere from a thousand to ten thousand times to the background rate, resulting in the loss of up to two hundred plant or animals species every day. Considering that two-thirds of our pharmaceuticals are derived or based on biological sources, we really are shooting ourselves in the foot. And that's without considering the advanced materials that we could develop from nature.

5 fundamental forces

Also known as interactions, in order from strongest to weakest these are: the strong nuclear force; electro-magnetism; the weak nuclear force; and gravity. One of the most surprising finds in late Twentieth Century cosmology was that as the universe expands, it is being pushed apart at an ever-greater speed. The culprit has been named dark energy, but that's where our knowledge ends of this possible fifth force. Although it appears to account for about 68% of the total energy of the known universe, the label 'dark' refers to the complete lack of understanding as to how it is generated. Perhaps the most radical suggestion is that Einstein's General Theory of Relativity is incorrect and that an overhaul of the mechanism behind gravity would remove the need for dark energy at all. One thing is for certain: we still have a lot to learn about the wide-scale fabric of the universe.

4 DNA bases

Despite being one of the best-selling popular science books ever, Bill Bryson's A Short History of Nearly Everything manages to include a few howlers, including listing thiamine (AKA vitamin B1) as one of the four bases, instead of thymine. In addition to an understanding how the bases (adenine, cytosine, guanine and thymine) are connected via the double helix backbone, the 1953 discovery of DNA's structure also uncovered the replication mechanism, in turn leading to the development of the powerful genetic editing tools in use today. Also, the discovery itself shows how creativity can be used in science: Watson and Crick's model-building technique proved to be a faster way of generating results than the more methodical x-ray crystallography of Rosalind Franklin and Maurice Wilkins - although it should be noted that one of Franklin's images gave her rivals a clue as to the correct structure. The discovery also shows that collaboration is often a vital component of scientific research, as opposed to the legend of the lonely genius.

3 branches of science

When most people think of science, they tend to focus on the stereotypical white-coated boffin, beavering away in a laboratory filled with complex equipment. However, there are numerous branches or disciplines, covering the purely theoretical, the application of scientific theory, and everything in between. Broadly speaking, science can be divided into the formal sciences, natural sciences and social sciences, each covering a variety of categories themselves. Formal sciences include mathematics and logic and has aspects of absolutism about it (2+2=4). The natural or 'hard' sciences are what we learn in school science classes and broadly divide into physics, chemistry and biology. These use observation and experiment to develop working theories, but maths is often a fundamental component of the disciplines. Social or 'soft' sciences speak for themselves, with sub-disciplines such as anthropology sometimes crossing over into humanities such as archaeology. So when someone tells you that all science is impossibly difficult, you know they obviously haven't considered just what constitutes science!

2 types of fundamental particles

Named after Enrico Fermi and Satyendra Nath Bose respectively, fermions and bosons are the fundamental building blocks of the universe. The former, for example quarks and electrons, are the particles of mass and obey the Pauli Exclusion Principle, meaning no two fermions can exist in the same place in the same state. The latter are the carriers of force, with photons being the best known example. One problem with these particles and their properties such as angular momentum or spin is that most analogies are only vaguely appropriate. After all, we aren't used to an object that has to rotate 720 degrees in order to get back to its original state! In addition, there are many aspects of underlying reality that are far from being understood. String theory was once mooted as the great hope for unifying all the fermions and bosons, but has yet to achieve absolute success, while the 2012 discovery of the Higgs boson is only one potential advance in the search for a Grand Unifying Theory of creation.

1 planet Earth

There is a decorative plate on my dining room wall that says "Other planets cannot be as beautiful as this one." Despite the various Earth-sized exoplanets that have been found in the Goldilocks zone of their solar system, we have little chance in the near future of finding out if they are inhabited as opposed to just inhabitable. Although the seasonal methane on Mars hints at microbial life there, any human colonisation will be a physically and psychologically demanding ordeal. The idea that we can use Mars as a lifeboat to safeguard our species - never mind our biosphere - is little more than a pipedream. Yet we continue to exploit our home world with little consideration for the detrimental effects we are having on it. As the environmental movement says: there is no Planet B. Apart from the banning of plastic bags in some supermarkets, little else appears to have been done since my 2010 post on reduce, reuse and recycle. So why not make a New Year’s resolution to help future generations? Wouldn’t that be the best present for your children and your planetary home?

Wednesday 30 May 2018

Photons vs print: the pitfalls of online science research for non-scientists


It's common knowledge that school teachers and university lecturers are tired of discovering that their students' research is often limited to one search phrase on Google or Bing. Ignoring the minimal amount of rewriting that often accompanies this shoddy behaviour - leading to some very same-y coursework - one of the most important questions to arise is how easy is it to confirm the veracity of online material compared to conventionally-published sources? This is especially important when it comes to science research, particularly when the subject matter involves new hypotheses and cutting-edge ideas.

One of the many problems with the public's attitude to science is that it is nearly always thought of as an expanding body of knowledge rather than as a toolkit to explore reality. Popular science books such as Bill Bryson's 2003 best-seller A Short History of Nearly Everything follow this convention, disseminating facts whilst failing to illuminate the methodologies behind them. If non-scientists don't understand how science works is it little wonder that the plethora of online sources - of immensely variable quality - can cause confusion?

The use of models and the concurrent application of two seemingly conflicting theories (such as Newton's Universal Gravitation and Einstein's General Theory of Relativity) can only be understood with a grounding in how the scientific method(s) proceed. By assuming that scientific facts are largely immutable, non-scientists can become unstuck when trying to summarise research outcomes, regardless of the difficulty in understanding the technicalities. Of course this isn't true for every theory: the Second Law of Thermodynamics is unlikely to ever need updating; but as the discovery of dark energy hints, even Einstein's work on gravity might need amending in future. Humility and caution should be the bywords of hypotheses not yet verified as working theories; dogma and unthinking belief have their own place elsewhere!

In a 1997 talk Richard Dawkins stated that the methods of science are 'testability, evidential support, precision, quantifiability, consistency, intersubjectivity, repeatability, universality, and independence of cultural milieu.' The last phrase implies that the methodologies and conclusions for any piece of research should not differ from nation to nation. Of course the real world intrudes into this model and so culture, gender, politics and even religion play their part as to what is funded and how the results are presented (or even which results are reported and which obfuscated).

For those who want to stay ahead of the crowd by disseminating the most recent breakthroughs it seems obvious that web resources are far superior to most printed publications, professional journals excepted - although the latter are rarely suitable for non-specialist consumption. The expenses associated with producing popular science books means that online sources are often the first port of call.

Therein lies the danger: in the rush to skim seemingly inexhaustible yet easy to find resources, non-professional researchers frequently fail to differentiate between articles written by scientists, those by journalists with science training, those by unspecialised writers, largely on general news sites, and those by biased individuals. It's usually quite easy to spot material from cranks, even within the quagmire of the World Wide Web (searching for proof that the Earth is flat will generate tens of millions of results) but online content written by intelligent people with an agenda can be more difficult to discern. Sometimes, the slick design of a website offers reassurance that the content is more authentic than it really is, the visual aspects implying an authority that is not justified.

So in the spirit of science (okay, so it's hardly comprehensive being just a single trial) I recently conducted a simple experiment. Having read an interesting hypothesis in a popular science book I borrowed from the library last year, I decided to see what Google's first few pages had to say on the same subject, namely that the Y chromosome has been shrinking over the past few hundred million years to such an extent that its days - or in this case, millennia - are numbered.

I had previously read about the role of artificial oestrogens and other disruptive chemicals in the loss of human male fertility, but the decline in the male chromosome itself was something new to me. I therefore did a little background research first. One of the earliest sources I could find for this contentious idea was a 2002 paper in the journal Nature, in which the Australian geneticist Professor Jennifer Graves described the steady shrinking of the Y chromosome in the primate order. Her extrapolation of the data, combined with the knowledge that several rodent groups have already lost their Y chromosome, suggested that the Home sapiens equivalent has perhaps no more than ten million years left before it disappears.

2003 saw the publication of British geneticist Bryan Sykes' controversial book Adam's Curse: A Future Without Men. His prediction based on the rate of atrophy in the human Y chromosome was that it will only last another 125,000 years. To my mind, this eighty-fold difference in timescales suggests that for these early days in its history, very little of the hypothesis could be confirmed with any degree of certainty.

Back to the experiment itself. The top results for 'Y chromosome disappearing' and similar search phrases lead to articles published between 2009 and 2018. They mostly fall into one of two categories: (1) that the Y chromosome is rapidly degenerating and that males, at least of humans and potentially all other mammal species, are possibly endangered; and (2) that although the Y chromosome has shrunk over the past few hundred million years it has been stable for the past 25 million and so is no longer deteriorating. A third, far less common category, concerns the informal polls taken of chromosomal researchers, who have been fairly evenly divided between the two opinions and thus nicknamed the "leavers" and the "remainers". Considering the wildly differing timescales mentioned above, perhaps this lack of consensus is proof of science in action; there just hasn't been firm enough evidence for either category to claim victory.

What is common to many of the results is that inflammatory terms and hyperbole are prevalent, with little in the way of caution you would hope to find with cutting-edge research. Article titles include 'Last Man on Earth?', 'The End of Men' and 'Sorry, Guys: Your Y Chromosome May Be Doomed ', with paragraph text contain provocative phrases such as 'poorly designed' and 'the demise of men'. This approach is friendly to organic search at the same time as amalgamating socio-political concerns with the science.

You might expect that the results would show a change in trend of time, first preferring one category and then the other, but this doesn't appear to be the case. Rearranged in date order, the search results across the period 2009-2017 include both opinions running concurrently. This year however has seen a change, with the leading 2018 search results so far only offering support to the rapid degeneration hypothesis. The reason for this difference is readily apparent: publication of a Danish study that bolsters support for it. This new report is available online, but is difficult for a non-specialist to digest. Therefore, most researchers such as myself would have to either rely upon second-hand summaries or, if there was enough time, wait for the next popular science book that discusses it in layman's terms.

As it is, I cannot tell from my skimming approach to the subject whether the new research is thorough enough to be completely reliable. For example, it only examined the genes of sixty-two Danish men, so I have no idea if this is a large enough sample to be considered valid beyond doubt. However, all of the 2018 online material I read accepted the report without question, which at least suggests that after a decade and a half of vacillating between two theories, there may now be an answer. Even so, by examining the content in the "remainers" category, I wonder how the new research confirms a long term trend rather than short term blip in chromosomal decline. I can't help thinking that the sort of authoritative synthesis found in the better sort of popular science books would answer these queries, such is my faith in the general superiority of print volumes!

Of course books have been known to emphasise pet theories and denigrate those of opponents, but the risk of similar issues for online content is far greater. Professor Graves' work seems to dominate the "leavers" category, via her various papers subsequent to her 2002 original, but just about every reference to them is contaminated with overly emotive language. I somehow doubt that if her research was only applicable to other types of animals, say reptiles, there would be nearly so many online stories covering it, let alone the colourful phrasing that permeates this topic. The history of the Y chromosome is as extraordinary as the chromosome itself, but treating serious scientific speculation - and some limited experimental evidence - with tabloid reductionism and show business hoopla won't help when it comes to non-specialists researching the subject.

There may be an argument here for the education system to systematically teach such basics as common sense and rigour, in the hopes of giving non-scientists a better chance of detecting baloney. This of course includes the ability to accurately filter online material during research. Personally, I tend to do a lot of cross-checking before committing to something I haven't read about on paper. If even such highly-resourced and respected websites as the BBC Science News site can make howlers (how about claiming that chimpanzees are human ancestors?) why should we take any of these resources on trust? Unfortunately, the seductive ease with which information can be found on the World Wide Web does not in any way correlate with its quality. As I found out with the shrinking Y chromosome hypothesis, there are plenty of traps for the unwary.

Saturday 16 August 2014

The escalating armoury: weapons in the war between science and woolly thinking

According to that admittedly dubious font of broad knowledge Wikipedia, there are currently sixteen Creationist museums in the United States alone. These aren't minor attractions for a limited audience of fundamentalist devotees either: one such institution in Kentucky has received over one million visitors in its first five years. That's hardly small potatoes! So how much is the admittance fee and when can I go?

Or maybe not. It isn't the just the USA that has become home to such anti-scientific nonsense either: the formerly robust secular societies of the UK and Australia now house museums and wildlife parks with similar anti-scientific philosophies. For example, Noah's Ark Zoo Farm in England espouses a form of Creationism in which the Earth is believed to be a mere 100,000 years old. And of course in addition to traditional theology, there is plenty of pseudo-scientific/New Age nonsense that fails every test science can offer and yet appears to be growing in popularity. Anyone for Kabbalah?

It's thirty-five years since Carl Sagan's book Broca's Brain: Reflections on the Romance of Science summarised the scientific response to the pseudo-scientific writings of Immanuel Velikovsky. Although Velikovsky and his bizarre approach to orbital mechanics - created in order to provide an astrophysical cause for Biblical events - has largely been forgotten, his ideas were popular enough in their time. A similar argument could be made for the selective evidence technique of Erich von Daniken in the 1970's, whose works have sold an astonishing 60 million copies; and to a less extent the similar approach of Graham Hancock in the 1990's. But a brief look at that powerhouse of publishing distribution, Amazon.com, shows that today there is an enormous market for best-selling gibberish that far outstrips the lifetime capacity of a few top-ranking pseudo-scientists:
  • New Age: 360,000
  • Spirituality: 243,000
  • Religion: 1,100,000
  • (Science 3,100,000)
(In the best tradition of statistics, all figures have been rounded slightly up or down.)

Since there hasn't exactly been a decrease of evidence for most scientific theories, the appeal of the genre must be due to changes in society. After writing-off the fundamentalist/indoctrinated as an impossible-to-change minority, what has lead to the upsurge in popularity of so many publications at odds with critical thinking?

It seems that those who misinterpret scientific methodology, or are in dispute with it due to a religious conviction, have become adept at using the techniques that genuine science popularisation utilises. What used to be restricted to the printed word has been expanded to include websites, TV channels, museums and zoos that parody the findings of science without the required rigorous approach to the material. Aided and abetted by well-meaning but fundamentally flawed popular science treatments such as Bill Bryson's A Short History of Nearly Everything, which looks at facts without real consideration of the science behind them, the public are often left with little understanding of what separates science from its shadowy counterparts. Therefore the impression of valid scientific content that some contemporary religious and pseudo-science writers offer can quite easily be mistaken for the genuine article. Once the appetite for a dodgy theory has been whetted, it seems there are plenty of publishers willing to further the interest.

If a picture is worth a thousand words, then the 'evidence' put forward in support of popular phenomenon such an ancient alien presence or faked moon landings seems all the more impressive. At a time when computer-generated Hollywood blockbusters can even be replicated on a smaller scale in the home, most people are surely aware of how easy it is to be fooled by visual evidence. But it seems that pictorial support for a strongly-written idea can resonate with the search for fundamental meaning in an ever more impersonal technocratic society. And of course if you are flooded with up-to-the-minute information from a dozen sources then it is much easier to absorb evidence from your senses than having to unravel the details from that most passé of communication methods, boring old text. Which perhaps fails to explain just why there are quite so many dodgy theories available in print!

But are scientists learning from their antithesis how to fight back? With the exception of Richard Dawkins and other super-strict rationalists, science communicators have started to take on board the necessity of appealing to hearts as well as minds. Despite the oft-mentioned traditional differentiation to the humanities, science is a human construct and so may never be purely objective. Therefore why should religion and the feel-good enterprises beloved of pseudo-scientists hold the monopoly on awe and wonder?

Carl Sagan appears to have been a pioneer in the field of utilising language that is more usually the domain of religion. In The Demon-Haunted Word: Science As A Candle In The Dark, he argues that science is 'a profound source of spirituality'. Indeed, his novel Contact defines the numinous outside of conventional religiosity as 'that which inspires awe'. If that sounds woolly thinking, I'd recommend viewing the clear night sky away from city lights...

Physicist Freeman Dyson's introduction to the year 2000 edition of Sagan's Cosmic Connection uses the word 'gospel' and the phrase 'not want to appear to be preaching'. Likewise, Ann Druyan's essay A New Sense of the Sacred in the same volume includes material to warm the humanist heart. Of course, one of the key intentions of the Neil deGrasse Tyson-presented reboot of Cosmos likewise seeks to touch the emotions as well as improve the mind, a task at which it sometimes - in my humble opinion - overreaches.

The emergence of international science celebrities such as Tyson is also helping to spread the intentions if not always the details of science as a discipline. For the first time since Apollo, former astronauts such as Canadian Chris Hadfield undertake international public tours. Neil deGrasse Tyson, Michio Kaku and Brian Cox are amongst those practicing scientists who host their own regular radio programmes, usually far superior to the majority of popular television science shows. Even the seven Oscar-winning movie Gravity may have helped promote science, with its at times accurate portrayal of the hostile environment outside our atmosphere, far removed from the science fantasy of most Hollywood productions. What was equally interesting was that deGrasse Tyson's fault-finding tweets of the film received a good deal of public attention. Can this suppose that despite the immense numbers of anti-scientific publications on offer, the public is prepared to put trust in scientists again? After all, paraphrasing Monty Python, what have scientists ever done for us?

There are far important uses for the time and effort that goes into such nonsense as the 419,000 results on Google discussing 'moon landing hoax'. And there's worse: a search for 'flat earth' generates 15,800,00 results. Not that most of these are advocates, but surely very few would miss most of the material discussing these ideas ad nauseum?

Although it should be remembered that scientific knowledge can be progressed by unorthodox thought - from Einstein considering travelling alongside a beam of light to Wegener's continental drift hypothesis that led to plate tectonics - but there is usually a fairly obvious line between an idea that may eventually be substantiated and one that can either be disproved by evidence or via submission to parsimony. Dare we hope that science faculties might teach their students techniques for combating an opposition that doesn't fight fair, or possibly even how to use their own methods back at them? After all, it's time to proselytise!

Saturday 15 March 2014

Cutting remarks: investigating five famous science quotations

If hearing famous movie lines being misquoted seems annoying, then misquoted or misused science citations can be exasperating, silly or downright dangerous. To this end, I thought that I would examine five well-known science quotations to find the truth behind the soundbite. By delineating the accurate (as far as I'm aware) words in the wider context in which they were said/written down/overheard by someone down the hallway, I may be able to understand the intended meaning, and not the autopilot definition frequently used. Here goes:

1) God does not play dice (Albert Einstein)

Possibly Einstein's most famous line, it sound like the sort of glib comment that could be used by religious fundamentalists to denigrate science in two opposing fashions: either Einstein is being facetious and therefore sacrilegious; or he supports an old-fashioned version of conventional Judeo-Christian beliefs in which God can be perceived in the everyday world. Talk about having your cake and eating it!

Einstein is actually supposed to have said: "It is hard to sneak a look at God's cards. But that he would choose to play dice with the world...is something that I cannot believe for a single moment." This gives us much more material to work with: it was actually a quote Einstein himself supplied to a biographer. Some years earlier he had communicated with physicist Max Born along similar lines: "Quantum mechanics is certainly imposing. But an inner voice tells me that it is not yet the real thing. The theory says a lot, but does not really bring us any closer to the secret of the 'old one'. I, at any rate, am convinced that He does not throw dice."

So here is the context behind the quote: Einstein's well-known disbelief in the fundamental nature of quantum mechanics. As I've discussed in a previous post Einstein's opinions on the most accurate scientific theory ever devised was completely out of step with the majority of his contemporaries - and physicists ever since. Of course we haven't yet got to the bottom of it; speaking as a non-scientist I find the Copenhagen Interpretation nonsense. But then, many physicists have said something along the lines of that if you think you understand quantum mechanics, you haven't understood it. Perhaps at heart, Einstein was stuck in a Nineteenth Century mind set, unable to conceive of fundamental limits to our knowledge or that probability lies at the heart of reality. He spent decades looking for a deeper, more obviously comfortable, cause behind quantum mechanics. And as for his interest in the 'Old One', Einstein frequently denied his belief in a Judeo-Christian deity but referred to himself as an agnostic: the existence of any presence worthy of the name 'God' being "the most difficult in the world". Now there's a quote worth repeating!

2) Science is a way of thinking much more than it is a body of knowledge (Carl Sagan)

As I've mentioned before, Bill Bryson's A Short History of Nearly Everything is chock full of the results of scientific investigation but rarely stops to consider the unique aspects that drive the scientific method, or even define the limits of that methodology. Sagan's full quote is: "Science is more than a body of knowledge. It is a way of thinking; a way of sceptically interrogating the universe with a fine understanding of human fallibility. If we are not able to ask sceptical questions, to interrogate those who tell us that something is true, to be sceptical of those in authority, then, we are up for grabs for the next charlatan (political or religious) who comes rambling along."

It is interesting because it states some obvious aspects of science that are rarely discussed, such as the subjective rather than objective nature of science. As human beings, scientists bring emotions, selective memory and personal preferences into their work. In addition, the socio-cultural baggage we carry is hardly ever discussed until a paradigm shift (or just plain, old-fashioned time has passed) and we recognise the idiosyncrasies and prejudices embedded into research. Despite being subject to our frailties and the zeitgeist, once recognised, these limitations are part of the strength of the discipline: it allows us, at least eventually, to discover their effect on what was once considered the most dispassionate branch of learning.

Sagan's repeated use of the word sceptical is also of great significance. Behind the multitude of experimental, analytical and mathematical methods in the scientific toolkit, scepticism should be the universal constant. As well as aiding the recognition of the biases mentioned above, the sceptical approach allows parsimony to take precedence over authority. It may seem a touch idealistic, especially for graduate students having to kowtow to senior faculty when seeking research positions, but open-minded young turks are vital in overcoming the conservative old guard. Einstein's contempt for authority is well-known, as he made clear by delineating unthinking respect for it as the greatest enemy of truth. I haven't read Stephen Jay Gould's Rocks of Ages: Science and Religion in the Fullness of Life, but from what I understand of his ideas, the distinction concerning authority marks a clear boundary worthy of his Non-Overlapping Magisteria.

3) The mystery of the beginning of all things is insoluble by us; and I for one must be content to remain an agnostic (Charles Darwin)

From the original publication of On the Origin of Species in 1859 to the present day, one of the most prominent attacks by devoutly religious critics to natural selection is the improbability of how life started without divine intervention. If we eventually find microbial life on Mars - or larger organisms on Titan, Europa or Enceladus - this may turn the tide against such easy a target, but one thing is for certain: Darwin did not attempt to detail the origin of life itself. Although he stated in a letter to a fellow scientist: "But if (and Oh! What a big if!) we could conceive in some warm little pond, with all sorts of ammonia and phosphoric salts, lights, heat, electricity etc., present that a protein compound was chemically formed ready to undergo still more complex changes" there are no such broad assumptions in his public writings.

As it turns out, Darwin may have got some of the details correct, although the 'warm little pond' is more likely to have been a deep sea volcanic vent. But we are still far from understanding the process by which inert chemicals started to make copies of themselves. It's been more than sixty years since Harold Urey and Stanley Miller at the University of Chicago produced amino acids simply by recreating what conditions were then thought to resemble on the early Earth. Despite numerous variations on this classic experiment in subsequent decades, we are little closer to comprehending the origin of life. So it was appropriate that Darwin, who was not known for flights of fancy (he once quipped "My mind seems to have become a kind of machine for grinding general laws out of large collections of facts") kept speculation out of his strictly evidence-based publications.

Just as Darwin has been (at times, deliberately) misquoted by religious fundamentalists determined to undermine modern biology, his most vociferous disciple today, Richard Dawkins, has also been selectively quoted to weaken the scientific arguments. For example, printing just "The essence of life is statistical improbability on a colossal scale" as opposed to the full text from The Blind Watchmaker discussing cumulative natural selection, is a cheap literary device that lessens the critique, but only if the reader is astute enough to investigate the original source material.

4) Anybody who has been seriously engaged in scientific work of any kind realizes that over the entrance to the gates of the temple of science are written the words: 'Ye must have faith.' (Max Planck)

Thomas Henry Huxley (A.K.A. Darwin's Bulldog) once wrote that "Science is organized common sense where many a beautiful theory was killed by an ugly fact." But that was back in the Nineteenth Century, when classical physics ruled and scientists predicted a time in the near future when they would understand all the fundamentals of the universe. In these post-modern, quantum mechanical times, uncertainty (or rather, Uncertainty) is key, and common sense goes out of the window with the likes of entanglement, etc.

Back to Planck. It seems fairly obvious that his quote tallies closely with the physics of the past century, in which highly defined speculation and advanced mathematics join forces to develop hypotheses into theories long before hard evidence can be gleaned from the experimental method. Some of the key players in quantum physics have even furthered Copernicus' preference for beautiful mathematics over observation and experiment. Consider the one-time Lucasian Professor of Mathematics Paul Dirac's partiality for the beauty of equations over experimental results, even though he considered humanity's progress in maths to be 'feeble'. The strangeness of the sub-atomic world could be seen as a vindication of these views; another of Planck's quotes is "One must be careful, when using the word, real."

Leaving aside advanced physics, there are examples in the other scientific disciplines that confirm Planck's view. In the historical sciences, you can never know the full story. For example, fossils can provide some idea of the how and when a species diverged into two daughter species, but not necessarily the where and why (vis-à-vis ecological 'islands' in the wider sense). Not that this lack of precision should be taken as doubt of validity. As evolutionary biologist Stephen Jay Gould once said, a scientific fact is something "confirmed to such a degree that it would be perverse to withhold provisional assent."  So what might appear to primarily apply to one segment of the scientific endeavour can be applied across all of science.

5) Space travel is utter bilge (Richard van der Riet Woolley, Astronomer Royal)

In 1956 the then-Astronomer Royal made a prediction that was thoroughly disproved five years later with Yuri Gagarin's historic Vostock One flight. The quote has been used ever since as an example of how blind obedience to authority is unwise. But Woolley's complete quote was considerably more ambiguous: "It's utter bilge. I don't think anybody will ever put up enough money to do such a thing...What good would it do us? If we spent the same amount of money on preparing first-class astronomical equipment we would learn much more about the universe...It is all rather rot." He went on say: "It would cost as much as a major war just to put a man on the moon." In fact, the latter appears to be quite accurate, and despite the nostalgia now aimed at the Apollo era, the lack of any follow-up only reinforces the notion that the race to the moon was simply the ultimate example of Cold War competition. After all, only one trained geologist ever got there!

However, I'm not trying to defend the edited version of Woolley's inopportune statement since he appears to have been an armchair naysayer for several decades prior to his most famous quote. Back in 1936, his review of Rockets Through Space: The Dawn of Interplanetary Travel by the first president of the British Interplanetary Society (BIS) was even more pessimistic: "The whole procedure [of shooting rockets into space]...presents difficulties of so fundamental a nature, that we are forced to dismiss the notion as essentially impracticable, in spite of the author's insistent appeal to put aside prejudice and to recollect the supposed impossibility of heavier-than-air flight before it was actually accomplished." Again, it might appear in hindsight that Woolley deserves scorn, were it not for the fact that nearly everyone with some knowledge of space and aeronautics was of a similar opinion, and the opposition were a few 'cranks' and the like, such as BIS members.

The moral of the this story is that it is far from difficult to take a partial quote, or a statement out of context, and alter a sensible, realistic attitude (for its time and place) into an easy piece of fun. A recent tweet I saw was a plaintive request to read what Richard Dawkins actually says, rather than what his opponents claim he has says. In a worst-case scenario, quote-mining makes it possible to imply the very opposite of an author's intentions. Science may not be one hundred percent provable, but it's by the far the best approach we have to finding out that wonderful thing we humans call 'the truth'.

Monday 27 January 2014

An index of possibilities: defining science at a personal level

"If a little knowledge is dangerous, where is the man who has so much as to be out of danger?" - T.H. Huxley

With a sense of revitalisation following the start of a new year - and since the number of misconceived notions of the scientific method are legion - I thought I should put my cards on the table and delineate my personal ideas of what I believe science to be.

I suppose you could say it's a self-learning exercise as much as anything. Most people consider science the least comprehensible of all disciplines, removed from every day experience and only accessible by a select few (a.k.a. an intellectual elite), albeit at the loss of the creativity that drives so many other aspects of our lives. But hopefully the incredible popularity of British physicist Brian Cox and other photogenic scientist-cum-science-communicators is more than a passing fad and will help in the long term to break down this damaging myth. Science is both part and parcel of our existence and will only increase in importance as we try to resolve such vital issues as environmental degradation whilst still providing enough food and water for an ever-increasing population (fingers very much crossed on that one, folks!)

So here goes: my interpretation of the scientific method in ten bite-size, easy-to-swallow, chunks.
  1. A large amount of science is not difficult to comprehend
    Granted, theoretical high-energy physics is one of several areas of science difficult to describe meaningfully in a few, short sound bites. But amidst the more obtuse volumes aimed at a popular readership there are some gems that break down the concepts to a level that retains the essential details without resorting to advanced mathematics. Evolutionary biologist Stephen Jay Gould noted that the fear of incompetence put many intelligent enthusiasts off learning science as a leisure activity, but with the enormity of these popular science sections in many bookstores - there are over 840,000 books in Amazon.com's science section - there is no longer an excuse for not dipping a toe. Leaving physics aside, there are plenty of areas of science that are easy to understand too, especially in the 'historical' disciplines such as palaeontology (more on that later).
  2. Science is not a collection of facts but a way of exploring reality
    This is still one of the most difficult things to convey. Bill Bryson's prize-winning best seller A Short History of Nearly Everything reminds me of the genre of boy's own bumper book of true facts that was still around when I was a child: Victorian-style progress with a capital 'P' and science just a compilation of theories and facts akin to say, history. The reality is of course rather more complicated. The scientific method is a way of examining nature via testable questions that can be resolved to a high degree of certainty by simplified models, either by practical experiments (both repeatable and under 'laboratory conditions') - and including these days, computer simulations - or via mathematics.
  3. Science requires creativity, not just rigor
    The stereotype of scientists as rational, unemotional beings has been broken down over the past thirty years or so, but many non-scientists still have little idea of the creative thinking that can be involved in science, particularly in cutting-edge theorising. From Einstein's thought experiments such as what it would be like to ride alongside a beam of light to the development of string theory - which has little likelihood of experimental evidence in the near future - scientists need to utilise creative thought at least as much as data collation and hard mathematics.
  4. Scientists are only human
    Scientists are far from immune to conditioned paths of thought ingrained via their social and cultural background. Therefore, rather than all scientists being equally adept at developing particular hypotheses, they are subject to the same whims and sense of normality as everyone else. In addition, individual idiosyncrasies can hinder their career. I've discussed previously how Einstein (who famously said his contempt of authority was punished by him becoming an authority himself) refused to accept some of the aspects of quantum theory long after his contemporaries had.
    Scientists could be said then to follow the stereotype visible elsewhere, namely that young radicals frequently evolve into old conservatives.
  5. If there's no proof, is it still science?
    Thomas Henry Huxley (a.k.a. Darwin's Bulldog) once said that the 'deepest sin against the human mind is to believe things without evidence'. Yet scientific hypotheses are sometimes formed prior to any support from nature or real-world experimentation. Although Charles Darwin had plenty of the evidence revealing artificial selection when he wrote On the Origin of Species, the fossil record at the time was extremely patchy and he had no knowledge of Mendelian inheritance. In addition, the most prominent physicists of his day were unaware of nuclear fusion and so their theories of how stars shone implied a solar system far too young for natural selection to be the primary mechanism of evolution. By sticking to his ideas in spite of these issues, did this make Darwin a poor scientist? Or is it feasible that many key advances require a leap of faith - a term unlikely to please Richard Dawkins - due to lack of solid, physical evidence?
  6. Are there two schools of science?
    New Zealand physicist Ernest Rutherford once disparagingly remarked something along the lines of physics being the only real science, and that other so-called scientific disciplines are just stamp collecting. I prefer to think of science as being composed of historical and non-historical disciplines, only occasionally overlapping. For instance, cutting-edge technological application of physics required repeatable and falsifiable experiments, hence the deemed failure of cold fusion, whilst the likes of meteorology, evolutionary biology, and palaeontology are composed of innumerable historical events and/or subject to the complexities of chaos theory and as such are unlikely to provide duplicate circumstances for testing or even capable of being broken down into simplified models that can be accurately tested.
  7. An accepted theory is not necessarily final
    A theory doesn't have to be the absolute end of a quest. For example, Newton's law of universal gravitation had to wait over two centuries for Einstein's general theory of relativity to explain the mechanism behind the phenomenon. Although quantum mechanics is the most accurate theory ever developed (in terms of the match between theory and experimental results), the root cause is yet to be understood, with wildly varying interpretations offered instead. The obvious problem with this approach is that a hypothesis may fit the facts but without an explanatory mechanism, scientists may reject it as untenable. A well-known instance of this scientific conservatism (albeit for good reasons) involved Alfred Wegener's hypothesis of continental drift, which only achieved orthodoxy decades later once plate tectonics was discovered.
  8. Scientific advance rarely proceeds by eureka moments
    Science is a collaborative effort. Few scientists work in a vacuum (except astronauts, of course!) Even the greatest of 'solo' theories such as universal gravitation was on the cards during Newton's lifetime, with contemporaries such as Edmond Halley working along similar lines. Unfortunately, our predilection for simple stories with identifiable heroes means that team leaders and thesis supervisors often receive the credit when many researchers have worked towards a goal. In addition, the priority rule is based on first publication, not when a scientist formulated the idea. Therefore many theories are named after scientists who may not have been the earliest discoverer or formulator. The work of unsung researchers is frequently neglected in favour of this simplified approach that glorifies the work of one pioneer at the expense of many others.
  9. Science is restricted by the necessity of using language to describe it
    Richard Dawkins has often railed against Plato's idealism (a.k.a. Essentialism), using the phrase 'the tyranny of the discontinuous mind'. I recall a primary example of this as a child, whilst contemplating a plastic model kit I had of a Neanderthal. I wondered how the human race had evolved: specifically, how could parents of a predecessor hominid species give birth to a modern human, i.e. a child of a different species? Of course, such discontinuity is nonsense, but it is surprising how frequently our mind interprets the world in this format of neat boundaries. A large part of the problem is how do we define transitional states as the norm, since our language is bound up with intrinsic categories? In addition, we rely on metaphor and analogy to describe aspects of the universe that do not conform to everyday experience, the nature of quantum probability being an obvious example. As with the previous point on our innate need for heroes, we are always constructing narratives, thus restricting our ability to understand nature at a fundamental level.
  10. Science does not include a moral dimension
    Science, like nature, is neither moral nor immoral and cannot provide a framework for human behaviour. Of course, this doesn't prevent scientists from being greedy or stupid, or even just naïve: witness British evolutionary biologist J.B.S. Haldane who recommended the use of poison gas as a war weapon due to it being more humane than conventional weapons (in terms of the ratio of deaths to temporarily incapacitation). This suggests that non-scientists should be involved in the decision-making process for the funding of some science projects, especially those with clear applications in mind. But in order for this to be tenable, the public needs to be considerably more scientifically literate than at present. Otherwise the appalling scare-mongering engendered by the likes of the British tabloid press - think genetically modified crops labelled as 'Frankenstein foods' - will only make matters far worse. GM crops themselves are a perfect example of why the Hollywood approach for clear-cut heroes and villains fails with most of science. Reality is rarely black or white but requires careful analysis of the myriad shades of grey.
In conclusion, it might be said that there are as many variants of science as there are human beings. Contrary to many other disciplines, mistakes and ignorance are clear strengths: as Darwin stated in The Descent of Man, 'Ignorance more frequently begets confidence than does knowledge.' Above all, there are aspects of science that are part and parcel of our everyday experience and as such, we shouldn't just consider it as something to save for special occasions.

Friday 15 March 2013

Preaching to the unconverted: or how to convey science to the devout

It's said that charity begins at home. Likewise, a recent conversation I had with a pious Mormon started me thinking: just how do you promote science, both the method and the uncomfortable facts, to someone who has been raised to mistrust the discipline? Of course, there is a (hopefully) very small segment of the human race that will continue to ignore the evidence even after it is presented right in front of them, but stopping to consider those on the front line - such as biology teachers and ‘outed' atheists in the U.S. Bible Belt - how do you present a well-reasoned set of arguments to promote the theory and practice of science? 

It's relatively easy for the likes of Richard Dawkins to argue his case when he has large audiences of professionals or sympathetic listeners, but what is the best approach when endorsing science to a Biblical literalist on a one-to-one basis? The example above involved explaining just how we know the age of the Earth. Not being the first time I've been asked this, I was fully prepared to enlighten on the likes of uranium series dating, but not having to mention the 'D' words (Darwin or Dawkins) made this a relatively easy task. To aid any fans of science who might find themselves in a similar position I've put together a small toolkit of ideas, even if the conversation veers into that ultimate of controversial subjects, the evolution of the human race:
  1. A possible starting point is to be diffident, explaining the limitations of science and dispelling the notion that it isn't the catalogue of sundry facts it is sometimes described as (for example, in Bill Bryson's A Short History of Nearly Everything). It is difficult but nonetheless profitable to explain the concept that once-accepted elements of scientific knowledge can ostensibly be surpassed by later theories, only to maintain usefulness on a special case basis. A good illustration of this is Newton's Law of Universal Gravitation, which explains the force of gravity but not what creates it. Einstein's General Theory of Relativity provides a solution but Newton's Law is much easier to use, being accurate enough to use even to guide spacecraft. And since General Relativity cannot be combined with quantum mechanics, there is probably another theory waiting to be discovered…somewhere. As British astrophysicist and populariser John Gribbin has often pointed out, elements at the cutting edge of physics are sometimes only describable via metaphor, there not being anything within human experience that can be used as a comparison. Indeed, no-one has ever observed a quark and in the early days of the theory some deemed it just a convenient mathematical model. As for string theory, it's as bizarre as many a creation myth (although you might not want to admit that bit).
  2. Sometimes (as can be seen with Newton and gravity) the 'what' is known whilst the 'why' isn't. Even so, scientists can use the partial theories to extrapolate potential 'truths' or even exploit them via technology. Semi-conductors require quantum mechanics, a theory that no-one really understands. Indeed, no less a figure than Einstein refused to accept many of its implications.  There are many competing interpretations, some clearly more absurd than others, but that doesn't stop it being the most successful scientific theory ever, in terms of the correspondence between the equations and experimental data. So despite the uncertainty - or should that be Uncertainty (that's a pun, for the quantum mechanically-minded) - the theory is a cornerstone of modern physics.
  3. As far as I know, the stereotype of scientists as wild-haired, lab-coated, dispassionate and unemotional beings may stem from the Cold War, when the development of the first civilisation-destroying weapons led many to point their fingers at the inventors rather than their political paymasters. Yet scientists can be as creative as artists. Einstein conducted thought experiments, often aiming for a child-like simplicity, in order to obtain results. The idea that logic alone makes a good scientist is clearly bunkum. Hunches and aesthetics can prove as pivotal as experimental data or equations.
  4. Leading on from this, scientists are just as fallible as the rest of us. Famous examples range from Fred Hoyle's belief in the Steady State theory (and strangely, that the original Archaeopteryx fossils are fakes) through to the British scientific establishment's forty-year failure to recognise that the Piltdown Man finds were crude fakes. However, it isn't always as straightforward as these examples: Einstein's greatest blunder - the cosmological constant - was abandoned after the expansion of the universe was discovered, only for it to reappear in recent years as the result of dark energy. And of course mistakes can prove more useful than finding the correct answer the first time!
  5. There are numerous examples of deeply religious scientists, from Kepler and Newton via Gregor Mendel, the founder of genetics, to the contemporary British particle physicist the Reverend John Polkinghorne. Unlike the good versus evil dichotomy promoted by Hollywood movies, it's rarely a case of us versus them.
  6. Although there are searches for final theories such as the Grand Unified Theory of fundamental forces, one of the current aspects of science that differs profoundly from the attitudes of a century or so ago is that there is the possibility of never finding a final set of solutions. Indeed, a good experiment should generate as many new questions as it answers.
  7. If you feel that you're doing well, you could explain how easy it is to be fooled by non-existent patterns and that our brains aren't really geared up for pure logic. It's quite easy to apparently alter statistics using left- or right-skewed graphs, or to use a logarithmic scale on one axis. In addition, we recognise correlations that just aren't there but we which we would like to think are true. In the case of my Mormon colleague he was entrenched in the notion of UFOs as alien spacecraft! At this point you could even conduct an experiment: make two drawings, one of a constellation and one of evenly-spaced dots, and ask them to identify which one is random. Chances are they will pick the latter. After all, every culture has seen pictures in the random placements of stars in the night sky (or the face of Jesus in a piece of toast).
Constellation vs random dots
Ursa Major (see what you like) vs evenly-spaced dots

So to sum up:
  1. There's a fuzzy line at the cutting edge of physics and no-one understands what most of it means;
  2. We've barely started answering fundamental questions, and there are probably countless more we don't even know to ask yet;
  3. Science doesn't seek to provide comforting truths, only gain objective knowledge, but...
  4. ...due to the way our brains function we can never remove all subjectivity from the method;
  5. No one theory is the last word on a subject;
  6. Prominent scientists easily make mistakes;
  7. And most of all, science is a method for finding out about reality, not a collection of carved-in-stone facts.
So go out there and proselytise. I mean evangelise. Err...spread the word. Pass on the message. You get the picture: good luck!

Wednesday 26 September 2012

Moulds, mildew and mushrooms: living cheek by jowl with fungi

There is a form of life that probably exists in every house, office and workplace on the planet (operating theatres and clinical laboratories largely excepted) that is so ubiquitous that it goes chiefly unnoticed. The organisms are stationary yet spread rapidly, are composed of numerous species - some of which include common foodstuffs - and are neither animal nor plant. In other words they belong to the third great kingdom of macroscopic life: fungi. But what are these poor relations of the other two groups, seen as both friend and foe?

Having moved last year from a one hundred and thirty year old, centrally-heated and double-glazed terrace house in the UK to a single-glazed, largely unheated detached house less than a quarter that age in New Zealand, I've been able to conduct a comparative domestic mycology experiment. Without sounding  too much like a mould-and-spores collector out of a P.G. Wodehouse story, the subject has proved interesting and reasonably conclusive: a family of four moving to an annual climate on average four degrees warmer but with twice the rainfall has not substantially changed the amount or placement of mould in the home; if anything, it has slightly decreased. But then the amount of bathing, laundry and pans on the hob hasn't changed, so perhaps it's not too surprising. The more humid climate has been tempered by having more windows and doors to open, not to mention being able to dry more of the laundry outside. Mind you, one big plus of the move has been not having to use electric dehumidifiers or salt crystal moisture traps, so a few degrees warmth seems to be making a difference after all.

There appears to be a wide range of dubious stories, old wives' tales and assorted urban myths regarding fungi, no doubt being due to the lack of knowledge: after all, if you ask most people about the kingdom they will probably think of edible mushrooms followed by poisonous toadstools. Yet of the postulated 1.5 million species of fungi, only about 70,000 have so far been described. They are fundamentally closer to animals than they are to plants, but as they live off dead organic matter (and some inorganic substances too), thriving in darkness as unlike plants they do not photosynthesise, their reputation is more than a little sinister. The fact they will grow on just about any damp surface, hence the kitchen and bathroom mould populations, reinforces the opinion of them as being unwelcome visitors. So just how bad are they?

Firstly, fungi play a vital role in the nitrogen cycle, supplying nutrients to the roots of vegetation. The familiar fruiting bodies are, as Richard Dawkins describes them, pretty much the tip of iceberg compared to the enormous network of fungal material under the soil. Even so, they are given short shrift in popular natural history and science books: for example, they only warrant five pages in Richard Fortey's Life: An Unauthorised Biography, whilst Bill Bryson's A Short History of Nearly Everything spends much of its four pages on the subject concerned with the lack of knowledge about the number of species. Of my five Stephen Jay Gould volumes totalling over two thousand pages, there are just several, short paragraphs. And at least one of my books even refers to fungi as a simple form of plant life! Yet we rely on fungi for so many of our staple foodstuffs; it's just that they are so well hidden we don't consider them if they're not labelled as mushrooms.  But if you eat leavened bread, yoghurt, cheese or soy sauce, or drink beer or wine, fungi such as yeast will have been involved somewhere along the line. On another tack, fungi are party to yet another knife in the coffin of human uniqueness, since both ants and termites cultivate fungi: so much for Man the Farmer.

As this point I could start listing their uses in health cures, from traditional Chinese medicine to Penicillin, but my intention has been to look at fungi in the home. Anyone who has seen the fantastic BBC television series Planet Earth might recall the parasitical attack of the genus Cordyceps upon insects, but our much larger species is far from immune to attack. Minor ailments include Athlete's Foot and Ringworm whilst more serious conditions such as Candidemia, arising from the common Candida yeast, can be life- threatening . The spores are so small that there is no way to prevent them entering buildings, with commonly found species including Cladosporium, Aspergillus, and our old friend Penicillium.

Once they have a presence, moulds and mildew are almost impossible to eradicate. They are extremely resilient, with the poison in Amanita species such as the death cap failing to be destroyed by heat. An increasingly well-known example is the toxin of the cereal-infecting ergot, capable of surviving the bread-making process, even the baking. Indeed, ergot has seemingly become a major star of the fungi world, being used in pharmaceuticals at the same time as being nominated the culprit behind many an historic riddle, from the Salem witch trials to the abandonment of the Marie Celeste. Again, lack of knowledge of much of the fungal world means just about anything can be claimed with only dubious evidence to support it.

Varieties of domestic mould
A rogue's gallery of household fungi

Although we are vulnerable to many forms of fungus, an at least equally wide range attack our buildings. Whether the material is plaster, timber or fabrics, moulds and mildew can rapidly spread across most surfaces containing even a hint of dampness, often smelt before they are seen. At the very least, occupants of a heavily infested property can suffer allergies, sinus problems and breathing problems. As an asthmatic I should perhaps be more concerned, but other than keeping windows and doors open as much as possible there doesn't seem much that can be done to counter these diminutive foes.  As it is, vinegar is a favourite weapon, particularly on shower curtains and the children's plastic bath toys. Even so, constant vigilance is the watchword, as can be seen by the assorted examples from around the house above. For any mycophobes wondering how large fungi can get indoors, I once worked on a feature film shot in a dilapidated Edwardian hotel in central London about to be demolished which had fungal growths on the top floor (saturated with damp thanks to holes in the roof) which were the size of dinner plates.

So whether you've played with puffballs or like to dine on truffles, remember there's no escape: fungi are a fundamental element of our homes, our diet, and if we're unlucky, us too. Seemingly humble they may be, but even in our age of advanced technology, there's just no escape...

Saturday 9 January 2010

Quis custodiet ipsos custodes? (Or who validates popular science books?)

Gandhi once said "learn as if you were to live forever", but for the non-scientist interested in gaining accurate scientific knowledge this can prove rather tricky. Several options are available in the UK, most with drawbacks: there are few 'casual' part-time adult science courses (including the Open University); the World Wide Web is useful but inhibits organised, cohesive learning and there's always the danger of being taken in by some complete twaddle; whilst television documentaries and periodicals rarely delve into enough detail. This only leaves the ever-expanding genre of popular science books, with the best examples often including the false starts and failed hypotheses that make science so interesting.

However, there is a problem: if the book includes mistakes then the general reader is unlikely to know any better. I'm not talking about the usual spelling typos but more serious flaws concerning incorrect facts or worse still, errors of emphasis and misleading information. Admittedly the first category can be quite fun in a 'spot the mistake' sort of way: to have the particle physicists Brian Cox and Jeff Forshaw inform you that there were Muslims in the second century AD, as they do in Why does E=mc2? (and why should we care?) helps to make the authors a bit more human. After all, why should a physicist also have good historical knowledge? Then again, this is the sort of fact that is extremely easy to verify, so why wasn't this checked in the editing process? You expect Dan Brown's novels to be riddled with scientific errors, but are popular science book editors blind to non-science topics?

Since the above is an historical error many readers may be aware of the mistake, but the general public will often not be aware of inaccuracies relating to scientific facts and theories. Good examples of the latter can be found in Bill Bryson's A Short History of Nearly Everything, the bestselling popular science book in the UK in 2005. As a non-scientist Bryson admits that it's likely to be full of "inky embarrassments" and he's not wrong. For instance, he makes several references to the DNA base Thymine but at one point calls it Thiamine, which is actually Vitamin B1. However, since Bryson is presenting themed chapters of facts (his vision of science rather than any explanation of methods) these are fairly minor issues and don't markedly detract from the substance of the book.

So far that might seem a bit nitpicky but there are other works containing more fundamental flaws that give a wholly inaccurate description of a scientific technique. My favourite error of this sort can be found in the late Stephen Jay Gould's Questioning the Millennium and is howler that continues to astonish me more than a decade after first reading. Gould correctly states that raw radiocarbon dates are expressed as years BP (Before Present) but then posits that this 'present' relates directly to the year of publication of the work containing that date. In other words, if you read a book published in AD 2010 that refers to the date 1010 BP, the latter year is equivalent to AD 1000; whereas for a book published in AD 2000, 1010 BP would equate to AD 990. It's astounding that Gould, who as a palaeontologist presumably had some understanding of other radiometric dating methods, could believe such a system would be workable. The 'present' in the term BP was fixed at AD 1950 decades before Gould's book was published, so it doubly astonishes that no-one questioned his definition. You have to ask were his editors so in awe that they were afraid to query his text, or did his prominence give him copy-editing control of his own material? A mistake of this sort in a discipline so close to Gould's area of expertise can only engender doubt as to the veracity of his other information.

A more dangerous type of error is when the author misleads his readership through personal bias presented as fact. This is particularly important in books dealing with recent scientific developments as there will be few alternative sources for the public to glean the information from. In turn, this highlights the difference between professionals and their peer-reviewed papers and the popularisations available to the rest of us. There is an ever-increasing library of popular books discussing superstrings and M-theory but most make the same mistake of promoting this highly speculative branch of physics not just as the leading contender in the search for a unified field theory, but as the only option. Of course a hypothesis that cannot be experimentally verified is not exactly following a central tenet of science anyway. There has been discussion in recent years of a string theory Mafia so perhaps this is only a natural extension into print; nonetheless it is worrying to see a largely mathematical framework given so much premature attention. I suppose only time will tell...

It also appears that some publishers will accept material from senior but non-mainstream scientists on the basis of the scientist's stature, even if their hypotheses border on pseudoscience. The late Fred Hoyle was a good example of a prominent scientist with a penchant for quirky (some might say bizarre) ideas such as panspermia, who although unfairly ignored by the Nobel Committee seems to have had few problems getting his theories into print. Another example is Elaine Morgan, who over nearly four decades has written a string of volumes promoting the aquatic ape hypothesis despite lack of evidence in the ever-increasing fossil record.

But whereas Hoyle and Morgan's ideas have long been viewed as off the beaten track, there are more conventional figures whose popular accounts can be extremely misleading, particularly if they promote the writer's pet ideas over the accepted norm. Stephen Jay Gould himself frequently came in for criticism for overemphasising various evolutionary methods at the expense of natural selection, yet his peers' viewpoint is never discussed in his popular writings. Another problem can be seen in Bryan Sykes's The Seven Daughters of Eve, which received enormous publicity on publication as it gratifies our desire to understand human origins. However, the book includes a jumbled combination of extreme speculation and pure fiction, tailored in such a way as to maximise interest at the expense of clarification. Some critics have argued the reason behind Sykes's approach is to promote his laboratory's mitochondrial DNA test, capable of revealing which 'daughter' the customer is descended from. Scientists have to make a living like everyone else, but this commercially-driven example perhaps sums up the old adage that you should never believe everything you read. The Catch-22 of course is that unless you understand enough of the subject beforehand, how will you know if a popular science book contains errors?

A final example does indeed suggest that some science books aimed at a general audience prove to be just too complex for comprehensive editing by anyone other than the author. I am talking about Roger Penrose's The Road to Reality: A Complete Guide to the Laws of the Universe. At over one thousand pages this great tome is marketed with the sentence "No particular mathematical knowledge on the part of the reader is assumed", yet I wonder whether the cover blurb writer had their tongue firmly in their cheek? It is supposed to have taken Penrose eight years to write and from my occasional flick-throughs in bookshops I can see it might take me that long to read, never mind understand. I must confess all those equations haven't really tempted me yet, at least not until I have taken a couple of Maths degrees...