Showing posts with label evolution. Show all posts
Showing posts with label evolution. Show all posts

Wednesday 15 September 2021

Life in a rut: if microbes are commonplace, where does that leave intelligent aliens?

A few years ago I wrote about how Mars' seasonal methane fluctuations suggested - although far from confirmed - that microbial life might be present just under the Martin surface. Now another world in our solar system, the Saturnian moon Enceladus, has ignited discussion along similar lines.

The Cassini probe conducted flybys of Enceladus over a decade, revealing that Saturn's sixth largest moon was venting geyser-like jets of material, including water vapour, from its southern polar region. The material being emitted from these vents also included organic compounds and methane, hinting that this distant moon's watery oceans may also contain alien methane-producing microbes. Whereas Titan and Europa were originally deemed the moons most suitable for life, Enceladus's status has now been boosted to second only to Mars, with conditions not dissimilar to those in the oceans of the early Earth.

Of course, unknown geochemical processes cannot be ruled out, but nonetheless the quality of the evidence is such as to invite further exploration of Enceladus. There have been at least seven potential mission designs proposed by various bodies, including NASA and ESA, to gain more information about the moon and its geysers. Several of these include landers, while others would fly through a plume in order to examine the vented material for biosignatures. However, to date none have received official funding confirmation. As it stands the first probe to arrive might be billionaire Yuri Milner's privately-funded Breakthrough Enceladus, rather than one from a major organisation. However, don't hold your breath: the earliest any of these missions is likely to reach Enceladus is at some point in the 2030s.

What happens if future probes find evidence of microbial life on both Mars and Enceladus? Or even, whenever a method is found to reach it, in the ice-covered oceans of Jupiter's moon Europa? The first key fact will be whether they are genetically independent of Earth biota or if the panspermia hypothesis - the delivery of microbes via cometary and meteorite impact - has been proven. If that turns out not to be the case and multiple instances of life arose separately within a single solar system, this has some profoundly mixed implications for the search for extraterrestrial intelligence (SETI). After all, if simple life can arise and be sustained on three or even four very different worlds - including bodies far outside their solar system's 'Goldilocks zone' - then shouldn't this also imply a much higher chance of complex alien life evolving on exoplanets? 

Yet despite various SETI programmes over the past few decades, we have failed to pick up any signs of extraterrestrial intelligence - or at least from other technological civilisations prepared to communicate with radio waves, either in our galactic neighbourhood or with super high-powered transmitters further away. This doesn't mean they don't exist: advanced civilisations might use laser pulses at frequencies our SETI projects currently don't have the ability to detect. But nonetheless, it is a little disheartening that we've so far drawn a blank. If there is microbial life on either Mars or Enceladus - or even more so, on both worlds, never mind Europa - then a continued lack of success for SETI suggests the chances of intelligent life evolving are far lower than the probability of life itself arising.

In effect, this means that life we can only view via a microscope - and therefore somewhat lacking in cognitive ability - may turn out to be common, but intelligence a much rarer commodity. While it might be easy to say that life on both Enceladus and Mars wouldn't stand much of a chance of gaining complexity thanks to the unpleasant environmental conditions that have no doubt existed for much of their history, it's clear that Earth's biota has evolved via a complex series of unique events. In other words, the tortuous pathways of history have influenced the evolution of life on Earth.

Whereas the discovery of so many exoplanets in the past decade might imply an optimistic result for the Drake equation, the following factors, being largely unpredictable, infrequent or unique occurrences, might suggest that the evolution of complex (and especially sapiens-level intelligent) life is highly improbable:

  • The Earth orbits inside the solar system's Goldilocks zone (bear in mind that some of the planets have moved from the region of space they were created in) and so water was able to exist in liquid form after the atmospheric pressure became high enough.
  • The size and composition of the planet is such that radioactivity keeps the core molten and so generates a magnetic field to block most solar and cosmic radiation.
  • It is hypothesised that the Earth was hit by another body, nicknamed Theia, that both tilted the planet's axis and caused the formation of the Moon rather than having a catastrophic effect such as tearing our world apart, knocking it on its side (like Uranus) or removing its outer crust (like Mercury).
  • The Moon is comparatively large and close to the Earth and as such their combined gravitational fields help to keep Earth in a very stable, only slightly eccentric orbit. This is turn has helped to maintain a relatively life-friendly environment over the aeons. 
  • The Earth's axial tilt causes seasons and as such generates a simultaneous variety of climates at different latitudes, providing impetus for natural selection.
  • The Great Unconformity and hypothesised near-global glaciation (AKA Snowball Earth) that might have caused it suggests this dramatic period of climate change led to the development of the earliest multi-cellular life around 580 million years ago.
  • Mass extinctions caused rapid changes in global biota without destroying all life. Without the Chicxulub impactor for example, it is unlikely mammals would have radiated due to the dominance of reptiles on the land.
  • Ice ages over the past few million years have caused rapid climate fluctuations that may have contributed to hominin evolution as East African forests gave way to grasslands.

The evolutionary biologist Stephen Jay Gould often discussed 'contingency', claiming that innumerable historical events had led to the evolution of Homo sapiens and therefore that if history could be re-run, most possible paths would not lead to a self-aware ape. Therefore, despite the 4,800 or so exoplanets discovered so far, some within their system's Goldilocks zone, what is the likelihood such a similar concatenation of improbable events would occur of any of them? 

Most people are understandably not interested in talking to microbes. For a start, they are unlikely to gain a meaningful reply. Yet paradoxically, the more worlds that microbial life is confirmed on, when combined with the distinct failure of our SETI research to date, the easier it is to be pessimistic; while life might be widespread in the universe, organisms large enough to view without a microscope, let alone communicate with across the vast reaches of interstellar space, may be exceedingly rare indeed. The origins of life might be a far easier occurrence than we used to think, but the evolution of technological species far less so. Having said that, we are lucky to live in this time: perhaps research projects in both fields will resolve this fundamental issue within the next half century. Now wouldn't that be amazing?

Tuesday 23 June 2020

Grey matter blues: why has the human brain been shrinking?

There is a disturbing fact about our species that the public don't appear to know, and few specialists seem to want to discuss: over recent millennia, the human brain has been shrinking. There have been plenty of non-scientific warnings about the alleged deleterious effects on IQ of first television and more recently smartphones and tablets, but palaeontological evidences proves that over some tens of thousands of years, the Homo sapiens brain has shrunk somewhere between ten and seventeen percent.

There are usually two key indicators said to provide an accurate measure of smartness: encephalisation quotient and absolute brain size. Encephalisation quotient or EQ is simply the ratio of the mass of the brain to the mass of the body. Overall size is seen as critical due to the number of neural connections required for complex thought processes; you can only squeeze so many neurons into any given volume. Having said that, there is some considerably flexibility around this, thanks to variation in neuron density. The reason that some birds, especially the crow and parrot families are highly intelligent despite the small absolute size of their brains is due to their higher neural density compared to mammals.

Analysis of data from the examination of thousands of human fossil remains suggests that our species reached a peak in EQ around 70,000 years ago, followed by a gradual decline. The reduction in brain size appears to be due to a loss of the archetypal grey matter itself, rather than the white matter that provides support to the neural architecture. However, one key issue is lack of agreement as to a definitive start date for this decline, with 20,000 to 40,000 years ago being the most commonly cited origin. With such basic points remaining unsettled, it's perhaps not surprising that there is a plethora of opinions as to the cause. Here are some of the more popular hypotheses for the decline in human brain size:

1. Change to body size

The first and perhaps most obvious - but easily refuted idea - is that human body size has been steadily declining and so cranial capacity has kept in step with this. While it is true that archaic sapiens may have had a higher mass and even stature than modern humans, the reduction in brain size is greater than would be expected when compared to the overall shrinkage. The assumption is that the development of material culture, from clothing to weapons, has given humans a less calorie-demanding lifestyle.

This would allow - although not dictate - natural selection to trend towards a smaller body size. This doesn't appear to offer any help for the comparatively greater reduction in brain mass, although we should remember that an overall reduction in body size means a smaller birth canal. This in turn requires a smaller skull at birth; as is well known, the human gestation period is three months' less than for similar-size mammals, but our seemingly premature delivery is necessary for the pelvis to maintain efficient bipedalism.

2. Self-domestication

Another idea is that humanity has become domesticated via the impact of culture upon natural selection. Following the population bottleneck of 70,000 years ago - the cause of which is not yet confirmed, despite attempts to correlate it with the Toba super-volcano - there has been continual growth of the human population.

Just as all our domesticated animal species have brain sizes some 10-15% smaller than their wild cousins and ancestors, so the move to larger groups sizes may have led to a more docile humanity, with associated traits such as a smaller cranial capacity being carried along with it.

There are several issues with this hypothesis, ranging from a lack of data on the size of gatherer-hunter bands to the biological mechanisms involved. As regards the latter, there has been some speculation concerning neoteny, in which a species no longer grows to the final stage of maturity. The idea is that if adults are more aggressive than juveniles but peaceful collaboration can lead to larger groups, mutual aid and longer lifespans, then unintentional selective breeding for the retention of juvenile characteristics, including smaller brains, may cause a shift away from the fully mature but more aggressive individuals.

Research in recent years has suggested our brains may continuing to grow into our early thirties rather than cease growing in our teens, so it's possible there could be some truth to this; it would interesting to seek evidence as to whether the brains of archaic sapiens continued growing longer than ours do.

3. The impact of culture

Taking this a step further, increased population density allows a more rapid development and transmission of new ideas, including those that lead to better health, longer lifespans and so to an increased birth rate. Culture and sophisticated language may have reduced the need for most people to gain a wide range of skills - courtesy of a higher intellectual capability - as tasks could be shared and specialisation take hold. In effect, larger societies provide a safety net for those who would be less able to cope in smaller groups.

If ideas could be handed down, then individuals wouldn't have to continually 'reinvent the wheel' in each generation, allowing survival despite a smaller brain size and decreased level of intelligence. The problem with this scenario is that we have no proof the 10-17% reduction has led to an associated drop in intellect; it may well be that the size of certain lobes, used in specialist thought processes such as formulating complex speech, far outweigh any decline in less critical areas.

4. The expensive big brain

One possibility that has a clear cause-and-effect concerns the energy demands of having larger brains. Although they consume a quarter of our daily calories, the human brain is less than five per cent of our body weight. Therefore, there could be a case for arguing the existence of an evolutionary competition between smaller-brained individuals who can survive on less food with those who use their larger brains to improve food collecting strategies. Unfortunately, there are so many variables that it's difficult to judge whether the former would continually trend against the latter and - considering it clearly occurred - why the larger brain managed to evolve in the first place?

5. The more efficient brain

Although a smaller brain might have fewer neurons than a larger version with similar architecture, it has been suggested that its shorter pathways would lead to more rapid thought processing than in a larger counterpart. In addition, there might be fewer neural pathways, again increasing the efficiency. This 'nimble thinking' approach certainly seems logical, although again it doesn't explain the evolution of larger EQ in archaic sapiens.

This is certainly a subject ripe for much more research. I've often concluded with a statement along the lines that it wouldn't be surprising if some or all these factors were involved, since nature rarely conforms to the nice, neat patterns we would like to lay upon it. There is an even possibility that brain size - like so many other aspects of all animal species - fluctuates around a mean value, so that what goes up may come down again, only to later go up again.

At least one anthropological study on both Afro-Americans and US citizens of European descent proposes that over the past few hundred years there may have been an upward drift towards larger brains. Assuming the research is accurate, one possibility is that the superior nutrition available since the Industrial Revolution is allowing such development, thanks to the comparative ease with which its energy demands can be fulfilled.

It would certainly be interesting to investigate this hypothesis on a global scale, considering the wide differences between the clinically obese nations and those still subject to frequent famine. Whatever the results, they are unlikely to be the simple 'just-so' stories often passed-off as to the public in lieu of accurate but understandable science communication. The answers may be out there somewhere...I'd certainly love to know what's been happening to the most sophisticated object in the known universe!


Tuesday 12 May 2020

Ancestral tales: why we prefer fables to fact for human evolution

It seems that barely a month goes by without there being a news article concerning human ancestry. In the eight years since I wrote a post on the apparent dearth of funding in hominin palaeontology there appears to have been some uptake in the amount of research in the field. This is all to the good of course, but what is surprising is that much of the non-specialist journalism - and therefore public opinion - is still riddled with fundamental flaws concerning both our origins and evolution in general.

It also seems that our traditional views of humanity's position in the cosmos is often the source of the errors. It's one thing to make such howlers as the BBC News website did some years' back, in which they claimed chimpanzees were direct human ancestors, but there are a key number of more subtle errors that are repeated time and again. What's interesting is that in order to explain evolution by natural selection, words and phrases have become imbued with incorrect meaning or in some cases, just a slight shift of emphasis. Either way, it seems that evolutionary ideas have been tacked onto existing cultural baggage and in the process, failed to explain the intended theories; personal and socio-political truths have triumphed over objective truth, as Neil deGrasse Tyson might say.

1) As evolutionary biologist Stephen Jay Gould use to constantly point out, the tree of life is like the branches of a bush, not a ladder of linear progression. It's still fairly common to see the phrase 'missing link' applied to our ancestry, among others; I even saw David Attenborough mention it in a tv series about three years' ago. A recent news article described - as if in surprise - that there were at least three species of hominins living in Africa during the past few million years, at the same time and in overlapping regions too. Even college textbooks use it - albeit in quotation marks - among a plethora of other phrases that were once valid, so perhaps it isn't surprising that popular publications continue to use them without qualification.

Evolution isn't a simple, one-way journey through space and time from ancestors to descendants: separate but contemporaneous child species can arise via geographical isolation and then migrate to a common location, all while their parent species continues to exist. An example today would be the lesser black-backed and herring gulls of the Arctic circle, which is either a single, variable species or two clearly distinct species, depending where you look within its range.

It might seem obvious, but species also migrate and then their descendants return to the ancestral homeland; the earliest apes evolved in Africa and then migrated to south-east Asia, some evolving into the ancestors of gibbons and orangutan while others returned to Africa to become the ancestors of gorillas and chimpanzees. One probable culprit of the linear progression model is that some of the examples chosen to teach evolution such as the horse have few branches in their ancestry, giving the false impression of a ladder in which a descendant species always replaces an earlier one.

2) What defines a species is also much misunderstood. The standard description doesn't do any favours in disentangling human evolution; this is where Richard Dawkins' oft-repeated phrase 'the tyranny of the discontinuous mind' comes into play. Examine a range of diagrams for our family tree and you'll find distinct variations, with certain species sometimes being shown as direct ancestors and sometimes as cousins on extinct branches.

If Homo heidelbergensis is the main root stock of modern humans but some of us have small amounts of Neanderthal and/or Denisovan DNA, then do all three qualify as direct ancestors of modern humans? Just where do you draw the line, bearing in mind every generation could breed with both the one before and after? Even with rapid speciation events between long periods of limited variability (A.K.A. punctuated equilibrium) there is no clear cut-off point separating us from them. Yet it's very rare to see Neanderthals labelled as Homo sapiens neanderthalensis and much more common to see them listed as Homo neanderthalensis, implying a wholly separate species.

Are the religious beliefs and easy-to-digest just-so stories blinding us to the complex, muddled background of our origins? Obviously, the word 'race' has profoundly negative connotations these days, with old-school human variation now known to be plain wrong. For example, there's greater genetic variation in the present-day sub-Saharan African population than in the rest of the world combined, thanks to it being the homeland of all hominin species and the out-of-Africa migrations of modern humans occurring relatively recently.

We should also consider that species can be separated by behaviour, not just obvious physical differences. Something as simple as the different pitches of mating calls separate some frog species, with scientific experiments proving that the animals can be fooled by artificially changing the pitch. Also, just because species appear physically similar doesn't necessarily mean an evolutionary close relationship: humans and all other vertebrates are far closer to spiny sea urchins and knobbly sea cucumbers than they are to any land invertebrates such as the insects.

3) Since the Industrial Revolution, societies - at least in the West - have become obsessed with growth, progress and advance. This bias has clearly affected the popular conception that evolution always leads to improvements, along the lines of faster cheetahs to catch more nimble gazelles and 'survival of the fittest'. Books speak of our epoch as the Age of Mammals, when by most important criteria we live in the era of microbes; just think of the oxygen-generating cyanobacteria. Many diagrams of evolutionary trees place humans on the central axis and/or at the pinnacle, as if we were destined to be the best thing that over three billion years of natural selection could achieve. Of course, this is no better than what many religions have said, whereby humans are the end goal of the creator and the planet is ours to exploit and despoil as we like (let's face it, for a large proportion of our existence, modern Homo sapiens was clearly less well adapted to glacial conditions than the Neanderthals).

Above all, these charts give the impression of a clear direction for evolution with mammals as the core animal branch. Popular accounts still describe our distant ancestors, the synapsids, as the 'mammal-like reptiles', even though they evolved from a common ancestor of reptiles, not from reptiles per se. Even if this is purely due to lazy copying from old sources rather than fact-checking, doesn't it belie the main point of the publication? Few general-audience articles admit that all of the earliest dinosaurs were bipedal, presumably because we would like to conflate standing on two legs with more intelligent or 'advanced' (a tricky word to use in a strict evolutionary sense) lineages.

The old ladder of fish-amphibian-reptile/bird-mammal still hangs over us and we seem unwilling to admit to extinct groups (technically called clades) that break our neat patterns. Incidentally, for the past 100 million years or so, about half of all vertebrate species have been teleost fish - so much for the Age of Mammals! No-one would describe the immensely successful but long-extinct trilobites as just being 'pill bug-like marine beetles' or similar, yet when it comes to humans, we have a definite sore spot. There is a deep psychological need to have an obvious series of ever-more sophisticated ancestors paving the way for us.

What many people don't realise is that organisms frequently evolve both physical and behavioural attributes that are subsequently lost and possibly later regained. Some have devolved into far simpler forms, frequently becoming parasites. Viruses are themselves a simplified life form, unable to reproduce without a high-jacked cell doing the work for them; no-one could accuse them of not being highly successful - as we are currently finding out to our cost. We ourselves are highly adaptable generalists, but on a component-by-component level it would appear that only our brains make us as successful as we are. Let's face it, physically we're not up to much: even cephalopods such as squid and octopus have a form of camera eye that is superior to that of all vertebrates.

Even a cursory glance at the natural history of life, using scientific disciplines as disparate as palaeontology and comparative DNA analysis, shows that some lineages proved so successful that their outward physiology has changed very little. Today, there are over thirty species of lancelet that are placed at the base of the chordates and therefore closely related to the ancestors of all vertebrates. They are also extremely similar in appearance to 530-million-year-old fossils of the earliest chordates in the Cambrian period. If evolution were a one-way ticket to progress, why have they not long since been replaced by later, more sophisticated organisms?

4) We appear to conflate success simply with being in existence today, yet our species is a newcomer and barely out of the cradle compared to some old-timers. We recently learned that Neanderthals wove plant fibre to make string and ate a wide variety of seafood. This knowledge brings with it a dwindling uniqueness for modern Homo sapiens. The frequently given explanation of our superiority over our extinct cousins is simply that they aren't around anymore, except as minor components of our genome. But this is a tautology: they are inferior because they are extinct and therefore an evolutionary dead end; yet they became extinct because of their inferiority. Hmmm...there's not much science going on here!

The usual story until recently was that at some point (often centred around 40,000-50,000 years ago) archaic sapiens developed modern human behaviour, principally in the form of imaginative, symbolic thinking. This of course ignores the (admittedly tentative) archaeological evidence of Neanderthal cave-painting, jewelry and ritual, all of which are supposed to be evidence of our direct ancestor's unique Great Leap Forward (yes, it was named after Chairman Mao's plan). Not only did Neanderthals have this symbolic behaviour, they appear to have developed it independently of genetically-modern humans. This is a complete about-turn from the previous position of them being nothing more than poor copyists.

There are alternative hypotheses to the Great Leap Forward, including:
  1. Founder of the Comparative Cognition Project and primate researcher Sarah Boysen observed that chimpanzees can create new methods for problem solving and processing information. Therefore, a gradual accumulation of cognitive abilities and behavioural traits over many millennia - and partially inherited from earlier species - may have reached a tipping point. 
  2. Some geneticists consider there to have been a sudden paradigm shift caused by a mutation of the FOXP2 gene, leading to sophisticated language and all that it entails.
  3. Other researchers consider that once a certain population size and density was achieved, complex interactions between individuals led the way to modern behaviour. 
  4. A better diet, principally in the form of larger amounts of cooked meat, led to increased cognition. 
In some ways, all of these are partly speculative and as is often the case we may eventually find that a combination of these plus other factors were involved. This shouldn't stop us from realising how poor the communication of evolutionary theories still is and how many misconceptions exist, with the complex truth obscured by our need to feel special and to tell simple stories that rarely convey the amazing evolution of life on Earth.



Tuesday 25 February 2020

Falling off the edge: in search of a flat Earth

It's just possible that future historians will label the 21st century as the Era of Extreme Stupidity. In addition to the 'Big Four' of climate change denial, disbelief in evolution by natural selection, young Earth creationism and the anti-vaxxers, there are groups whose oddball ideas have rather less impact on our ecosystem and ourselves. One segment of people that I place in the same camp as UFO abductees and their probing fixation are believers in a flat Earth.

Although on the surface this - admittedly tiny - percentage of people appear to be more amusing than harmful, their media visibility makes them a microcosm of the appalling state of science education and critical thinking in general. In addition, their belief in an immense, long-running, global conspiracy adds ammunition to those with similar paranoid delusions, such as the moon landing deniers. One example of how intense those beliefs can be (at times there's just a whiff of religious fanaticism), the American inventor and stuntman 'Mad' Mike Hughes was killed recently flying a self-built rocket intended to prove that the Earth is a disc.

I won't bother to describe exactly what the flat Earthers take to be true, except that their current beliefs resemble a description of the late, great Terry Pratchett's fantasy Discworld - albeit without the waterfall around the edge of the disc. For anyone who wants to test the hypothesis themselves rather than rely on authority (the mark of a true scientist) there are plenty of observational methods to try. These include:
  1. Viewing the Earth's shadow on the Moon during a lunar eclipse
  2. Noticing that a sailing ship's mast disappears/reappears on the horizon after/before the hull
  3. How certain stars are only visible at particular latitudes
For anyone with a sense of adventure, you can also build a high-altitude balloon or undertake a HAHO skydive to photograph the Earth's curvature - from any point on the planet!

It's easy to suggest that perhaps our brains just aren't up to the task of deciphering the intricacies of a 13.7 billion old universe, but basic experiments and observations made several thousand years ago were enough for Greek scientists to confirm both the shape and size of our planet. So what has changed in the past century or so to turn back the clock, geophysically-speaking?

The modern take on a flat Earth seems to have begun in the late 19th century, with an attempt - similar to contemporary mid-Western creationists - to ignore scientific discoveries that disagree with a literal interpretation of the Old Testament. Indeed, the forerunners of today's flat Earthers were anti-science in many respects, also denying that prominent enemy of today's Biblical literalists, evolution by natural selection. However, many of the 21st century' s leading adherents to a disc-shaped Earth have more sympathy and interest in scientific discoveries, even supporting such politically contentious issues as rapid, human-induced, climate change.

This topic is laden with ironies, few greater than the fact that a large proportion of the evidence for global warming is supplied by space agencies such as NASA. The latter has long been claimed by the Flat Earth Society as a leading conspirator and purveyor of faked imagery in the promotion of a spherical earth (yes to all pedants, I know that strictly speaking our planet is an oblate spheroid, not purely spherical).

Today's flat Earth societies follow the typical pseudo-scientific / fringe approach, analysing the latest science theories for material they can cherry pick and cannibalise to support their ideas. In recent years they've even tackled key new developments such as dark energy; in fact, about the only area they are lagging behind in is the incorporation of elements involving quantum mechanics.

But for anyone with an understanding of parsimony or Occam's Razor, the physics for a flat Earth have about as much likelihood as Aristotle's crystalline spheres. It isn't just the special pleading for localised astrophysics (since the other planets are deemed spherical); isn't it obviously absurd that there could be a global conspiracy involving rival nations and potentially hundreds of thousands of people - with no obvious explanation of what the conspirators gain from the deception?

Even for the vast majority of the public with little interest or understanding of the physics, most people considering the flat Earth hypothesis are presumably puzzled by this apparent lack of motivation. In a nutshell, what's in it for the conspirators? Until recently, NASA (nick-named 'Never A Straight Answer,') was the main enemy, but with numerous other nations and private corporations building space vehicles, there is now a plethora of conspiracy partners. Going back half a century to the height of the Cold War why, for example, would the USA and Soviet Union have agreed to conspire? As yet, there hasn't been anything approaching a satisfactory answer; but ask Carl Sagan said: "Extraordinary claims require extraordinary evidence."

Unlike most fringe groups, flat Earthers don't appear to favour other, popular conspiracy theories above scientific evidence. Yet somehow, their ability to support ludicrous ideas whilst denying fundamental observations and the laws of physics in the light of so much material evidence is astonishing.  Of course our species doesn't have a mental architecture geared solely towards rational, methodical thought processes, but the STEM advances that Homo sapiens has made over the millennia prove we are capable of suppressing the chaotic, emotional states we usually associate with young children.

Whether we can transform science education into a cornerstone topic, as daily-relevant as reading, writing and arithmetic, remains to be seen. Meanwhile, the quest continues for funding a voyage to find the Antarctic ice wall that prevents the oceans falling over the edge of the world. Monty Python, anyone?

Monday 30 January 2012

Sell-by date: are old science books still worth reading?

As an outsider to the world of science I've recently been struck by an apparent dichotomy that I don't think I've ever heard discussed, namely that if science is believed by non-practitioners to work on the basis of new theories replacing earlier ones, then are out-of-date popular science (as opposed to text) books a disservice, if not positive danger, to the field?

I recently read three science books written for a popular audience in succession, the contrast between them serving as the inspiration for this post. The most recently published was Susan Conner and Linda Kitchen's Science's Most Wanted: the top 10 book of outrageous innovators, deadly disasters, and shocking discoveries (2002). Yes, it sounds pretty tacky, but I hereby protest that I wanted to read it as much to find out about the authors and their intended audience as the subject material itself. Although only a decade old the book is already out of date, in a similar way that a list of top ten grossing films would be. In this case the book lists different aspects of the scientific method and those involved, looking at issues ranging from collaborative couples (e.g. the Curies) to prominent examples of scientific fraud such as the Chinese fake feathered dinosaur fossil Archaeoraptor.

To some extent the book is a very poor example of the popular science genre, since I found quite a few incorrect but easily verifiable facts. Even so, it proved to be an excellent illustration of how transmission of knowledge can suffer in a rapidly-changing, pop-cultural society. Whilst the obsession with novelty and the associated transience of ideas may appear to somewhat fit in with the principle that a more recent scientific theory always replaces an earlier one, this is too restrictive a definition of science. The discipline doesn't hold with novelty for the sake of it, nor does an old theory that is largely superseded by a later one prove worthless. A good example of the latter is the interrelationship between Newton's classical Law of Gravitation (first published in 1687) and Einstein's General Relativity (1916), with the former still used most of the time (calculating space probe trajectories, etc, etc).

The second of the three books discusses several different variants of scientific practice, although far different from New Zealand particle physicist Ernest Rutherford's crude summary that "physics is the only real science. The rest are just stamp collecting." Stephen Jay Gould's first collection of essays, Ever Since Darwin (1977), contains his usual potpourri of scientific theories, observations and historical research. These range from simple corrections of 'facts' – e.g. Darwin was not the original naturalist on HMS Beagle – to why scientific heresy can serve important purposes (consider the much-snubbed Alfred Wegener, who promoted a precursor to plate tectonics long before the evidence was in) through to a warning of how literary flair can promote poor or even pseudo-science to an unwary public (in this instance, Immanuel Velikovsky's now largely forgotten attempts to link Biblical events to interplanetary catastrophes).

Interestingly enough, the latter element surfaced later in Gould's own career, when his 1989 exposition of the Early Cambrian Burgess Shale fossils, Wonderful Life, was attacked by Richard Dawkins with the exclamation that he wished Gould could think as clearly as he could write! In this particular instance, the attack was part of a wider critique of Gould's theories of evolutionary mechanisms rather than material being superseded by new factual evidence. However, if I'm a typical member of the lay readership, the account of the weird and wonderful creatures largely outweighs the professional arguments. Wonderful Life is still a great read as descriptive natural history and I suppose serves as a reminder that however authoritative the writer, don't take accept everything on face value. But then that's a good lesson in all subjects!

But back to Ever Since Darwin. I was surprised by just how much of the factual material had dated in fields as disparate as palaeontology and planetary exploration over the past thirty-five years. As an example, Essay 24 promotes the idea that the geophysical composition of a planetary body is solely reliant on the body's size, a hypothesis since firmly negated by space probe data. In contrast, it is the historical material that still shines as relevant and in the generic sense 'true'. I've mentioned before (link) that Bill Bryson's bestseller A Short History of Nearly Everything promotes the idea that science is a corpus of up-to-date knowledge, not a theoretical framework and methodology of experimental procedures. But by so short-changing science, Bryson's attitude could promote the idea that all old material is essentially worthless. Again, the love of novelty, now so ingrained in Western societies, can cause public confusion in the multi-layered discipline known as science.

Of course, this doesn't mean that something once considered a classic still has great worth, any more than every single building over half a century old is worthy of a preservation order. But just possibly (depending on your level of post-modernism and/or pessimism) any science book that stands the test of time does so because it contains self-evident truths. The final book of the three is a perfect example of this: Charles Darwin's On the Origin of Species, in this case the first edition of 1859. The book shows that Darwin's genius lay in tying together apparently disparate precursors to formulate his theory; in other words, natural selection was already on the thought horizon (as proven by Alfred Russel Wallace's 1858 manuscript). In addition, the distance between publication and today gives us an interesting insight into the scientist as human being, with all the cultural and linguistic baggage we rarely notice in our contemporaries. In some ways Darwin was very much a man of his time, attempting to soften the non-moralistic side to his theory by subtly suggesting that new can equal better, i.e. a form of progressive evolution. For example, he describes extinct South American mega fauna as 'anomalous monsters' yet our overtly familiar modern horse only survived via Eurasian migration, dying out completely in its native Americas. We can readily assume that had the likes of Toxodon survived but not Equus, the horse would seem equally 'anomalous' today.

Next, Darwin had limited fossil evidence to support him, whilst Nineteenth Century physics negated natural selection by not allowing enough time for the theory to have effect. Of course, if the reader knows what has been discovered in the same field since, they can begin to get an idea of the author's thought processes and indeed world view, and just how comparatively little data he had to work with. For example, Darwin states about variations in the sterility of hybrids whilst we understand, for example that most mules are sterile because of chromosomal issues. Yet this didn’t prevent the majority of mid-Victorian biologists from accepting natural selection, an indication that science can be responsive to ideas with only circumstantial evidence; this is a very long way indeed from the notion of an assemblage of clear-cut facts laid out in logical succession.

I think it was the physicist and writer Alan Lightman who said: "Science is an ideal but the application of science is subject to the psychological complexities of the humans who practice it." Old science books may frequently be dated from a professional viewpoint but can still prove useful to the layman for at least the following reasons: understanding the personalities, mind-sets and modes of thought of earlier generations; observing how theories within a discipline have evolved as both external evidence and fashionable ideas change; and the realisation that science as a method of understanding the universe is utterly different from all other aspects of humanity. Of course, this is always supposing that the purple prose doesn’t obscure a multitude of scientific sins...

Saturday 20 March 2010

Come all ye faithful: do faith schools threaten British science education?

With the announcement of a New Life Academy in Hull opening later this year the debate over religious education in Britain has become more intense than ever before. Of course we need to take Richard Dawkins' rhetoric with a pinch of salt, but has the current administration allowed or even provided financial support for fundamentalist organisations to infiltrate the British education system at the expense of science and rational thought?

The Hull Academy will follow the Accelerated Christian Education curriculum that amongst other tenets supports the literal truth of the Bible. So how likely is it that the UK will take on aspects of the American Bible Belt, with critical thinking and enquiry subservient to dogma and absolute belief? One of the main criticisms of the ACE system is its reliance on learning by rote, yet at least in their pre-teens, children are shown to benefit from such a system. It appears to do little to quench their thirst for exploration and discovery, which if anything is largely stamped out by an exam-obsessed education system. If all learning is given via rote there is an obvious problem, but in the vast majority of British faith schools this does not seem to be the case.

Alongside the four Emmanuel Schools Foundation academies, the NLA Academy is an easy target for those fearing religious extremism. But outside of Hollywood, the real world is rarely so easy to divide into good and bad. Not only are the ESF schools open to all faiths but an Ofsted inspection failed to support the allegations of creation science being taught. Even if these faculties were heading towards US-style fundamentalism, linking their techniques to all faith schools would be akin to arguing that the majority of British Jewish children attend the Yiddish-speaking private schools in North London's Stamford Hill orthodox community. Parents who are desperate to indoctrinate their children will take a do-it-yourself approach if they cannot find a school to deliver their requirements.

Many senior religious figures of various faiths, including the Archbishop of Canterbury Dr Rowan Williams, have stated that they do not want creationism taught in schools. If there is any stereotyping in this subject, it is here: most fundamentalists concentrate solely on evolutionary theories, natural selection and its implicit linking of mankind to other animals, rather than any other branch of science. Although the age of the Earth (and therefore the universe in general), as well as the sun-centred solar system, is sometimes denied for its disagreement with the Bible and the Koran, there are few extremists prepared to oppose other cornerstones of modern science. Clearly, would-be chemists should feel safe, potential geo- and astrophysicists less so, and those considering a career in evolutionary biology should not move to the American Midwest (or even Hull!)

More seriously, what of more subtle approaches by the mainstream denominations? A 2004 New Statesman article maligned an Anglican school in Canterbury for its attempts to inculcate infants with religious sensibilities via techniques that sounded more like a New Age cult than the Jesuit approach, but since then there has been little in the way of comparable stories. Whether senior figures in the Church of England see faith schools as a way of replenishing their ever-diminishing flock is unknown, but there is no solid evidence for such a master plan. Britain has a long and let's face it, fairly proud history of ordained ministers who have dabbled in the sciences, although few who could be compared with the Augustinian monk Gregor Mendel, the father of modern genetics. Although T.H.Huxley (A.K.A. Darwin's bulldog) railed against the ordained amateurs, his main bone of contention concerned Anglican privilege: comfortable sinecures allowing vicars to delve in the sciences whilst the lower social orders including Huxley had to fight tooth and claw to establish a paid profession.

There are many examples of religiously devout scientists who can be used to diffuse the caricatured 'us and them' mentality, perhaps the best-known current British example being particle physicist the Reverend John Polkinghorne. Organisations such as the International Society for Science and Religion, and the Society of Ordained Scientists, both of which claim Polkinghorne as a member, are against intelligent design from both a faith and science perspective. Whilst the hardline atheists might deem these groups as intending to both have their wafer and eat it, there are clearly a wide range of attitudes in support of current scientific theories at the expense of a literal belief in religious texts. But then don't most Christians today express a level of belief as varied as the rituals of the numerous denominations themselves, often far short of accepting literal Biblical truth? Believers find their own way, and so it is with scientists who follow conventional belief systems.

However, one potential danger of teaching science in faith schools may be a relic of Darwin's contemporaries (and of course Darwin himself initially aimed for a church career), namely the well-intentioned attempt to imbibe the discipline with a moral structure. Yet as our current level of knowledge clearly shows, bearing in mind everything from natural selection to asteroid impact, we cannot ally ethical principles to scientific methods or knowledge. Scientific theories can be used for good or evil, but it is about as tenable to link science to ethics or moral development as it is to blame a cat for torturing its prey. Of course children require moral guidance, but it must be nurtured via other routes. Einstein wrote in 1930 of a sense of cosmic religious feeling which has no need for the conventional anthropomorphic deity but to my mind seems more akin to Buddhism. As such he believed that a key role of science (along with art) is to awaken and preserve this numinous-like feeling. I for one consider this is as far as science can go along the road to spirituality, but equally agree with Huxley's term agnosticism: to go beyond this in either direction with our current, obviously primitive state of understanding, is sheer arrogance. If we wish to inculcate an open mind in our children, we must first guarantee such a thought system in ourselves. All else is indoctrination, be it religious or secular.

One of the ironies of faith schools in a nation where two thirds of secondary school children do not see themselves as religious practitioners, is that they are generally considered to supply a high standard of education and as such are usually oversubscribed. But all in all, there is little evidence to support this notion, since any oversubscribed institution is presumably able to choose a higher calibre of student whilst claiming to the contrary. Current estimates suggest 15% of British children attend faith schools, with a higher proportion in some regions (such as over 20% of London's secondary school places) but as low as 5% in more rural areas. Clearly, parents who want a good education for their children are not being put off by the worry of potential indoctrination. As has become obvious over the past few years, there are large increases in attendance at school-affiliated churches just prior to the application period: a substantial number of parents are obviously faking faith in return for what they deem to be a superior education.

For the moment it seems science education in Britain has little to worry about from the fundamentalists, at least compared to the divisiveness and homophobia that the National Secular Society deem the most prominent results of increasing faith-based education. We must be careful to ensure that as taxpayers we do not end up funding creationist institutions, but we can do little to prevent private schools following this approach. On a positive note, the closest faith school to me has a higher level of science attainment than its non-religious rivals. I admit that I attended an Anglican school for three years and appear to have emerged with as plural a stance as could be wished for. Indeed, I look back fondly on the days of dangerous chemistry experiments before health and safety-guaranteed virtual demonstrations began to supplant this fun aspect of school science: if you haven't used a burning peanut to blow the lid off a cocoa tin, you haven't lived!

Technorati Tags: , , ,