Showing posts with label natural selection. Show all posts
Showing posts with label natural selection. Show all posts

Wednesday 15 September 2021

Life in a rut: if microbes are commonplace, where does that leave intelligent aliens?

A few years ago I wrote about how Mars' seasonal methane fluctuations suggested - although far from confirmed - that microbial life might be present just under the Martin surface. Now another world in our solar system, the Saturnian moon Enceladus, has ignited discussion along similar lines.

The Cassini probe conducted flybys of Enceladus over a decade, revealing that Saturn's sixth largest moon was venting geyser-like jets of material, including water vapour, from its southern polar region. The material being emitted from these vents also included organic compounds and methane, hinting that this distant moon's watery oceans may also contain alien methane-producing microbes. Whereas Titan and Europa were originally deemed the moons most suitable for life, Enceladus's status has now been boosted to second only to Mars, with conditions not dissimilar to those in the oceans of the early Earth.

Of course, unknown geochemical processes cannot be ruled out, but nonetheless the quality of the evidence is such as to invite further exploration of Enceladus. There have been at least seven potential mission designs proposed by various bodies, including NASA and ESA, to gain more information about the moon and its geysers. Several of these include landers, while others would fly through a plume in order to examine the vented material for biosignatures. However, to date none have received official funding confirmation. As it stands the first probe to arrive might be billionaire Yuri Milner's privately-funded Breakthrough Enceladus, rather than one from a major organisation. However, don't hold your breath: the earliest any of these missions is likely to reach Enceladus is at some point in the 2030s.

What happens if future probes find evidence of microbial life on both Mars and Enceladus? Or even, whenever a method is found to reach it, in the ice-covered oceans of Jupiter's moon Europa? The first key fact will be whether they are genetically independent of Earth biota or if the panspermia hypothesis - the delivery of microbes via cometary and meteorite impact - has been proven. If that turns out not to be the case and multiple instances of life arose separately within a single solar system, this has some profoundly mixed implications for the search for extraterrestrial intelligence (SETI). After all, if simple life can arise and be sustained on three or even four very different worlds - including bodies far outside their solar system's 'Goldilocks zone' - then shouldn't this also imply a much higher chance of complex alien life evolving on exoplanets? 

Yet despite various SETI programmes over the past few decades, we have failed to pick up any signs of extraterrestrial intelligence - or at least from other technological civilisations prepared to communicate with radio waves, either in our galactic neighbourhood or with super high-powered transmitters further away. This doesn't mean they don't exist: advanced civilisations might use laser pulses at frequencies our SETI projects currently don't have the ability to detect. But nonetheless, it is a little disheartening that we've so far drawn a blank. If there is microbial life on either Mars or Enceladus - or even more so, on both worlds, never mind Europa - then a continued lack of success for SETI suggests the chances of intelligent life evolving are far lower than the probability of life itself arising.

In effect, this means that life we can only view via a microscope - and therefore somewhat lacking in cognitive ability - may turn out to be common, but intelligence a much rarer commodity. While it might be easy to say that life on both Enceladus and Mars wouldn't stand much of a chance of gaining complexity thanks to the unpleasant environmental conditions that have no doubt existed for much of their history, it's clear that Earth's biota has evolved via a complex series of unique events. In other words, the tortuous pathways of history have influenced the evolution of life on Earth.

Whereas the discovery of so many exoplanets in the past decade might imply an optimistic result for the Drake equation, the following factors, being largely unpredictable, infrequent or unique occurrences, might suggest that the evolution of complex (and especially sapiens-level intelligent) life is highly improbable:

  • The Earth orbits inside the solar system's Goldilocks zone (bear in mind that some of the planets have moved from the region of space they were created in) and so water was able to exist in liquid form after the atmospheric pressure became high enough.
  • The size and composition of the planet is such that radioactivity keeps the core molten and so generates a magnetic field to block most solar and cosmic radiation.
  • It is hypothesised that the Earth was hit by another body, nicknamed Theia, that both tilted the planet's axis and caused the formation of the Moon rather than having a catastrophic effect such as tearing our world apart, knocking it on its side (like Uranus) or removing its outer crust (like Mercury).
  • The Moon is comparatively large and close to the Earth and as such their combined gravitational fields help to keep Earth in a very stable, only slightly eccentric orbit. This is turn has helped to maintain a relatively life-friendly environment over the aeons. 
  • The Earth's axial tilt causes seasons and as such generates a simultaneous variety of climates at different latitudes, providing impetus for natural selection.
  • The Great Unconformity and hypothesised near-global glaciation (AKA Snowball Earth) that might have caused it suggests this dramatic period of climate change led to the development of the earliest multi-cellular life around 580 million years ago.
  • Mass extinctions caused rapid changes in global biota without destroying all life. Without the Chicxulub impactor for example, it is unlikely mammals would have radiated due to the dominance of reptiles on the land.
  • Ice ages over the past few million years have caused rapid climate fluctuations that may have contributed to hominin evolution as East African forests gave way to grasslands.

The evolutionary biologist Stephen Jay Gould often discussed 'contingency', claiming that innumerable historical events had led to the evolution of Homo sapiens and therefore that if history could be re-run, most possible paths would not lead to a self-aware ape. Therefore, despite the 4,800 or so exoplanets discovered so far, some within their system's Goldilocks zone, what is the likelihood such a similar concatenation of improbable events would occur of any of them? 

Most people are understandably not interested in talking to microbes. For a start, they are unlikely to gain a meaningful reply. Yet paradoxically, the more worlds that microbial life is confirmed on, when combined with the distinct failure of our SETI research to date, the easier it is to be pessimistic; while life might be widespread in the universe, organisms large enough to view without a microscope, let alone communicate with across the vast reaches of interstellar space, may be exceedingly rare indeed. The origins of life might be a far easier occurrence than we used to think, but the evolution of technological species far less so. Having said that, we are lucky to live in this time: perhaps research projects in both fields will resolve this fundamental issue within the next half century. Now wouldn't that be amazing?

Monday 15 March 2021

Distorted Darwin: common misconceptions about evolution and natural selection

A few months' ago, I discussed how disagreements with religious texts can lead the devout to disagree with key scientific theories; presumably this is a case of fundamentalists denying the fundamentals? Of all the areas of scientific research that cause issues today, it is evolutionary biology that generates the most opposition. This is interesting in so many ways, not least because the primary texts of the Abrahamic religions have little to say on the topic beyond the almost universal elements seen in creation myths, namely that one or more superior beings created all life on Earth and that He/They placed humanity at the zenith.

Thanks to opposition to the modern evolutionary synthesis, there is a plethora of misinformation, from material taken out of context to complete falsehoods, that is used to promote Creationist ideas rather than scientifically-gleaned knowledge. Even those with well-meaning intentions often make mistakes when condensing the complexity of the origin and history of life into easy-to-digest material. I've previously written about the concepts of evolutionary bushes rather than ladders, concurrent rather than consecutive radiation of sister species and speciation via punctuated equilibrium (i.e., the uneven pace of evolution) so here are a few other examples where the origin, implications and illustrations of natural selection has been distorted or overly simplified to the point of inaccuracy.

I've previously mentioned that Charles Darwin was the earliest discoverer - but only a decade or two ahead of Alfred Russel Wallace - of natural selection, and not as is often written, evolution per se. However, this is not completely accurate. Darwin's hypothesis was more complete than Wallace's, in the sense of being entirely scientific and therefore testable. Wallace on the other hand maintained there must have been divine intervention in the creation of our species, making us different from all other life forms.

In addition, there were several precursors who partially formulated ideas regarding natural selection, but who were unable to promote a consistent, evidence-based hypothesis to anywhere near the extent that Darwin achieved. For example, as early as 1831 the Scottish agriculturalist Patrick Matthew published some notes on what he termed 'new diverging ramifications of life' as he thought must occur after mass extinctions. Nevertheless, he failed to expand and fully explain his ideas, seemingly unaware of where they could lead. In this sense, he is a minor figure compared to the thorough research Darwin undertook to back up his hypothesis. 

Darwin appears to have been unaware of Matthew's ideas, although the same could not be said for Robert Chambers' (anonymous) 1844 publication Vestiges of the Natural History of Creation, which although highly speculative contained some kernels of truth about the mechanisms behind biological evolution. Just as Thomas Malthus' 1798 An Essay on the Principle of Population inspired Darwin, so the mid-nineteenth century contained other combinations of ideas and real-world inspiration that provided,an ideal background for the formulation of natural selection. In other words, the conditions were ready for those with the correct mindset to uncover the mechanism behind evolution. What Darwin did was to combine the inspiration with an immense amount of rigour, including examples taken from selective breeding.

Another frequently quoted fallacy is that evolution always maintains a single direction from earlier, simpler organisms to later, more complex ones. I've covered this before in discussions of the evolution of our own species, as many popular biology accounts seek parallels between technological progress and a central branch of animal evolution leading ever upwards until it produced us. 

Modern techniques such as genetic analysis and sophisticated examination of fossils - including scanning their internal cavities – has negated this appealing but incorrect idea. For example, mammals evolved around the same time as the dinosaurs (and over one hundred million years before flowering plants) while parasitic species often have a far more rudimentary structure than their ancestors. 

Despite this, we still see countless illustrations showing a clear-cut path from primordial organisms 'up' to Homo sapiens. No-one who has seen the cranial endocast of a dinosaur would consider it to be superior to even the least intelligent of mammals, although the later medium-sized carnivorous species were on the way to developing a bird-like brain-to-body mass ratio. Yet throughout the Jurassic and Cretaceous periods, dinosaurs filled most ecological niches at the expense of the mammals; you would be hard-pressed to state that the latter were the dominant type of land organism during the Mesozoic!

Research published last year shows that New Zealand's unique tuatara, the sole remaining member of the Rhynchocephalia, is a reptile that shares some genetic similarities to the Monotremata, the egg-laying mammalian species known as platypus and echidna. In addition, a report from the beginning of this year states that the ancestors of today's five monotreme species diverged from all other mammals 187 million years ago; therefore, they have spent approximately three times as long on their own evolutionary journey as they did when part of all the other mammalian lineages. As a result of retaining many ancestral features, the platypus genome is in some ways more like that of birds and reptiles rather than placental and marsupial mammals. But we still include them amongst the mammals rather than as a hybrid or separate class; both platypus and echidna have fur, are warm-blooded and produce milk (although with a unique delivery system!) This allows their inclusion in Mammalia; does this mean we arbitrarily allow certain traits and discard others?

Would it be fair to say that the boundaries we make between organisms are more for our convenience than the underlying reality? Are you happy to label birds as 'avian dinosaurs' and if not, why not? If they had feathers, nests and even underground burrows, some dinosaurs were clearly part of the way there; physiologically, it was teeth, bony tail, and a crocodilian-type brain that provided the differentiation from birds. Scans of fossils show that dinosaur hearts may have been more like birds than other reptiles, which along with the possible discovery of bird-like air sacs, means that they could have had something of the former's more active lifestyle. 

This doesn't confirm that they were warm-blooded: today there are eight species, including leatherback turtles, that are mesothermic and therefore lie between warm- and cold-blooded metabolisms. Eggshell analysis suggests that some of the theropod (carnivorous) dinosaurs could have been warm-blooded, but as dinosaurs existed for around 165 million years it may be that some evolved to be mesothermic and others to be endothermic (i.e., fully warm-blooded). In this respect then, some meat-eating dinosaurs especially may have had more in common with us mammals than they did with other reptiles such as lizards and snakes.

All this only goes to show that there is far more to life's rich pageant than the just-so stories still used to illustrate the history of life. Science communication to the public is fundamental to our society but it needs to present the awkward complexities of evolution via all the tortured pathways of natural selection if it is not to fall victim to those who prefer myths of the last few thousand years to the history of countless millennia, as revealed in the genes and rocks waiting for us to explore.


Tuesday 23 June 2020

Grey matter blues: why has the human brain been shrinking?

There is a disturbing fact about our species that the public don't appear to know, and few specialists seem to want to discuss: over recent millennia, the human brain has been shrinking. There have been plenty of non-scientific warnings about the alleged deleterious effects on IQ of first television and more recently smartphones and tablets, but palaeontological evidences proves that over some tens of thousands of years, the Homo sapiens brain has shrunk somewhere between ten and seventeen percent.

There are usually two key indicators said to provide an accurate measure of smartness: encephalisation quotient and absolute brain size. Encephalisation quotient or EQ is simply the ratio of the mass of the brain to the mass of the body. Overall size is seen as critical due to the number of neural connections required for complex thought processes; you can only squeeze so many neurons into any given volume. Having said that, there is some considerably flexibility around this, thanks to variation in neuron density. The reason that some birds, especially the crow and parrot families are highly intelligent despite the small absolute size of their brains is due to their higher neural density compared to mammals.

Analysis of data from the examination of thousands of human fossil remains suggests that our species reached a peak in EQ around 70,000 years ago, followed by a gradual decline. The reduction in brain size appears to be due to a loss of the archetypal grey matter itself, rather than the white matter that provides support to the neural architecture. However, one key issue is lack of agreement as to a definitive start date for this decline, with 20,000 to 40,000 years ago being the most commonly cited origin. With such basic points remaining unsettled, it's perhaps not surprising that there is a plethora of opinions as to the cause. Here are some of the more popular hypotheses for the decline in human brain size:

1. Change to body size

The first and perhaps most obvious - but easily refuted idea - is that human body size has been steadily declining and so cranial capacity has kept in step with this. While it is true that archaic sapiens may have had a higher mass and even stature than modern humans, the reduction in brain size is greater than would be expected when compared to the overall shrinkage. The assumption is that the development of material culture, from clothing to weapons, has given humans a less calorie-demanding lifestyle.

This would allow - although not dictate - natural selection to trend towards a smaller body size. This doesn't appear to offer any help for the comparatively greater reduction in brain mass, although we should remember that an overall reduction in body size means a smaller birth canal. This in turn requires a smaller skull at birth; as is well known, the human gestation period is three months' less than for similar-size mammals, but our seemingly premature delivery is necessary for the pelvis to maintain efficient bipedalism.

2. Self-domestication

Another idea is that humanity has become domesticated via the impact of culture upon natural selection. Following the population bottleneck of 70,000 years ago - the cause of which is not yet confirmed, despite attempts to correlate it with the Toba super-volcano - there has been continual growth of the human population.

Just as all our domesticated animal species have brain sizes some 10-15% smaller than their wild cousins and ancestors, so the move to larger groups sizes may have led to a more docile humanity, with associated traits such as a smaller cranial capacity being carried along with it.

There are several issues with this hypothesis, ranging from a lack of data on the size of gatherer-hunter bands to the biological mechanisms involved. As regards the latter, there has been some speculation concerning neoteny, in which a species no longer grows to the final stage of maturity. The idea is that if adults are more aggressive than juveniles but peaceful collaboration can lead to larger groups, mutual aid and longer lifespans, then unintentional selective breeding for the retention of juvenile characteristics, including smaller brains, may cause a shift away from the fully mature but more aggressive individuals.

Research in recent years has suggested our brains may continuing to grow into our early thirties rather than cease growing in our teens, so it's possible there could be some truth to this; it would interesting to seek evidence as to whether the brains of archaic sapiens continued growing longer than ours do.

3. The impact of culture

Taking this a step further, increased population density allows a more rapid development and transmission of new ideas, including those that lead to better health, longer lifespans and so to an increased birth rate. Culture and sophisticated language may have reduced the need for most people to gain a wide range of skills - courtesy of a higher intellectual capability - as tasks could be shared and specialisation take hold. In effect, larger societies provide a safety net for those who would be less able to cope in smaller groups.

If ideas could be handed down, then individuals wouldn't have to continually 'reinvent the wheel' in each generation, allowing survival despite a smaller brain size and decreased level of intelligence. The problem with this scenario is that we have no proof the 10-17% reduction has led to an associated drop in intellect; it may well be that the size of certain lobes, used in specialist thought processes such as formulating complex speech, far outweigh any decline in less critical areas.

4. The expensive big brain

One possibility that has a clear cause-and-effect concerns the energy demands of having larger brains. Although they consume a quarter of our daily calories, the human brain is less than five per cent of our body weight. Therefore, there could be a case for arguing the existence of an evolutionary competition between smaller-brained individuals who can survive on less food with those who use their larger brains to improve food collecting strategies. Unfortunately, there are so many variables that it's difficult to judge whether the former would continually trend against the latter and - considering it clearly occurred - why the larger brain managed to evolve in the first place?

5. The more efficient brain

Although a smaller brain might have fewer neurons than a larger version with similar architecture, it has been suggested that its shorter pathways would lead to more rapid thought processing than in a larger counterpart. In addition, there might be fewer neural pathways, again increasing the efficiency. This 'nimble thinking' approach certainly seems logical, although again it doesn't explain the evolution of larger EQ in archaic sapiens.

This is certainly a subject ripe for much more research. I've often concluded with a statement along the lines that it wouldn't be surprising if some or all these factors were involved, since nature rarely conforms to the nice, neat patterns we would like to lay upon it. There is an even possibility that brain size - like so many other aspects of all animal species - fluctuates around a mean value, so that what goes up may come down again, only to later go up again.

At least one anthropological study on both Afro-Americans and US citizens of European descent proposes that over the past few hundred years there may have been an upward drift towards larger brains. Assuming the research is accurate, one possibility is that the superior nutrition available since the Industrial Revolution is allowing such development, thanks to the comparative ease with which its energy demands can be fulfilled.

It would certainly be interesting to investigate this hypothesis on a global scale, considering the wide differences between the clinically obese nations and those still subject to frequent famine. Whatever the results, they are unlikely to be the simple 'just-so' stories often passed-off as to the public in lieu of accurate but understandable science communication. The answers may be out there somewhere...I'd certainly love to know what's been happening to the most sophisticated object in the known universe!


Tuesday 12 May 2020

Ancestral tales: why we prefer fables to fact for human evolution

It seems that barely a month goes by without there being a news article concerning human ancestry. In the eight years since I wrote a post on the apparent dearth of funding in hominin palaeontology there appears to have been some uptake in the amount of research in the field. This is all to the good of course, but what is surprising is that much of the non-specialist journalism - and therefore public opinion - is still riddled with fundamental flaws concerning both our origins and evolution in general.

It also seems that our traditional views of humanity's position in the cosmos is often the source of the errors. It's one thing to make such howlers as the BBC News website did some years' back, in which they claimed chimpanzees were direct human ancestors, but there are a key number of more subtle errors that are repeated time and again. What's interesting is that in order to explain evolution by natural selection, words and phrases have become imbued with incorrect meaning or in some cases, just a slight shift of emphasis. Either way, it seems that evolutionary ideas have been tacked onto existing cultural baggage and in the process, failed to explain the intended theories; personal and socio-political truths have triumphed over objective truth, as Neil deGrasse Tyson might say.

1) As evolutionary biologist Stephen Jay Gould use to constantly point out, the tree of life is like the branches of a bush, not a ladder of linear progression. It's still fairly common to see the phrase 'missing link' applied to our ancestry, among others; I even saw David Attenborough mention it in a tv series about three years' ago. A recent news article described - as if in surprise - that there were at least three species of hominins living in Africa during the past few million years, at the same time and in overlapping regions too. Even college textbooks use it - albeit in quotation marks - among a plethora of other phrases that were once valid, so perhaps it isn't surprising that popular publications continue to use them without qualification.

Evolution isn't a simple, one-way journey through space and time from ancestors to descendants: separate but contemporaneous child species can arise via geographical isolation and then migrate to a common location, all while their parent species continues to exist. An example today would be the lesser black-backed and herring gulls of the Arctic circle, which is either a single, variable species or two clearly distinct species, depending where you look within its range.

It might seem obvious, but species also migrate and then their descendants return to the ancestral homeland; the earliest apes evolved in Africa and then migrated to south-east Asia, some evolving into the ancestors of gibbons and orangutan while others returned to Africa to become the ancestors of gorillas and chimpanzees. One probable culprit of the linear progression model is that some of the examples chosen to teach evolution such as the horse have few branches in their ancestry, giving the false impression of a ladder in which a descendant species always replaces an earlier one.

2) What defines a species is also much misunderstood. The standard description doesn't do any favours in disentangling human evolution; this is where Richard Dawkins' oft-repeated phrase 'the tyranny of the discontinuous mind' comes into play. Examine a range of diagrams for our family tree and you'll find distinct variations, with certain species sometimes being shown as direct ancestors and sometimes as cousins on extinct branches.

If Homo heidelbergensis is the main root stock of modern humans but some of us have small amounts of Neanderthal and/or Denisovan DNA, then do all three qualify as direct ancestors of modern humans? Just where do you draw the line, bearing in mind every generation could breed with both the one before and after? Even with rapid speciation events between long periods of limited variability (A.K.A. punctuated equilibrium) there is no clear cut-off point separating us from them. Yet it's very rare to see Neanderthals labelled as Homo sapiens neanderthalensis and much more common to see them listed as Homo neanderthalensis, implying a wholly separate species.

Are the religious beliefs and easy-to-digest just-so stories blinding us to the complex, muddled background of our origins? Obviously, the word 'race' has profoundly negative connotations these days, with old-school human variation now known to be plain wrong. For example, there's greater genetic variation in the present-day sub-Saharan African population than in the rest of the world combined, thanks to it being the homeland of all hominin species and the out-of-Africa migrations of modern humans occurring relatively recently.

We should also consider that species can be separated by behaviour, not just obvious physical differences. Something as simple as the different pitches of mating calls separate some frog species, with scientific experiments proving that the animals can be fooled by artificially changing the pitch. Also, just because species appear physically similar doesn't necessarily mean an evolutionary close relationship: humans and all other vertebrates are far closer to spiny sea urchins and knobbly sea cucumbers than they are to any land invertebrates such as the insects.

3) Since the Industrial Revolution, societies - at least in the West - have become obsessed with growth, progress and advance. This bias has clearly affected the popular conception that evolution always leads to improvements, along the lines of faster cheetahs to catch more nimble gazelles and 'survival of the fittest'. Books speak of our epoch as the Age of Mammals, when by most important criteria we live in the era of microbes; just think of the oxygen-generating cyanobacteria. Many diagrams of evolutionary trees place humans on the central axis and/or at the pinnacle, as if we were destined to be the best thing that over three billion years of natural selection could achieve. Of course, this is no better than what many religions have said, whereby humans are the end goal of the creator and the planet is ours to exploit and despoil as we like (let's face it, for a large proportion of our existence, modern Homo sapiens was clearly less well adapted to glacial conditions than the Neanderthals).

Above all, these charts give the impression of a clear direction for evolution with mammals as the core animal branch. Popular accounts still describe our distant ancestors, the synapsids, as the 'mammal-like reptiles', even though they evolved from a common ancestor of reptiles, not from reptiles per se. Even if this is purely due to lazy copying from old sources rather than fact-checking, doesn't it belie the main point of the publication? Few general-audience articles admit that all of the earliest dinosaurs were bipedal, presumably because we would like to conflate standing on two legs with more intelligent or 'advanced' (a tricky word to use in a strict evolutionary sense) lineages.

The old ladder of fish-amphibian-reptile/bird-mammal still hangs over us and we seem unwilling to admit to extinct groups (technically called clades) that break our neat patterns. Incidentally, for the past 100 million years or so, about half of all vertebrate species have been teleost fish - so much for the Age of Mammals! No-one would describe the immensely successful but long-extinct trilobites as just being 'pill bug-like marine beetles' or similar, yet when it comes to humans, we have a definite sore spot. There is a deep psychological need to have an obvious series of ever-more sophisticated ancestors paving the way for us.

What many people don't realise is that organisms frequently evolve both physical and behavioural attributes that are subsequently lost and possibly later regained. Some have devolved into far simpler forms, frequently becoming parasites. Viruses are themselves a simplified life form, unable to reproduce without a high-jacked cell doing the work for them; no-one could accuse them of not being highly successful - as we are currently finding out to our cost. We ourselves are highly adaptable generalists, but on a component-by-component level it would appear that only our brains make us as successful as we are. Let's face it, physically we're not up to much: even cephalopods such as squid and octopus have a form of camera eye that is superior to that of all vertebrates.

Even a cursory glance at the natural history of life, using scientific disciplines as disparate as palaeontology and comparative DNA analysis, shows that some lineages proved so successful that their outward physiology has changed very little. Today, there are over thirty species of lancelet that are placed at the base of the chordates and therefore closely related to the ancestors of all vertebrates. They are also extremely similar in appearance to 530-million-year-old fossils of the earliest chordates in the Cambrian period. If evolution were a one-way ticket to progress, why have they not long since been replaced by later, more sophisticated organisms?

4) We appear to conflate success simply with being in existence today, yet our species is a newcomer and barely out of the cradle compared to some old-timers. We recently learned that Neanderthals wove plant fibre to make string and ate a wide variety of seafood. This knowledge brings with it a dwindling uniqueness for modern Homo sapiens. The frequently given explanation of our superiority over our extinct cousins is simply that they aren't around anymore, except as minor components of our genome. But this is a tautology: they are inferior because they are extinct and therefore an evolutionary dead end; yet they became extinct because of their inferiority. Hmmm...there's not much science going on here!

The usual story until recently was that at some point (often centred around 40,000-50,000 years ago) archaic sapiens developed modern human behaviour, principally in the form of imaginative, symbolic thinking. This of course ignores the (admittedly tentative) archaeological evidence of Neanderthal cave-painting, jewelry and ritual, all of which are supposed to be evidence of our direct ancestor's unique Great Leap Forward (yes, it was named after Chairman Mao's plan). Not only did Neanderthals have this symbolic behaviour, they appear to have developed it independently of genetically-modern humans. This is a complete about-turn from the previous position of them being nothing more than poor copyists.

There are alternative hypotheses to the Great Leap Forward, including:
  1. Founder of the Comparative Cognition Project and primate researcher Sarah Boysen observed that chimpanzees can create new methods for problem solving and processing information. Therefore, a gradual accumulation of cognitive abilities and behavioural traits over many millennia - and partially inherited from earlier species - may have reached a tipping point. 
  2. Some geneticists consider there to have been a sudden paradigm shift caused by a mutation of the FOXP2 gene, leading to sophisticated language and all that it entails.
  3. Other researchers consider that once a certain population size and density was achieved, complex interactions between individuals led the way to modern behaviour. 
  4. A better diet, principally in the form of larger amounts of cooked meat, led to increased cognition. 
In some ways, all of these are partly speculative and as is often the case we may eventually find that a combination of these plus other factors were involved. This shouldn't stop us from realising how poor the communication of evolutionary theories still is and how many misconceptions exist, with the complex truth obscured by our need to feel special and to tell simple stories that rarely convey the amazing evolution of life on Earth.



Wednesday 27 July 2016

Resistance is futile: the ongoing war against super bugs

As I'm currently three days into an irritating cough (aren't they all?) accompanied by a sore throat, I've just taken a soothing lozenge. The packet states the lozenges contain a combination of two antibacterial agents which aim to help kill the bacteria causing the infection. However, the packet also notes - in a somewhat smaller font size - that there is no clinical proof an antibacterial agent will reduce the severity or duration of the infection. Could this be because common colds and influenza are caused by viruses not bacteria? I don't suppose the pharmaceutical industry could possibly be duping an ignorant public in the name of profit margins?

Working in a hot desking environment, I frequently remind colleagues not to overdue usage of anti-bacterial sprays on their desk, keyboards, mouse and telephone. Not that I'm exactly certain how damaging the company-supplied sprays are, environmentally-speaking: for all I know, they may be good enough to destroy all the 'bad' bacteria, but I'd rather be safe than sorry. Instead, I recommend the method I use at work, namely washing my hands before eating. Simple, and hopefully less likely to encourage super bugs.

It seems to me that basic hygiene is preferable to the chemical war on microbes, since (a) some are beneficial, including for building immunity; and (b) some strains may survive the cull and lead to a desk biota high in resistant bacteria: after all, isn't that just Darwinian natural selection being given an unintentional boost? Unfortunately, there has been a large increase in sick leave since we moved from conventional offices to hot-desking. Therefore something is clearly going wrong, regardless of approach!

The best well-known of the super bugs has to be Methicillin-resistant Staphylococcus aureus (MRSA), beloved of news journalists but very few others. Although the resistance was first recognised around 1960, the past twenty-five years or so has seen a plethora of scientific reports describing outbreaks separated from healthcare environments. Therefore popular news articles about super bugs in hospitals - and the over-use of antibiotics that have aided their increase in range - only started hitting the headlines after the bacteria had already spread to other types of locale.

This latter community-associated or CA-MRSA is therefore at least as great a risk as the hospital variant, often affecting younger people. MRSA naturally occurs in several percent of the population anyway, so it would be difficult to totally eradicate by any foreseeable method. Many common antibiotics are already useless against MRSA, which can be spread by direct skin contact as well as via objects - such as computer keyboards and mice I might add, to anyone considering converting their offices to hot desking. In addition, the far less well-known methicillin-sensitive Staphylococcus aureus (MSSA) is also on the increase.

Another key reason for the increase of resistant microbes is thanks to the use of antibiotics on farmed animals. Whilst it might seem sensible for densely-packed livestock to be inoculated - frankly I don't mind paying more for free range rather than battery-farmed eggs, but I realise that isn't an option for many - the discovery in the 1940s that antibiotics can be used to promote growth imply profit is yet again the key factor here. Far from being a simple precautionary measure against the spread of infection, livestock and poultry has been given pharmaceuticals in order to maximise produce without an associated increase in feeding costs.

In 1969 the Swann report on the Use of Antibiotics in Animal Husbandry and Veterinary Medicine recommended a cease on their use as growth promoters. After a long period of inertia, the European Union eventually banned such usage for eight antibiotics, four in 1989 and a further four in 2006. Unfortunately many other nations, including the USA, are still pumping enormous amounts of pharmaceuticals into farm animals.

I've found very little in the way of research projects that seek to lessen this dependency. Possibly the method containing the least disruption would be to develop drugs that have similar effects on animal growth but aren't required as human medicine. Perhaps the pharmaceutical giants just aren't finding antibiotic development profitable enough anymore; after all, if medical practice wants to prevent the spread of resistant bacteria it needs to minimise use of antibiotics.

The effects agricultural usage is having is wide-ranging, from pathogens crossing from livestock to humans and back again, to infections spreading to pets and even into wild animals such as flies and rodents. However, the USA seems to have made little effort to follow the EU, with about 80% of the antibiotics sold there being used on farm livestock. Yet another MRSA variant, CC398, has been gaining ground, particularly in pigs and can transfer to humans in the form LA-MRSA. What price a cheap bacon sandwich?

It isn't as if the American scientific establishment hasn't been amassing data to support the case for stopping the practice, which over the past half century or so has led to other, less well-known strains such as Campylobacter coli gaining immunity, Despite high levels of infected produce, large-scale recalls and perhaps over 100,000 deaths per annum in the USA alone (farm workers and food processors can pick up strains, not just the end consumer), commerce appears to be winning over common sense.

It isn't completely bad news: research by the University of Southampton indicates that copper might become useable as an inhibitor (which seems strange - I thought silver might be the metal of choice, considering its anti-bacterial properties - guess that proves I'm not a research chemist, then!) In addition, some of the main fast food chains have started to cut down on buying produce from antibiotic-pumped livestock. But is this too little much too late? With most pharmaceutical production in the hands of a few giant multi-nationals, the human race is largely beholden to a very small number of executives. My suggestion would be...err...just don't get ill? Or work in a hot desking environment. Or leave your home, ever...hmm...

Monday 30 January 2012

Sell-by date: are old science books still worth reading?

As an outsider to the world of science I've recently been struck by an apparent dichotomy that I don't think I've ever heard discussed, namely that if science is believed by non-practitioners to work on the basis of new theories replacing earlier ones, then are out-of-date popular science (as opposed to text) books a disservice, if not positive danger, to the field?

I recently read three science books written for a popular audience in succession, the contrast between them serving as the inspiration for this post. The most recently published was Susan Conner and Linda Kitchen's Science's Most Wanted: the top 10 book of outrageous innovators, deadly disasters, and shocking discoveries (2002). Yes, it sounds pretty tacky, but I hereby protest that I wanted to read it as much to find out about the authors and their intended audience as the subject material itself. Although only a decade old the book is already out of date, in a similar way that a list of top ten grossing films would be. In this case the book lists different aspects of the scientific method and those involved, looking at issues ranging from collaborative couples (e.g. the Curies) to prominent examples of scientific fraud such as the Chinese fake feathered dinosaur fossil Archaeoraptor.

To some extent the book is a very poor example of the popular science genre, since I found quite a few incorrect but easily verifiable facts. Even so, it proved to be an excellent illustration of how transmission of knowledge can suffer in a rapidly-changing, pop-cultural society. Whilst the obsession with novelty and the associated transience of ideas may appear to somewhat fit in with the principle that a more recent scientific theory always replaces an earlier one, this is too restrictive a definition of science. The discipline doesn't hold with novelty for the sake of it, nor does an old theory that is largely superseded by a later one prove worthless. A good example of the latter is the interrelationship between Newton's classical Law of Gravitation (first published in 1687) and Einstein's General Relativity (1916), with the former still used most of the time (calculating space probe trajectories, etc, etc).

The second of the three books discusses several different variants of scientific practice, although far different from New Zealand particle physicist Ernest Rutherford's crude summary that "physics is the only real science. The rest are just stamp collecting." Stephen Jay Gould's first collection of essays, Ever Since Darwin (1977), contains his usual potpourri of scientific theories, observations and historical research. These range from simple corrections of 'facts' – e.g. Darwin was not the original naturalist on HMS Beagle – to why scientific heresy can serve important purposes (consider the much-snubbed Alfred Wegener, who promoted a precursor to plate tectonics long before the evidence was in) through to a warning of how literary flair can promote poor or even pseudo-science to an unwary public (in this instance, Immanuel Velikovsky's now largely forgotten attempts to link Biblical events to interplanetary catastrophes).

Interestingly enough, the latter element surfaced later in Gould's own career, when his 1989 exposition of the Early Cambrian Burgess Shale fossils, Wonderful Life, was attacked by Richard Dawkins with the exclamation that he wished Gould could think as clearly as he could write! In this particular instance, the attack was part of a wider critique of Gould's theories of evolutionary mechanisms rather than material being superseded by new factual evidence. However, if I'm a typical member of the lay readership, the account of the weird and wonderful creatures largely outweighs the professional arguments. Wonderful Life is still a great read as descriptive natural history and I suppose serves as a reminder that however authoritative the writer, don't take accept everything on face value. But then that's a good lesson in all subjects!

But back to Ever Since Darwin. I was surprised by just how much of the factual material had dated in fields as disparate as palaeontology and planetary exploration over the past thirty-five years. As an example, Essay 24 promotes the idea that the geophysical composition of a planetary body is solely reliant on the body's size, a hypothesis since firmly negated by space probe data. In contrast, it is the historical material that still shines as relevant and in the generic sense 'true'. I've mentioned before (link) that Bill Bryson's bestseller A Short History of Nearly Everything promotes the idea that science is a corpus of up-to-date knowledge, not a theoretical framework and methodology of experimental procedures. But by so short-changing science, Bryson's attitude could promote the idea that all old material is essentially worthless. Again, the love of novelty, now so ingrained in Western societies, can cause public confusion in the multi-layered discipline known as science.

Of course, this doesn't mean that something once considered a classic still has great worth, any more than every single building over half a century old is worthy of a preservation order. But just possibly (depending on your level of post-modernism and/or pessimism) any science book that stands the test of time does so because it contains self-evident truths. The final book of the three is a perfect example of this: Charles Darwin's On the Origin of Species, in this case the first edition of 1859. The book shows that Darwin's genius lay in tying together apparently disparate precursors to formulate his theory; in other words, natural selection was already on the thought horizon (as proven by Alfred Russel Wallace's 1858 manuscript). In addition, the distance between publication and today gives us an interesting insight into the scientist as human being, with all the cultural and linguistic baggage we rarely notice in our contemporaries. In some ways Darwin was very much a man of his time, attempting to soften the non-moralistic side to his theory by subtly suggesting that new can equal better, i.e. a form of progressive evolution. For example, he describes extinct South American mega fauna as 'anomalous monsters' yet our overtly familiar modern horse only survived via Eurasian migration, dying out completely in its native Americas. We can readily assume that had the likes of Toxodon survived but not Equus, the horse would seem equally 'anomalous' today.

Next, Darwin had limited fossil evidence to support him, whilst Nineteenth Century physics negated natural selection by not allowing enough time for the theory to have effect. Of course, if the reader knows what has been discovered in the same field since, they can begin to get an idea of the author's thought processes and indeed world view, and just how comparatively little data he had to work with. For example, Darwin states about variations in the sterility of hybrids whilst we understand, for example that most mules are sterile because of chromosomal issues. Yet this didn’t prevent the majority of mid-Victorian biologists from accepting natural selection, an indication that science can be responsive to ideas with only circumstantial evidence; this is a very long way indeed from the notion of an assemblage of clear-cut facts laid out in logical succession.

I think it was the physicist and writer Alan Lightman who said: "Science is an ideal but the application of science is subject to the psychological complexities of the humans who practice it." Old science books may frequently be dated from a professional viewpoint but can still prove useful to the layman for at least the following reasons: understanding the personalities, mind-sets and modes of thought of earlier generations; observing how theories within a discipline have evolved as both external evidence and fashionable ideas change; and the realisation that science as a method of understanding the universe is utterly different from all other aspects of humanity. Of course, this is always supposing that the purple prose doesn’t obscure a multitude of scientific sins...

Saturday 25 June 2011

Amazed rats and super squirrels: urban animal adaptations

If I was the gambling sort I might be tempted to bet that the most of the large fauna in my neighbourhood was, like much of London, restricted to very few species: namely feral pigeons, rats, mice and foxes. The most interesting visitor to my garden is, judging by the size, a female common toad - the wondrously named Bufo bufo - which makes an appearance every couple of years to feast on snails and leave a shell midden behind.

After spotting a small flock of Indian-ringnecked Parakeets in our local park, I decided to look at the adaptations wildlife has undergone whilst living in an urban environment. After intermittently researching this topic over a month or so, I was surprised to find the BBC Science News website posting an article along similar lines. Synchronicity? I decided to plough ahead, since the subject is too interesting to abandon and I've got my very own experimental data as well, although it's hardly 'laboratory conditions' material.

Your friendly neighbourhood Bufo bufo
It's easy to see why animals are attracted to cities: the ever-present food scraps; the warmer microclimate; and of course plenty of places to use for shelter (my nickname for railway embankments is 'rodent condominiums'). Even the mortar in walls seems to offer smaller birds a mineral supplement (calcium carbonate) and/or mini-gastroliths (A.K.A. stomach grit) judging by the way they peck at them. Then there's also the plentiful sources of fresh water, which in my neighbourhood goes from birdbaths and guttering to streams and reservoirs. Who can blame animals for coming in from the cold? In the case of the London fox they have been arriving since the 1930s, whilst rodents were probably rubbing their paws together in glee as the first cities were being built many millennia ago in the Fertile Crescent.

There seem to be several, obvious behavioural changes that result from urban adaption, particularly when it comes to judging humans. I have found an astonishing lack of wariness in mice, squirrels and foxes, even in daylight, although rats are usually more circumspect. There are an increasing number of stories concerning foxes biting sleeping humans, including adults, even during the day. I was informed by a Clapham resident of how, having chased a noisy fox down the street at night, it then followed him back to his house, only stopping at the garden gate. Clearly there is some understanding of territorial boundaries here, too. This is supported by the behaviour of foxes in my area, which will happily chase cats in the local allotments even during the day, but once the cat emerges onto the street, the fox doesn't follow. Perhaps they have some understanding of connection between cats and humans?

City fauna has become more opportunist, prepared to scavenge meals from the enormous range of foodstuffs available in an urban environment, which around my area seems mostly to consist of fried chicken carcasses, usually still in the box. Even birds of prey such as the Red Kite (no small fry, with up to a one and three-quarter metre wing span) have recently been seen taking food off unwary children. This follows a period of finding food deliberately left out for them, so an association forms between people and food. This then is a two-way connection, with humans helping to generate changes in urban fauna by their own actions. Less time spent foraging means urban animals expend less physical energy, so there may a feedback loop at work here; if surplus energy can aid higher cognition, discrimination of humans and the urban environment increases, and thus even less time is required to source food. A facile conclusion perhaps, but read on for a possible real-life example.

My own experiments on grey squirrels took place about ten years ago, probably at least partially inspired by a television lager advertisement. It started when I found that my bird feeder was being misappropriated by a couple of squirrels. My first idea was to add radial spikes around the bird feeder using garden canes, but the squirrels were more nimble than I had thought, so after adding more and more spikes to create an object reminiscent of the Spanish Inquisition, I had to change tack. I next suspended the bird feeder on the end of a long rod that was too thin for the squirrels to climb on, but they managed to dislodge it at the wall end, causing it to drop to the ground for easy consumption. Rounds one and two to the pesky Sciurus carolinensis. My final design was a combination of spikes on the approach to the rod, the rod itself, then the feeder suspended from a long wire at the end of rod. I went off to work with an air of smug satisfaction that no mere rodent was going to get the better of me, only to find on my return that somehow the squirrels had leapt onto the rod and eaten through the wire!

One point to consider is that the bird food itself was in a transparent perspex tube, which is totally unlike any natural material. So when it comes down to it, are some animals, at least mammals and birds, over-endowed with grey matter when it comes to their usual environment, only utilising more of their potential when faced with artificial materials? Or do the challenges and rewards of being an urban sophisticate cause an increase in neurological activity or actual physiology? The latter gets my vote, if only for the evidence that supports this in human development. After all, the archaeological record suggests that modern humans and our ancestral/cousin species experienced an incredibly slow rate of technological development, with rapid increases only coming after disastrous setbacks such as the population bottleneck around 70,000 years ago, probably following a decade-long volcanic winter.

Experiments using rats in mazes over the past eighty years seem to agree with this thesis. However, there are clearly limits to animals' ability to learn new cognitive skills if they don't have time for repeated interactions, which may explain why most young foxes' first encounter with vehicular traffic is also their last. As for the BBC Science News report I mentioned earlier, research shows that birds with comparatively larger brain to body size ratios are those found to thrive in an urban environment. So it isn't all nature red in tooth and claw after all, but at least on occasion a case of brain over brawn for the city slickers.

Finally, I ought to mention a series of scare stories over the past year about another urban coloniser that seems to be returning after half a century's absence, namely the Cimicidae family of bloodsucking insects. With many of us using weaker laundry detergents at lower water temperatures, some researchers are predicting an imminent global pandemic of these unpleasant critters. So please be careful at night, and don't let the bed bugs bite!

Friday 18 March 2011

Animal farm: agricultural revolutions happening in your own garden

Various forms of symbiosis - the mutual interactions between species - have long been recognised, not least the hundreds of microorganisms that co-exist within and upon us Homo sapiens. But going beyond mere symbiosis, there appear to be examples of interactions between species that are nothing less than astonishing. Following a recent spate of television documentaries on the Neolithic period, the time when humans started to farm first animals and then crops, it seemed a good excuse to look at examples of other animals that also farm. Although mostly restricted to arable farmers (technically speaking, fungi culturists) there is also one fascinating case of pastoralism.

The best-known examples are probably insects, with many species of leaf-cutter ant and termites known to farm strains of fungi as a food source. It has been assumed (although I’m not sure on what basis, since farming activity would presumably be invisible to the fossil record) that these insects developed their sophisticated social structures, including caste systems, prior to the adoption of farming. This is the direct reverse of the earliest human farmers, wherein the earliest cities of the Near East, for example, arose after livestock domestication. It’s difficult to see how insects started the process and raises the interesting question of whether it offers the farming species any superiority over non-farmers of similar genera. After all, in human cultures it appears that early farmers had to work far harder for their daily bread than the gatherer-hunters who preceded them, the latter being a way of life that continues in isolated pockets even to this day. So it may not be an improvement on non-farming lifestyles - just different. Another nail in the coffin for any followers of the Victorian notion of progress…

Staying with insects, a diverse group of over three thousand beetles cultivate the ambrosia fungus for food, in a relationship thought to stretch back tens of millions of years. Unlike ants and termites, these beetle species do not all live in large, strictly-organised colonies. Heading for wetter environments, marsh snails have also been found to cultivate fungus that is ‘sown’ from spores embedded in their own excrement! Then in the water itself, some species of damselfish farm algae on the remnants of coral they have themselves killed, a process that bares a striking resemblance to Amazonian deforestation for cattle ranching. Unfortunately, the fishing by humans of damselfish predators has had the effect of aiding the population of fishy farmers and thus only increased the rate of coral loss.

Finally, the pastoralist in the pack, our everyday common or garden ant. In a bizarre simulcrum of dairy farming, some ant species control, supervise and ‘milk’ aphids. Had the species involved been more cuddly (i.e. one of us mammals) then it might have seemed all the more astonishing – a real-life antidote to Beatrix Potter-esque anthropomorphism. As it is these genuine animal farmers, with individual brains weighing a few thousandths of a gram, will drug aphids, protect them from predators and bad weather, and even use biochemicals to affect their growth patterns. And all in return for the honeydew they extract from the aphids.

You may have noticed the use of very human activities in these descriptions: domestication; caste systems; protection, etc. We are only just beginning to understand the behavioural diversity to found amongst other species, only to find we are continuously removing yet more barriers that differentiate ourselves from the rest of the biosphere. It is tempting to suggest this last example of animal farmers includes a form of slavery, with drug-controlled drones and just a whif of Brave New World. If these examples of non-human farmers were found on another planet, would we possibly consider it to be a sign, incredibly alien to be sure, of intelligence? Clearly, the brain size of the individuals involved doesn’t count for much, but a colony of 40,000 ants has the collective number of brain cells of one human. If the ants were able to store information in chemical signatures, something akin to a library, then wouldn’t this be a form of hive mind? Speculative nonsense of course, but does anyone remember the 1970’s film Phase IV?

It’s difficult to be anything other than dumbfounded as we learn more about animal behaviour, especially at what seems to be a programmed/non-conscious level. If the permutations are like this on Earth, the possibilities on other worlds are seemingly limitless. Again, this questions whether we could even recognise whether another species is intelligent or not. Perhaps Douglas Adams put it best: "Man has always assumed that he was more intelligent than dolphins because he had achieved so much...the wheel, New York, wars and so on...while all the dolphins had ever done was muck about in the water having a good time. But conversely, the dolphins had always believed that they were far more intelligent than man...for precisely the same reason."

Enough said!