Showing posts with label Homo heidelbergensis. Show all posts
Showing posts with label Homo heidelbergensis. Show all posts

Tuesday 12 May 2020

Ancestral tales: why we prefer fables to fact for human evolution

It seems that barely a month goes by without there being a news article concerning human ancestry. In the eight years since I wrote a post on the apparent dearth of funding in hominin palaeontology there appears to have been some uptake in the amount of research in the field. This is all to the good of course, but what is surprising is that much of the non-specialist journalism - and therefore public opinion - is still riddled with fundamental flaws concerning both our origins and evolution in general.

It also seems that our traditional views of humanity's position in the cosmos is often the source of the errors. It's one thing to make such howlers as the BBC News website did some years' back, in which they claimed chimpanzees were direct human ancestors, but there are a key number of more subtle errors that are repeated time and again. What's interesting is that in order to explain evolution by natural selection, words and phrases have become imbued with incorrect meaning or in some cases, just a slight shift of emphasis. Either way, it seems that evolutionary ideas have been tacked onto existing cultural baggage and in the process, failed to explain the intended theories; personal and socio-political truths have triumphed over objective truth, as Neil deGrasse Tyson might say.

1) As evolutionary biologist Stephen Jay Gould use to constantly point out, the tree of life is like the branches of a bush, not a ladder of linear progression. It's still fairly common to see the phrase 'missing link' applied to our ancestry, among others; I even saw David Attenborough mention it in a tv series about three years' ago. A recent news article described - as if in surprise - that there were at least three species of hominins living in Africa during the past few million years, at the same time and in overlapping regions too. Even college textbooks use it - albeit in quotation marks - among a plethora of other phrases that were once valid, so perhaps it isn't surprising that popular publications continue to use them without qualification.

Evolution isn't a simple, one-way journey through space and time from ancestors to descendants: separate but contemporaneous child species can arise via geographical isolation and then migrate to a common location, all while their parent species continues to exist. An example today would be the lesser black-backed and herring gulls of the Arctic circle, which is either a single, variable species or two clearly distinct species, depending where you look within its range.

It might seem obvious, but species also migrate and then their descendants return to the ancestral homeland; the earliest apes evolved in Africa and then migrated to south-east Asia, some evolving into the ancestors of gibbons and orangutan while others returned to Africa to become the ancestors of gorillas and chimpanzees. One probable culprit of the linear progression model is that some of the examples chosen to teach evolution such as the horse have few branches in their ancestry, giving the false impression of a ladder in which a descendant species always replaces an earlier one.

2) What defines a species is also much misunderstood. The standard description doesn't do any favours in disentangling human evolution; this is where Richard Dawkins' oft-repeated phrase 'the tyranny of the discontinuous mind' comes into play. Examine a range of diagrams for our family tree and you'll find distinct variations, with certain species sometimes being shown as direct ancestors and sometimes as cousins on extinct branches.

If Homo heidelbergensis is the main root stock of modern humans but some of us have small amounts of Neanderthal and/or Denisovan DNA, then do all three qualify as direct ancestors of modern humans? Just where do you draw the line, bearing in mind every generation could breed with both the one before and after? Even with rapid speciation events between long periods of limited variability (A.K.A. punctuated equilibrium) there is no clear cut-off point separating us from them. Yet it's very rare to see Neanderthals labelled as Homo sapiens neanderthalensis and much more common to see them listed as Homo neanderthalensis, implying a wholly separate species.

Are the religious beliefs and easy-to-digest just-so stories blinding us to the complex, muddled background of our origins? Obviously, the word 'race' has profoundly negative connotations these days, with old-school human variation now known to be plain wrong. For example, there's greater genetic variation in the present-day sub-Saharan African population than in the rest of the world combined, thanks to it being the homeland of all hominin species and the out-of-Africa migrations of modern humans occurring relatively recently.

We should also consider that species can be separated by behaviour, not just obvious physical differences. Something as simple as the different pitches of mating calls separate some frog species, with scientific experiments proving that the animals can be fooled by artificially changing the pitch. Also, just because species appear physically similar doesn't necessarily mean an evolutionary close relationship: humans and all other vertebrates are far closer to spiny sea urchins and knobbly sea cucumbers than they are to any land invertebrates such as the insects.

3) Since the Industrial Revolution, societies - at least in the West - have become obsessed with growth, progress and advance. This bias has clearly affected the popular conception that evolution always leads to improvements, along the lines of faster cheetahs to catch more nimble gazelles and 'survival of the fittest'. Books speak of our epoch as the Age of Mammals, when by most important criteria we live in the era of microbes; just think of the oxygen-generating cyanobacteria. Many diagrams of evolutionary trees place humans on the central axis and/or at the pinnacle, as if we were destined to be the best thing that over three billion years of natural selection could achieve. Of course, this is no better than what many religions have said, whereby humans are the end goal of the creator and the planet is ours to exploit and despoil as we like (let's face it, for a large proportion of our existence, modern Homo sapiens was clearly less well adapted to glacial conditions than the Neanderthals).

Above all, these charts give the impression of a clear direction for evolution with mammals as the core animal branch. Popular accounts still describe our distant ancestors, the synapsids, as the 'mammal-like reptiles', even though they evolved from a common ancestor of reptiles, not from reptiles per se. Even if this is purely due to lazy copying from old sources rather than fact-checking, doesn't it belie the main point of the publication? Few general-audience articles admit that all of the earliest dinosaurs were bipedal, presumably because we would like to conflate standing on two legs with more intelligent or 'advanced' (a tricky word to use in a strict evolutionary sense) lineages.

The old ladder of fish-amphibian-reptile/bird-mammal still hangs over us and we seem unwilling to admit to extinct groups (technically called clades) that break our neat patterns. Incidentally, for the past 100 million years or so, about half of all vertebrate species have been teleost fish - so much for the Age of Mammals! No-one would describe the immensely successful but long-extinct trilobites as just being 'pill bug-like marine beetles' or similar, yet when it comes to humans, we have a definite sore spot. There is a deep psychological need to have an obvious series of ever-more sophisticated ancestors paving the way for us.

What many people don't realise is that organisms frequently evolve both physical and behavioural attributes that are subsequently lost and possibly later regained. Some have devolved into far simpler forms, frequently becoming parasites. Viruses are themselves a simplified life form, unable to reproduce without a high-jacked cell doing the work for them; no-one could accuse them of not being highly successful - as we are currently finding out to our cost. We ourselves are highly adaptable generalists, but on a component-by-component level it would appear that only our brains make us as successful as we are. Let's face it, physically we're not up to much: even cephalopods such as squid and octopus have a form of camera eye that is superior to that of all vertebrates.

Even a cursory glance at the natural history of life, using scientific disciplines as disparate as palaeontology and comparative DNA analysis, shows that some lineages proved so successful that their outward physiology has changed very little. Today, there are over thirty species of lancelet that are placed at the base of the chordates and therefore closely related to the ancestors of all vertebrates. They are also extremely similar in appearance to 530-million-year-old fossils of the earliest chordates in the Cambrian period. If evolution were a one-way ticket to progress, why have they not long since been replaced by later, more sophisticated organisms?

4) We appear to conflate success simply with being in existence today, yet our species is a newcomer and barely out of the cradle compared to some old-timers. We recently learned that Neanderthals wove plant fibre to make string and ate a wide variety of seafood. This knowledge brings with it a dwindling uniqueness for modern Homo sapiens. The frequently given explanation of our superiority over our extinct cousins is simply that they aren't around anymore, except as minor components of our genome. But this is a tautology: they are inferior because they are extinct and therefore an evolutionary dead end; yet they became extinct because of their inferiority. Hmmm...there's not much science going on here!

The usual story until recently was that at some point (often centred around 40,000-50,000 years ago) archaic sapiens developed modern human behaviour, principally in the form of imaginative, symbolic thinking. This of course ignores the (admittedly tentative) archaeological evidence of Neanderthal cave-painting, jewelry and ritual, all of which are supposed to be evidence of our direct ancestor's unique Great Leap Forward (yes, it was named after Chairman Mao's plan). Not only did Neanderthals have this symbolic behaviour, they appear to have developed it independently of genetically-modern humans. This is a complete about-turn from the previous position of them being nothing more than poor copyists.

There are alternative hypotheses to the Great Leap Forward, including:
  1. Founder of the Comparative Cognition Project and primate researcher Sarah Boysen observed that chimpanzees can create new methods for problem solving and processing information. Therefore, a gradual accumulation of cognitive abilities and behavioural traits over many millennia - and partially inherited from earlier species - may have reached a tipping point. 
  2. Some geneticists consider there to have been a sudden paradigm shift caused by a mutation of the FOXP2 gene, leading to sophisticated language and all that it entails.
  3. Other researchers consider that once a certain population size and density was achieved, complex interactions between individuals led the way to modern behaviour. 
  4. A better diet, principally in the form of larger amounts of cooked meat, led to increased cognition. 
In some ways, all of these are partly speculative and as is often the case we may eventually find that a combination of these plus other factors were involved. This shouldn't stop us from realising how poor the communication of evolutionary theories still is and how many misconceptions exist, with the complex truth obscured by our need to feel special and to tell simple stories that rarely convey the amazing evolution of life on Earth.



Tuesday 13 May 2014

Digging apart: why is archaeology a humanity and palaeontology a science?

Although my Twitter account only follows scientists and scientific organisations, every day sees the arrival of a fair few archaeology tweets, even by science-orientated sites such as Science News. As someone who has been an amateur practitioner of both archaeology and palaeontology I thought I'd like to get to grips with why they are categorised so differently. After all, the names themselves don't really help: the word 'archaeology' means "the study of everything ancient." whilst the common definition of 'palaeontology' is pretty much "the study of ancient life". I've even known people with close friends or relatives in one or the other discipline to confuse them: whilst viewing my fossil cabinet, a visitor once told me that her cousin was an archaeologist studying Maori village sites!

Even historically, both fields share many common factors. Not only were they founded by enthusiasts and amateurs, but to this day non-professionals continue to make fundamental contributions. In converse, amateurs can cause serious deficiencies in the data record by lack of rigour or deliberately putting financial gain ahead of the preservation of new information. This can be caused by a variety of methods, from crude or overly hasty preparation of fossils, to metal detectorists and site robbers who sell their finds to private collectors without recording the context, or even the material itself.

It is not immediately obvious where the dividing line between the two disciplines lies when it comes to prehistoric human remains. In the 1990s, archaeologist Mark Roberts led a team that excavated the half a million year old Boxgrove site in southern England. Finds included fragmentary remains of Homo heidelbergensis, thus crossing over to what might traditionally be deemed the territory of palaeontologists. In 2001 the multi-phase Ancient Human Occupation of Britain project started, with deliberate collaboration between both sectors, proof that their skills could overlap and reinforce each other.

By and large, neither palaeontology nor archaeology utilises repeatable laboratory experiments and therefore neither can be classified as a ‘hard’ science. Even palaeontology relies to a large extent on historical contingency, both for remains to be fossilised in the first place and then for them to be discovered and recorded using the relevant methodology. As British palaeontologist Richard Fortey has said "Physics has laboratories; systematic biology has collections." Talking of which, re-examination of old evidence in both disciplines can lead to new discoveries: how often do we see headlines pointing to a fundamental discovery...made in a museum archive?

Although archaeologist were not previously known for conducting experiments,  the New Archaeology/Processual archaeology that arose in the 1960s included an emphasis on testing hypotheses, one result of which is that archaeology now uses experiments to interpret site data. This includes attempts to recreate artefacts, structures, boats, or even food recipes, based on finds from one or more sites. It may not be laboratory conditions, but it is still a method of analysis that can reinforce or disprove an idea in a close equivalent of the scientific hypothesis.

Attempts to improve the quality of data gleaned from the archaeological record have led to the utilisation of an enormous variety of scientific techniques collectively labelled archaeometry. These include microwear analysis, artefact conservation, numerous physical and chemical dating methods such as the well-known radio carbon dating and dendrochronology; geophysical remote sensing techniques involving radar, magnetometry and resistivity; and DNA analysis, pathology and osteo-archaeology.

Teeth of a sand tiger shark
(possibly Odontaspis winkleri)
I found in a wood in Surrey, UK

But there are some major differences between archaeology and palaeontology as well. Although both appear to involve excavation, this is only somewhat true. Not only does archaeology include standing structures such as buildings or ancient monuments, but a project can be restricted to non-invasive techniques such as the geophysical methods mentioned above; excavating a site is the last resort to glean information unobtainable by any other way, especially important if the site is due to be destroyed by development. In contrast, fossils are no use to science by remaining buried. Having said that, I often fossils by sifting through pebbles rather than concerted digging. I have occasionally split rocks or dug through soft sand, but a lot of the time fossils can be found scattered on the surface or prised out of exposed chalk via finger nails. The best way to spot even large finds is to have them already partially exposed through weathering, whilst some archaeology cannot be directly seen from the site but only identified via aerial photography or geophysics.

Archaeological sites can prove extremely complex due to what is known as context: for example, digging a hole is a context, back filling it is another, and any finds contained therein are yet more. Repeated occupation of a site is likely to cause great difficulty in unravelling the sequence, especially if building material has been robbed out. This is substantially different to palaeontology, where even folded stratigraphy caused by geophysical phenomena can be relatively easily understood.

Perhaps the most fundamental difference between the disciplines is that of data analysis. As anyone who has spent time on a site excavation knows, there are often as many theories as there are archaeologists. There are obviously far less fixed data points than that provided by Linnaean taxonomy and so there is a reliance on subjectivity, the keyword being 'interpretation'. Even the prior experience of the excavator with sites of a similar period/location/culture can prove crucial in gaining a correct (as far as we can ever be correct) assessment. In lieu of similarity to previously excavated sites, an archaeologist may turn to anthropology, extrapolating elements of a contemporary culture to a vanished one, such as British prehistorian Mike Parker-Pearson's comparison between the symbolic use of materials in contemporary Madagascar and Bronze Age Britain. In stark contrast, once a fossil has been identified it is unlikely for its taxonomy to be substantially revised - not that this doesn’t still occur from time to time.

As can be seen, not all science proceeds from the hypothesis-mathematical framework-laboratory experiment axis. After all, most of the accounts of string theory that I have read discuss how unlikely it can ever be subject to experiment. The British Quality Assurance Agency Benchmark Statement for Archaeology perhaps comes closest to the true status of the discipline when it lists 'scientific' as one of the four key contexts for higher level archaeological training. In addition, every edition since 2000 has stated "Where possible, thinking scientifically should be part of the armoury of every archaeologist."

So part historical science, part humanity, archaeology is an interesting combination of methodologies and practice, with more resemblances than differences to palaeontology. As the Ancient Human Occupation of Britain project shows, sometimes the practitioners can even work in (hopefully) perfect harmony. Another nail in the coffin for C.P. Snow's 'Two Cultures', perhaps?