Tuesday, 28 May 2019

Praying for time: the rise and fall of the New Zealand mantis


While the decline of the giant panda, great apes and various cetacean species have long garnered headlines, our scale prejudice has meant invertebrates have fared rather less well. Only with the worrying spread of colony collapse disorder (CCD) in bee hives have insect-themed stories gained public attention; yet most of the millions of other small critters remain on the sidelines. I've often mentioned that overlooking these small marvels could backfire on us, considering we don't know the knock-on effect their rapid decline - and possible near-future extinction - would have on the environment we rely on.

One such example here in New Zealand is our native praying mantis Orthodera novaezealandiae, which for all we know could be a key player in the pest control of our farms and gardens. Mantid species are often near the apex of invertebrate food webs, consuming the likes of mosquitoes, moth caterpillars and cockroaches. I admit that they are not exactly discriminating and will also eat useful species such as ladybirds or decorative ones like monarch butterflies. However, they are definitely preferable to pesticides, a known cause of CCD today and an acknowledged factor of insect decline since Rachel Carson's pioneering 1962 book Silent Spring.

Of course, we shouldn't just support species due to their usefulness: giant pandas aren't being conserved for any particular practical benefit. From a moral perspective it's much easier to convince the public that we should prevent their extinction than that of the rather uncuddly mantis. We still know so little about many insect species it's difficult to work out which need to be saved in order to preserve our agribusiness (versus all the others that of course should be preserved regardless). I’m not averse to careful extermination of plagues of locusts or mosquitoes, but indiscriminate destruction due to greed or stupidity is well, stupid, really.

Down but not out: the New Zealand praying mantis Orthodera novaezealandiae



Back to O. novaezealandiae. I've only seen New Zealand's sole native mantis species three times in the 'wild': twice in my garden in the past two years and once in my local reserve before that. What is particularly interesting is that since initial descriptions in the 1870's, hypotheses regarding its origin appear to have evolved due to patriotic trends as much as to factual evidence. Late Nineteenth Century accounts of its spread suggest an accidental importation from Australia by European sailing ship, since it is a clumsy, short-range flier and seabirds are unlikely to carry the insects - and certainly not their cemented oothecae (egg sacks) - on their feet.

However, some Victorian naturalists thought the insect was incorporated into Maori tradition, implying a precolonial existence. In contrast, A.W.B.Powell's 1947 book Native Animals of New Zealand refers to the native mantis as Orthodera ministralis (which today is only used to describe the Australian green mantis) and the author states it may well be a recent arrival from across the Tasman Sea. So the native species may not be particularly native after all! I find this fascinating, insomuch as it shows how little we understand about our local, smaller scale, wildlife when compared to New Zealand's birds, bats and even reptiles.

The specimens in my garden have lived up to their reputation for being feisty: they seem to size you up before launching themselves directly towards you, only for their wings to rapidly falter and force the insect into an emergency landing. After the most recent observation, I looked around the outside of the house and found three oothecae, two of which were under a window sill built in 2016. These finds are cheering, as it means that at least in my neighbourhood they must be holding their own.

Perhaps their chief enemy these days is the invasive Miomantis caffra. This inadvertently-introduced South African mantis was first seen in 1978 and is rapidly spreading throughout New Zealand's North Island. The intruder - frequently spotted in my garden - has several advantages over O. novaezealandiae: firstly, it is able to survive through winter. Second, it produces rather more nymphs per ootheca; combined with hatching over a longer period this presumably leads to a larger numbers of survivors per year. In addition, and most unfortunately, the native male appears to find the (cannibalistic) South African female more attractive than the female of its own species, frequently resulting in its own demise during mating.

Humans too have further aided the decline of the native mantis with the accidental introduction of parasitic wasps and widespread use of pesticides. After less than a century and a half of concerted effort, European settlers have managed to convert a large proportion of the best land in this corner of the Pacific into a facsimile of the English countryside - but at what cost to the local fauna and flora?

Working to the old adage that we won't save what we don't love and cannot love what we don't know, perhaps what is really required is an education piece disguised as entertainment. Promoting mammals in anthropomorphic form has long been a near-monopoly of children's literature (think Wind in the Willows) but perhaps it is about time that invertebrates had greater public exposure too. Gerald Durrell's 1956 semi-autobiographical best-seller My Family and Other Animals includes an hilarious battle in the author's childhood bedroom between Cicely the praying mantis and the slightly smaller Geronimo the gecko, with the lizard only winning after dropping its tail and receiving other injuries. Perhaps a contemporary writer telling tales in a similar vein might inspire more love for these overlooked critters before it is too late. Any takers?


Monday, 13 May 2019

Which side are you on? The mysterious world of brain lateralisation

There are many linguistic examples of ancient superstitions still lurking in open sight. Among the more familiar are sinister and dexterous, which are directly related to being left- and right-handed respectively. These words are so common-place that we rarely consider the pre-scientific thinking behind them. I was therefore interested last year to find out that I am what is known as 'anomalous dominant'. Sounds ominous!

The discovery occurred during my first archery lesson where - on conducting the Miles test for ocular dominance - I discovered that despite being right-handed, I am left-eye dominant. I'd not heard of cross-dominance before, so I decided to do some research. As Auckland Central City Library didn't have any books on the subject I had to resort to the Web, only to find plenty of contradictory information, often of dubious accuracy, with some sites clearly existing so as to sell their strategies for overcoming issues related to the condition.

Being cross-dominant essentially means it takes longer for sensory information to be converted into physical activity, since the dominant senses and limbs must rely on additional transmission of neurons between the hemispheres of the brain. One common claim is that the extra time this requires has an effect on coordination and thus affects sporting ability. I'm quite prepared to accept that idea as I've never been any good at sport, although I must admit I got used to shooting a bow left-handed much quicker than I thought; lack of strength on my left side proved to be a more serious issue than lack of coordination due to muscle memory.

Incidentally, when I did archery at school in the 1980s, no mention was ever made about testing for eye dominance and so I shot right-handed! I did try right-handed shooting last year, only to find that I was having to aim beyond the right edge of the sight in order to make up for the parallax error caused by alignment of the non-dominant eye.

Research over the past century suggests children with crossed lateralisation could suffer a reduction in academic achievement or even general intelligence as a direct result, although a 2017 meta-analysis found little firm evidence to support this. Archery websites tend to claim that the percentage of people with mixed eye-hand dominance is around 18%, but other sources I have found vary anywhere from 10% to 35%. This lack of agreement over so fundamental a statistic suggests that there is still much research to be done on the subject, since anecdotal evidence is presumably being disseminated due to lack of hard data.

There is another type of brain lateralisation which is colloquially deemed ambidextrous, but this term covers a wide range of mixed-handedness abilities. Despite the descriptions of ambidextrous people as lucky or gifted (frequently-named examples include Leonardo da Vinci, Beethoven, Gandhi and Albert Einstein) parenting forums describe serious issues as a result of a non-dominant brain hemisphere. Potential problems include dyspraxia and dyslexia, ADHD, even autism or schizophrenia.

While the reporting of individual families can't be considered of the same quality as professional research, a 2010 report by Imperial College London broadly aligns with parents' stories. 'Functional disconnection syndrome' has been linked to learning disabilities and slower physical reaction times, rooted in the communications between the brain's hemispheres. There also seems to be evidence for the opposite phenomenon, in which the lack of a dominant hemisphere causes too much communication between left and right sides, generating noise that impedes normal mental processes.

What I would like to know is why there is so little information publicly available? I can only conclude that this is why there is such a profusion of non-scientific (if frequently first-hand) evidence. I personally know of people with non-dominant lateralisation who have suffered from a wide range of problems from dyslexia to ADHD, yet they have told me that their general practitioners failed to identify root causes for many years and suggested conventional solutions such as anti-depressants.

Clearly this is an area that could do with much further investigation; after all, if ambidexterity is a marker for abnormal brain development that arose in utero (there is some evidence that a difficult pregnancy could be the root cause) then surely there is clearly defined pathway for wide scale research? This could in turn lead to a reduction in people born with these problems.

In the same way that a child's environment can have a profound effect on their mental well-being and behaviour, could support for at-risk pregnant women reduce the chance of their offspring suffering from these conditions? I would have thought there would be a lot to gain from this, yet I can't find evidence of any medical research seeking a solution. Meanwhile, why not try the Miles test yourself and find out where you stand when it comes to connectivity between your brain, senses and limbs?

Tuesday, 23 April 2019

Lift to the stars: sci-fi hype and the space elevator

As an avid science-fiction reader during my childhood, one of the most outstanding extrapolations for future technology was that of the space elevator. As popularised in Arthur C. Clarke's 1979 novel, The Fountains of Paradise, the elevator was described as a twenty-second century project. I've previously written about near-future plans for private sector spaceflight, but the elevator would be a paradigm shift in space transportation: a way of potentially reaching as far as geosynchronous orbit without the need for rocket engines.

Despite the novelty of the idea: a tower stretching from Earth - or indeed any planet's surface - to geosynchronous orbit and beyond; the first description dates back to 1895 and writings of the Russian theoretical astronautics pioneer Konstantin Tsiolkovsky. Since the dawn of the Space Age engineers and designers in various nations have either reinvented the elevator from scratch or elaborated on Tsiolkovsky's idea.

There have of course been remarkable technological developments over the intervening period, with carbyne, carbon nanotubes, tubular carbon 60 and graphene seen as potential materials for the elevator, but we are still a long way from being able to build a full-size structure. Indeed, there are now known to be many more impediments to the space elevator than first thought, including a man-made issue that didn't exist at the end of the nineteenth century. Despite this, there seems to be a remarkable number of recent stories about elevator-related experiments and the near-future feasibility of such a project.

An objective look at practical - as opposed to theoretical - studies show that results to date have been decidedly underwhelming. The Space Shuttle programme started tethered satellite tests in 1992. After an initial failure (the first test achieved a distance of a mere 256 metres), a follow up six years later built a tether that was a rather more impressive twenty kilometres long. Then last year the Japanese STARS-me experiment tested a miniature climber component in orbit, albeit at a miniscule distance of nine metres. Bearing in mind that a real tower would be over 35,000 kilometres long, it cannot be argued that the technology is almost available for a full-scale elevator.

This hasn't prevented continuous research by the International Space Elevator Consortium (ISEC), which was formed in 2008 to promote the concept and the technology behind it. It's only to be expected that fans of the space elevator would be enthusiastic, but to my mind their assessment that we are 'tech ready' for its development seems to be optimistic to the point of incredulity.

A contrasting view is that of Google X's researchers, who mothballed their space elevator work in 2014 on the grounds that the requisite technology will not be available for decades to come. While the theoretical strength of carbon nanotubes meets the requirements, the total of cable manufactured to date is seventy centimetres, showing the difficulties in achieving mass production. A key stopping point apparently involves catalyst activity probability; until that problem is resolved, a space elevator less than one metre in length isn't going to convince me, at least.

What is surprising then is that in 2014, the Japanese Obayashi Corporation published a detailed concept that specified a twenty-year construction period starting in 2030. Not to be outdone, the China Academy of Launch Vehicle Technology released news in 2017 of a plan to actually build an elevator by 2045, using a new carbon nanotube fibre. Just how realistic is this, when so little of the massive undertaking has been prototyped beyond the most basic of levels?

The overall budget is estimated to be around US$90 billion, which suggests an international collaboration in order to offset the many years before the completed structure turns a profit. In addition to the materials issue, there are various other problems yet to be resolved. Chief among these are finding a suitable equatorial location (an ocean-based anchor has been suggested), capturing an asteroid for use as a counterweight, dampening vibrational harmonics, removing space junk, micrometeoroid impact protection and shielding passengers from the Van Allen radiation belts. Clearly, just developing the construction material is only one small element of the ultimate effort required.

Despite all these issues, general audience journalism regarding the space elevator - and therefore the resulting public perception - appears as optimistic as the Chinese announcement. How much these two feedback on each other is difficult to ascertain, but there certainly seems to be a case of running before learning to walk. It's strange that China made the claim, bearing in mind how many other rather important things the nation's scientists should be concentrating on, such as environmental degradation and pollution.

Could it be that China's STEM community have fallen for the widespread hype rather than prosaic reality? It's difficult to say how this could be so, considering their sophisticated internet firewall that blocks much of the outside world's content. Clearly though, the world wide web is full of science and technology stories that consist of parrot fashion copying, little or no analysis and click bait-driven headlines.

A balanced, in-depth synthesis of the relevant research is often a secondary consideration. The evolutionary biologist Stephen Jay Gould once labelled the negative impact of such lazy journalism as "authorial passivity before secondary sources." In this particular case, the public impression of what is achievable in the next few decades seems closer to Hollywood science fiction than scientific fact.

Of course, the irony is that even the more STEM-minded section of the public is unlikely to read the original technical articles in a professional journal. Instead, we are reliant on general readership material and the danger inherent in its immensely variable quality. As far as the space elevator goes (currently, about seventy centimetres), there are far more pressing concerns requiring engineering expertise; US$90 billion could, for example, fund projects to improve quality of life in the developing world.

That's not to say that I believe China will construct a space elevator during this century, or that the budget could be found anywhere else, either. But there are times when there's just too much hype and nonsense surrounding science and not enough fact. It's easy enough to make real-world science appear dull next to the likes of Star Trek, but now more than ever we need the public to trust and support STEM if we are to mitigate climate change and all the other environmental concerns.

As for the space elevator itself, let's return to Arthur C. Clarke. Once asked when he thought humanity could build one, he replied: "Probably about fifty years after everybody quits laughing." Unfortunately, bad STEM journalism seems to have joined conservatism as a negative influence in the struggle to promote science to non-scientists. And that's no laughing matter.

Monday, 1 April 2019

The day of the dolphin: covert cetaceans, conspiracy theories and Hurricane Katrina

One of the late, great Terry Pratchett's Discworld novels mentions a failed attempt by King Gurnt the Stupid to conduct aerial warfare using armoured ravens. Since real life is always stranger than fiction, just how harebrained are schemes by armed forces to utilise animals in their activities?

Large mammals such as horses and elephants have long been involved in the darker aspects of human existence, but the twentieth century saw the beginnings of more sophisticated animals-as-weapons schemes, including for example, research into the use of insects as disease vectors.

Some of the fruitier research projects of the 1960s saw the recruitment of marine mammals, reaching an apotheosis - or nadir - in the work of John Lilly. A controversial neuroscientist concerned with animal (and extraterrestrial) communication, Lilly even gave psychedlic drugs to dolphins as part of attempts to teach them human language and logic: go figure!

Whether this work was the direct inspiration for military programmes is uncertain, but both the Soviet and United States navies sought to harness the intelligence and learning capabilities of marine mammals during the Cold War. Besides bottlenose dolphins, sea lions were also trained in activities such as mine detection, hardware retrieval and human rescue. Although the Russians are said to have discontinued their research some years ago, the US Navy's Marine Mammal Research Program is now in its sixth decade and has funding up until at least next year.

Various sources claim that there is a classified component to the program headquartered in San Diego under the moniker the Cetacean Intelligence Mission. Although little of any value is known for certain, researchers at the University of Texas at Austin have been named as one of the groups who have used naval funding to train dolphins - plus design a dolphin equipment harness - for underwater guard duty. A more controversial yet popular claim is for their use as weapon platforms involving remote-controlled knock-out drug dart guns. If this all sounds a bit like Dr. Evil's request for "sharks with lasers" then read on before you scoff.

In the aftermath of the devastation caused by Hurricane Katrina in August 2005, it was discovered that eight out of fourteen bottlenose dolphins that were housed at the Marine Life Oceanarium in Gulfport, Mississippi, had been swept out to sea. Although later recovered by the United States Navy, this apparently innocent operation has a bearing on a similar escape that was given much greater news coverage soon after the hurricane.

Even respected broadsheet newspapers around the world covered the story generated by a US Government leak that thirty-eight United States Navy dolphins had also gotten free after their training ponds near Lake Pontchartrain, Louisiana, were inundated by Hurricane Katrina. Apart from the concerns of animal rights groups that: (a) dolphins shouldn't be used as weapons platforms; and (b) how would they cope in the open ocean of the Gulf of Mexico (vis-a-vis its busy shipping lanes)? another issue was the notion that the dolphins might attack civilian divers or vessels.

It would be quite easy here to veer into the laughable fantasies that the Discovery Channel tries to pass off as genuine natural history, if it weren't for a string of disconcerting facts. The eight dolphins that escaped from the Marine Life Oceanarium were kept by the navy for a considerable period before being returned to Mississippi. This was explained at the time as a health check by navy biologists, but there is a more sinister explanation: what if the dolphins were being examined to ensure that they were not military escapees from Lake Pontchartrain?

The latter half of 2005 into early 2006 saw the resumption of fishing in the Gulf of Mexico, following the destruction of almost ninety per cent of the region's commercial fleet in the hurricane. However, many of the smaller boats that did make it back to sea returned to port with unusual damage, or in some cases, had to be towed after failing to make it home under their own power. Much of this was put down to hasty repairs in order to resume fishing - a key component of the local economy - as soon as possible.

Reports released by boat yards during this period show inexplicable damage to rudders and propellers, mainly to shrimp boats. Fragments of metal, plastic and pvc were recovered in a few cases, causing speculation as to where this material had come from. The National Marine Fisheries Service requested access to the flotsam, which was subsequently lost in the chain of bureaucracy; none of the fragments have been seen since. It may not be on the scale of Roswell, but someone in the US military seems to be hiding something here.

It's been over half a century since Dr. Lilly's experiments inspired such fictional cetacean-centred intrigue as The Day of the Dolphin. Therefore, there has been plenty of time for conspiracy theorists to cobble together outlandish schemes on the basis of threadbare rumours. What is certain is that the enormous reduction in the region's fishing that followed in the wake of Hurricane Katrina would have been a boon for the Gulf of Mexico's fish stocks. This would presumably have carried on up the food chain, allowing dolphin numbers to proliferate throughout 2006 and beyond.

Whether the US Navy was able to recover some or all of its underwater army is not known, but it doesn't take much imagination to think of the dolphins enjoying their freedom in the open ocean, breaking their harnesses upon the underside of anchored fishing vessels, determined to avoid being rounded up by their former keepers. The Gulf in the post-Katrina years would have been a relative paradise for the animals compared to their military careers.

Although the United States Navy is said to have spent less than $20 million dollars per annum on the Marine Mammal Research Program, a mere drop in the ocean (you know that one's irresistible) compared to the mega-budgets of many Department of Defense projects, the low cost alone suggests the value of attempting to train dolphins for military purposes. Perhaps the truth will emerge one day, once the relevant files are declassified. Or alternatively, a new John Lilly may come along and be finally able to translate dolphinese. In which case, what are the chances that descendants of the Lake Pontchartrain escapees will recall the transition from captivity to freedom with something along the lines of "So long, and thanks for all the fish!"

Wednesday, 20 March 2019

My family & other animals: what is it that makes Homo sapiens unique?

It's a curious thing, but I can't recall ever having come across a comprehensive assessment of what differentiates Homo sapiens from all other animals. Hence this post is a brief examination on what I have found out over the years. I originally thought of dividing it into three neat sections, but quickly discovered that this would be, as Richard Dawkins once put it, 'a gratuitously manufactured discontinuity in a continuous reality.' In fact, I found a reasonably smooth gradation between these segments:
  1. Long-held differences now found to be false
  2. Possibly distinctions - but with caveats
  3. Uniquely human traits
Despite the carefully-observed, animal-centered stories of early civilisations - Aesop's fable of The Crow and the Pitcher springs to mind - the conventional wisdom until recently was that animals are primarily automatons and as such readily exploitable by humanity. Other animals were deemed vastly inferior to us by a question of kind, not just degree, with a complete lack of awareness of themselves as individuals.

The mirror test developed in 1970 has disproved that for a range of animals, from the great apes to elephants, dolphins to New Caledonian crows. Therefore, individuals of some species can differentiate themselves from their kin, leading to complex and fluid hierarchies within groups - and in the case of primates, some highly Machiavellian behaviour.

Man the tool-maker has been a stalwart example of humanity's uniqueness, but a wide range of animals in addition to the usual suspects (i.e. great apes, dolphins and Corvidae birds) are now known to make and use tools on a regular basis. Examples include sea otters, fish, elephants, and numerous bird species, the latter creating everything from fish bait to insect probes. Even octopuses are known to construct fences and shelters, such as stacking coconut shells - but then they do have eight ancillary brains in addition to the main one!

We recognise regional variations in human societies as the result of culture, but some animal species also have geographically-differentiated traits or tools that are the obvious equivalent. Chimpanzees are well known for their variety of techniques used in obtaining food or making tools. These skills are handed down through the generations, remaining different to those used in neighbouring groups.

Interestingly, farming has really only been adopted by the most humble of organisms, namely the social insects. Ants and termites farm aphids and fungi in their complex, air-conditioned cities that have more than a touch of Aldous Huxley's Brave New World about them; in a few species, the colonies may even largely consist of clones!

Although many animals construct nests, tunnels, dams, islets or mounds, these appear to serve purely functional purposes: there is no equivalent of the human architectural aesthetic. Octopus constructions aside, birds for example will always build a structure that resembles the same blueprint used by the rest of their kind.

Many species communicate by aural, gestural or pheremonal languages, but only humans can store information outside of the body and across generations living at different times. Bird song might sound pretty, but again, this appears to be a series of basic, hard-wired, communications. Conversely, humpback whale song may contain artistic values but we just don't know enough about it to judge it in this light.

Birds and monkeys are happy to hoard interesting objects, but there is little aesthetic sense in animals other than that required to identify a high-quality mate. In contrast, there is evidence to suggest that other species in the hominin line, such as Neanderthals and Homo erectus, created art in forms recognisable today, including geometric engravings and jewellery.

Some of our ancestor's earliest artworks are realistic representations, whereas when armed with a paint brush, captive chimps and elephants produce abstract work reminiscent of pre-school children. We should remember that only since the start of the Twentieth Century has abstract art become an acceptable form for professional artists.

Jane Goodall's research on the Gombe chimps shows that humans are not the only animal to fight and kill members of the same species for reasons other than predation or rivalry. Sustained group conflict may be on a smaller scale and have less rules than sanctioned warfare, but it still has enough similarity to our own violence to say that humanity is not its sole perpetrator. One interesting point is that although chimps have been known to use sharpened sticks to spear prey, they haven't as yet used their weapons on each other.

Chimpanzees again have been shown to empathise with other members of their group, for example after the death of a close relative. Altruism has also been observed in the wild, but research suggests there is frequently another motive involved as part of a long-term strategy. This is countered with the notion that humans are deemed able to offer support without the expectation of profit or gain in the future; then again, what percentage of such interactions are due to a profitless motivation is open to suggestion.

A tricky area is to speculate on the uniqueness of ritual to Homo sapiens. While we may have usurped the alpha male position in domesticated species such as dogs, their devotion and loyalty seems too far from deity worship to be a useful comparison; certainly the idea of organised religion has to be alien to all other species? Archaeological evidence shows what appears to be Neanderthal rituals centred on cave bears, as well as funereal rites, but the DNA evidence for interbreeding with modern humans doesn't give enough separation to allow religion to be seen as anything other than a human invention. What is probably true though is that we are the only species aware of our own mortality.

One area in which humans used to be deemed sole practitioners is abstract thought, but even here there is evidence that the great apes have some capability, albeit no greater than that of a pre-schooler. Common chimps and bonobos raised in captivity have learnt - in some cases by observation, rather than being directly taught - how to use sign language or lexigrams to represent objects and basic grammar. It's one thing to see a button with a banana on it and to learn that pressing it produces a banana, but to receive the same reward for pressing an abstract symbol shows a deeper understanding of relationship and causality.

A consideration of a potential future is also shared with birds of the Corvidae family, who are able to plan several steps ahead. Where humans are clearly far ahead is due to a gain in degree rather than just kind. Namely, we have the ability to consider numerous future paths and act accordingly; this level of sophistication and branch analysis appears to be uniquely human, allowing us to cogitate about possibilities in the future that might occur - or may never be possible. Both prose and poetic literature are likely to be uniquely human; at least until we can decipher humpback whale song.

Finally, there is science, possibly the greatest of human inventions. The multifarious aspects of the scientific endeavour, from tentative hypothesis to experimentation, advanced mathematics to working theory, are unlikely to be understood let alone attempted by any other species. The combination of creative and critical thinking, rigour and repetition, and objectivity and analysis require the most sophisticated object in the known universe, the human brain. That's not to say there aren't far more intelligent beings out there somewhere, but for now there is one clear activity that defines us as unique. And thank goodness it isn't war!

Sunday, 10 March 2019

Buzzing away: are insects on the verge of global extinction?

It's odd how some of these posts get initiated. For this particular one, there were two driving factors. One was passing a new house on my way to work where apart from the concrete driveway, the front garden consisted solely of a large square of artificial grass; the owners are clearly not nature lovers! The second inspiration was listening to a BBC Radio comedy quiz show, in which the panel discussed the recent report on global insect decline without being able to explain why this is important, apart from a vague mention of pollination.

Insect biologists have long sung the praises of these unrewarded miniature heroes, from JBS Haldane's supposed adage about God being "inordinately fond of stars and beetles" to EO Wilson's 1987 speech that described them as "the little things that run the world." In terms of numbers of species and individuals, invertebrates, especially insects, are the great success story of macroscopic life on our planet. So if they are in serious decline, does that spell trouble for Homo sapiens?

The new research claims that one-third of all insect species are currently endangered, extrapolating to wholesale extinction for the class Insecta over the next century. Although the popular press has started using evocative phrases such as "insect genocide" and even "insectageddon", just how accurate are these dramatic claims?

The United Nation's Red List currently describes three hundred insect species as critically endangered and a further seven hundred as vulnerable, but this is a tiny proportion of the total of...well, at lot more, at any rate. One oft-quoted figure is around one million insect species, although entomologists have estimated anywhere from 750,000 up to 30 million, with many species still lacking formal scientific identification. The hyperbole could therefore easily sound like unnecessary scaremongering, until you consider the details.

The new report states that butterflies and caddis flies are suffering the greatest decline, while cockroaches - as anyone who has faced a household infestation of them will know, they are likely to remain around until the end of world - and flies are the least affected orders. So, to paraphrase Monty Python, what have the insects ever done for us?

Pollination is of course of key importance, to both horticulture and un-managed 'wild' environments. Insects are near the base of many food webs; if numbers were much reduced, never mind removed, the impact on the rest of the ecosystem would be catastrophic. With the human population set to top ten billion in thirty years' time, we require ever larger regions of productive land for agriculture. They may be small at an individual level, but arthropods in general total about seventeen times the mass of all us H. sapiens. Insects replenish the soil, as alongside bacteria they break down dead matter and fecal material. So important is this latter function that New Zealand has been trialling non-native dung beetles to aid cattle farmers.

One key way to save fresh water and lessen the generation of the potent greenhouse gas methane is to reduce meat consumption in favour of insect protein. If insects are no longer around, then that will be an additional challenge in reducing environmental degradation. This of course also ignores the fact that insects are already a component in the diet of many developing nations. Last year I wrote about how scientists have been creating advanced materials derived from animals. Again, we are shooting ourselves in the foot if we allow this ready-made molecular library to be destroyed.

What is responsible for this global decline? Perhaps unsurprisingly, it turns out to be the usual suspects. Agricultural chemicals including pesticides have been associated with honey-bee colony collapse disorder (not incidentally, some tests have found honey samples with neonicotinoids - the mostly widely-used insecticides - exceeding the recommended human dosage) so clearly the same culprit is affecting other insects. Fresh waterways, home to many aquatic insect species, are frequently as polluted as the soil, either due to agricultural run-off or industrial contaminants. Wild landscapes are being converted with great haste into farm land and urban sprawl, with an obviously much-reduced biota.

Climate change is playing its part, with soil acidity increasing just as it is in the oceans. Even areas as remote as central Australia have seen marked decreases in insects as higher temperatures and lower rainfall outpaces the ability to adapt to the new conditions. I've often mentioned the role of invasive species in the decimation of indigenous vertebrates, but insects are equally prone to suffer from the arrival of newcomers. Although New Zealand has very strict biosecurity protocols, the likes of Queensland fruit flies and brown marmorated stink bugs are still occasionally found in or around ports of entry.

Many nations have no such procedures in place, resulting in local species being out-competed or killed by introduced species or pathogens to which they have no resistance. Until fairly recently, even New Zealand had a lax attitude to the issue, resulting in the decline of native species such as carabid beetles. When I conducted a brief survey of my garden in 2017 I found that one-third of the insect species were non-native, most of these being accidental imports since the arrival of European settlers.

If insects are so vital to our survival, why has there been so little interest in their well-being? There are some fairly obvious suggestions here. Firstly, at least in Western cultures, insects have been deemed dirty, ugly things that can be killed without a second thought. Wasps, ants and cockroaches in particular are seen in this light of being unwelcome pests, with typical insect-related phrases including "creepy crawlies" and "don't let the bed bugs bite".

It's fairly well-known that malaria-carrying mosquitoes are the most dangerous animals for us humans in terms of fatalities. The widespread outbreaks of the Zika virus haven't done them any favours either. As Brian Cox's television series Wonders of Life showed, their small size has given them veritable super powers compared to us lumbering mammals, from climbing up sheer surfaces (as a praying mantis was doing a few nights' ago on my window) to having amazing strength-to-weight ratios. All in all, insects are a bit too alien for their own good!

Clearly, scale prejudice is also a key factor. On a recent trip to Auckland Central Library I only found one book on insects versus dozens on birds. Photographic technology has been a double-edged sword when it comes to giving us a clearer picture of insects: close-ups are often greeted with revulsion, yet until Sir David Attenborough's 2005 BBC series Life in the Undergrowth, there was little attempt to film their behaviour with the same level of detail as say, the lions and antelopes of the Serengeti. It should also be mentioned that when Rachel Carson's ground-breaking book about the dangers of pesticides, Silent Spring, was published in 1962, the resulting environmentalism was largely in support of birds rather than insects.

Among all this doom and gloom, are there any ways to prevent it? One thing is for certain, and that is that it won't be easy. The agricultural sector would have to make drastic changes for a start, becoming much smarter in the use of chemicals and be held responsible for the local environment, including waterways. Vertical farming and other novel techniques could reduce the need for new agricultural land and water usage, but developing nations would be hard-pressed to fund these themselves.

Before any major undertaking, there's going to have to be either a fundamental crisis, such as food shortages, in a rich nation or a massive public relations exercise to convince people to consider insects in the same light as giant pandas or dolphins. This is not going to be easy, but as David Attenborough put it: "These small creatures are within a few inches of our feet, wherever we go on land - but often, they're disregarded. We would do very well to remember them."

Sunday, 24 February 2019

Core solidification and the Cambrian explosion: did one begat the other?

Let's face it, we all find it easier to live our lives with the help of patterns. Whether it's a daily routine or consultation of an astrology column (insert expletive of choice here) - or even us amateur astronomers guiding our telescopes via the constellations - our continued existence relies on patterns. After all, if we didn't innately recognise our mother's face or differentiate harmless creatures from the shape of a predator, we wouldn't last long. So it shouldn't be any surprise that scientists also rely on patterns to investigate the complexities of creation.

Richard Feynman once said that a scientific hypothesis starts with a guess, which should perhaps be taken with a pinch of salt. But nonetheless scientists like to use patterns when considering explanations for phenomena; at a first glance, this technique matches the principle of parsimony, or Occam's Razor, i.e. the simplest explanation is usually the correct one - excluding quantum mechanics, of course!

An example in which a potential pattern was widely publicised prior to confirmation via hard data was that of periodic mass extinction, the idea being that a single cause might be behind the five greatest extinction events. Four years after Luis Alvarez's team's suggestion that the 66 million year-old Chicxulub impactor could have caused the Cretaceous-Paleogene extinction, paleontologists David Raup and Jack Sepkoski published a 1984 paper hypothesising extinctions at regular intervals due to extraterrestrial impacts.

This necessitated the existance of an object that could cause a periodic gravitational perturbation, in order for asteroids and comets to be diverted into the inner solar system. The new hypothesis was that we live in binary star system, with a dwarf companion star in an highly elliptical, 26 million-year orbit. This would be responsible for the perturbation when it was at perihelion (i.e. closest approach to the sun).

What's interesting is that despite the lack of evidence, the hypothesis was widely publicised in popular science media, with the death-dealing star being appropriately named Nemesis after the Greek goddess of retribution. After all, the diversification of mammals was a direct result of the K-T extinction and so of no small importance to our species.

Unfortunately, further research has shown that mass extinctions don't fall into a neat 26 million-year cycle. In addition, orbiting and ground-based telescopes now have the ability to detect Nemesis and yet have failed to do so. It appears that the hypothesis has reached a dead end; our local corner of the universe probably just isn't as tidy as we would like it to be.

Now another hypothesis has appeared that might appear to belong in a similar category of neat pattern matching taking precedence over solid evidence. Bearing in mind the importance of the subject under scrutiny - the origin of complex life - are researchers jumping the gun in order to gain kudos if proven correct? A report on 565 million year-old minerals from Quebec, Canada, suggests that at that time the Earth's magnetic field was less than ten percent of what it is today. This is considerably lower than earlier estimate of forty percent. Also, the magnetic poles appear to have reversed far more frequently during this period than they have since.

As this is directly related to the composition of the Earth's core, it has led to speculation that the inner core was then in the final stage of solidification. This would have caused increased movement in the outer liquid, iron-rich core, and thus to the rapid generation of a much higher magnetic field. In turn, the larger the magnetic field dipole intensity, the lower the amount of high energy particles that reach the Earth's surface, both cosmic rays and from our own sun. What is particularly interesting about this time is that it is just (i.e. about twenty million years) prior to the so-called Cambrian explosion, following three billion years or so of only microbial life. So were these geophysical changes responsible for a paradigm shift in evolution? To confirm, we would need to confirm the accuracy of this apparently neat match.

It's well known that some forms of bacteria can survive in much higher radiation environments than us larger scale life forms; extremophiles such as Deinococcus radiodurans have even been found thriving inside nuclear reactors. Therefore it would seem obvious that more complex organisms couldn't evolve until the magnetic field was fairly high. But until circa 430 million years ago there was no life on land (there is now evidence that fungi may have been the first organisms to survive in this harsh environment). If all life was therefore in the sea, wouldn't the deep ocean have provided the necessary radiation protection for early plants and animals?

By 600 million years ago the atmospheric oxygen content was only about ten percent of today's value; clearly, those conditions would not have been much use to pretty much any air-breathing animals we know to have ever existed. In addition, the Ediacaran assemblage, albeit somewhat different from most subsequent higher animals, arose no later than this time - with chemical evidence suggesting their development stretched back a further 100 million years. Therefore the Canadian magnetic mineral evidence seems to be too late for the core solidification/higher magnetic field generation to have given the kick start to a more sophisticated biota.

In addition, we shouldn't forget that it is the ozone layer that acts as an ultraviolet shield; UVB is just as dangerous to many organisms, including near-surface marine life, as cosmic rays and high-energy solar particles. High-altitude ozone is thought to have reached current density by 600 million years ago, with blue-green algae as its primary source. O2 levels also increased at this time, perhaps driven by climate change at the end of a global glaciation.

Although the "Snowball Earth" hypothesis - that at least half of all ocean water was frozen solid during three or four periods of glaciation - is still controversial, there is something of a correlation in time between the geophysical evidence and the emergence of the Ediacaran fauna. As to the cause of this glacial period, it is thought to have been a concatenation of circumstances, with emergent plate tectonics as a primary factor.

How to conclude? Well, we would all like to find neat, obvious solutions, especially to key questions about our own origin. Unfortunately, the hypothesis based on the magnetic mineral evidence appears to selectively ignore the evolution of the Ediacaran life forms and the development of the ozone layer. The correlation between the end of "Snowball Earth" and the Ediacaran biota evolution is on slightly firmer ground, but the period is so long ago that even dating deposits cannot be accurate except to the nearest million years or so.

It's certainly a fascinating topic, so let's hope that one day the evidence will be solid enough for us to finally understand how and when life took on the complexity we take for granted. Meanwhile, I would take any speculation based on new evidence with a Feynman-esque pinch of salt; the universe frequently fails to match the nice, neat, parcels of explanations we would like it to. Isn't that one of the factors that makes science so interesting in the first place?