Thursday, 12 October 2017

The zeal in Zealandia: revealing a lost continent

From an outsider's standpoint, geology appears to be a highly conservative science. As I have mentioned on numerous occasions, it seems astonishing that it took over four decades for Alfred Wegener's continental drift hypothesis to be formalised - via the paradigm-shifting discovery of sea floor spreading - into the theory of plate tectonics. I suppose that like evolution by natural selection, the mechanism, once stated, seems blindingly obvious in hindsight.

Regardless, the geological establishment appears to have been stubbornly opposed to the ideas of an outsider (Wegener was a meteorologist) who was unable to provide proof of an exact mechanism. This was despite the fact that the primary alternative, hypothetical submerged (but extremely convenient) land bridges, appear even more far-fetched.

Over the past few decades geophysical data has been accumulating that should generate rewrites of texts from the most basic level upwards. Namely, that the islands making up New Zealand are merely the tip of the iceberg, accounting for just six per cent of a mostly submerged 'lost' continent. Once part of the Southern Hemisphere's Gondwana, in 1995 the newly discovered continent was given the name Zealandia. Approximately five million square kilometres in size, it broke away from the Australasian region of Gondwana around 70-80 million years ago.

After a decade or two of fairly lacklustre reporting, 2017 seems to be the year in which Zealandia is taking-off in the public domain. First, the Geological Society of America published a paper in February. stating that Zealandia should be officially declared as a continent. Then in July the drill ship Joides Resolution began the two month long Expedition 371, a research trip under the International Ocean Discovery Programme (IODP). Scientists from twelve countries undertook deep sea drilling, gaining data on plate tectonics, palaeontology and climate history as well as research directly relevant to understanding the geology of the newest continent.

It is surprising then to learn that geologists first mooted the idea as early as the 1960s but that apart from some marine core samples collected in 1971, no-one undertook the necessary ocean-based research until very recently. Earth resources satellites have helped somewhat, but nothing could replace the evidence that emerged with deep drilling of the seabed. Therefore I wonder what has sparked the sudden interest in an idea that has been around for so long?

One possibility is the large amount of data that the international geological community required to prove the theory beyond doubt, coupled with the fact that this sort of research has little in the way of an obvious immediate practical benefit. It is extremely expensive to undertake deep sea drilling and few vessels are equipped for the purpose. Joides Resolution itself will be forty years old next year, having undergone several years' of refit to keep it going. Those areas of sea bed with potential oil or gas deposits may gain high-fidelity surveying, but compared to fossil fuels, fossil biota and sea bed strata research are very much at the whim of international project funding. In the case of the IODP, governments are cutting budgets on what are deemed non-essential projects, so it remains to be seen whether the intended follow-up trips will occur.

It would be disappointing if there was no further research as despite the acceptance of Zealandia, there is still a great deal of disagreement about what is known as the Oligocene Drowning. I first came across the notion of an eighth continent in the excellent 2007 book In Search of Ancient New Zealand, written by geologist / palaeontologist Hamish Campbell and natural history writer Gerard Hutching. The reason that over ninety per cent of Zealandia is underwater is due to the lack of thickness of its continental land mass - only 20-30km - making it far less buoyant than other continents.

But has this submerged percentage varied during the past eighty million years? There are some very divided opinions about this, with palaeontologists, geneticists and other disciplines taking sides with different camps of geologists. These can be roughly summarised as Moa's Ark versus the Oligocene Drowning, or to be more precise, what percentage, if any, of New Zealand's unique plants and animals are locally-derived Gondwanan survivors and how many have arrived by sea or air within the past twenty or so million years?

The arguments are many and varied, with each side claiming that the other has misinterpreted limited or inaccurate data. If Zealandia has at any time been entirely submerged, then presumably next to none of the current fauna and flora can have remained in situ since the continent broke away from Gondwana. The evidence for and against includes geology, macro- and micro-fossils, and genetic comparisons, but nothing as yet provides enough certainty for a water-tight case in either direction. In Search of Ancient New Zealand examines evidence that all Zealandia was under water around twenty-three million years ago, during the event known as the Oligocene Drowning. However, Hamish Campbell's subsequent 2014 book (co-written with Nick Mortimer) Zealandia: Our continent revealed discusses the finding of land-eroded sediments during this epoch, implying not all the continent was submerged.

It's easy to see why experts might be reticent to alter their initial stance, since in addition to the conservative nature of geology there are other non-science factors such as patriotism at stake. New Zealand's unusual biota is a key element of its national identity, so for New Zealand scientists it's pretty much a case of damage it at your own peril! In 2003 I visited the predator-free Karori Wildlife Reserve in Wellington. Six years later it was rebranded as Zealandia, deliberately referencing the eighth continent and with more than a hint of support for Moa's Ark, i.e. an unbroken chain of home-grown oddities such as the reptile tuatara and insect weta. With the nation's reliance on tourism and the use of the '100% Pure New Zealand' slogan, a lot rests on the idea of unique and long-isolated wildlife. If the flightless kakapo parrot for example turns out not to be very Kiwi after all, then who knows how the country's reputation might suffer.

What isn't well known, even within New Zealand, is that some of the best known animals and plants are very recent arrivals. In addition to the numerous species deliberately or accidentally introduced by settlers in the past two hundred years, birds such as the silvereye / waxeye (Zosterops lateralis) and Welcome swallow (Hirundo neoxena) are self-introduced, as is the monarch butterfly.

The volcanic island of Rangitoto in Auckland's Hauraki Gulf is only about six centuries old and yet - without any human intervention - has gained the largest pohutukawa forest in the world, presumably all thanks to seeds spread on the wind and by birds. Therefore it cannot be confirmed with any certainty just how long the ancestors of the current flora and fauna have survived in the locality. A number of New Zealand scientists are probably worried that some of the nation's best-loved species may have arrived relatively recently from across the Tasman; a fossil discovered in 2013 suggests that the flightless kiwi is a fairly close cousin of the Australian emu and so is descended from a bird that flew to New Zealand before settling into an ecological niche that didn't require flight.

Other paleontological evidence supports the Moa's Ark hypothesis: since 2001 work on a lake bed at St Bathans, Central Otago has produced a wide range of 16 million year-old fossils, including three bones from a mouse-sized land mammal. The diversity of the assemblage indicates that unless there was some uniquely rapid colonisation and subsequent speciation, there must have been above-water regions throughout the Oligocene. In addition, whereas the pro-underwater faction have concentrated on vertebrates, research into smaller critters such as giant land snails (which are unable to survive in salt water conditions) supports the opposite proposition.

So all in all, there is as yet no definitive proof one way or the other. What's interesting about this particular set of hypotheses is the way in which an array of disciplines are coming together to provide a more accurate picture of New Zealand's past. By working together, they also seem to be reducing the inertia that has led geology to overlook new ideas for far too long; Zealandia, your time has come!

Wednesday, 27 September 2017

Cow farts and climate fiddling: has agriculture prevented a new glaciation?

Call me an old grouch, but I have to say that one of my bugbears is the use of the term 'ice age' when what is usually meant is a glacial period. We currently live in an interglacial (i.e. warmer) era, the last glaciation having ended about 11,700 years ago. These periods are part of the Quaternary glaciation that has existed for almost 2.6 million years and deserving of the name 'Ice Age', with alternating but irregular cycles of warm and cold. There, that wasn't too difficult now, was it?

What is rather more interesting is that certain geology textbooks published from the 1940s to 1970s hypothesised that the Earth is overdue for the next glaciation. Since the evidence suggests the last glacial era ended in a matter of decades, the proposed future growth of the ice sheets could be equally rapid. Subsequent research has shown this notion to be flawed, with reliance on extremely limited data leading to over-confident conclusions. In fact, current estimates put interglacial periods as lasting anywhere from ten thousand to fifty thousand years, so even without human intervention in global climate, there would presumably be little to panic about just yet.

Over the past three decades or so this cooling hypothesis has given way to the opposing notion of a rapid increase in global temperatures. You only have to read such recent news items as the breakaway of a six thousand square kilometre piece of the Antarctic ice shelf to realise something is going on, regardless of whether you believe it is manmade, natural or a combination of both. But there is a minority of scientists who claim there is evidence for global warming - and an associated postponement of the next glaciation - having begun thousands of years prior to the Industrial Revolution. This then generates two key questions:

  1. Has there been a genuine steady increase in global temperature or is the data flawed?
  2. Assuming the increase to be accurate, is it due to natural changes (e.g. orbital variations or fluctuations in solar output) or is it anthropogenic, that is caused by human activity?

As anyone with even a vague interest in or knowledge of climate understands, the study of temperature variation over long timescales is fraught with issues, with computer modelling often seen as the only way to fill in the gaps. Therefore, like weather forecasting, it is far from being an exact science (insert as many smileys here as deemed appropriate). Although there are climate-recording techniques involving dendrochronology (tree rings) and coral growth that cover the past few thousand years, and ice cores that go back hundreds of thousands, there are still gaps and assumptions that mean the reconstructions involve variable margins of error. One cross-discipline assumption is that species found in the fossil record thrived in environments - and crucially at temperatures - similar to their descendants today. All in all this indicates that none of the numerous charts and diagrams displaying global temperatures over the past twelve thousand years are completely accurate, being more along the lines of a reconstruction via extrapolation.

Having looked at some of these charts I have to say that to my untrained eye there is extremely limited correlation for the majority of the post-glacial epoch. There have been several short-term fluctuations in both directions in the past two thousand years alone, from the so-called Mediaeval Warm Period to the Little Ice Age of the Thirteenth to Nineteenth centuries. One issue of great importance is just how wide a region did these two anomalous periods cover outside of Europe and western Asia? Assuming however that the gradual warming hypothesis is correct, what are the pertinent details?

Developed in the 1920s, the Milankovitch cycles provide a reasonable fit for the evidence of regular, long-term variations in the global climate. The theory states that changes in the Earth's orbit and axial tilt are the primary causes of these variations, although the timelines do not provide indisputable correlation. This margin of error has helped to lead other researchers towards an anthropogenic cause for a gradual increase in planet-wide warming since the last glaciation.

The first I heard of this was via Professor Iain Stewart's 2010 BBC series How Earth Made Us, in which he summarised the ideas of American palaeoclimatologist Professor William Ruddiman, author of Plows, Plagues and Petroleum: How Humans Took Control of Climate. Although many authors, Jared Diamond amongst them, have noted the effects of regional climate on local agriculture and indeed the society engaged in farming, Professor Ruddiman is a key exponent of the reverse: that pre-industrial global warming has resulted from human activities. Specifically, he argues that the development of agriculture has led to increases in atmospheric methane and carbon dioxide, creating an artificial greenhouse effect long before burning fossil fuels became ubiquitous. It is this form of climate change that has been cited as postponing the next glaciation, assuming that the current interglacial is at the shorter end of such timescales. Ruddiman's research defines two major causes for an increase in these greenhouse gases:

  1. Increased carbon dioxide emissions from burning vegetation, especially trees, as a form of land clearance, i.e. slash and burn agriculture.
  2. Increased methane from certain crops, especially rice, and from ruminant species, mostly cattle and sheep/goat.

There are of course issues surrounding many of the details, even down to accurately pinpointing the start dates of human agriculture around the world. The earliest evidence of farming in the Near East is usually dated to a few millennia after the end of the last glaciation, with animal husbandry preceding the cultivation of crops. One key issue concerns the lack of sophistication in estimating the area of cultivated land and ruminant population size until comparatively recent times, especially outside of Western Europe. Therefore generally unsatisfactory data concerning global climate is accompanied by even less knowledge concerning the scale of agriculture across the planet for most of its existence.

The archaeological evidence in New Zealand proves without a doubt that the ancestors of today's Maori, who probably first settled the islands in the Thirteenth Century, undertook enormous land clearance schemes. Therefore even cultures remote from the primary agricultural civilisations have used similar techniques on a wide scale. The magnitude of these works challenges the assumption that until chemical fertilisers and pesticides were developed in the Twentieth Century, the area of land required per person had altered little since the first farmers. In a 2013 report Professor Ruddiman claims that the level of agriculture practiced by New Zealand Maori is just one example of wider-scale agricultural land use in pre-industrial societies.

As for the role played by domesticated livestock, Ruddiman goes on to argue that ice core data shows an anomalous increase in atmospheric methane from circa 3000BCE onwards. He hypothesises that a rising human population led to a corresponding increase in the scale of agriculture, with rice paddies and ruminants the prime suspects. As mentioned above, the number of animals and size of cultivated areas remain largely conjectural for much of the period in question.  Estimates suggest that contemporary livestock are responsible for 37% of anthropogenic methane and 9% of anthropogenic carbon dioxide whilst cultivated rice may be generating up to 20% of anthropogenic methane. Extrapolating back in time allows the hypothesis to gain credence, despite lack of access to exact data.

In addition, researchers both in support and opposition to pre-industrial anthropogenic global warming admit that the complexity of feedback loops, particularly with respect to the role of temperature variation in the oceans, further complicates matters. Indeed, such intricacy, including the potential latency between cause and effect, means that proponents of Professor Ruddiman's ideas could be using selective data for support whilst suppressing its antithesis. Needless to say, cherry-picking results is hardly model science.

There are certainly some intriguing aspects to this idea of pre-industrial anthropogenic climate change, but personally I think the jury is still out (as I believe it is for the majority of professionals in this area).  There just isn't the level of data to guarantee its validity and what data is available doesn't provide enough correlation to rule out other causes. I still think such research is useful, since it could well prove essential in the fight to mitigate industrial-era global warming. The more we know about longer term variations in climate change, the better the chance we have of understanding the causes - and potentially the solutions - to our current predicament. And who knows, the research might even persuade a few of the naysayers to move in the right direction. That can't be bad!

Monday, 11 September 2017

Valuing the velvet worm: noticing the most inconspicuous of species

Most of the recent television documentaries or books I've encountered that discuss extra-terrestrial life include some description of the weirder species we share our own planet with. Lumped together under the term 'extremophiles' these organisms appear to thrive in environments hostile to most other life forms, from the coolant ponds of nuclear reactors to the boiling volcanic vents of the deep ocean floor.

Although this has rightly gained attention for these often wonderfully-named species (from snottites to tardigrades) there are numerous other lifeforms scarcely noticed by anyone other than a few specialists, quietly going about their unassuming business. However, they may provide a few useful lessons for all of us, including that we should acknowledge there may be unrecognised problems generated when we make rapid yet radical modifications to local environments.

There is a small, unassuming type of creature alive today that differs little from a marine animal present in the Middle Cambrian period around five hundred million years ago. I first read about onychophorans in Stephen Jay Gould's 1989 exposition on the Burgess Shale, Wonderful Life, and although those fossil marine lobopodians are not definitively onychophorans they are presumed to be ancestral. More commonly known by one genus, peripatus, or even more colloquially as velvet worms, there are at least several hundred species around today, possibly many more. The velvet component of their name is due to their texture, but they bear more resemblance to caterpillars than to worms. They are often described as the ‘missing link' between arthropods and worms but as is usually the case this is a wildly inappropriate term in this context of biological classification. The key difference to the Burgess Shale specimens is that today's velvet worms are fully terrestrial: there are no known marine or freshwater species.

Primarily resident in the southern hemisphere, the largely nocturnal peripatus shun bright light and requiring humid conditions to survive. Although there are about thirty species here in New Zealand, a combination of their small size (under 60mm long) and loss of habitat means they are rarely seen. The introduction of predators such as hedgehogs - who of course never meet peripatus in their northern hemisphere home territory - means that New Zealand's species have even more to contend with. Although I frequently (very carefully) look under leaf litter and inside damp logs on bush walks in regions known to contain the genus Peripatoides - and indeed where others have told me they have seen them - I have yet to encounter a single specimen.

There appears to be quite limited research, with less than a third of New Zealand species fully described. However, enough is known about two species to identify their population status as 'vulnerable'. One forest in the South Island has been labelled an 'Area of Significant Conservation Value' thanks to its population of peripatus, with the Department of Conservation relocating specimens prior to road development. Clearly, they had better luck locating velvet worms than I have had! It isn't just the New Zealand that lacks knowledge of home-grown onychophorans either: in the past two decades Australian researchers have increased the number of their known species from just seven to about sixty.

Their uncanny resemblance to the Burgess Shale specimens, despite their transition from marine to terrestrial environments, has led velvet worms to be described by another well-worn phrase, 'living fossils'. However, is this short-hand in any way useful, or is it a lazy and largely inaccurate term? The recent growth in sophisticated DNA analysis suggests that even when outward anatomy may be change little, the genome itself may vary widely. Obviously DNA doesn't preserve in fossils and so any such changes cannot be tracked from the Cambrian specimens, but the genetic variation found in other types of organisms sharing a similar appearance shows that reliance on just external anatomy can be deceptive.

Due to lack of funding, basic taxonomic research, the bedrock for cladistics, is sadly lacking. In the case of New Zealand, some of the shortfall has been made up for by dedicated amateurs, but there are few new taxonomists learning the skills to continue this work - which is often seen as dull and plodding compared to the excitement of, for example, genetics. Most people might say so what interest could there be in such tiny, insignificant creatures as peripatus? After all, how likely would you be to move an ant's nest in your garden before undertaking some re-landscaping? But as shown by the changing terminology from 'food chains' to 'food webs', in most cases we still don't understand how the removal of one species might generate a domino effect on a local ecosystem.

I've previously discussed the over-reliance on 'poster' species such as giant pandas for environmental campaigns, but mere aesthetics don't equate to importance, either for us or ecology as a whole. It is becoming increasingly clear that by weight the majority of our planet's biomass is microbial. Then come the insects, with the beetles prominent both by number of species and individuals. Us large mammals are really just the icing on the cake and certainly when it comes to Homo sapiens, the rest of the biosphere would probably be far better off without us, domesticated species aside.

It would be nice to value organisms for themselves, but unfortunately our market economies require the smell of profit before they will lift a finger. Therefore if their usefulness could be ascertained, it might help generate greater financial incentive to support the wider environment. Onychophorans may seem dull, but there are several aspects to them that is both interesting in itself and might also provide something fruitful for us humans.

Firstly, they have an unusual weapon in the form of a mechanism that shoots adhesive slime at prey. Like spider silk, is it possible that this might prove an interesting line of research in the materials or pharmaceutical industries? After all, it was the prickly burrs of certain plants that inspired the development of Velcro, whilst current studies of tardigrades (the tiny 'water bears' living amongst the mosses) are investigating their near indestructability. If even a single, tiny species becomes extinct, that genome is generally lost forever: who knows what insights it might have led to? Although museum collections can be useful, DNA does decay and contamination leads to immense complexities in unravelling the original organism's genome. All in all, it's much better to have a living population to work on than rely on what can be pieced together post-extinction.

In addition, for such tiny creatures, velvet worms have developed complex social structures; is it possible that analysis of their brains might be useful in computing or artificial intelligence? Of course it is unlikely - and extinction is nothing if not natural - but the current rate is far greater than it has been outside of mass extinctions. Losing a large and obvious species such as the Yangtze River dolphin (and that was despite it being labelled a ‘national treasure') is one thing, but how many small, barely-known plants and animals are going the same way without anyone noticing? Could it be that right now some minute, unassuming critter is dying out and that we will only find out too late that it was a vital predator of crop-eating pests like snails or disease vectors such as cockroaches?

It has been said that ignorance is bliss, but with so many humans needing to be fed, watered and treated for illness, now more than ever we need as much help as we can get. Having access to the complex ready-made biochemistry of a unique genome is surely easier than attempting to synthesise one from scratch or recover it from a long-dead preserved specimen? By paying minimal attention to the smallest organisms that lie all around us, we could be losing so much more than just an unobtrusive plant, animal or fungus.

We can't save every species on the current endangered list but more attention could be given to the myriad of life forms that get side-lined by the cute and cuddly flagship species, usually large animals. Most of us would be upset by the disappearance of the eighteen hundred or so giant pandas still left in the wild, but somehow I doubt their loss would have as great an impact on the surrounding ecosystem than that of some far less well known flora or fauna. If you think that's nonsense, then consider the vital roles that bees and dung beetles play in helping human agriculture.

Although the decimation of native New Zealand wildlife has led to protective legislation for all our vertebrates and a few famous invertebrates such as giant weta, the vast majority of other species are still left to their own devices. That's not to say that the ecosystems in most other countries are given far less support, of course. But without funding for basic description and taxonomy, who knows what is even out there, never mind whether it might be important to humanity? Could it be that here is a new field for citizen scientists to move into?

Needless to say, the drier climes brought on by rising temperatures will not do peripatus any favours, thanks to its need to remain in damp conditions. Whether by widespread use of the poison 1080 (in the bid to create a pest-free New Zealand by 2050) or the accidental importation of a non-native fungus such as those decimating amphibians worldwide and causing kauri dieback in New Zealand, there are plenty of ways that humans could unwittingly wipe out velvet worms, etal. So next time you watch a documentary on the demise of large, familiar mammals, why not spare a thought for all those wee critters hiding in the bush, going about their business and trying to avoid all the pitfalls us humans have unthinkingly laid for them?

Tuesday, 29 August 2017

Cerebral celebrities: do superstar scientists harm science?

One of my earliest blog posts concerned the media circus surrounding two of the most famous scientists alive today: British physicist Stephen Hawking and his compatriot the evolutionary biologist Richard Dawkins. In addition to their scientific output, they are known in public circles thanks to a combination of their general readership books, television documentaries and charismatic personalities. The question has to be asked though, how much of their reputation is due to their being easily-caricatured and therefore media-friendly characters rather than what they have contributed to human knowledge?

Social media has done much to democratise the publication of material from a far wider range of authors than previously possible, but the current generation of scientific superstars who have arisen in the intervening eight years appear party to a feedback loop that places personality as the primary reason for their media success. As a result, are science heroes such as Neil deGrasse Tyson and Brian Cox merely adding the epithet 'cool' to STEM disciplines as they sit alongside the latest crop of media and sports stars? With their ability to fill arenas usually reserved for pop concerts or sports events, these scientists are seemingly known far and wide for who they are as much as for what they have achieved. It might seem counterintuitive to think that famous scientists and mathematicians could be damaging STEM, but I'd like to put forward five ways by which this could be occurring:

1: Hype and gossip

If fans of famous scientists spend their time reading, liking and commenting at similarly trivial levels, they may miss important material from other, less famous sources. A recent example that caught my eye was a tweet by British astrophysicist and presenter Brian Cox, containing a photograph of two swans he labelled ‘Donald' and ‘Boris'. I assume this was a reference to the current US president and British foreign secretary, but with over a thousand 'likes' by the time I saw it I wonder what other, more serious, STEM-related stories might have been missed in the rapid ebb and flow of social media.

As you would expect with popular culture fandom the science celebrities' material aimed at a general audience receives the lion's share of attention, leaving the vast majority of STEM popularisations under-recognised. Although social media has exacerbated this, the phenomenon does pre-date it. For example, Stephen Hawking's A Brief History of Time was first published in 1988, the same year as Timothy Ferris's Coming of Age in the Milky Way, a rather more detailed approach to similar material that was left overshadowed by its far more famous competitor. There is also the danger that celebrities with a non-science background might try to cash in on the current appeal of science and write poor-quality popularisations. If you consider this unlikely, you should bear in mind that there are already numerous examples of extremely dubious health, diet and nutrition books written by pop artists and movie stars. If scientists can be famous, perhaps the famous will play at being science writers.

Another result of this media hubbub is that in order to be heard, some scientists may be guilty of the very hype usually blamed on the journalists who publicise their discoveries. Whether to guarantee attention or self-promoting in order to gain further funding, an Australian research team recently came under fire for discussing a medical breakthrough as if a treatment was imminent, despite having so are only experimented on mice! This sort of hyperbole both damages the integrity of science in the public eye and can lead to such dangerous outcomes as the MMR scandal, resulting in large numbers of children not being immunised.

2: Hero worship

The worship of movie stars and pop music artists is nothing new and the adulation accorded them reminds me of the not dissimilar veneration shown to earlier generations of secular and religious leaders. The danger here then is for impressionable fans to accept the words of celebrity scientists as if they were gospel and so refrain from any form of critical analysis. When I attended an evening with astrophysicist Neil deGrasse Tyson last month I was astonished to hear some fundamental misunderstandings of science from members of the public. It seemed as if Dr Tyson had gained a personality cult who hung on each utterance but frequently failed to understand the wider context or key issues regarding the practice of science. By transferring hero worship from one form of human activity to another, the very basis - and differentiation - that delineates the scientific enterprise may be undermined.

3: Amplifying errors

Let's face it, scientists are human and make mistakes. The problem is that if the majority of a celebrity scientist's fan base are prepared to lap up every statement, then the lack of critical analysis can generate further issues. There are some appalling gaffes in the television documentaries and popular books of such luminaries as Sir David Attenborough (as previously discussed) and even superstar Brian Cox is not immune: his 2014 book Human Universe described lunar temperatures dropping below -2000 degrees Celsius! Such basic errors imply that the material is ghost-written or edited by authors with little scientific knowledge and no time for fact checking. Of course this may embarrass the science celebrity in front of their potentially jealous colleagues, but more importantly can serve as ammunition for politicians, industrialists and pseudo-scientists in their battles to persuade the public of the validity of their own pet theories - post-truth will out, and all that nonsense.

4: Star attitude

With celebrity status comes the trappings of success, most usually defined as a luxury lifestyle. A recent online discussion here in New Zealand concerned the high cost of tickets for events featuring Neil deGrasse Tyson, Brian Greene, David Attenborough, Jane Goodall and later this year, Brian Cox. Those for Auckland-based events were more expensive than tickets to see Kiwi pop star Lorde and similar in price for rugby matches between the All Blacks and British Lions. By making the tickets this expensive there is little of chance of attracting new fans; it seems to be more a case of preaching to the converted.

Surely it doesn't have to be this way: the evolutionary biologist Beth Shapiro, author of How to Clone a Mammoth, gave an excellent free illustrated talk at Auckland Museum a year ago. It seems odd that the evening with Dr Tyson, for example, consisting of just himself, interviewer Michelle Dickinson (A.K.A. Nanogirl) and a large screen, cost approximately double that of the Walking with Dinosaurs Arena event at the same venue two years earlier, which utilised US$20 million worth of animatronic and puppet life-sized dinosaurs.

Dr Tyson claims that by having celebrity interviewees on his Star Talk series he can reach a wider audience, but clearly this approach is not feasible when his tour prices are so high. At least Dr Goodall's profits went into her conservation charity, but if you consider that Dr Tyson had an audience of probably over 8000 in Auckland alone, paying between NZ$95-$349 (except for the NZ$55 student tickets) you have to wonder where all this money goes: is he collecting ‘billions and billions' of fancy waistcoats? It doesn't look as if this trend will soon stop either, as Bill Nye (The Science Guy) has just announced that he will be touring Australia later this year and his tickets start at around NZ$77.

5: Skewing the statistics

The high profiles of sci-comm royalty and their usually cheery demeanour implies that all is well in the field of scientific research, with adequate funding for important projects. However, even a quick perusal of less well-known STEM professionals on social media prove that this is not the case. An example that came to my attention back in May was that of the University of Auckland microbiologist Dr Siouxsie Wiles, who had to resort to crowdfunding for her research into fungi-based antibiotics after five consecutive funding submissions were rejected. Meanwhile, Brian Cox's connection to the Large Hadron Collider gives the impression that even such blue-sky research as the LHC can be guaranteed enormous budgets.

As much as I'd like to thank these science superstars for promoting science, technology and mathematics, I can't quite shake the feeling that their cult status is too centred on them rather than the scientific enterprise as a whole.  Now more than ever science needs a sympathetic ear from the public, but this should be brought about by a massive programme to educate the public (they are the taxpayers, after all) as to the benefits of such costly schemes as designing nuclear fusion reactors and the research on climate change. Simply treating celebrity scientists in the same way as movie stars and pop idols won't help an area of humanity under siege from so many influential political and industrial leaders with their own private agendas. We simply mustn't allow such people to misuse the discipline that has raised us from apemen to spacemen.

Friday, 11 August 2017

From steampunk to Star Trek: the interwoven strands between science, technology and consumer design

With Raspberry Pi computers having sold over eleven million units by the end of last year, consumer interest in older technology appears to have become big business. Even such decidedly old-school devices as crystal radio kits are selling well, whilst replicas of vintage telescopes are proof that not everyone has a desire for the cutting-edge. I'm not sure why this is so, but since even instant Polaroid-type cameras are now available again - albeit with a cute, toy-like styling - perhaps manufacturers are just capitalising on a widespread desire to appear slightly out of the ordinary. Even so, such products are far closer to the mainstream than left field: instant-developing cameras for example now reach worldwide sales of over five million per year. That's hardly a niche market!

Polaroid cameras aside, could it be the desire for a less minimal aesthetic that is driving such purchases? Older technology, especially if it is pre-integrated circuit, has a decidedly quaint look to it, sometimes with textures - and smells - to match. As an aside, it's interesting that whilst on the one hand current miniaturisation has reduced energy consumption for many smaller pieces of technology from the Frankenstein laboratory appearance of valve-based computing and room-sized mainframes to the smart watch etal, the giant scale of cutting-edge technology projects require immense amounts of energy, with nuclear fusion reactors presumably having overtaken the previous perennial favourite example of space rockets when it comes to power usage.

The interface between sci-tech aesthetics and non-scientific design is a complicated one: it used to be the case that consumer or amateur appliances were scaled-down versions of professional devices, or could even be home-made, for example telescopes or crystal radios. Nowadays there is a massive difference between the equipment in high-tech laboratories and the average home; even consumer-level 3D printers won't be able to reproduce gravity wave detectors or CRISPR-Cas9 genome editing tools any time soon.

The current trend in favour - or at least acknowledgement - of sustainable development, is helping to nullify the pervasive Victorian notion that bigger, faster, noisier (and smellier) is equated with progress. It's therefore interesting to consider the interaction of scientific ideas and instruments, new technology and consumerism over the past century or so. To my mind, there appear to be five main phases since the late Victorian period:
  1. Imperial steam
  2. Streamlining and speed
  3. The Atomic Age
  4. Minimalism and information technology
  5. Virtual light

1) Imperial steam

In the period from the late Nineteenth Century's first generation of professional scientists up to the First World War, there appears to have been an untrammelled optimism for all things technological. Brass, iron, wood and leather devices - frequently steam-powered - created an aesthetic that seemingly without effort has an aura of romance to modern eyes.

Although today's steampunk/alternative history movement is indebted to later authors, especially Michael Moorcock, as much as it is to Jules Verne and H.G. Wells, the latter pair are only the two most famous of a whole legion of late Victorian and Edwardian writers who extolled - and occasionally agonised over - the wonders of the machine age.

I must confess I much prefer steam engines to electric or diesel locomotives, despite the noise, smuts and burning of fossil fuels. Although the pistons and connecting rods of these locomotives might be the epitome of the design from this phase, it should be remembered that it was not unknown for Victorian engineers to add fluted columns and cornucopia reliefs to their cast iron and brass machinery, echoes of a pre-industrial past. An attempt was being made, however crude, to tie together the might of steam power to the Classical civilisations that failed to go beyond the aeolipile toy turbine and the Antikythera mechanism.

2) Streamlining and speed

From around 1910, the fine arts and then decorative arts developed new styles obsessed with mechanical movement, especially speed. The dynamic work of the Futurists led the way, depicting the increasing pace of life in an age when humans and machines were starting to interact ever more frequently. The development of heavier-than-air flight even led to a group of 'aeropainters' whose work stemmed from their experience of flying.

Although scientific devices still had some of the Rube Goldberg/Heath Robinson appearance of their Nineteenth Century forebears, both consumer goods and vehicles picked up the concept of streamlining to suggest a sophisticated, future-orientated design. Items such as radios and toasters utilised early plastics, stainless steel and chrome to imply a higher level of technology than their interiors actually contained. This is in contrast to land, sea and aerial craft, whereby the practical benefits of streamlining happily coincided with an attractive aesthetic, leading to design classics such as the Supermarine seaplanes (forerunners of the Spitfire) and the world speed record-holding A4 Pacific Class steam locomotives.

3) The Atomic Age

By the 1950s practically anything that could be streamlined was, whether buildings that looked like ocean liners or cars with rocket-like tailfins and dashboards fit for a Dan Dare spaceship. However, a new aesthetic was gaining popularity in the wake of the development of atomic weapons. It seems to have been an ironic move that somewhere between the optimism of an era of exciting new domestic gadgets and the potential for nuclear Armageddon, the Bohr (classical physics) model of the atom itself gained a key place in post-war design.

Combined with rockets and space the imagery could readily be termed 'space cadet', but it wasn't the only area of science to influence wider society. Biological research was undergoing a resurgence, which may explain why stylised x-ray forms, amoebas and bodily organs become ubiquitous on textiles, furnishings, and fashion. Lighting fixtures were a standout example of items utilising designs based on the molecular models used in research laboratories (which famously gave Crick and Watson the edge in winning the race to understand the structure of DNA).

Monumental architecture also sought to represent the world of molecules on a giant scale, culminating in the 102 metre-high Atomium built in Brussels for the 1958 World's Fair. It could be said that never before had science- and technological-inspired imagery been so pervasive in non-STEM arenas.

4) Minimalism and information technology

From the early 1970s the bright, optimistic designs of the previous quarter century were gradually replaced by the cool, monochromatic sophistication of minimalism. Less is more became the ethos, with miniaturisation increasing as solid-state electronics and then integrated circuits became available. A plethora of artificial materials, especially plastics, meant that forms and textures could be incredibly varied if refined.

Perhaps a combination of economic recession, mistrust of authority (including science and a military-led technocracy) and a burgeoning awareness of environmental issues led to the replacement of exuberant colour with muted, natural tones and basic if self-possessed geometries. Consumers could now buy microcomputers and video games consoles; what had previously only existed in high-tech labs or science fiction became commonplace in the household. Sci-fi media began a complex two-way interaction with cutting-edge science; it's amazing to consider that only two decades separated the iPad from its fictional Star Trek: The Next Generation predecessor, the PADD.

5) Virtual light

With ultra high-energy experiments such as nuclear fusion reactors and the ubiquity of digital devices and content, today's science-influenced designs aim to be simulacra of their professional big brothers. As stated earlier, although consumer technology is farther removed from mega-budget science apparatus than ever, the former's emphasis on virtual interfaces is part of a feedback loop between the two widely differing scales.

The blue and green glowing lights of everything from futuristic engines to computer holographic interfaces in many Hollywood blockbusters are representations of both the actual awesome power required by the likes of the Large Hadron Collider and as an analogy for the visually-unspectacular real-life lasers and quantum teleportation, the ultimate fusion (sorry, couldn't resist that one) being the use of the real National Ignition Facility target chamber as the engine core of the USS Enterprise in Star Trek: Into Darkness.

Clearly, this post-industrial/information age aesthetic is likely to be with us for some time to come, as consumer-level devices emulate the cool brilliance of professional STEM equipment; the outer casing is often simple yet elegant, aiming not to distract from the bright glowing pixels that take up so much of our time. Let's hope this seduction by the digital world can be moderated by a desire to keep the natural, material world working.

Friday, 28 July 2017

Navigating creation: A Cosmic Perspective with Neil deGrasse Tyson


I recently attended an interesting event at an Auckland venue usually reserved for pop music concerts. An audience in the thousands came to Neil deGrasse Tyson: A Cosmic Perspective, featuring the presenter of Cosmos: A Spacetime Odyssey and radio/tv show StarTalk. The 'Sexiest Astrophysicist Alive' presented his brand of science communication to an enormous congregation (forgive the use of the word) of science fans aged from as young as five years old. So was the evening a success? My fellow science buffs certainly seemed to have enjoyed it, so I decided it would be worthwhile to analyse the good doctor's method of large-scale sci-comm.

The evening was split into three sections, the first being the shortest, a primer as to our location in both physical and psychological space-time. After explaining the scale of the universe via a painless explanation of exponents, Dr Tyson used the homespun example of how stacking the 'billions' (which of course he declared to be Carl Sagan's favourite word) of Big Macs so far sold could be stacked many times around the Earth's circumference and even then extend onwards to the Moon and back. Although using such a familiar object in such unusual terrain is a powerful way of taking people outside their comfort territory, there was nothing new about this particular insight, since Dr Tyson has been using it since at least 2009; I assume it was a case of sticking to a tried-and-trusted method, especially when the rest of the evening was (presumably) unscripted.

Billions of Big Macs around the Earth and moon

Having already belittled our location in the universe, the remainder of the first segment appraised our species' smug sense of superiority, questioning whether extra-terrestrials would have any interest in us any more than we show to most of the biota here on Earth. This was a clear attempt to ask the audience to question the assumptions that science fiction, particularly of the Hollywood variety, has been popularising since the dawn of the Space Age. After all, would another civilisation consider us worthy of communicating with, considering how much of our broadcasting displays obvious acts of aggression? In this respect, Neil deGrasse Tyson differs markedly from Carl Sagan, who argued that curiosity would likely be a mutual connection with alien civilisations, despite their vastly superior technology. Perhaps this difference of attitude isn't surprising, considering Sagan's optimism has been negated by both general circumstance and the failure of SETI in the intervening decades.

Dr Tyson also had a few gibes at the worrying trend of over-reliance on high technology in place of basic cognitive skills, describing how after once working out some fairly elementary arithmetic he was asked which mobile app he had used to gain the result! This was to become a central theme of the evening, repeated several times in different guises: that rather than just learning scientific facts, non-scientists can benefit from practising critical thinking in non-STEM situations in everyday life.

Far from concentrating solely on astrophysical matters, Dr Tyson also followed up on topics he had raised in Cosmos: A Spacetime Odyssey regarding environmental issues here on Earth. He used Apollo 8's famous 'Earthrise' photograph (taken on Christmas Eve 1968) as an example of how NASA's lunar landing programme inspired a cosmic perspective, adding that organisation such as the National Oceanic and Atmospheric Administration and the Environmental Protection Agency were founded during the programme. His thesis was clear: what began with political and strategic causes had fundamental benefits across sectors unrelated to space exploration; or as he put it "We're thinking we're exploring the moon and we discovered the Earth for the first time."

The second and main part of the event was Tyson's discussion with New Zealand-based nanotechnologist and science educator Michelle Dickinson, A.K.A. Nanogirl. I can only assume that there aren't any New Zealand astronomers or astrophysicists as media-savvy as Dr Dickinson, or possibly it's a case of celebrity first and detailed knowledge second, with a scientifically-minded interviewer deemed to have an appropriate enough mindset even if not an expert in the same specialisation.

The discussion/interview was enlightening, especially for someone like myself who knows Neil deGrasse Tyson as a presenter but very little about him as a person. Dr Tyson reminisced how in 1989 he accidentally become a media expert solely on the basis of being an astrophysicist and without reference to him as an Afro-American, counter to the prevailing culture that only featured Afro-Americans to gain their point of view.

Neil deGrasse Tyson: A Cosmic Perspective

Dr Tyson revealed himself to be both a dreamer and a realist, the two facets achieving a focal point with his passion for a crewed mission to Mars. He has often spoken of this desire to increase NASA's (comparatively small) budget so as reinvigorate the United States via taking humans out from the humdrum comfort zone of low earth orbit. However, his understanding of how dangerous such a mission would be led him to state he would only go to Mars once the pioneering phase was over!

His zeal for his home country was obvious - particularly the missed opportunities and the grass roots rejection of scientific expertise prevalent in the United States - and it would be easy to see his passionate pleas for the world to embrace Apollo-scale STEM projects as naïve and out-of-touch. Yet there is something to be said for such epic schemes; if the USA is to rise out of its present lassitude, then the numerous if unpredictable long-term benefits of, for example, a Mars mission is a potential call-to-arms.

The final part of the evening was devoted to audience questions. As I was aware of most of the STEM and sci-comm components previously discussed this was for me perhaps the most illuminating section of the event. The first question was about quantum mechanics, and so not unnaturally Dr Tyson stated that he wasn't qualified to answer it. Wouldn't it be great if the scientific approach to expertise could be carried across to other areas where people claim expert knowledge that they don't have?

I discussed the negative effects that the cult of celebrity could have on the public attitude towards science back in 2009 so it was extremely interesting to hear questions from several millennials who had grown up with Star Talk and claimed Neil deGrasse Tyson as their idol. Despite having watched the programmes and presumably having read some popular science books, they fell into some common traps, from over-reliance on celebrities as arbiters of truth to assuming that most scientific theories rather than just the cutting edge would be overturned by new discoveries within their own lifetimes.

Dr Tyson went to some lengths to correct this latter notion, describing how Newton's law of universal gravitation for example has become a subset of Einstein's General Theory of Relativity. Again, this reiterated that science isn't just a body of facts but a series of approaches to understanding nature. The Q&A session also showed that authority figures can have a rather obvious dampening effect on people's initiative to attempt critical analysis for themselves. This suggests a no-win situation: either the public obediently believe everything experts tell them (which leads to such horrors as the MMR vaccine scandal) or they fail to believe anything from STEM professionals, leaving the way open for pseudoscience and other nonsense. Dr Tyson confirmed he wants to teach the public to think critically, reducing gullibility and thus exploitation by snake oil merchants. To this end he follows in the tradition of James 'The Amazing' Randi and Carl Sagan, which is no bad thing in itself.

In addition, by interviewing media celebrities on StarTalk Dr Tyson stated how he can reach a far wider audience than just dedicated science fans. For this alone Neil deGrasse Tyson is a worthy successor to the much-missed Sagan. Let's hope some of those happy fans will be inspired to not just dream, but actively promote the cosmic perspective our species sorely needs if we are to climb out of our current doldrums.

Monday, 10 July 2017

Genius: portraying Albert Einstein as a human being, not a Hollywood stereotype

I recently watched the National Geographic docudrama series Genius, presenting a warts-and-all look at the life and work of Albert Einstein. In these post-truth times in which even a modicum of intellectual thought is often regarded with disdain, it's interesting to see how a scientific icon is portrayed in a high-budget, high-profile series.

A few notable examples excepted, Dr Frankenstein figures still inform much of Hollywood's depiction of STEM practitioners. Inventors are frequently compartmentalised as either patriotic or megalomaniac, often with a love of military hardware; Jurassic Park's misguided and naive Dr John Hammond seemingly a rare exception. As for mathematicians, they are often depicted with more than a touch of insanity, such as in Pi or Fermat's Room.

So does Genius break the mould or follow the public perception of scientists as freaky, geeky, nerdy or plain evil? The script is a fairly sophisticated adaptation of real life events, although the science exposition suffers as a result. Despite some computer graphic sequences interwoven with the live action, the attempts to explore Einstein's thought experiments and theories are suggestive rather than comprehensive, the tip of the iceberg when it comes to his scientific legacy. Where the series succeeds is in describing the interaction of all four STEM disciplines: science, technology, engineering and mathematics; and the benefits when they overlap. The appalling attitudes prevalent in the academia of his younger years are also brought to vivid life, with such nonsense as not questioning tutors piled onto the usual misogyny and xenophobia.

Albert Einstein

Contrary to the popular conception of the lone genius - and counter to the series' title - the role of Einstein's friends such as Marcel Grossmann and Michele Besso as his sounding boards and mathematical assistants is given a high profile. In addition, the creative aspect of science is brought to the fore in sequences that show how Einstein gained inspiration towards his special and general theories of relativity.

The moral dimension of scientific research is given prominence, from Fritz Haber's development of poison gas to Leo Szilard's persuasion of Einstein to both encourage and later dissuade development of atomic weapons. As much as the scientific enterprise might appear to be separate from the rest of human concern, it is deeply interwoven with society; the term 'laboratory conditions' applies to certain processes, not to provide a wall to isolate science from everything else. Scientists in Genius are shown to have the same human foibles as everyone else, from Einstein's serial adultery (admittedly veering to Hollywood family drama at times, paternal guilt complex etal) to Philipp Lenard's dismissal of Einstein's theories due to his anti-Semitism rather than any scientific evidence. So much for scientific impartiality!

The last few episodes offer a poignant description of how even the greatest of scientific minds lose impetus, passing from creative originality as young rebels to conservative middle age stuck-in-the-muds, out of touch with the cutting edge. General readership books on physics often claim theoretical physicists do their best work before they are thirty, with a common example being that Einstein might as well have spent his last twenty years fishing. Although not as detailed as the portrayal of his early, formative years, Einstein's obsessive (but failed) quest to find fault with quantum mechanics is a good description of how even the finest minds can falter.

All in all, the first series of Genius is a very noble attempt to describe the inspiration and background that led to some revolutionary scientific theories. The irony is that by concentrating on Einstein as a human being it might help the wider public gain a better appreciation, if not comprehensive understanding, of the work of scientists and role of STEM in society. Surely that's no bad thing, especially if it makes Hollywood rethink the lazy stereotype of the crazy-haired scientist seeking world domination. Or even encourages people to listen to trained experts rather than the rants of politicians and religious nutbars. Surely that's not a difficult choice?

Monday, 26 June 2017

The power of pond scum: are microalgae biofuels a realistic proposition?

I've previously discussed some very humble organisms but they don't get much humbler than microalgae, photosynthetic organisms that generate about half our planet's atmospheric oxygen. Imagine then what potential there might be for their exploitation in a world of genetic manipulation and small-scale engineering? The total number of algal species is unknown, but estimates suggest some hundreds of thousands. To this end, private companies and government projects around the world have spent the past few decades - and a not inconsiderable amount of funding - to generate a replacement for fossil fuels based on these tiny plants.

For anyone with even a microgram's worth of common sense, developing eco-friendly substitutes for oil, coal and gas is a consummation to be devoutly wished for, but behind the hype surrounding microalgae-derived fuel there is a wealth of opposing opinions and potential some shady goings-on. Whilst other projects such as creating ethanol from food crops are continuing, the great hope - and hype -that surrounded algae-based solutions appears to be grinding to a halt.

Various companies were forecasting that 2012 would be the year that the technology achieved commercial viability, but this now appears to be rather over-eager. Therefore it's worth exploring what happens when hope, high-value commerce and cutting-edge technology meet. There are some big names involved in the research too: ExxonMobil, Shell and BP each pumped tens to hundreds of millions of dollars into microalgae fuel projects, only to either make substantial funding cuts or shut them down altogether since 2011.
Microalgae-derived biofuel
Manufacturing giants such as General Electric and Boeing have been involved in research for new marine and aircraft fuels, whilst the US Navy undertook tests in 2012 whereby algae-derived fuel was included in a 50:50 blend with conventional fossil fuel for ships and naval aircraft. Even shipping companies have become interested, with one boffin-worthy idea being for large cruise ships to grow and process their own fuel on-board. Carriers including United Airlines, Qantas, KLM and Air New Zealand have invested in these kerosene-replacement technologies, with the first two of these airlines having trialled fuel blends including 40% algae derivative. So what has gone wrong?

The issue appears to be one of scale: after initial success with laboratory-sized testing, the expansion to commercial production has encountered a range of obstacles that will most likely delay widespread implementation for at least another quarter century.

The main problems are these:
  1. The algae growing tanks need to be on millions of acres of flat land and there are arguments there just isn't enough such land in convenient locations.
  2. The growing process requires lots of water, which means large transportation costs to get the water to the production sites. Although waste water is usable, some estimates suggest there is not enough of this - even in the USA - for optimal production.
  3. Nitrogen and phosphorus are required as fertiliser, further reducing commercial viability. Some estimates suggest half the USA's annual phosphorus amount would need to be requisitioned for use in this one sector!
  4. Contamination by protozoans and fungi can rapidly destroy a growing pond's entire culture.
In 2012 the US National Academy of Sciences appeared to have confirmed these unfortunate issues. Reporting on the Department of Energy goal to replace 5% of the nation's vehicle fossil fuel consumption with algae-derived biofuel, the Academy stated that this scale of production would make unfeasibly large impacts on water and nutrient usage, as well heavy commitments from other energy sources.

In a bid to maintain solvency, some independent research companies appear to have minimised such issues for as long as possible, finally diversifying when it appeared their funding was about to be curtailed or cut-off. As with nuclear fusion research, commercial production of microalgae fuels hold much promise, but those holding the purse strings aren't as patient as the researchers.

There may be a hint of a silver lining to all this, even if wide scale operations are postponed many decades. The microalgae genus Chlorella - subject of a Scottish biofuel study - is proving to be a practical source of dietary supplements, from vitamins and minerals to Omega-3. It only lacks vitamin B12, but is an astonishing 50-60% protein by weight. As well as human consumption, both livestock and aquaculture feed supplements can be derived from microalgae, although as usual there is a wealth of pseudoscientific nonsense in the marketing, such as the notion that it has an almost magical detox capability. Incidentally, Spirulina, the tablets and powder sold in health food outlets to make into green gloop smoothies, is not microalgae but a B12-rich cyanobacteria, colloquially - and confusingly - known as blue-green algae. Glad that's cleared that one up!

If anything, the research into microalgae-derived biofuels is a good example of how new technology and commercial enterprise uneasily co-exist; each needs the other, but gaining a workable compromise is perhaps just a tricky as the research itself. As for Government-funded projects towards a better future for all, I'll leave you to decide where the interests of our current leaders lie...

Saturday, 10 June 2017

Owning the aliens: who should support endangered species thriving outside their home territories?

On holiday in Fiji last year I was surprised to learn that the most commonly-seen animals - with the exception of flying foxes - were recent introductions from other countries, primarily India. Examples include the red-vented bulbul, mynah bird, house gecko, and mongoose, all of which have brought their own problems to either native wildlife or Fijian agriculture.

From Hawaii to New Zealand, the deliberate or accidental introduction of non-native animals, plants and fungi has had profoundly negative effects on these previously isolated ecosystems. So what happens if an introduced organism, especially one that has a deleterious effect on wildlife, thrives in its transplanted habitat whilst becoming endangered across its original range? Two questions spring to mind: should the adopted homeland be able to exterminate the alien invader with impunity; and/or should the country of origin fund work in the invaded nation during a 'lifeboat' phase, until the home turf is suitable for restocking?

Almost inevitably, the countries with the highest number of at-risk species tend to be the poorer ones, Australia and the United States excepted. Reports over the past four years list a variety of nations with this sorry state of affairs, but amongst different conservation groups those within the top ten for endangered animal species include Indonesia, Malaysia, Ecuador, Mexico, India and Brazil. In some of these there is little political willpower - or indeed funding - to support anything deemed non-critical, with biodiversity seen as a nice-to-have.

For small nations such as Fiji there is little in the way of an environmental lobby. NatureFiji-MareqetiViti is an organisation that attempts to safeguard such threatened animals as the Fijian Crested Iguana whilst enhancing regional biosecurity, but with grants - including from the European Union - rarely exceeding a few tens or hundreds of thousands Fijian dollars they are woefully underfunded.

Which brings us to New Zealand, with its collection of endangered birds, lizards, freshwater fish and the Maui dolphin. In addition to Department of Conservation (Doc) budget cuts over the past decade - claimed by some organisations to total a 21% decline in real terms - the nation is home to several Australian animals that are nationally vulnerable in their native homeland across the Tasman Sea.

The green and golden bell frog (Litoria aurea) is a prime example of this, with a rapidly reducing Australian range having generated a status of 'globally vulnerable' yet being common enough in the northern part of New Zealand's North Island. I found this specimen at Auckland's Botanic Gardens earlier this year.


Therefore should the Australian Government fund a captive breeding programme - or simply a round-up - of individuals in New Zealand? After all, the latter has its own four native frog species, all rare and/or endangered, for its herpetologists to concentrate on.

There is a precedent for this. In 2003, three Australian trappers captured rare brush-tailed rock-wallabies on New Zealand's Kawau Island, where the marsupial's 'noxious' pest status meant
they were about to be targeted for eradication. The project included support from DoC but presumably - it's difficult to ascertain - the funding came from Australia.

Of course Australia may be able to afford to engage in restocking programmes abroad, but few other nations are in the same position. Although the largest conservation organisation in the world, the World Wide Fund for Nature (World Wildlife Fund in North America) has a comparatively large budget, even it cannot afford to support every repatriation or gene pool nursery scheme. Meanwhile, local charities such as NatureFiji-MareqetiViti tend to rely on volunteers rather than trained professionals and don't have the scope or capability for logistically-complex international undertakings.

With the USA becoming increasingly insular and Europe consumed with its own woes, the potential funding sources for these interim lifeboats is rapidly drying up. There are a few eco-angels, such as Norway's US$1 billion donation to Brazil - intended to curtail Amazonian rainforest destruction - but they are few and far between. It's one thing to support in-situ environmental issues, but another to raise funds to save selected endangered species thriving away from their native ecosystem.

It appears that there is no single solution to this, meaning that except for a few lucky 'poster' cases, many at-risk species may well fail to gain attention and be allowed to die out (or even be exterminated as foreign pests). The original home territory might no longer contain a suitable environment for them to thrive in whilst the foster nation lacks the impetus or funding to look after those pesky alien invaders. It seems that there are difficult times ahead!

Tuesday, 23 May 2017

Water, water, everywhere: the hidden holism of H2O

Like other northern regions of New Zealand, the summer of 2017 saw Auckland residents facing City Council requests to conserve water, as well as a hosepipe ban in effect during March and April. It therefore seems ironic that the water shortage occurred at the same time as flooding in the west of the city; thanks to a tropical downpour - one of several so far this year - the equivalent of an entire month's rain fell over a day or two. Clearly, water shortages are going to become an ever-increasing issue, even in nations with well-developed infrastructure.

The British historian of science James Burke, writer-presenter of The Day the Universe Changed, also made three television series called Connections 1, 2 and 3 (in 1978, 1994 and 1997 respectively) which examined the historical threads linking scientific and technological advances with changes in other areas of society. Therefore I'd like to take a similarly holistic approach to the wonderful world of water consumption and see how it ties into the world in general.

Although the statistics vary - it's difficult to assess with any great precision - there are published figures suggesting that the populace of richer nations use up to 5000 litres of water each per day, mostly hidden in food production. Many websites now supply details of the amount of water used to grown certain crops and foodstuffs, so you can easily raise your guilt level simply by comparing your diet to the water involved in its generation; and that's without considering the carbon mileage or packaging waste, either!

I've previously discussed the high environmental cost of cattle farming, with both dairy and beef herds being prominent culprits in water pollution as well as consumption. However, there are plenty of less-obvious foodstuffs proven to be notorious water consumers, for example avocado and almonds. Although the latter might be deemed a luxury food, much of the global supply is now used to make almond milk; with consumption increasing up to 40% year-on-year, this is one foodstuff much in demand.

Even though it is claimed to require much less water than the equivalent volume of dairy produce, almond farming is still relevant due to the massive increase in bulk production, especially in California (home to 80% of the global almond harvest). The reasons for the popularity of almond milk are probably two-fold: firstly, the public is getting more health-conscious; and secondly, a reduction or abstention in dairy produce is presumed to lessen food allergies/intolerance. These obviously link to prominent concerns in the West, in the form of high-calorie/low-exercise diets leading to mass obesity and over-use of cleaning chemicals in the home, preventing children from developing good anti-microbial resistance. Clearly, there is a complex web when it comes to water and the human race.

Even for regions chronically short of water such as California, more than three-quarters of fresh water usage is by agriculture. In order to conserve resources, is it likely that we may soon face greater taxes on commercially-grown water-hogging produce and bans on the home-growth of crops that have a low nutrition to water consumption ratio? I've recently read several books discussing probable issues over the next half century with the humble lettuce appearing as a good example of the latter.

Talking of which, the wet and windy conditions in New Zealand of the past year - blamed at least partially on La Niña - have led to record prices for common vegetables: NZ$9 for a lettuce and NZ$10 for a cauliflower, even in major supermarket chains. British supermarkets were forced to ration some fruit and vegetables back in February, due to their Mediterranean growers suffering from storms and floods. This suggests that even for regions with sophisticated agricultural practices there is a fine line between too much and too little fresh water. Isn't it about time that the main food producers developed a more robust not to mention future-proof infrastructure, considering the increased impact that climate change is likely to have?

The world is also paying a heavy price for bottled water, a commercial enterprise that largely breaks all boundaries of common sense. In the USA alone it costs several thousand times the equivalent volume of tap water and there are some reports that there may be chemical leaching from reusing plastic bottles. As you might expect, there is also an extremely high environmental cost. This includes the fossil fuels used by bottling plants and transportation, the lowering of the water table (whose level is so critical in areas utilising less sophisticated farming technologies) and the impact of plastic waste: the USA only recycles about 23% of its plastic water bottles, resulting in 38 billion bottles dumped each year at a cost of around US$1 billion. All in all, bottled water for nations with highly developed infrastructure seems like an insane use of critical resources.

Although accelerated population growth has become a widespread fear, there are indicators that later this century the global figure may peak at around nine billion and then level off. Increasing urbanisation is seen a primary cause for this and not just in developing nations; Auckland for example (New Zealand's largest city by far) experienced 8% population growth in the seven years from 2006. A larger population obviously requires more food, but a more urban and therefore generally better educated, higher income populace tends to demand access to processed, non-local and above all water-intensive foods. China is the touchstone here, having seen a massive increase in fish and meat consumption over the past half century; the latter has risen from 8 million tons per year in 1978 to over 70 million tons in recent years.

It has been claimed that 70% of industrial waste generated in developing nations is dumped into water courses, meaning that there will be a massive cost for environmental clean-up before local sources can be fully utilised. The mass outbreak of E-coli in Hawke's Bay, New Zealand, in February this year shows that even developed nations are having difficulty maintaining water quality, whilst there has been a shocking admittance of lead contamination above safe levels in 41 American states over the past three years. Does this mean bottled water - heretofore the lifeline of Western tourists abroad - is becoming a necessity in the West after all?

Some might argue that thanks to global warming there will be more water available due to the melting of polar caps and glaciers, which after all contain almost two-thirds of the world's fresh water resources. However, these sources are mostly located far from high-density populations and upon marine contamination they require energy-demanding desalination technology. It's small comfort that current estimates suggest that by 2025 about 14% of the global population will rely on desalination plants for their fresh water needs.

In the West we tend to take clean, safe water completely for granted but thanks to the demands of living in a society run on rampant consumerism - coupled with poor science education - everyday decisions are being made that affect the environment, waste critical resources and damage our own health. Pundits are predicting that water will be the new oil: liquid gold, a precious commodity to be fought over, if necessary. Surely this is one resource that all of us can do something to support, whether it is cutting down on water-intensive foodstuffs, using tap rather than bottled water, or simply turning off a tap sooner than usual!

Monday, 8 May 2017

Weather with you: meteorology and the public perception of climate change

If there's one thing that appears to unite New Zealanders with the British it is the love of discussing the weather. This year has been no exception, with New Zealand's pre-summer forecasts - predicting average temperatures and rainfall - proving wildly inaccurate. La Niña has been blamed for what Wellingtonians have deemed a 'bummer summer', January having provided the capital with its fewest 'beach days' of any summer in the last thirty years. Sunshine hours, temperature, rainfall and wind speed data from the MetService support this as a nationwide trend; even New Zealand flora and fauna have been affected with late blossoming and reduced breeding respectively.

However, people tend to have short memories and often recall childhood weather as somehow superior to that of later life. Our rose-tinted spectacles make us remember long, hot summer school holidays and epic snowball fights in winter, but is this a case of remembering the hits and forgetting the misses (meteorologically speaking)? After all, there are few things more boring than a comment that the weather is the same as the previous ten comments and surely our memories of exciting outdoor ventures are more prominent than being forced to stay indoors due to inclement conditions?

Therefore could our fascination with weather but dubious understanding - or even denial - of climate change be due to us requiring personal or even emotional involvement in a meteorological event? Most of us have had the luck not to experience extreme weather (or 'weather bombs' as the media now term them), so unless you have been at the receiving end of hurricanes or flash floods the weather is simply another aspect of our lives, discussed in everyday terms and rarely examined in detail.

Since we feel affected by weather events that directly impact us (down to the level of 'it rained nearly every day on holiday but the locals said it had been dry for two months prior') we have a far greater emotional response to weather than we do to climate. The latter appears amorphous and almost mythical by comparison. Is this one of the reasons that climate change sceptics achieve such success when their arguments are so unsupported?

Now that we are bombarded with countless pieces of trivia, distracting us from serious analysis in favour of short snippets of multimedia edutainment, how can we understand climate change and its relationship to weather? The standard explanation is that weather is short term (covering hours, days or at most weeks) whilst climate compares annual or seasonal variations over far longer timeframes. Neil deGrasse Tyson in Cosmos:A Spacetime Odyssey made the great analogy that weather is like the zigzag path of a dog on a leash whereas its owner walks in a straight line from A to B. So far so good, but there's not even a widespread designation for the duration that counts as valid for assessing climate variability.

As such this leads us to statistics. Everyone thinks they understand the word 'average' but averages can represent the mean, median or mode. Since the period start and end date can be varied, as can the scaling on infographics (a logarithmic axis, for example), these methods allow a single set of statistics to be presented in a wide variety of ways.

The laws of probability rear their much-misinterpreted head too. The likelihood of variation may change wildly, depending on the length of the timeframe: compare a five-year block to that of a century and you can see that climate statistics is a tricky business; what is highly improbable in the former period may be inevitable over the latter. As long as you are allowed to choose the timeframe, you can skew the data to support a favoured hypothesis. So much then for objective data!

By comparison, if someone is the recipient of a worse than expected summer, as per New Zealand in 2017, then that personal experience may well be taken as more important than all the charts of long-term climate trends. It might just be the blink of an eye in geological terms, but being there takes precedence over far less emotive science and mathematics.

Perhaps then we subconsciously define weather as something that we feel we experience whilst climate is a more abstract notion, perhaps a series of weather events codified in some sort of order? How else can climate change deniers, when faced with photographs proving glacial or polar cap shrinkage, offer alternative explanations to global warming?

This is where politics comes into the mix. Whereas weather has little obvious involvement with politics, climate has become heavily politicised in the past thirty years, with party lines in some nations (mentioning no names) clearly divided. Although some of the naysayers have begun to admit global warming appears to be happening - or at least that the polar caps and glaciers are melting - they stick to such notions that (a) it will be too slow to affect humans - after all, there have been far greater swings in temperature in both directions in previous epochs - and (b) it has natural causes. The latter implies there is little we can do to mitigate it (solar output may be involved, not just Earth-related causes) and so let's stick our head in the sand and do some ostrich impressions.

As an aside, I've just finished reading a 1988 book called Prehistoric New Zealand. Its three authors are a palaeontologist (Graeme Stevens), an archaeologist (Beverley McCulloch)  and an environmental researcher (Matt McGlone) so the content covers a wide range of topics, including the nation's geology, climate, wildlife and human impact. Interestingly, the book states if anything the climate appears to be cooling and the Earth is probably heading for the next glaciation!

Unfortunately no data is supplied to support this, but Matt McGlone has since confirmed that there is a wealth of data supporting the opposite conclusion. In 2008 the conservative American Heartland Institute published a list of 500 scientists it claimed supported the notion that current climate change has solely natural causes. McGlone was one of many scientists who asked for his name to be removed from this list, stating both his work and opinions were not in agreement with this idea.

So are there any solutions or is it simply the case that we believe what we personally experience but have a hard time coming to terms with less direct, wider-scale events? Surely there are enough talented science communicators and teachers to convince the public of the basic facts, or are people so embedded in the now that even one unseasonal rain day can convince them - as it did some random man I met on the street - that climate change is a myth?

Saturday, 22 April 2017

Which way's up? Mental mapping and conditioning by familarity

I recently watched a television documentary on Irish prehistory that noted if you cunningly turned a conventional map of the British Isles ninety degrees anti-clockwise, then Ireland would appear to be an integral part of Europe's maritime trade routes and not stuck out on the edge of the known world. Be that as it may, it's interesting how easily we accept conventions without analysis. As you might expect, just because something is a convention doesn't necessarily mean it is superior, only that it has achieved such a commonplace status that it will usually be taken for granted. It's not the logical approach, but then we're not Vulcans!

Take maps of the world. Map projections have usually arisen in reponse to practical needs or due to the contingency of history. Most global maps today use the Mercator projection, which whilst being useful for maritime navigation in a time before GPS, increasingly distorts areas as they approach the poles. This shouldn't seem surprising, since after all we're taken a near-spherical object, transposing it onto the surface of a cylinder, and then unrolling that onto a two-dimensional plane.

In fact there are dozens of different map projections but none are good for all regions and purposes. This doesn't mean that the Mercator projection is ideal; far from it, since heavily-populated regions such as Africa appear too small whilst barely-populated areas such as Greenland and Antarctica are far too large. However, it is popular because it is familiar because it is popular...and so on. Like QWERTY keyboards, it may no longer be required for the purpose it originally served but is now far too common to be replaced without a great deal of hassle.

Aside from projection, there's also the little matter of direction. There are novelty maps with the south pole at the top, most commonly created by Australians, but since 88% of the human race currently live in the Northern hemisphere (which has 68% on the total landmass) it's hardly surprising that the North Pole is conventionally top-most.

However, this hasn't always been the case: before there was worldwide communication, the ancient Egyptians deemed 'upper' as towards the equator and 'lower' away from it. Early medieval Arab scholars followed suit whilst the mappa mundi of medieval Christian Europe placed East at the top of a topography centred on Jerusalem.

Photographs of the Earth that show a recognisable landmass usually present north uppermost too; there is no such thing as 'right' way up for our solar system, but the origin of the first great civilisations has set the geographic orientation for our global society.

None of this might seem particularly important, but ready acceptance of familiar conventions can easily lead to lack of critical thinking. For example, in the Nineteenth and early Twentieth Centuries, Great Britain exported pre-fabricated buildings to Australia and New Zealand, but as some architects failed to recognise that the Southern hemisphere sun is due north at midday there are examples with the main windows on the south-facing wall. Even the fact that most humans live in the Northern hemisphere has lead to the incorrect assumption that - thanks to their summer - the earth is closer to the sun in June than it is in December. There is such a thing as hemisphere parochialism after all!

If we can learn anything from this it is that by accepting popular conventions without considering their history or relevance, we are switching off critical faculties that might otherwise generate replacement ideas more suitable for the present. Unfortunately, we frequently prefer familiarity over efficiency, so even though tried and trusted conventions may no longer be suitable for changed circumstances we solidly cling to them. Thus we stifle improvements as a trade-off for our comfort. I guess that's what they call human nature...

Saturday, 1 April 2017

The moons of Saturn and echoes of a synthetic universe

As fans of Star Wars might be aware, George Lucas is nothing if not visually astute. His thumbnail sketches for the X-wing, TIE fighter and Death Star created the essence behind these innovative designs. So isn't it strange that there is a real moon in our solar system that bears an astonishing resemblance to one of Lucas's creations?

At the last count Saturn had 53 confirmed moons, with another 9 provisionally verified - and as such assigned numbers rather than names. One of the ringed planet's natural satellites is Mimas, discovered in 1789 and at 396 kilometres in diameter about as small as an object can be yet conform to an approximate sphere. The distinguishing characteristic of Mimas is a giant impact crater about 130 kilometres in diameter, which is named Herschel after the moon's discoverer, William Herschel. For anyone who has seen Star Wars (surely most of the planet by now), the crater gives Mimas an uncanny resemblance to the Death Star. Yet Lucas's original sketch for the battle station was drawn in 1975, five years before Voyager 1 took the first photograph with a high enough resolution to show the crater.


Okay, so one close resemblance between art and nature could be mere coincidence. But amongst Saturn's retinue of moons is another with an even more bizarre feature. At 1469 kilometres in diameter Iapetus is the eleventh largest moon in the solar system. Discovered by Giovanni Cassini in 1671, it quickly became apparent that there was something extremely odd about it, with one hemisphere much brighter than the other.

As such, it attracted the attention of Arthur C. Clarke, whose novel 2001: A Space Odyssey described Japetus (as he called it) as the home of the Star Gate, an artificial worm hole across intergalactic space. He explained the brightness differentiation as being due to an eye-shaped landscape created by the alien engineers of the Star Gate: an enormous pale oval with a black dot at its centre. Again, Voyager 1 was the first spacecraft to photograph Iapetus close up…revealing just such a feature! Bear in mind that this was 1980, whereas the novel was written between 1965 and 1968. Carl Sagan, who worked on the Voyager project, actually sent Clarke a photograph of Iapetus with a comment "Thinking of you..." Clearly, he had made the connection between reality and fiction.

As Sagan himself was apt to say, extraordinary claims require extraordinary evidence. Whilst a sample of two wouldn't make for a scientifically convincing result in most disciplines, there is definitely something strange about two Saturnian moons that are found to closely resemble elements in famous science fiction stories written prior to the diagnostic observations being made. Could there be something more fundamental going on here?

One hypothesis that has risen in popularity despite lacking any hard physical evidence is that of the simulated universe. Nick Bostrum, the director of the University of Oxford's Future of Humanity Institute has spent over a decade promoting the idea. Instead of experimental proof Bostrum uses probability theory to support his suppositions. At its simplest level, he notes that the astonishing increase in computing power over the past half century implies an ability in the near future to create detailed recreations of reality within a digital environment; basically, it's The Matrix for real (or should that be, for virtual?)

It might sound like the silliest science fiction, as no-one is likely to be fooled by current computer game graphics or VR environments, but with quantum computing on the horizon we may soon have processing capabilities far beyond those of the most powerful current mainframes. Since the ability to create just one simulated universe implies the ability to create limitless - even nested - versions of a base reality, each with potentially tweaked physical or biological laws for experimental reasons, the number of virtual realities must far outweigh the original model.

As for the probability of it being true in our universe, this key percentage varies widely from pundit to pundit. Astronomer and presenter Neil deGrasse Tyson has publicly admitted he considers it an even chance likelihood, whilst Space-X and Tesla entrepreneur Elon Musk is prepared to go much further, having stated that there is only a one in a billion chance that our universe is the genuine physical one!

Of course anyone can state a probability for a hypothesis as being fact without providing supporting evidence, but then what is to differentiate such an unsubstantiated claim from a religious belief? To this end, a team of researchers at the University of Bonn published a paper in 2012 called 'Constraints on the Universe as a Numerical Simulation', defining possible methods to verify whether our universe is real or virtual. Using technical terms such as 'unimproved Wilson fermion discretization' makes it somewhat difficult for anyone who isn't a subatomic physicist to get to grips with their argument (you can insert a smiley here) but the essence of their work involves cosmic rays. The paper states that in a virtual universe these are more likely to travel along the axes of a multi-dimensional, fundamental grid, rather than appear in equal numbers in all directions. In addition, they will exhibit energy restrictions at something called the Greisen-Zatsepin-Kuzmin cut-off (probably time for another smiley). Anyhow, the technology apparently exists for the relevant tests to be undertaken, assuming the funding could be obtained.

So could our entire lives simply be part of a Twenty-Second Century schoolchild's experiment or museum exhibit, where visitors can plug-in, Matrix-style, to observe the stupidities of their ancestors? Perhaps historians of the future will be able to run such simulations as an aide to their papers on why the hell, for example, the United Kingdom opted out of the European Union and the USA elected Donald Trump?

Now there's food for thought.