Tuesday, 23 April 2019

Lift to the stars: sci-fi hype and the space elevator

As an avid science-fiction reader during my childhood, one of the most outstanding extrapolations for future technology was that of the space elevator. As popularised in Arthur C. Clarke's 1979 novel, The Fountains of Paradise, the elevator was described as a twenty-second century project. I've previously written about near-future plans for private sector spaceflight, but the elevator would be a paradigm shift in space transportation: a way of potentially reaching as far as geosynchronous orbit without the need for rocket engines.

Despite the novelty of the idea: a tower stretching from Earth - or indeed any planet's surface - to geosynchronous orbit and beyond; the first description dates back to 1895 and writings of the Russian theoretical astronautics pioneer Konstantin Tsiolkovsky. Since the dawn of the Space Age engineers and designers in various nations have either reinvented the elevator from scratch or elaborated on Tsiolkovsky's idea.

There have of course been remarkable technological developments over the intervening period, with carbyne, carbon nanotubes, tubular carbon 60 and graphene seen as potential materials for the elevator, but we are still a long way from being able to build a full-size structure. Indeed, there are now known to be many more impediments to the space elevator than first thought, including a man-made issue that didn't exist at the end of the nineteenth century. Despite this, there seems to be a remarkable number of recent stories about elevator-related experiments and the near-future feasibility of such a project.

An objective look at practical - as opposed to theoretical - studies show that results to date have been decidedly underwhelming. The Space Shuttle programme started tethered satellite tests in 1992. After an initial failure (the first test achieved a distance of a mere 256 metres), a follow up six years later built a tether that was a rather more impressive twenty kilometres long. Then last year the Japanese STARS-me experiment tested a miniature climber component in orbit, albeit at a miniscule distance of nine metres. Bearing in mind that a real tower would be over 35,000 kilometres long, it cannot be argued that the technology is almost available for a full-scale elevator.

This hasn't prevented continuous research by the International Space Elevator Consortium (ISEC), which was formed in 2008 to promote the concept and the technology behind it. It's only to be expected that fans of the space elevator would be enthusiastic, but to my mind their assessment that we are 'tech ready' for its development seems to be optimistic to the point of incredulity.

A contrasting view is that of Google X's researchers, who mothballed their space elevator work in 2014 on the grounds that the requisite technology will not be available for decades to come. While the theoretical strength of carbon nanotubes meets the requirements, the total of cable manufactured to date is seventy centimetres, showing the difficulties in achieving mass production. A key stopping point apparently involves catalyst activity probability; until that problem is resolved, a space elevator less than one metre in length isn't going to convince me, at least.

What is surprising then is that in 2014, the Japanese Obayashi Corporation published a detailed concept that specified a twenty-year construction period starting in 2030. Not to be outdone, the China Academy of Launch Vehicle Technology released news in 2017 of a plan to actually build an elevator by 2045, using a new carbon nanotube fibre. Just how realistic is this, when so little of the massive undertaking has been prototyped beyond the most basic of levels?

The overall budget is estimated to be around US$90 billion, which suggests an international collaboration in order to offset the many years before the completed structure turns a profit. In addition to the materials issue, there are various other problems yet to be resolved. Chief among these are finding a suitable equatorial location (an ocean-based anchor has been suggested), capturing an asteroid for use as a counterweight, dampening vibrational harmonics, removing space junk, micrometeoroid impact protection and shielding passengers from the Van Allen radiation belts. Clearly, just developing the construction material is only one small element of the ultimate effort required.

Despite all these issues, general audience journalism regarding the space elevator - and therefore the resulting public perception - appears as optimistic as the Chinese announcement. How much these two feedback on each other is difficult to ascertain, but there certainly seems to be a case of running before learning to walk. It's strange that China made the claim, bearing in mind how many other rather important things the nation's scientists should be concentrating on, such as environmental degradation and pollution.

Could it be that China's STEM community have fallen for the widespread hype rather than prosaic reality? It's difficult to say how this could be so, considering their sophisticated internet firewall that blocks much of the outside world's content. Clearly though, the world wide web is full of science and technology stories that consist of parrot fashion copying, little or no analysis and click bait-driven headlines.

A balanced, in-depth synthesis of the relevant research is often a secondary consideration. The evolutionary biologist Stephen Jay Gould once labelled the negative impact of such lazy journalism as "authorial passivity before secondary sources." In this particular case, the public impression of what is achievable in the next few decades seems closer to Hollywood science fiction than scientific fact.

Of course, the irony is that even the more STEM-minded section of the public is unlikely to read the original technical articles in a professional journal. Instead, we are reliant on general readership material and the danger inherent in its immensely variable quality. As far as the space elevator goes (currently, about seventy centimetres), there are far more pressing concerns requiring engineering expertise; US$90 billion could, for example, fund projects to improve quality of life in the developing world.

That's not to say that I believe China will construct a space elevator during this century, or that the budget could be found anywhere else, either. But there are times when there's just too much hype and nonsense surrounding science and not enough fact. It's easy enough to make real-world science appear dull next to the likes of Star Trek, but now more than ever we need the public to trust and support STEM if we are to mitigate climate change and all the other environmental concerns.

As for the space elevator itself, let's return to Arthur C. Clarke. Once asked when he thought humanity could build one, he replied: "Probably about fifty years after everybody quits laughing." Unfortunately, bad STEM journalism seems to have joined conservatism as a negative influence in the struggle to promote science to non-scientists. And that's no laughing matter.

Monday, 1 April 2019

The day of the dolphin: covert cetaceans, conspiracy theories and Hurricane Katrina

One of the late, great Terry Pratchett's Discworld novels mentions a failed attempt by King Gurnt the Stupid to conduct aerial warfare using armoured ravens. Since real life is always stranger than fiction, just how harebrained are schemes by armed forces to utilise animals in their activities?

Large mammals such as horses and elephants have long been involved in the darker aspects of human existence, but the twentieth century saw the beginnings of more sophisticated animals-as-weapons schemes, including for example, research into the use of insects as disease vectors.

Some of the fruitier research projects of the 1960s saw the recruitment of marine mammals, reaching an apotheosis - or nadir - in the work of John Lilly. A controversial neuroscientist concerned with animal (and extraterrestrial) communication, Lilly even gave psychedlic drugs to dolphins as part of attempts to teach them human language and logic: go figure!

Whether this work was the direct inspiration for military programmes is uncertain, but both the Soviet and United States navies sought to harness the intelligence and learning capabilities of marine mammals during the Cold War. Besides bottlenose dolphins, sea lions were also trained in activities such as mine detection, hardware retrieval and human rescue. Although the Russians are said to have discontinued their research some years ago, the US Navy's Marine Mammal Research Program is now in its sixth decade and has funding up until at least next year.

Various sources claim that there is a classified component to the program headquartered in San Diego under the moniker the Cetacean Intelligence Mission. Although little of any value is known for certain, researchers at the University of Texas at Austin have been named as one of the groups who have used naval funding to train dolphins - plus design a dolphin equipment harness - for underwater guard duty. A more controversial yet popular claim is for their use as weapon platforms involving remote-controlled knock-out drug dart guns. If this all sounds a bit like Dr. Evil's request for "sharks with lasers" then read on before you scoff.

In the aftermath of the devastation caused by Hurricane Katrina in August 2005, it was discovered that eight out of fourteen bottlenose dolphins that were housed at the Marine Life Oceanarium in Gulfport, Mississippi, had been swept out to sea. Although later recovered by the United States Navy, this apparently innocent operation has a bearing on a similar escape that was given much greater news coverage soon after the hurricane.

Even respected broadsheet newspapers around the world covered the story generated by a US Government leak that thirty-eight United States Navy dolphins had also gotten free after their training ponds near Lake Pontchartrain, Louisiana, were inundated by Hurricane Katrina. Apart from the concerns of animal rights groups that: (a) dolphins shouldn't be used as weapons platforms; and (b) how would they cope in the open ocean of the Gulf of Mexico (vis-a-vis its busy shipping lanes)? another issue was the notion that the dolphins might attack civilian divers or vessels.

It would be quite easy here to veer into the laughable fantasies that the Discovery Channel tries to pass off as genuine natural history, if it weren't for a string of disconcerting facts. The eight dolphins that escaped from the Marine Life Oceanarium were kept by the navy for a considerable period before being returned to Mississippi. This was explained at the time as a health check by navy biologists, but there is a more sinister explanation: what if the dolphins were being examined to ensure that they were not military escapees from Lake Pontchartrain?

The latter half of 2005 into early 2006 saw the resumption of fishing in the Gulf of Mexico, following the destruction of almost ninety per cent of the region's commercial fleet in the hurricane. However, many of the smaller boats that did make it back to sea returned to port with unusual damage, or in some cases, had to be towed after failing to make it home under their own power. Much of this was put down to hasty repairs in order to resume fishing - a key component of the local economy - as soon as possible.

Reports released by boat yards during this period show inexplicable damage to rudders and propellers, mainly to shrimp boats. Fragments of metal, plastic and pvc were recovered in a few cases, causing speculation as to where this material had come from. The National Marine Fisheries Service requested access to the flotsam, which was subsequently lost in the chain of bureaucracy; none of the fragments have been seen since. It may not be on the scale of Roswell, but someone in the US military seems to be hiding something here.

It's been over half a century since Dr. Lilly's experiments inspired such fictional cetacean-centred intrigue as The Day of the Dolphin. Therefore, there has been plenty of time for conspiracy theorists to cobble together outlandish schemes on the basis of threadbare rumours. What is certain is that the enormous reduction in the region's fishing that followed in the wake of Hurricane Katrina would have been a boon for the Gulf of Mexico's fish stocks. This would presumably have carried on up the food chain, allowing dolphin numbers to proliferate throughout 2006 and beyond.

Whether the US Navy was able to recover some or all of its underwater army is not known, but it doesn't take much imagination to think of the dolphins enjoying their freedom in the open ocean, breaking their harnesses upon the underside of anchored fishing vessels, determined to avoid being rounded up by their former keepers. The Gulf in the post-Katrina years would have been a relative paradise for the animals compared to their military careers.

Although the United States Navy is said to have spent less than $20 million dollars per annum on the Marine Mammal Research Program, a mere drop in the ocean (you know that one's irresistible) compared to the mega-budgets of many Department of Defense projects, the low cost alone suggests the value of attempting to train dolphins for military purposes. Perhaps the truth will emerge one day, once the relevant files are declassified. Or alternatively, a new John Lilly may come along and be finally able to translate dolphinese. In which case, what are the chances that descendants of the Lake Pontchartrain escapees will recall the transition from captivity to freedom with something along the lines of "So long, and thanks for all the fish!"

Wednesday, 20 March 2019

My family & other animals: what is it that makes Homo sapiens unique?

It's a curious thing, but I can't recall ever having come across a comprehensive assessment of what differentiates Homo sapiens from all other animals. Hence this post is a brief examination on what I have found out over the years. I originally thought of dividing it into three neat sections, but quickly discovered that this would be, as Richard Dawkins once put it, 'a gratuitously manufactured discontinuity in a continuous reality.' In fact, I found a reasonably smooth gradation between these segments:
  1. Long-held differences now found to be false
  2. Possibly distinctions - but with caveats
  3. Uniquely human traits
Despite the carefully-observed, animal-centered stories of early civilisations - Aesop's fable of The Crow and the Pitcher springs to mind - the conventional wisdom until recently was that animals are primarily automatons and as such readily exploitable by humanity. Other animals were deemed vastly inferior to us by a question of kind, not just degree, with a complete lack of awareness of themselves as individuals.

The mirror test developed in 1970 has disproved that for a range of animals, from the great apes to elephants, dolphins to New Caledonian crows. Therefore, individuals of some species can differentiate themselves from their kin, leading to complex and fluid hierarchies within groups - and in the case of primates, some highly Machiavellian behaviour.

Man the tool-maker has been a stalwart example of humanity's uniqueness, but a wide range of animals in addition to the usual suspects (i.e. great apes, dolphins and Corvidae birds) are now known to make and use tools on a regular basis. Examples include sea otters, fish, elephants, and numerous bird species, the latter creating everything from fish bait to insect probes. Even octopuses are known to construct fences and shelters, such as stacking coconut shells - but then they do have eight ancillary brains in addition to the main one!

We recognise regional variations in human societies as the result of culture, but some animal species also have geographically-differentiated traits or tools that are the obvious equivalent. Chimpanzees are well known for their variety of techniques used in obtaining food or making tools. These skills are handed down through the generations, remaining different to those used in neighbouring groups.

Interestingly, farming has really only been adopted by the most humble of organisms, namely the social insects. Ants and termites farm aphids and fungi in their complex, air-conditioned cities that have more than a touch of Aldous Huxley's Brave New World about them; in a few species, the colonies may even largely consist of clones!

Although many animals construct nests, tunnels, dams, islets or mounds, these appear to serve purely functional purposes: there is no equivalent of the human architectural aesthetic. Octopus constructions aside, birds for example will always build a structure that resembles the same blueprint used by the rest of their kind.

Many species communicate by aural, gestural or pheremonal languages, but only humans can store information outside of the body and across generations living at different times. Bird song might sound pretty, but again, this appears to be a series of basic, hard-wired, communications. Conversely, humpback whale song may contain artistic values but we just don't know enough about it to judge it in this light.

Birds and monkeys are happy to hoard interesting objects, but there is little aesthetic sense in animals other than that required to identify a high-quality mate. In contrast, there is evidence to suggest that other species in the hominin line, such as Neanderthals and Homo erectus, created art in forms recognisable today, including geometric engravings and jewellery.

Some of our ancestor's earliest artworks are realistic representations, whereas when armed with a paint brush, captive chimps and elephants produce abstract work reminiscent of pre-school children. We should remember that only since the start of the Twentieth Century has abstract art become an acceptable form for professional artists.

Jane Goodall's research on the Gombe chimps shows that humans are not the only animal to fight and kill members of the same species for reasons other than predation or rivalry. Sustained group conflict may be on a smaller scale and have less rules than sanctioned warfare, but it still has enough similarity to our own violence to say that humanity is not its sole perpetrator. One interesting point is that although chimps have been known to use sharpened sticks to spear prey, they haven't as yet used their weapons on each other.

Chimpanzees again have been shown to empathise with other members of their group, for example after the death of a close relative. Altruism has also been observed in the wild, but research suggests there is frequently another motive involved as part of a long-term strategy. This is countered with the notion that humans are deemed able to offer support without the expectation of profit or gain in the future; then again, what percentage of such interactions are due to a profitless motivation is open to suggestion.

A tricky area is to speculate on the uniqueness of ritual to Homo sapiens. While we may have usurped the alpha male position in domesticated species such as dogs, their devotion and loyalty seems too far from deity worship to be a useful comparison; certainly the idea of organised religion has to be alien to all other species? Archaeological evidence shows what appears to be Neanderthal rituals centred on cave bears, as well as funereal rites, but the DNA evidence for interbreeding with modern humans doesn't give enough separation to allow religion to be seen as anything other than a human invention. What is probably true though is that we are the only species aware of our own mortality.

One area in which humans used to be deemed sole practitioners is abstract thought, but even here there is evidence that the great apes have some capability, albeit no greater than that of a pre-schooler. Common chimps and bonobos raised in captivity have learnt - in some cases by observation, rather than being directly taught - how to use sign language or lexigrams to represent objects and basic grammar. It's one thing to see a button with a banana on it and to learn that pressing it produces a banana, but to receive the same reward for pressing an abstract symbol shows a deeper understanding of relationship and causality.

A consideration of a potential future is also shared with birds of the Corvidae family, who are able to plan several steps ahead. Where humans are clearly far ahead is due to a gain in degree rather than just kind. Namely, we have the ability to consider numerous future paths and act accordingly; this level of sophistication and branch analysis appears to be uniquely human, allowing us to cogitate about possibilities in the future that might occur - or may never be possible. Both prose and poetic literature are likely to be uniquely human; at least until we can decipher humpback whale song.

Finally, there is science, possibly the greatest of human inventions. The multifarious aspects of the scientific endeavour, from tentative hypothesis to experimentation, advanced mathematics to working theory, are unlikely to be understood let alone attempted by any other species. The combination of creative and critical thinking, rigour and repetition, and objectivity and analysis require the most sophisticated object in the known universe, the human brain. That's not to say there aren't far more intelligent beings out there somewhere, but for now there is one clear activity that defines us as unique. And thank goodness it isn't war!

Sunday, 10 March 2019

Buzzing away: are insects on the verge of global extinction?

It's odd how some of these posts get initiated. For this particular one, there were two driving factors. One was passing a new house on my way to work where apart from the concrete driveway, the front garden consisted solely of a large square of artificial grass; the owners are clearly not nature lovers! The second inspiration was listening to a BBC Radio comedy quiz show, in which the panel discussed the recent report on global insect decline without being able to explain why this is important, apart from a vague mention of pollination.

Insect biologists have long sung the praises of these unrewarded miniature heroes, from JBS Haldane's supposed adage about God being "inordinately fond of stars and beetles" to EO Wilson's 1987 speech that described them as "the little things that run the world." In terms of numbers of species and individuals, invertebrates, especially insects, are the great success story of macroscopic life on our planet. So if they are in serious decline, does that spell trouble for Homo sapiens?

The new research claims that one-third of all insect species are currently endangered, extrapolating to wholesale extinction for the class Insecta over the next century. Although the popular press has started using evocative phrases such as "insect genocide" and even "insectageddon", just how accurate are these dramatic claims?

The United Nation's Red List currently describes three hundred insect species as critically endangered and a further seven hundred as vulnerable, but this is a tiny proportion of the total of...well, at lot more, at any rate. One oft-quoted figure is around one million insect species, although entomologists have estimated anywhere from 750,000 up to 30 million, with many species still lacking formal scientific identification. The hyperbole could therefore easily sound like unnecessary scaremongering, until you consider the details.

The new report states that butterflies and caddis flies are suffering the greatest decline, while cockroaches - as anyone who has faced a household infestation of them will know, they are likely to remain around until the end of world - and flies are the least affected orders. So, to paraphrase Monty Python, what have the insects ever done for us?

Pollination is of course of key importance, to both horticulture and un-managed 'wild' environments. Insects are near the base of many food webs; if numbers were much reduced, never mind removed, the impact on the rest of the ecosystem would be catastrophic. With the human population set to top ten billion in thirty years' time, we require ever larger regions of productive land for agriculture. They may be small at an individual level, but arthropods in general total about seventeen times the mass of all us H. sapiens. Insects replenish the soil, as alongside bacteria they break down dead matter and fecal material. So important is this latter function that New Zealand has been trialling non-native dung beetles to aid cattle farmers.

One key way to save fresh water and lessen the generation of the potent greenhouse gas methane is to reduce meat consumption in favour of insect protein. If insects are no longer around, then that will be an additional challenge in reducing environmental degradation. This of course also ignores the fact that insects are already a component in the diet of many developing nations. Last year I wrote about how scientists have been creating advanced materials derived from animals. Again, we are shooting ourselves in the foot if we allow this ready-made molecular library to be destroyed.

What is responsible for this global decline? Perhaps unsurprisingly, it turns out to be the usual suspects. Agricultural chemicals including pesticides have been associated with honey-bee colony collapse disorder (not incidentally, some tests have found honey samples with neonicotinoids - the mostly widely-used insecticides - exceeding the recommended human dosage) so clearly the same culprit is affecting other insects. Fresh waterways, home to many aquatic insect species, are frequently as polluted as the soil, either due to agricultural run-off or industrial contaminants. Wild landscapes are being converted with great haste into farm land and urban sprawl, with an obviously much-reduced biota.

Climate change is playing its part, with soil acidity increasing just as it is in the oceans. Even areas as remote as central Australia have seen marked decreases in insects as higher temperatures and lower rainfall outpaces the ability to adapt to the new conditions. I've often mentioned the role of invasive species in the decimation of indigenous vertebrates, but insects are equally prone to suffer from the arrival of newcomers. Although New Zealand has very strict biosecurity protocols, the likes of Queensland fruit flies and brown marmorated stink bugs are still occasionally found in or around ports of entry.

Many nations have no such procedures in place, resulting in local species being out-competed or killed by introduced species or pathogens to which they have no resistance. Until fairly recently, even New Zealand had a lax attitude to the issue, resulting in the decline of native species such as carabid beetles. When I conducted a brief survey of my garden in 2017 I found that one-third of the insect species were non-native, most of these being accidental imports since the arrival of European settlers.

If insects are so vital to our survival, why has there been so little interest in their well-being? There are some fairly obvious suggestions here. Firstly, at least in Western cultures, insects have been deemed dirty, ugly things that can be killed without a second thought. Wasps, ants and cockroaches in particular are seen in this light of being unwelcome pests, with typical insect-related phrases including "creepy crawlies" and "don't let the bed bugs bite".

It's fairly well-known that malaria-carrying mosquitoes are the most dangerous animals for us humans in terms of fatalities. The widespread outbreaks of the Zika virus haven't done them any favours either. As Brian Cox's television series Wonders of Life showed, their small size has given them veritable super powers compared to us lumbering mammals, from climbing up sheer surfaces (as a praying mantis was doing a few nights' ago on my window) to having amazing strength-to-weight ratios. All in all, insects are a bit too alien for their own good!

Clearly, scale prejudice is also a key factor. On a recent trip to Auckland Central Library I only found one book on insects versus dozens on birds. Photographic technology has been a double-edged sword when it comes to giving us a clearer picture of insects: close-ups are often greeted with revulsion, yet until Sir David Attenborough's 2005 BBC series Life in the Undergrowth, there was little attempt to film their behaviour with the same level of detail as say, the lions and antelopes of the Serengeti. It should also be mentioned that when Rachel Carson's ground-breaking book about the dangers of pesticides, Silent Spring, was published in 1962, the resulting environmentalism was largely in support of birds rather than insects.

Among all this doom and gloom, are there any ways to prevent it? One thing is for certain, and that is that it won't be easy. The agricultural sector would have to make drastic changes for a start, becoming much smarter in the use of chemicals and be held responsible for the local environment, including waterways. Vertical farming and other novel techniques could reduce the need for new agricultural land and water usage, but developing nations would be hard-pressed to fund these themselves.

Before any major undertaking, there's going to have to be either a fundamental crisis, such as food shortages, in a rich nation or a massive public relations exercise to convince people to consider insects in the same light as giant pandas or dolphins. This is not going to be easy, but as David Attenborough put it: "These small creatures are within a few inches of our feet, wherever we go on land - but often, they're disregarded. We would do very well to remember them."

Sunday, 24 February 2019

Core solidification and the Cambrian explosion: did one begat the other?

Let's face it, we all find it easier to live our lives with the help of patterns. Whether it's a daily routine or consultation of an astrology column (insert expletive of choice here) - or even us amateur astronomers guiding our telescopes via the constellations - our continued existence relies on patterns. After all, if we didn't innately recognise our mother's face or differentiate harmless creatures from the shape of a predator, we wouldn't last long. So it shouldn't be any surprise that scientists also rely on patterns to investigate the complexities of creation.

Richard Feynman once said that a scientific hypothesis starts with a guess, which should perhaps be taken with a pinch of salt. But nonetheless scientists like to use patterns when considering explanations for phenomena; at a first glance, this technique matches the principle of parsimony, or Occam's Razor, i.e. the simplest explanation is usually the correct one - excluding quantum mechanics, of course!

An example in which a potential pattern was widely publicised prior to confirmation via hard data was that of periodic mass extinction, the idea being that a single cause might be behind the five greatest extinction events. Four years after Luis Alvarez's team's suggestion that the 66 million year-old Chicxulub impactor could have caused the Cretaceous-Paleogene extinction, paleontologists David Raup and Jack Sepkoski published a 1984 paper hypothesising extinctions at regular intervals due to extraterrestrial impacts.

This necessitated the existance of an object that could cause a periodic gravitational perturbation, in order for asteroids and comets to be diverted into the inner solar system. The new hypothesis was that we live in binary star system, with a dwarf companion star in an highly elliptical, 26 million-year orbit. This would be responsible for the perturbation when it was at perihelion (i.e. closest approach to the sun).

What's interesting is that despite the lack of evidence, the hypothesis was widely publicised in popular science media, with the death-dealing star being appropriately named Nemesis after the Greek goddess of retribution. After all, the diversification of mammals was a direct result of the K-T extinction and so of no small importance to our species.

Unfortunately, further research has shown that mass extinctions don't fall into a neat 26 million-year cycle. In addition, orbiting and ground-based telescopes now have the ability to detect Nemesis and yet have failed to do so. It appears that the hypothesis has reached a dead end; our local corner of the universe probably just isn't as tidy as we would like it to be.

Now another hypothesis has appeared that might appear to belong in a similar category of neat pattern matching taking precedence over solid evidence. Bearing in mind the importance of the subject under scrutiny - the origin of complex life - are researchers jumping the gun in order to gain kudos if proven correct? A report on 565 million year-old minerals from Quebec, Canada, suggests that at that time the Earth's magnetic field was less than ten percent of what it is today. This is considerably lower than earlier estimate of forty percent. Also, the magnetic poles appear to have reversed far more frequently during this period than they have since.

As this is directly related to the composition of the Earth's core, it has led to speculation that the inner core was then in the final stage of solidification. This would have caused increased movement in the outer liquid, iron-rich core, and thus to the rapid generation of a much higher magnetic field. In turn, the larger the magnetic field dipole intensity, the lower the amount of high energy particles that reach the Earth's surface, both cosmic rays and from our own sun. What is particularly interesting about this time is that it is just (i.e. about twenty million years) prior to the so-called Cambrian explosion, following three billion years or so of only microbial life. So were these geophysical changes responsible for a paradigm shift in evolution? To confirm, we would need to confirm the accuracy of this apparently neat match.

It's well known that some forms of bacteria can survive in much higher radiation environments than us larger scale life forms; extremophiles such as Deinococcus radiodurans have even been found thriving inside nuclear reactors. Therefore it would seem obvious that more complex organisms couldn't evolve until the magnetic field was fairly high. But until circa 430 million years ago there was no life on land (there is now evidence that fungi may have been the first organisms to survive in this harsh environment). If all life was therefore in the sea, wouldn't the deep ocean have provided the necessary radiation protection for early plants and animals?

By 600 million years ago the atmospheric oxygen content was only about ten percent of today's value; clearly, those conditions would not have been much use to pretty much any air-breathing animals we know to have ever existed. In addition, the Ediacaran assemblage, albeit somewhat different from most subsequent higher animals, arose no later than this time - with chemical evidence suggesting their development stretched back a further 100 million years. Therefore the Canadian magnetic mineral evidence seems to be too late for the core solidification/higher magnetic field generation to have given the kick start to a more sophisticated biota.

In addition, we shouldn't forget that it is the ozone layer that acts as an ultraviolet shield; UVB is just as dangerous to many organisms, including near-surface marine life, as cosmic rays and high-energy solar particles. High-altitude ozone is thought to have reached current density by 600 million years ago, with blue-green algae as its primary source. O2 levels also increased at this time, perhaps driven by climate change at the end of a global glaciation.

Although the "Snowball Earth" hypothesis - that at least half of all ocean water was frozen solid during three or four periods of glaciation - is still controversial, there is something of a correlation in time between the geophysical evidence and the emergence of the Ediacaran fauna. As to the cause of this glacial period, it is thought to have been a concatenation of circumstances, with emergent plate tectonics as a primary factor.

How to conclude? Well, we would all like to find neat, obvious solutions, especially to key questions about our own origin. Unfortunately, the hypothesis based on the magnetic mineral evidence appears to selectively ignore the evolution of the Ediacaran life forms and the development of the ozone layer. The correlation between the end of "Snowball Earth" and the Ediacaran biota evolution is on slightly firmer ground, but the period is so long ago that even dating deposits cannot be accurate except to the nearest million years or so.

It's certainly a fascinating topic, so let's hope that one day the evidence will be solid enough for us to finally understand how and when life took on the complexity we take for granted. Meanwhile, I would take any speculation based on new evidence with a Feynman-esque pinch of salt; the universe frequently fails to match the nice, neat, parcels of explanations we would like it to. Isn't that one of the factors that makes science so interesting in the first place?

Monday, 11 February 2019

The Square Kilometre Array: is it the wrong big science for New Zealand?

I've previously written about the problems besetting some mega-budget science projects and the notion that perhaps they should lose precedence to smaller programmes with quicker returns to both science and society. Of course there are advantages to long-term international STEM collaboration, including social, economic and political benefits, but there is a good case for claiming that projects are sometimes initiated without a full appreciation of the details.

Take for example, the Square Kilometre Array or SKA, the largest science project New Zealand has ever been involved with. Headquartered at the UK's Jodrell Bank Observatory (incidentally, I've been there a few times and it's well worth a visit if you're in the vicinity), twelve key nations are collaborating to construct two main arrays, one in Australia and the other in South Africa and some of its neighbours. The combined arrays will have a sensitivity fifty times greater than previous radio telescopes, allowing them to survey the sky far faster than has been done before and look back in time much earlier than current instruments.

But such paradigm-shifting specifications come with a very high price tag – and the funding sources are yet to be finalised. The €1.8 billion project is scheduled to start Phase 1 construction in 2024 and aims to begin observations four years later. Research will include a wide range of fundamental astrophysical questions, from exploring the very early universe only 300,000 years after the Big Bang to testing general relativity, gaining information on dark energy and even some SETI research.

The New Zealand contribution is organised via the Australia-New Zealand SKA Coordination Committee (ANZSCC) and is geared towards data processing and storage. The Central Signal Processor and Science Data Processor are fundamental components of the project, since the radio telescopes are expected to generate more data than the world currently stores.  As well as closer collaboration between the scientists and engineers of various nations, one of the aims of SKA is to become a source of public science education, something I have repeatedly pointed out is in desperate need of improvement.

So if this all seems so promising, why has the New Zealand Government announced that it may pull back from committing the outstanding NZ$23 million (equal to less than 10% of Australia's funding)? To date, the country has paid less than NZ$3 million. In 2015 I discussed the danger of the country falling behind in cutting-edge STEM research and Rocket Lab aside (which is after all, an American-owned company despite its kiwi founder) the situation hasn't really changed. so why did Research, Science and Innovation Minister Megan Woods declare this potential about turn, which may well relegate New Zealand to associate membership status?

The initial answer appears to be one of pure economics. Although the project is generating development of world-class computer technology, a report has questioned the long-term benefits from investing such comparatively large sums of public money. India is already an associate member while Germany has been considering a similar downgrade for some years and Canada may follow suit. The project is already  a decade behind schedule and New Zealand had hoped to be an array-hosting nation but lost out due to a lower bid from South Africa. SKA is run by a same-name not-for-profit organisation and so presumably any financial rewards are of a secondary nature (perhaps along the lines of patents or new technologies that can be repurposed elsewhere).

Interestingly, New Zealand's science community has been divided on the issue. While Auckland University of Technology and Victoria University of Wellington have objected to the downgrade, the university of Auckland's head of physics Richard Easther has support the Ministry of Business, Innovation and Employment (MBIE) decision, saying that far from providing financial and long-term science benefits (in both applied computing and astrophysical data), SKA is a white elephant, hinting that it might well be obsolete by the time it starts gathering data.

Another University of Auckland astrophysicist, Dr Nick Rattenbury, argues that the nation's public funding infrastructure is currently too primitive for it to become involved in such international mega-budget STEM projects. I simply don't know enough detail to question whether such adages as you need to speculate in order to accumulate apply here; it's clearly a well-thought out programme, unlike say the politically-motivated yet vague and probably unworkable Predator Free 2050 scheme.

If SKA was committed to solving an immediate practical problem in the fields of say, environmental degradation, food and water production, or medicine, I would probably have no hesitation in supporting it whole-heartedly, regardless of the cost to the public purse. But the universe has been around almost fourteen billion years, so I for one don't mind if it holds onto a few of its fundamental secrets for a little while longer.

Saturday, 26 January 2019

Concrete: a material of construction & destruction - and how to fix it

How often is it that we fail to consider what is under our noses? One of the most ubiquitous of man-made artifices - at least to the 55% of us who live in urban environments - is concrete. Our high-rise cities and power stations, farmyard siloes and hydroelectric dams wouldn't exist without it. As it is, global concrete consumption has quadrupled over the past quarter century, making it second only to water in terms of humanity's most-consumed substance. Unfortunately, it is also one of most environmentally-unfriendly materials on the planet.

Apart from what you might consider to be the aesthetic crimes of the bland, cookie-cutter approach to International Modernist architecture, there is a far greater issue due to the environmental degradation caused by the concrete manufacturing process. Cement is a key component of the material, but generates around 8% of all carbon dioxide emissions worldwide. As such, there needs to be a 20% reduction over the next ten years in order to fulfil the Paris Agreement - yet there is thought there may be a 25% increase in demand for concrete during this time span, particularly from the developing world. Although lower-carbon cements are being developed, concrete production causes other environmental issues as well. In particular, sand and gravel extraction is bad for the local ecology, including catastrophic damage to the sea bed.

So are there any alternatives? Since the 1990's, television series such as Grand Designs have presented British, New Zealand and Australian-based projects for (at times) extremely sustainable houses made from materials such as shipping containers, driftwood, straw bales, even shredded newspaper. However, these are mostly the unique dream builds of entrepreneurs, visionaries and let's face it, latter-day hippies. The techniques used might be suitable for domestic architecture, but they are impractical at a larger scale.

The US firm bioMASON studied coral in order to develop an alternative to conventional bricks, which generate large amounts of greenhouse gases during the firing process. They use a biomineralisation process, which basically consists of injecting microbes into nutrient-rich water containing sand and watching the rod-shaped bacteria grow into bricks over three to five days.  It's still comparatively early days for the technology, so meanwhile, what about applying the three environmental ‘Rs' of Reduce, Reuse and Recycle to conventional concrete design and manufacturing?

1 Reduce

3D printers are starting to be used in the construction industry to fabricate building and structural components, even small footbridges. Concrete extrusion designs require less material than is required by conventional timber moulds - not to mention removing the need for the timber itself. One common technique is to build up shapes such as walls from thin, stacked, layers. The technology is time-effective too: walls can be built up at a rate of several metres per hour, which may induce companies to make the initial outlay for the printing machinery.

As an example of the low cost, a 35 square metre demonstration house was built in Austin, Texas, last year at a cost of US$10,000 - and it only took 2 days to build. This year may see an entire housing project built in the Netherlands using 3D-printed concrete. Another technique has been pioneered at Exeter University in the UK, using graphene as an additive to reduce the amount of concrete required. This greatly increases both the water resistance and strength compared to the conventional material, thus halving the material requirement.

2 Reuse

Less than a third of the material from conventionally-built brick and timber structures can be reused after demolition. The post-war construction industry has continually reduced the quality of the building material it uses, especially in the residential sector; think of pre-fabricated roof trusses, made of new growth, comparatively unseasoned timber and held together by perforated connector plates. The intended lifespan of such structures could be as little as sixty years, with some integrated components such as roofing failing much sooner.

Compare this to Roman structures such as aqueducts and the Pantheon (the latter still being the world's largest unreinforced concrete dome) which are sound after two millennia, thanks to their volcanic ash-rich material and sophisticated engineering. Surely it makes sense to use concrete to construct long-lasting structures, rather than buildings that will not last as long as their architects? If the reuse of contemporary construction materials is minimal (about as far removed as you can get from the traditional approach of robbing out stone-based structures in their entirety) then longevity is the most logical alternative.

3 Recycle

It is becoming possible to both recycle other waste into concrete-based building materials and use concrete itself as a secure storage for greenhouse gases. A Canadian company called CarbonCure has developed a technique for permanently sequestering carbon dioxide in their concrete by converting it into a mineral during the manufacturing process, with the added benefits of increasing the strength of the material while reducing the amount of cement required.

As for recycling waste material as an ingredient, companies around the world have been developing light-weight concrete incorporating mixed plastic waste, the latter comprising anywhere from 10% to 60% of the volume, particularly with the addition of high density polyethylene.

For example New Zealand company Enviroplaz can use unsorted, unwashed plastic packaging to produce Plazrok, a polymer aggregate for creating a concrete which is up to 40% lighter than standard material. In addition, the same company has an alternative to metal and fibreglass panels in the form of Plaztuff, a fully recyclable, non-corroding material which is one-seventh the weight of steel. It has even been used to build boats as well as land-based items such as skips and playground furniture.

Therefore what might appear to be an intractable problem appears to have a variety of overlapping solutions that allow sustainable development in the building and civil engineering sector. It is somewhat unfortunate then that the conservative nature of these industries has until recently stalled progress in replacing a massive pollutant with much more environmentally sound alternatives. Clearly, green architecture doesn't have to be the sole prerogative of the driftwood dreamers; young entrepreneurs around the world are seizing the opportunity to create alternatives to the destructive effects of construction.

Friday, 11 January 2019

Hot, cold or in between: thermoregulation and public misunderstanding of science

I recently spotted an intriguing paleontology article concerning the 180 million year old fossil remains of an ichthyosaur, a marine reptile from the Early Jurassic. The beastie, belonging to the genus Stenopterygius,  is so well preserved that it shows coloration patterns (if not the colours themselves) on patches of scaleless skin, as well as a thick layer of insulating fat or blubber. What makes the latter so intriguing is that reptiles just aren't meant to have blubber. Then again, like some snakes and skinks today, ichthyosaurs must have given birth to live young. Thus the gap between reptiles and mammals surely grows ever smaller?

This conundrum touches on some interesting issues about the public's knowledge of science. Several times I've commented on what Richard Dawkins calls the "tyranny of the discontinuous mind", which is the way in which we use categorisation to make it easier to understand the world. It might seem that this is the very essence of some aspects of science, as in New Zealand physicist Ernest Rutherford's famously ungenerous quote that "Physics is the only real science. The rest are just stamp collecting." Indeed, examination of the life and work of many early botanists for example might appear to verify this statement. However, there needs to be an understanding that science requires a flexibility of mind set, a fundamental scientific process being the discarding of a pet theory in favour of a more accurate one.

I'm sure I've remarked countless times - again, echoing Professor Dawkins - that science is in this respect the antithesis of most religions, which set key ideas into stone and refuse to accept any challenges towards them. In the case of the blubber-filled Stenopterygius, it is still a reptile, albeit one that had many of the attributes of mammals. As for the latter, from our pre-school picture books onwards we tend to think of the main mammalian subclass, the placentals, but there are two smaller subclasses: the marsupials, such as the kangaroo; and the monotremes, for example the duck-billed platypus. It has been known since the 1880s that the platypus lays eggs rather than giving birth to live young, a characteristic it shares with the other four monotreme species alive today. In addition, their body temperature is five degrees Celsius lower than that of placental mammals, part of a suite of features presumably retained from their mammal-like reptile ancestors.

Even so, these traits do not justify the comment made by host Stephen Fry in a 2005 episode of the BBC TV quiz show QI, when he claimed that marsupials are not mammals! Richard Dawkins has frequently pointed out that it would be unacceptable to have a similar level of ignorance about the arts as there is on scientific matters, with this being a clear case in point as regards the cultured and erudite Mr Fry. Yet somehow, much of the general public has either a lack or a confusion concerning basic science. Indeed, only  last week I listened to a BBC Radio topical comedy show in which none of the panel members could work out why one face of the moon is always hidden from our view. Imagine the response if it had been a basic lack of knowledge in the arts and literature, for example if an Oxbridge science graduate had claimed that Jane Austen had written Hamlet!

Coming back to the ichthyosaur, one thing we may have learnt as a child is that some animals are warm-blooded and others cold-blooded. This may be useful as a starting point but it is an overly-simplistic and largely outmoded evaluation of the relevant biology; the use of such binary categorisation is of little use after primary school age. In fact, there is series of steps from endothermic homeotherms (encompassing most mammals and birds) to ectothermic poikilotherms (most species of fish, reptiles, amphibians and invertebrates), with the former metabolic feature having evidently developed from the latter.

Ichthyosaurs are likely to have had one of the intermediate metabolisms, as may have been the case for some species of dinosaurs, possibly the smaller, feathered, carnivorous theropods. Likewise, some tuna and shark species are known to be able to produce heat internally, but in 2015 researchers at the US National Marine Fisheries Service announced that five species of the opah fish were found to be whole-body endotherms. Clearly, the boundaries between us supposedly higher mammals and everything else is far less secure than we had previously believed.

At times, science terminology might appear as too abstruse, too removed from the everyday and of little practical use outside of a pub quiz, but then does being able to critique Shakespeare or Charles Dickens help to reduce climate change or create a cure for cancer? Of course we should strive to be fully-rounded individuals, but for too long STEM has been side-lined or stereotyped as too difficult or irrelevant when compared with the humanities.

Lack of understanding of the subtleties and gradations (as opposed to clearly defined boundaries) in science make it easy for anti-science critics to generate public support. Ironically, this criticism tends to take one of two clearly opposing forms: firstly, that science is mostly useless - as epitomised by the Ig Nobel Prize; and alternatively, that it leads to dangerous inventions, as per the tabloid scare-mongering around genetically modified organisms (GMOs) or 'Frankenfoods' as they are caricatured.

Being able to discern nuanced arguments such as the current understanding of animal thermoregulation is a useful tool for all of us. Whether it is giving the public a chance to vote in scientifically-related referendums or just arming them so as to avoid quack medicine, STEM journalism needs to improve beyond the lazy complacency that has allowed such phrases as 'warm-blooded', 'living fossil', 'ice age' and 'zero gravity' to be repeatedly misused. Only then will science be seen as the useful, relevant and above all a much more approachable discipline than it is currently deemed to be.