Showing posts with label climate change. Show all posts
Showing posts with label climate change. Show all posts

Tuesday 25 February 2020

Falling off the edge: in search of a flat Earth

It's just possible that future historians will label the 21st century as the Era of Extreme Stupidity. In addition to the 'Big Four' of climate change denial, disbelief in evolution by natural selection, young Earth creationism and the anti-vaxxers, there are groups whose oddball ideas have rather less impact on our ecosystem and ourselves. One segment of people that I place in the same camp as UFO abductees and their probing fixation are believers in a flat Earth.

Although on the surface this - admittedly tiny - percentage of people appear to be more amusing than harmful, their media visibility makes them a microcosm of the appalling state of science education and critical thinking in general. In addition, their belief in an immense, long-running, global conspiracy adds ammunition to those with similar paranoid delusions, such as the moon landing deniers. One example of how intense those beliefs can be (at times there's just a whiff of religious fanaticism), the American inventor and stuntman 'Mad' Mike Hughes was killed recently flying a self-built rocket intended to prove that the Earth is a disc.

I won't bother to describe exactly what the flat Earthers take to be true, except that their current beliefs resemble a description of the late, great Terry Pratchett's fantasy Discworld - albeit without the waterfall around the edge of the disc. For anyone who wants to test the hypothesis themselves rather than rely on authority (the mark of a true scientist) there are plenty of observational methods to try. These include:
  1. Viewing the Earth's shadow on the Moon during a lunar eclipse
  2. Noticing that a sailing ship's mast disappears/reappears on the horizon after/before the hull
  3. How certain stars are only visible at particular latitudes
For anyone with a sense of adventure, you can also build a high-altitude balloon or undertake a HAHO skydive to photograph the Earth's curvature - from any point on the planet!

It's easy to suggest that perhaps our brains just aren't up to the task of deciphering the intricacies of a 13.7 billion old universe, but basic experiments and observations made several thousand years ago were enough for Greek scientists to confirm both the shape and size of our planet. So what has changed in the past century or so to turn back the clock, geophysically-speaking?

The modern take on a flat Earth seems to have begun in the late 19th century, with an attempt - similar to contemporary mid-Western creationists - to ignore scientific discoveries that disagree with a literal interpretation of the Old Testament. Indeed, the forerunners of today's flat Earthers were anti-science in many respects, also denying that prominent enemy of today's Biblical literalists, evolution by natural selection. However, many of the 21st century' s leading adherents to a disc-shaped Earth have more sympathy and interest in scientific discoveries, even supporting such politically contentious issues as rapid, human-induced, climate change.

This topic is laden with ironies, few greater than the fact that a large proportion of the evidence for global warming is supplied by space agencies such as NASA. The latter has long been claimed by the Flat Earth Society as a leading conspirator and purveyor of faked imagery in the promotion of a spherical earth (yes to all pedants, I know that strictly speaking our planet is an oblate spheroid, not purely spherical).

Today's flat Earth societies follow the typical pseudo-scientific / fringe approach, analysing the latest science theories for material they can cherry pick and cannibalise to support their ideas. In recent years they've even tackled key new developments such as dark energy; in fact, about the only area they are lagging behind in is the incorporation of elements involving quantum mechanics.

But for anyone with an understanding of parsimony or Occam's Razor, the physics for a flat Earth have about as much likelihood as Aristotle's crystalline spheres. It isn't just the special pleading for localised astrophysics (since the other planets are deemed spherical); isn't it obviously absurd that there could be a global conspiracy involving rival nations and potentially hundreds of thousands of people - with no obvious explanation of what the conspirators gain from the deception?

Even for the vast majority of the public with little interest or understanding of the physics, most people considering the flat Earth hypothesis are presumably puzzled by this apparent lack of motivation. In a nutshell, what's in it for the conspirators? Until recently, NASA (nick-named 'Never A Straight Answer,') was the main enemy, but with numerous other nations and private corporations building space vehicles, there is now a plethora of conspiracy partners. Going back half a century to the height of the Cold War why, for example, would the USA and Soviet Union have agreed to conspire? As yet, there hasn't been anything approaching a satisfactory answer; but ask Carl Sagan said: "Extraordinary claims require extraordinary evidence."

Unlike most fringe groups, flat Earthers don't appear to favour other, popular conspiracy theories above scientific evidence. Yet somehow, their ability to support ludicrous ideas whilst denying fundamental observations and the laws of physics in the light of so much material evidence is astonishing.  Of course our species doesn't have a mental architecture geared solely towards rational, methodical thought processes, but the STEM advances that Homo sapiens has made over the millennia prove we are capable of suppressing the chaotic, emotional states we usually associate with young children.

Whether we can transform science education into a cornerstone topic, as daily-relevant as reading, writing and arithmetic, remains to be seen. Meanwhile, the quest continues for funding a voyage to find the Antarctic ice wall that prevents the oceans falling over the edge of the world. Monty Python, anyone?

Wednesday 22 January 2020

Wildfires and woeful thinking: why have Australians ignored global warming?

In a curious example of serendipity, I was thinking about a quote from the end of Carl Sagan's novel Contact ("For small creatures such as we the vastness is bearable only through love") just a few minutes before discovering his daughter Sasha Sagan's book For Small Creatures Such as We. Okay, so I didn't buy the book - due to the usual post-Christmas funds shortage - and cannot provide a review, but this indication of our place in the scale of creation is something that resonates deep within me.

I've often discussed how biased we are due to our physical size, especially when compared to other species we share the planet with. However, I've never really considered that other fundamental dimension, time. Another Carl Sagan quote echoes many a poet's rumination on our comparatively brief lifespan: "We are like butterflies who flutter for a day and think it is forever."

There's more to this than just fairly familiar poetic conceit. Earlier this month I was given a brief taste of what it might be like to live on Mars, thanks to high-altitude dust and ash transported across the Tasman Sea from the Australian bush fires. By three o'clock in the afternoon a New Zealand summer's day was turned into an eerie orange twilight, with birds and nocturnal insects starting their evening routine some five hours early. There was even a faint powdery, acrid taste in the air, adding to the sense of other-worldliness.

Apart from the obvious fact that this an example of how climate change in one nation can affect another, there is a more disturbing element to all this. Why is it that despite the reports and general consensus of the global climate science community Australians have shown a woeful lack of interest, or indeed, negativity, towards climate change?

Could it be that our society is now centred upon such short increments of time (competing businesses trying to out-do each other, which comes down to working at the ever-increasing speed our technology dictates) that we have replaced analysis with unthinking acceptance of the simplest and most aggressive opinions? Research shows that compared to even twenty years' ago, children read far less non-school literature and rely on the almost useless 'celebrity' shouters of social media for much of their information; there's not much chance of learning about informed, considered arguments via these sources!

After all, it's difficult for most of us to remember exact details of the weather a year ago, but understanding climate change relies on acceptance of directional trends over at least decades. How much easier is it to accept the opinions of those who preserve the status quo and claim we can maintain our current lifestyle with impunity? When combined with the Western capitalist notion of continuous growth and self-regulation, we see a not-so-subtle indoctrination that describes action to prevent climate change as disruptive to the fundamental aspects of the society that has arisen since the Industrial Revolution.

There is an old French saying that we get the government we deserve, which in Australia's case, implies a widespread desire to ignore or even deny global warming. Yet the irony is that of all developed nations, Australia has been at the receiving end of some of its worst effects, thanks to an average increase in daily temperature of several degrees over past century. It takes little cognition to understand how this can lead to the drier conditions that have caused the horrific bush fires; even though some have been deliberately started, their scale has been exacerbated by the change of climate. So what until now has prevented Australians from tying the cause to the effects?

It's not as if there isn't plenty of real-world evidence. However, with computer technology able to generate 'deep fakes', which implies a level of sophistication that only experts can detect, is the public becoming mistrustful of the multitude of videos and photographs of melting polar caps and shrinking glaciers? When combined with the decreased trust in authority figures, scientists and their technical graphs and diagrams don't stand much of a chance of acceptance without a fair amount of suspicion. As mentioned, it's difficult to understand the subtleties inherent in much of science when you are running at breakneck speed just to stand still; slogans and comforting platitudes are much more acceptable - unless of course people become caught up in the outcome themselves.

However, this doesn't explain why it is the key phrases such as 'climate change' and 'global warming' generate such negative sentiment, even from those Australian farmers who admit to hotter, drier conditions than those experienced by their parents' and grandparents' generations. Somehow, these sober terms have become tainted as political slogans rather than scientifically-derived representations of reality. That this negativity has been achieved by deniers seems incredible, when you consider that not only does it run counter to the vast majority of report data but that it comes from many with vested interests in maintaining current industrial practices and levels of fossil fuel usage.

Could it simply be a question of semantics, with much-used labels deemed unacceptable at the same time as the causes of directly-experienced effects accepted as valid? If so, it would suggest that our contemporary technological society differs little from the mindset of pre-industrial civilisation, in which leaders were believed to have at very least a divine right to rule, or even a divine bloodline. In which case, is it appalling to suggest that the terrible bush fires have occurred not a minute too soon?

If it is only by becoming victims at the tip of the impending (melted) iceberg that global warming is deemed genuine, then so be it. When scientists are mistrusted and activists labelled as everything from misguided to corrupt and scheming manipulators, this might only leaves a taste of what lies ahead to convince a majority who would otherwise rather keep doing as they always have done and trust politicians to do the thinking for them. I can think of nothing more apt to end on than another Carl Sagan quote: "For me, it is far better to grasp the Universe as it really is than to persist in delusion, however satisfying and reassuring."

Wednesday 27 November 2019

Ocean acidification: climate change at the sour end

A few weeks ago, I overheard a 58 year old man telling a 12 year old boy that the most dire of scientists' warnings concerning global warming over the past 30 years had failed to materialise - and that what the boy needed to learn was to be able to separate facts from propaganda.

Although it is no doubt next to impossible to be able to change such entrenched mindsets as those of this particular baby boomer, there is still extremely limited public understanding of the insidious changes currently taking place in our oceans. In addition to the rise in both sea temperature and sea level (approaching a centimetre every two-to-three years) a rapid increase in ocean acidity is now on course to profoundly disrupt marine life.

With the USA pulling out of the Paris Agreement, will the rest of world manage to pull together in order to prevent another tipping point? After all, increasing ocean acidification isn't something us non-marine scientists can directly observe. One key point that is immediately obvious is that it isn't a localised issue: as a third of atmospheric carbon dioxide is absorbed into the oceans, all the planet's seas will be affected. The decrease of 0.1pH unit in the past few centuries equates to an astonishing 26-29% increase in acidity. What's more, this change is predicted to have doubled by the end of this century. Clearly, the effect on marine life is set to be substantial.

So what is being done to assess the probable issues? Various projects around the world are using mesocosms - transparent cylinders up to ten metres long - to understand the effects of current and predicted near-future acidity levels on marine life. Coral bleaching is possibly the one condition people will have heard of (although there appear to be an astonishing number of people who think that coral is a plant rather than invertebrate animal) but sea temperature changes are as much a cause as increased acidity. Apart from causing stress to some marine organisms, leading to such conditions as lowered immune systems and so the spread of disease, acidification reduces the material available for shell and carapace formation, especially for juveniles and nauplii.

The problem isn't so much the change itself as the rate of change, which is far faster than normal geophysical processes. Indeed, one report states that over the past 20 million years, changes in oceanic acidification have been barely one percent of the current rate. Obviously, there is minimal chance of the non-directed mechanism of natural selection keeping pace with adaptations to the new conditions.

While many organisms will suffer, some such as jellyfish and toxic algae may benefit, with the latter leading to the poisoning of key fishing industry species. This in turn could lead to toxins entering the human food chain, on top of the economic issues from the decline in fish and shellfish stocks. Indeed, the US Pacific coast aquaculture industry is already experiencing a reduction in the shellfish populations. This will be in addition to the pollution of fresh waterways already explored in a post last year.

Of the various experiments aiming to understand the impact of the rapid increase, the largest project is the pan-European Biological Impacts of Ocean Acidification (BIOACID) scheme. Giant mesocosms sunk in a Swedish fjord have been sealed with local ocean water (and associated organisms) and half of them modified with the projected pH level.

Similar but small projects are underway in New Zealand and the Canary Islands, with preservation of edible stocks a key priority. Another problem with a decline in shellfish species destined for human consumption would be the loss of the raw material for chitosan, which may prove to be an ecologically-friendly replacement for plastic packaging.

Clearly, there could be numerous - and some as yet unknown - knock-on effects from the ocean acidification. Unlike the rise in atmospheric temperature, it is much more difficult to see the results of this fundamental change and for the public to understand the consequences. Yet again, the life forms affected are far from the cute poster species usually paraded to jump-start the public's environmental consciousness. Unfortunately, these may prove to be far more critical to the future of humanity and the wider world than say, giant pandas or Amur leopards. It's time for some serious sci-comm to spread the warning message!

Sunday 23 June 2019

Spray and walk away? Why stratospheric aerosols could be saviours or destroyers

My first scientific encounters with aerosols weren't particularly good ones. In my early teens, I read that the CFC propellants used as aerosols were depleting the ozone layer. Therefore, tiny atmospheric particles had negative connotations for me from my formative years. This was further enforced by Carl Sagan and Richard Turco's 1990 book A Path Where No Man Thought: Nuclear Winter and the End of the Arms Race, which discussed the potentially devastating effects of high-altitude aerosol's around the world following a nuclear attack. Strike two against these pesky particles!

Of course aerosols aren't just man-made. The stratospheric dust particles generated following the Chicxulub impact event 66 million years ago are known to have been instrumental in the global climate disruption that wiped out the dinosaurs and many other life forms. This would have been in addition to the thousands of years of environmental changes caused by sulfur aerosols from the Deccan Traps supervolcano. Rather more recently, the Mount Tambora volcanic eruption in 1815 led to starvation and epidemics around the world for up to three years.

Now that our civilisation is generating a rapid increase in global temperatures, numerous solutions are being researched. One of the most recent areas involves reducing the amount of solar radiation reaching the Earth's surface. Several methods have been suggested for this, but this year sees a small-scale experiment to actually test a solution, namely seeding the atmosphere with highly reflective particles in an artificial recreation of a volcanic event. The Stratospheric Controlled Perturbation Experiment (SCoPEx) is a solar geoengineering project involving Harvard University that will use a balloon to release calcium carbonate in aerosol form at about twenty kilometres above the Earth's surface, analysing the local airspace the following day to assess the effects.

This experiment is controversial for several reasons. Firstly, it doesn't lead to any reduction in greenhouse gases and particulate pollutants; if anything, by sweeping the issue under a stratospheric rug, it could allow fossil fuel corporations to maintain production levels and reduce investment in alternatives. If the recent reports by meteorologists that natural and non-intentional man-made aerosols are already mitigating global warming, then the gross effects of heat pollution must be higher than realised!

Next, this sort of minute level of testing is unlikely to pinpoint issues that operational use might generate, given the chaotic nature of atmospheric weather patterns. To date, numerous computer simulations have been run, but bearing in mind how inaccurate weather forecasting is beyond ten days, nothing can be as accurate as the real thing. Therefore at what point could a test prove that the process is effective and safe enough to be carried out on a global scale? Possibly it might require such a large scale experiment that it is both research and the actual process itself!

The duration that the aerosols remain aloft is still not completely understood, hinting that regular replenishment would be essential. In addition, could the intentionally-polluted clouds capture greater amounts of water vapour, at first holding onto and then dropping their moisture so as to cause drought followed by deluge? Clouds cannot be contained within the boundaries of the testing nation, meaning other countries could suffer these unintended side-effects.

It may be that as a back-up plan, launching reflective aerosols into the stratosphere makes sense, but surely it makes much more sense to reduce greenhouse gas emissions and increase funding of non-polluting alternatives? The main emphasis from ecologists to date has been to remove human-generated substances from the environment, not add new ones in abundance. I'm all for thinking outside the box, but I worry that the only way to test this technique at a fully effective level involves such a large scale experiment as to be beyond the point of no return. Such chemical-based debacles as ozone depletion via chlorofluorocarbons (CFCs) prove that in just a matter of decades we can make profound changes to the atmosphere - and badly effect regions furthest removed from the source itself.  So why not encourage more reducing, reusing and recycling instead?

Monday 10 June 2019

Defrosting dangers: global warming and the biohazards under the ice

Despite frequent news reports on the thawing of polar and glacial ice, there appears to be less concern shown towards this aspect of climate change than many others. Perhaps this is due to so few humans living in these regions; lack of familiarity with something helps us to ignore its true importance. The most obvious effects of melting ice are said to be the increase in atmospheric carbon, rising sea levels and unpredictable weather patterns, but there is another threat to our species that is only just beginning to be noticed - and as yet has failed to generate any mitigation plans.

A report last year confirmed a frightening cause behind the deaths back in 2015 of approximately half the world's remaining saiga antelope population: thanks to warmer and more humid weather, a type of bacteria usually confirmed to their nose had spread to the antelopes' bloodstream. Although not the sort of news to attract much attention even from nature-lovers, this ecological David and Goliath scenario looks set to be repeated in colder environments around the globe. Microscopic and fungal life forms that have been trapped or dormant for long periods, possibly millennia, may be on the verge of escaping their frozen confines.

The various film adaptions of John W. Campbell's 1938 novella Who Goes There? show the mayhem caused by an alien organism that has escaped its icy tomb. The real-life equivalents to this fictional invader are unlikely to be of extra-terrestrial origin, but they could prove at least as perilous, should climate change allow them to thaw out. The problem is easy to state: there is an enormous amount of dormant microbial life trapped in ice and permafrost that is in danger of escaping back into the wider ecosystem.

In the first quarter of the Twentieth Century over a million reindeer were killed by anthrax, with subsequent outbreaks occurring sporadically until as late as 1993. Recent years have seen the death of both farmers and their cattle from infection related to the thawing of a single infected reindeer carcass. In various incidents in 2016, dozens of Siberian herders and their families were admitted to hospital while Russian biohazard troops were flown in to run the clean-up operations. One issue is that until recently the infected animals - domesticated as well as wild - have rarely been disposed of to the recommended safety standards. Therefore, it doesn't take much for reactivated microbes to spread into environments where humans can encounter them.

Of course, the numbers of people and livestock living near glaciers and the polar caps is relatively low, but there are enormous regions of permafrost that are used by herders and hunters. Meltwater containing pathogens can get into local water supplies (conventional water treatment doesn't kill anthrax spores), or even reach further afield via oceans - where some microbes can survive for almost two years. The record high temperatures in some of the Northern Hemisphere's permafrost zones are allowing the spread of dangerous biological material into regions that may not have seen them for centuries - or far longer.

Decades-old anthrax spores aren't the only worry. Potential hazards include the smallpox virus, which caused a Siberian epidemic in the 1890s and may be able to survive in a freeze-dried state in victim's corpses before - however unlikely - reviving due to warmer temperatures. In addition, it should be remembered that many of the diseases that infect Homo sapiens today only arose with the development of farming, being variants of bacteria and viruses that transferred across from our domestic livestock.

This would suggest that permafrost and ice sheets include ancient microbes that our species hasn't interacted with for centuries - and which we may therefore have minimal resistance to. Although natural sources of radiation are thought to destroy about half of a bacteria's genome within a million years, there have been various - if disputed - claims of far older bacteria being revived, including those found in salt crystals that are said to be 250 million years old. In this particular case, their location deep underground is said to have minimised cosmic ray mutations and thus ensured their survival. Sounds like one for the Discovery Channel if you ask me, but never say never...

Even if this improbable longevity turns out to be inaccurate, it is known that dormant spore-forming bacteria such those leading to tetanus and botulism could, like anthrax, be revived after decades of containment in permafrost. Fungal spores are likewise known to survive similar interments; with amphibian, bat and snake populations currently declining due to the rapid spread of fungal pathogens, the escape of such material shouldn't be taken lightly.

So can anything be done to prevent these dangers? Other than reversing the increase in global temperatures, I somehow doubt it. Even the location of some of the mass burials during twentieth century reindeer epidemics have been lost, meaning those areas cannot be turned into no-go zones. Anthrax should perhaps be thought of as only one of a suite of biohazards that melting permafrost may be about to inflict on a largely uninformed world. The death of some remote animals and their herders may not earn much public sympathy, but if the revived pathogens spread to the wider ecosystem, there could be far more at stake. Clearly, ignorance is no protection from the microscopic, uncaring dangers now waking up in our warming world.

Sunday 10 March 2019

Buzzing away: are insects on the verge of global extinction?

It's odd how some of these posts get initiated. For this particular one, there were two driving factors. One was passing a new house on my way to work where apart from the concrete driveway, the front garden consisted solely of a large square of artificial grass; the owners are clearly not nature lovers! The second inspiration was listening to a BBC Radio comedy quiz show, in which the panel discussed the recent report on global insect decline without being able to explain why this is important, apart from a vague mention of pollination.

Insect biologists have long sung the praises of these unrewarded miniature heroes, from JBS Haldane's supposed adage about God being "inordinately fond of stars and beetles" to EO Wilson's 1987 speech that described them as "the little things that run the world." In terms of numbers of species and individuals, invertebrates, especially insects, are the great success story of macroscopic life on our planet. So if they are in serious decline, does that spell trouble for Homo sapiens?

The new research claims that one-third of all insect species are currently endangered, extrapolating to wholesale extinction for the class Insecta over the next century. Although the popular press has started using evocative phrases such as "insect genocide" and even "insectageddon", just how accurate are these dramatic claims?

The United Nation's Red List currently describes three hundred insect species as critically endangered and a further seven hundred as vulnerable, but this is a tiny proportion of the total of...well, at lot more, at any rate. One oft-quoted figure is around one million insect species, although entomologists have estimated anywhere from 750,000 up to 30 million, with many species still lacking formal scientific identification. The hyperbole could therefore easily sound like unnecessary scaremongering, until you consider the details.

The new report states that butterflies and caddis flies are suffering the greatest decline, while cockroaches - as anyone who has faced a household infestation of them will know, they are likely to remain around until the end of world - and flies are the least affected orders. So, to paraphrase Monty Python, what have the insects ever done for us?

Pollination is of course of key importance, to both horticulture and un-managed 'wild' environments. Insects are near the base of many food webs; if numbers were much reduced, never mind removed, the impact on the rest of the ecosystem would be catastrophic. With the human population set to top ten billion in thirty years' time, we require ever larger regions of productive land for agriculture. They may be small at an individual level, but arthropods in general total about seventeen times the mass of all us H. sapiens. Insects replenish the soil, as alongside bacteria they break down dead matter and fecal material. So important is this latter function that New Zealand has been trialling non-native dung beetles to aid cattle farmers.

One key way to save fresh water and lessen the generation of the potent greenhouse gas methane is to reduce meat consumption in favour of insect protein. If insects are no longer around, then that will be an additional challenge in reducing environmental degradation. This of course also ignores the fact that insects are already a component in the diet of many developing nations. Last year I wrote about how scientists have been creating advanced materials derived from animals. Again, we are shooting ourselves in the foot if we allow this ready-made molecular library to be destroyed.

What is responsible for this global decline? Perhaps unsurprisingly, it turns out to be the usual suspects. Agricultural chemicals including pesticides have been associated with honey-bee colony collapse disorder (not incidentally, some tests have found honey samples with neonicotinoids - the mostly widely-used insecticides - exceeding the recommended human dosage) so clearly the same culprit is affecting other insects. Fresh waterways, home to many aquatic insect species, are frequently as polluted as the soil, either due to agricultural run-off or industrial contaminants. Wild landscapes are being converted with great haste into farm land and urban sprawl, with an obviously much-reduced biota.

Climate change is playing its part, with soil acidity increasing just as it is in the oceans. Even areas as remote as central Australia have seen marked decreases in insects as higher temperatures and lower rainfall outpaces the ability to adapt to the new conditions. I've often mentioned the role of invasive species in the decimation of indigenous vertebrates, but insects are equally prone to suffer from the arrival of newcomers. Although New Zealand has very strict biosecurity protocols, the likes of Queensland fruit flies and brown marmorated stink bugs are still occasionally found in or around ports of entry.

Many nations have no such procedures in place, resulting in local species being out-competed or killed by introduced species or pathogens to which they have no resistance. Until fairly recently, even New Zealand had a lax attitude to the issue, resulting in the decline of native species such as carabid beetles. When I conducted a brief survey of my garden in 2017 I found that one-third of the insect species were non-native, most of these being accidental imports since the arrival of European settlers.

If insects are so vital to our survival, why has there been so little interest in their well-being? There are some fairly obvious suggestions here. Firstly, at least in Western cultures, insects have been deemed dirty, ugly things that can be killed without a second thought. Wasps, ants and cockroaches in particular are seen in this light of being unwelcome pests, with typical insect-related phrases including "creepy crawlies" and "don't let the bed bugs bite".

It's fairly well-known that malaria-carrying mosquitoes are the most dangerous animals for us humans in terms of fatalities. The widespread outbreaks of the Zika virus haven't done them any favours either. As Brian Cox's television series Wonders of Life showed, their small size has given them veritable super powers compared to us lumbering mammals, from climbing up sheer surfaces (as a praying mantis was doing a few nights' ago on my window) to having amazing strength-to-weight ratios. All in all, insects are a bit too alien for their own good!

Clearly, scale prejudice is also a key factor. On a recent trip to Auckland Central Library I only found one book on insects versus dozens on birds. Photographic technology has been a double-edged sword when it comes to giving us a clearer picture of insects: close-ups are often greeted with revulsion, yet until Sir David Attenborough's 2005 BBC series Life in the Undergrowth, there was little attempt to film their behaviour with the same level of detail as say, the lions and antelopes of the Serengeti. It should also be mentioned that when Rachel Carson's ground-breaking book about the dangers of pesticides, Silent Spring, was published in 1962, the resulting environmentalism was largely in support of birds rather than insects.

Among all this doom and gloom, are there any ways to prevent it? One thing is for certain, and that is that it won't be easy. The agricultural sector would have to make drastic changes for a start, becoming much smarter in the use of chemicals and be held responsible for the local environment, including waterways. Vertical farming and other novel techniques could reduce the need for new agricultural land and water usage, but developing nations would be hard-pressed to fund these themselves.

Before any major undertaking, there's going to have to be either a fundamental crisis, such as food shortages, in a rich nation or a massive public relations exercise to convince people to consider insects in the same light as giant pandas or dolphins. This is not going to be easy, but as David Attenborough put it: "These small creatures are within a few inches of our feet, wherever we go on land - but often, they're disregarded. We would do very well to remember them."

Saturday 26 January 2019

Concrete: a material of construction & destruction - and how to fix it

How often is it that we fail to consider what is under our noses? One of the most ubiquitous of man-made artifices - at least to the 55% of us who live in urban environments - is concrete. Our high-rise cities and power stations, farmyard siloes and hydroelectric dams wouldn't exist without it. As it is, global concrete consumption has quadrupled over the past quarter century, making it second only to water in terms of humanity's most-consumed substance. Unfortunately, it is also one of most environmentally-unfriendly materials on the planet.

Apart from what you might consider to be the aesthetic crimes of the bland, cookie-cutter approach to International Modernist architecture, there is a far greater issue due to the environmental degradation caused by the concrete manufacturing process. Cement is a key component of the material, but generates around 8% of all carbon dioxide emissions worldwide. As such, there needs to be a 20% reduction over the next ten years in order to fulfil the Paris Agreement - yet there is thought there may be a 25% increase in demand for concrete during this time span, particularly from the developing world. Although lower-carbon cements are being developed, concrete production causes other environmental issues as well. In particular, sand and gravel extraction is bad for the local ecology, including catastrophic damage to the sea bed.

So are there any alternatives? Since the 1990's, television series such as Grand Designs have presented British, New Zealand and Australian-based projects for (at times) extremely sustainable houses made from materials such as shipping containers, driftwood, straw bales, even shredded newspaper. However, these are mostly the unique dream builds of entrepreneurs, visionaries and let's face it, latter-day hippies. The techniques used might be suitable for domestic architecture, but they are impractical at a larger scale.

The US firm bioMASON studied coral in order to develop an alternative to conventional bricks, which generate large amounts of greenhouse gases during the firing process. They use a biomineralisation process, which basically consists of injecting microbes into nutrient-rich water containing sand and watching the rod-shaped bacteria grow into bricks over three to five days.  It's still comparatively early days for the technology, so meanwhile, what about applying the three environmental ‘Rs' of Reduce, Reuse and Recycle to conventional concrete design and manufacturing?

1 Reduce

3D printers are starting to be used in the construction industry to fabricate building and structural components, even small footbridges. Concrete extrusion designs require less material than is required by conventional timber moulds - not to mention removing the need for the timber itself. One common technique is to build up shapes such as walls from thin, stacked, layers. The technology is time-effective too: walls can be built up at a rate of several metres per hour, which may induce companies to make the initial outlay for the printing machinery.

As an example of the low cost, a 35 square metre demonstration house was built in Austin, Texas, last year at a cost of US$10,000 - and it only took 2 days to build. This year may see an entire housing project built in the Netherlands using 3D-printed concrete. Another technique has been pioneered at Exeter University in the UK, using graphene as an additive to reduce the amount of concrete required. This greatly increases both the water resistance and strength compared to the conventional material, thus halving the material requirement.

2 Reuse

Less than a third of the material from conventionally-built brick and timber structures can be reused after demolition. The post-war construction industry has continually reduced the quality of the building material it uses, especially in the residential sector; think of pre-fabricated roof trusses, made of new growth, comparatively unseasoned timber and held together by perforated connector plates. The intended lifespan of such structures could be as little as sixty years, with some integrated components such as roofing failing much sooner.

Compare this to Roman structures such as aqueducts and the Pantheon (the latter still being the world's largest unreinforced concrete dome) which are sound after two millennia, thanks to their volcanic ash-rich material and sophisticated engineering. Surely it makes sense to use concrete to construct long-lasting structures, rather than buildings that will not last as long as their architects? If the reuse of contemporary construction materials is minimal (about as far removed as you can get from the traditional approach of robbing out stone-based structures in their entirety) then longevity is the most logical alternative.

3 Recycle

It is becoming possible to both recycle other waste into concrete-based building materials and use concrete itself as a secure storage for greenhouse gases. A Canadian company called CarbonCure has developed a technique for permanently sequestering carbon dioxide in their concrete by converting it into a mineral during the manufacturing process, with the added benefits of increasing the strength of the material while reducing the amount of cement required.

As for recycling waste material as an ingredient, companies around the world have been developing light-weight concrete incorporating mixed plastic waste, the latter comprising anywhere from 10% to 60% of the volume, particularly with the addition of high density polyethylene.

For example New Zealand company Enviroplaz can use unsorted, unwashed plastic packaging to produce Plazrok, a polymer aggregate for creating a concrete which is up to 40% lighter than standard material. In addition, the same company has an alternative to metal and fibreglass panels in the form of Plaztuff, a fully recyclable, non-corroding material which is one-seventh the weight of steel. It has even been used to build boats as well as land-based items such as skips and playground furniture.

Therefore what might appear to be an intractable problem appears to have a variety of overlapping solutions that allow sustainable development in the building and civil engineering sector. It is somewhat unfortunate then that the conservative nature of these industries has until recently stalled progress in replacing a massive pollutant with much more environmentally sound alternatives. Clearly, green architecture doesn't have to be the sole prerogative of the driftwood dreamers; young entrepreneurs around the world are seizing the opportunity to create alternatives to the destructive effects of construction.

Thursday 11 October 2018

Sonic booms and algal blooms: a smart approach to detoxifying waterways

A recent report here in New Zealand has raised some interesting issues around data interpretation and the need for independent analysis to minimise bias. The study has examined the state of our fresh water environment over the past decade, leading to the conclusion that our lakes and rivers are improving in water quality.

However, some of the data fails to support this: populations of freshwater macro invertebrates remain low, following a steady decline over many decades. Therefore while the overall tone of the report is one of optimism, some researchers have claimed that the data has been deliberately cherry-picked in order to present as positive a result as possible.

Of course, there are countless examples of interested parties skewing scientific data for their own ends, with government organisations and private corporations among the most common culprits. In this case, the recorded drop in nitrate levels has been promoted at the expense of the continued low population of small-scale fauna. You might well ask what use these worms, snails and insects are, but even a basic understanding of food webs shows that numerous native bird and freshwater fish species rely on these invertebrates for food. As I've mentioned so often the apparently insignificant may play a fundamental role in sustaining human agriculture (yes, some other species practice farming too!)

So what is it that is preventing the invertebrates' recovery? The answer seems to be an increase in photosynthetic cyanobacteria, or as is more commonly - and incorrectly known - blue-green algae. If it is identified at all, it's as a health food supplement called spirulina, available in smoothies and tablet form. However, most cyanobacteria species are not nearly as useful - or pleasant. To start with, their presence in water lowers the oxygen content, so thanks to fertiliser runoff - nitrogen and phosphorus in particular - they bloom exponentially wherever intensive farming occurs close to fresh water courses. Another agriculture-related issue is due to clearing the land for grazing: without trees to provide shade, rivers and streams grow warmer, encouraging algal growth. Therefore as global temperatures rise, climate change is having yet another negative effect on the environment.

Most species of cyanobacteria contain toxins that can severely affect animals much larger than fresh water snails. Dogs have been reported as dying in as little as a quarter of an hour from eating it, with New Zealand alone losing over one hundred and fifty pet canines in the past fifteen years; it's difficult to prevent consumption, since dogs seem to love the smell! Kiwis are no stranger to the phylum for other reasons, as over one hundred New Zealand rivers and lakes have been closed to swimmers since 2011 due to cyanobacterial contamination.

Exposure to contaminated water or eating fish from such an environment is enough to cause external irritation to humans and may even damage our internal organs and nervous system. Drinking water containing blue-green algae is even worse; considering their comparable size to some dogs, it is supposed that exposure could prove fatal to young children. Research conducted over the past few years also suggests that high-level contamination can lead to Lou Gehrig's disease, A.K.A. amyotrophic lateral sclerosis, the same condition that Stephen Hawking suffered from.

What research you might ask is being done to discover a solution to this unpleasant organism? Chemicals additives including copper sulphate and calcium hypochlorite have been tried, but many are highly expensive while the toxicity of others is such that fish and crustacean populations also suffer, so this is hardly a suitable answer.

A more elegant solution has been under trial for the past two years, namely the use of ultrasound to sink the blue-green algae too deep to effectively photosynthesise, thus slowly killing it. A joint programme between New Zealand and the Netherlands uses a high-tech approach to identifying and destroying ninety per cent of each bloom. Whereas previous ultrasonic methods tended to be too powerful, thereby releasing algal toxins into the water, the new technique directly targets the individual algal species. Six tests per hour are used to assess water quality and detect the species to be eradicated. Once identified, the sonic blasts are calibrated for the target species and water condition, leading to a slower death for the blue-green algae that avoids cell wall rupture and so prevents the toxins from escaping.

Back to the earlier comment as to why the report's conclusions appear to have placed a positive spin that is unwarranted, the current and previous New Zealand Governments have announced initiatives to clean up our environment and so live up to the tourist slogan of '100% Pure'. The latest scheme requires making ninety percent of the nation's fresh water environments swimmable by 2040, which seems to be something of a tall order without radical changes to agriculture and the heavily polluting dairy sector in particular. Therefore the use of finely target sonic blasting couldn't come a moment too soon.

Our greed and short-sightedness has allowed cyanobacteria to greatly increase at the expense of the freshwater ecosystem, not to mention domesticated animals. Now advanced but small-scale technology has been developed to reduce it to non-toxic levels, but is yet to be implemented beyond the trial stage. Hopefully this eradication method will become widespread in the near future, a small victory in our enormous fight to right the wrongs of over-exploitation of the environment. But as with DDT, CFCs and numerous others, it does make me wonder how many more man-made time bombs could be ticking out there...

Wednesday 12 September 2018

Seasons of the mind: how can we escape subjective thinking?

According to some people I've met, the first day of spring in the Southern Hemisphere has been and gone with the first day of September. Not incidentally, there are also some, myself included, who think that it has suddenly started to feel a bit warmer. Apparently, the official start date is at the spring equinox during the third week of September. So on the one hand, the weather has been warming since the start of the month but on the other, why should a planet followed neat calendrical conventions, i.e. the first of anything? Just how accurate is the official definition?

There are many who like to reminisce about how much better the summer weather was back in their school holidays. The rose-tinted memories of childhood can seem idyllic, although I also recall summer days of non-stop rain (I did grow up in the UK, after all). Therefore our personal experiences, particularly during our formative years, can promote an emotion-based response that is so far ingrained we fail to consider they may be inaccurate. Subjectivity and wishful thinking are key to the human experience: how often do we remember the few hits and not the far more misses? As science is practiced by humans it is subject to the same lack of objectivity as anything else; only its built-in error-checking can steer practitioners onto a more rational course than in other disciplines.

What got me to ponder the above was that on meeting someone a few months' ago for the first time, almost his opening sentence was a claim that global warming isn't occurring and that instead we are on the verge of an ice age. I didn't have time for a discussion on the subject, so I filed that one for reply at a later date. Now seems like a good time to ponder what it is that leads people to make such assertions that are seemingly contrary to the evidence.

I admit to being biased on this particular issue, having last year undertaken research for a post on whether agriculture has postponed the next glaciation (note that this woolly - but not mammoth, ho-ho - terminology is one of my bugbears: we are already in an ice age, but currently in an interglacial stage). Satellite imagery taken over the past few decades shows clear evidence of large-scale reductions in global ice sheets. For example, the northern polar ice cap has been reduced by a third since 1980, with what remains only half its previous thickness. Even so, are three decades a long enough period to make accurate predictions? Isn't using a scale that can be sympathetic with the human lifespan just as bad as relying on personal experience?

The UK's Met Office has confirmed that 2018 was that nation's hottest summer since records began - which in this instance, only goes back as far back as 1910.  In contrast, climate change sceptics use a slight growth in Antarctic sea ice (contrary to its steadily decreasing continental icesheet) as evidence of climate equilibrium. Now I would argue that this growth is just a local drop in the global ocean, but I wonder if my ice age enthusiast cherry-picked this data to formulate his ideas? Even so, does he believe that all the photographs and videos of glaciers, etc. have been faked by the twenty or so nations who have undertaken Earth observation space missions? I will find out at some point!

If we try to be as objective as possible, how can we confirm with complete certainty the difference between long term climate change and local, short term variability? In particular, where do you draw the line between the two? If we look at temporary but drastic variations over large areas during the past thousand years, there is a range of time scales to explore. The 15th to 18th centuries, predominantly the periods 1460-1550 and 1645-1715, contained climate variations now known as mini ice ages, although these may have been fairly restricted in geographic extent. Some briefer but severe, wide-scale swings can be traced to single events, such as the four years of cold summers following the Tambora eruption of 1815.

Given such variability over the past millennium, in itself a tiny fragment of geological time, how much certainty surrounds the current changes? The public have come to expect yes or no answers delivered with aplomb, yet some areas of science such as climate studies involve chaos mathematics, thus generating results based on levels of probability. What the public might consider vacillation, researchers consider the ultimate expression of scientific good practice. Could this lack of black-and-white certainty be why some media channels insist on providing a 'counterbalancing' viewpoint from non-expert sources, as ludicrous as this seems?

In-depth thinking about a subject relies upon compartmentalisation and reductionism. Otherwise, we would forever be bogged down in the details and never be able to form an overall picture. But this quantising of reality is not necessarily a good thing if it generates a false impression regarding cause and effect. By suffering from what Richard Dawkins calls the “tyranny of the discontinuous mind” we are prone to creating boundaries that just don't exist. In which case, could a line ever be found between short term local variation and global climate change? Having said that, I doubt many climate scientists would use this as an excuse to switch to weather forecasting instead. Oh dear: this is beginning to look like a 'does not compute' error!

In a sense of course we are exceptionally lucky to have developed science at all. We rely on language to define our ideas, so need a certain level of linguistic sophistication to achieve this focus; tribal cultures whose numbers consist of imprecise values beyond two are unlikely to achieve much headway in, for example, classifying the periodic table.

Unfortunately, our current obsession with generating information of every quality imaginable and then loading it to all available channels for the widest possible audience inevitably leads to a tooth-and-claw form of meme selection. The upshot of this bombardment of noise and trivia is to require an enormous amount of time just filtering it. The knock-on effect being that minimal time is left for identifying the most useful or accurate content rather than simply the most disseminated.

Extremist politicians have long been adept at exploiting this weakness to expound polarising phraseology that initially sounds good but lacks depth; they achieve cut-through with the simplest and loudest of arguments, fulfilling the desire most people have to fit into a rigid social hierarchy - as seen in many other primate species. The problem is that in a similar vein to centrist politicians who can see both sides of an argument but whose rational approach negates emotive rhetoric, scientists are often stuck with the unappealing options of either taking a stand when the outcome is not totally clear, or facing accusations of evasion. There is current trend, particularly espoused by politicians, to disparage experts, but discovering how the universe works doesn't guarantee hard-and-fast answers supplied exactly when required and which provide comfort blankets in a harsh world.

Where then does this leave critical thinking, let alone science? Another quote from Richard Dawkins is that "rigorous common sense is by no means obvious to much of the world". This pessimistic view of the human race is supported by many a news article but somewhat negated by the immense popularity of star science communicators, at least in a number of countries.

Both the methods and results of science need to find a space amongst the humorous kitten videos, conspiracy theorists and those who yearn for humanity to be the pinnacle and purpose of creation. If we can comprehend that our primary mode of thinking includes a subconscious baggage train of hopes, fears and distorted memories, we stand a better chance of seeing the world for how it really is and not how we wish it to be. Whether enough of us can dissipate that fog remains to be seen. Meanwhile, the ice keeps melting and the temperature rising, regardless of what you might hear...

Wednesday 27 September 2017

Cow farts and climate fiddling: has agriculture prevented a new glaciation?

Call me an old grouch, but I have to say that one of my bugbears is the use of the term 'ice age' when what is usually meant is a glacial period. We currently live in an interglacial (i.e. warmer) era, the last glaciation having ended about 11,700 years ago. These periods are part of the Quaternary glaciation that has existed for almost 2.6 million years and deserving of the name 'Ice Age', with alternating but irregular cycles of warm and cold. There, that wasn't too difficult now, was it?

What is rather more interesting is that certain geology textbooks published from the 1940s to 1970s hypothesised that the Earth is overdue for the next glaciation. Since the evidence suggests the last glacial era ended in a matter of decades, the proposed future growth of the ice sheets could be equally rapid. Subsequent research has shown this notion to be flawed, with reliance on extremely limited data leading to over-confident conclusions. In fact, current estimates put interglacial periods as lasting anywhere from ten thousand to fifty thousand years, so even without human intervention in global climate, there would presumably be little to panic about just yet.

Over the past three decades or so this cooling hypothesis has given way to the opposing notion of a rapid increase in global temperatures. You only have to read such recent news items as the breakaway of a six thousand square kilometre piece of the Antarctic ice shelf to realise something is going on, regardless of whether you believe it is manmade, natural or a combination of both. But there is a minority of scientists who claim there is evidence for global warming - and an associated postponement of the next glaciation - having begun thousands of years prior to the Industrial Revolution. This then generates two key questions:

  1. Has there been a genuine steady increase in global temperature or is the data flawed?
  2. Assuming the increase to be accurate, is it due to natural changes (e.g. orbital variations or fluctuations in solar output) or is it anthropogenic, that is caused by human activity?

As anyone with even a vague interest in or knowledge of climate understands, the study of temperature variation over long timescales is fraught with issues, with computer modelling often seen as the only way to fill in the gaps. Therefore, like weather forecasting, it is far from being an exact science (insert as many smileys here as deemed appropriate). Although there are climate-recording techniques involving dendrochronology (tree rings) and coral growth that cover the past few thousand years, and ice cores that go back hundreds of thousands, there are still gaps and assumptions that mean the reconstructions involve variable margins of error. One cross-discipline assumption is that species found in the fossil record thrived in environments - and crucially at temperatures - similar to their descendants today. All in all this indicates that none of the numerous charts and diagrams displaying global temperatures over the past twelve thousand years are completely accurate, being more along the lines of a reconstruction via extrapolation.

Having looked at some of these charts I have to say that to my untrained eye there is extremely limited correlation for the majority of the post-glacial epoch. There have been several short-term fluctuations in both directions in the past two thousand years alone, from the so-called Mediaeval Warm Period to the Little Ice Age of the Thirteenth to Nineteenth centuries. One issue of great importance is just how wide a region did these two anomalous periods cover outside of Europe and western Asia? Assuming however that the gradual warming hypothesis is correct, what are the pertinent details?

Developed in the 1920s, the Milankovitch cycles provide a reasonable fit for the evidence of regular, long-term variations in the global climate. The theory states that changes in the Earth's orbit and axial tilt are the primary causes of these variations, although the timelines do not provide indisputable correlation. This margin of error has helped to lead other researchers towards an anthropogenic cause for a gradual increase in planet-wide warming since the last glaciation.

The first I heard of this was via Professor Iain Stewart's 2010 BBC series How Earth Made Us, in which he summarised the ideas of American palaeoclimatologist Professor William Ruddiman, author of Plows, Plagues and Petroleum: How Humans Took Control of Climate. Although many authors, Jared Diamond amongst them, have noted the effects of regional climate on local agriculture and indeed the society engaged in farming, Professor Ruddiman is a key exponent of the reverse: that pre-industrial global warming has resulted from human activities. Specifically, he argues that the development of agriculture has led to increases in atmospheric methane and carbon dioxide, creating an artificial greenhouse effect long before burning fossil fuels became ubiquitous. It is this form of climate change that has been cited as postponing the next glaciation, assuming that the current interglacial is at the shorter end of such timescales. Ruddiman's research defines two major causes for an increase in these greenhouse gases:

  1. Increased carbon dioxide emissions from burning vegetation, especially trees, as a form of land clearance, i.e. slash and burn agriculture.
  2. Increased methane from certain crops, especially rice, and from ruminant species, mostly cattle and sheep/goat.

There are of course issues surrounding many of the details, even down to accurately pinpointing the start dates of human agriculture around the world. The earliest evidence of farming in the Near East is usually dated to a few millennia after the end of the last glaciation, with animal husbandry preceding the cultivation of crops. One key issue concerns the lack of sophistication in estimating the area of cultivated land and ruminant population size until comparatively recent times, especially outside of Western Europe. Therefore generally unsatisfactory data concerning global climate is accompanied by even less knowledge concerning the scale of agriculture across the planet for most of its existence.

The archaeological evidence in New Zealand proves without a doubt that the ancestors of today's Maori, who probably first settled the islands in the Thirteenth Century, undertook enormous land clearance schemes. Therefore even cultures remote from the primary agricultural civilisations have used similar techniques on a wide scale. The magnitude of these works challenges the assumption that until chemical fertilisers and pesticides were developed in the Twentieth Century, the area of land required per person had altered little since the first farmers. In a 2013 report Professor Ruddiman claims that the level of agriculture practiced by New Zealand Maori is just one example of wider-scale agricultural land use in pre-industrial societies.

As for the role played by domesticated livestock, Ruddiman goes on to argue that ice core data shows an anomalous increase in atmospheric methane from circa 3000BCE onwards. He hypothesises that a rising human population led to a corresponding increase in the scale of agriculture, with rice paddies and ruminants the prime suspects. As mentioned above, the number of animals and size of cultivated areas remain largely conjectural for much of the period in question.  Estimates suggest that contemporary livestock are responsible for 37% of anthropogenic methane and 9% of anthropogenic carbon dioxide whilst cultivated rice may be generating up to 20% of anthropogenic methane. Extrapolating back in time allows the hypothesis to gain credence, despite lack of access to exact data.

In addition, researchers both in support and opposition to pre-industrial anthropogenic global warming admit that the complexity of feedback loops, particularly with respect to the role of temperature variation in the oceans, further complicates matters. Indeed, such intricacy, including the potential latency between cause and effect, means that proponents of Professor Ruddiman's ideas could be using selective data for support whilst suppressing its antithesis. Needless to say, cherry-picking results is hardly model science.

There are certainly some intriguing aspects to this idea of pre-industrial anthropogenic climate change, but personally I think the jury is still out (as I believe it is for the majority of professionals in this area).  There just isn't the level of data to guarantee its validity and what data is available doesn't provide enough correlation to rule out other causes. I still think such research is useful, since it could well prove essential in the fight to mitigate industrial-era global warming. The more we know about longer term variations in climate change, the better the chance we have of understanding the causes - and potentially the solutions - to our current predicament. And who knows, the research might even persuade a few of the naysayers to move in the right direction. That can't be bad!

Tuesday 23 May 2017

Water, water, everywhere: the hidden holism of H2O

Like other northern regions of New Zealand, the summer of 2017 saw Auckland residents facing City Council requests to conserve water, as well as a hosepipe ban in effect during March and April. It therefore seems ironic that the water shortage occurred at the same time as flooding in the west of the city; thanks to a tropical downpour - one of several so far this year - the equivalent of an entire month's rain fell over a day or two. Clearly, water shortages are going to become an ever-increasing issue, even in nations with well-developed infrastructure.

The British historian of science James Burke, writer-presenter of The Day the Universe Changed, also made three television series called Connections 1, 2 and 3 (in 1978, 1994 and 1997 respectively) which examined the historical threads linking scientific and technological advances with changes in other areas of society. Therefore I'd like to take a similarly holistic approach to the wonderful world of water consumption and see how it ties into the world in general.

Although the statistics vary - it's difficult to assess with any great precision - there are published figures suggesting that the populace of richer nations use up to 5000 litres of water each per day, mostly hidden in food production. Many websites now supply details of the amount of water used to grown certain crops and foodstuffs, so you can easily raise your guilt level simply by comparing your diet to the water involved in its generation; and that's without considering the carbon mileage or packaging waste, either!

I've previously discussed the high environmental cost of cattle farming, with both dairy and beef herds being prominent culprits in water pollution as well as consumption. However, there are plenty of less-obvious foodstuffs proven to be notorious water consumers, for example avocado and almonds. Although the latter might be deemed a luxury food, much of the global supply is now used to make almond milk; with consumption increasing up to 40% year-on-year, this is one foodstuff much in demand.

Even though it is claimed to require much less water than the equivalent volume of dairy produce, almond farming is still relevant due to the massive increase in bulk production, especially in California (home to 80% of the global almond harvest). The reasons for the popularity of almond milk are probably two-fold: firstly, the public is getting more health-conscious; and secondly, a reduction or abstention in dairy produce is presumed to lessen food allergies/intolerance. These obviously link to prominent concerns in the West, in the form of high-calorie/low-exercise diets leading to mass obesity and over-use of cleaning chemicals in the home, preventing children from developing good anti-microbial resistance. Clearly, there is a complex web when it comes to water and the human race.

Even for regions chronically short of water such as California, more than three-quarters of fresh water usage is by agriculture. In order to conserve resources, is it likely that we may soon face greater taxes on commercially-grown water-hogging produce and bans on the home-growth of crops that have a low nutrition to water consumption ratio? I've recently read several books discussing probable issues over the next half century with the humble lettuce appearing as a good example of the latter.

Talking of which, the wet and windy conditions in New Zealand of the past year - blamed at least partially on La Niña - have led to record prices for common vegetables: NZ$9 for a lettuce and NZ$10 for a cauliflower, even in major supermarket chains. British supermarkets were forced to ration some fruit and vegetables back in February, due to their Mediterranean growers suffering from storms and floods. This suggests that even for regions with sophisticated agricultural practices there is a fine line between too much and too little fresh water. Isn't it about time that the main food producers developed a more robust not to mention future-proof infrastructure, considering the increased impact that climate change is likely to have?

The world is also paying a heavy price for bottled water, a commercial enterprise that largely breaks all boundaries of common sense. In the USA alone it costs several thousand times the equivalent volume of tap water and there are some reports that there may be chemical leaching from reusing plastic bottles. As you might expect, there is also an extremely high environmental cost. This includes the fossil fuels used by bottling plants and transportation, the lowering of the water table (whose level is so critical in areas utilising less sophisticated farming technologies) and the impact of plastic waste: the USA only recycles about 23% of its plastic water bottles, resulting in 38 billion bottles dumped each year at a cost of around US$1 billion. All in all, bottled water for nations with highly developed infrastructure seems like an insane use of critical resources.

Although accelerated population growth has become a widespread fear, there are indicators that later this century the global figure may peak at around nine billion and then level off. Increasing urbanisation is seen a primary cause for this and not just in developing nations; Auckland for example (New Zealand's largest city by far) experienced 8% population growth in the seven years from 2006. A larger population obviously requires more food, but a more urban and therefore generally better educated, higher income populace tends to demand access to processed, non-local and above all water-intensive foods. China is the touchstone here, having seen a massive increase in fish and meat consumption over the past half century; the latter has risen from 8 million tons per year in 1978 to over 70 million tons in recent years.

It has been claimed that 70% of industrial waste generated in developing nations is dumped into water courses, meaning that there will be a massive cost for environmental clean-up before local sources can be fully utilised. The mass outbreak of E-coli in Hawke's Bay, New Zealand, in February this year shows that even developed nations are having difficulty maintaining water quality, whilst there has been a shocking admittance of lead contamination above safe levels in 41 American states over the past three years. Does this mean bottled water - heretofore the lifeline of Western tourists abroad - is becoming a necessity in the West after all?

Some might argue that thanks to global warming there will be more water available due to the melting of polar caps and glaciers, which after all contain almost two-thirds of the world's fresh water resources. However, these sources are mostly located far from high-density populations and upon marine contamination they require energy-demanding desalination technology. It's small comfort that current estimates suggest that by 2025 about 14% of the global population will rely on desalination plants for their fresh water needs.

In the West we tend to take clean, safe water completely for granted but thanks to the demands of living in a society run on rampant consumerism - coupled with poor science education - everyday decisions are being made that affect the environment, waste critical resources and damage our own health. Pundits are predicting that water will be the new oil: liquid gold, a precious commodity to be fought over, if necessary. Surely this is one resource that all of us can do something to support, whether it is cutting down on water-intensive foodstuffs, using tap rather than bottled water, or simply turning off a tap sooner than usual!

Monday 8 May 2017

Weather with you: meteorology and the public perception of climate change

If there's one thing that appears to unite New Zealanders with the British it is the love of discussing the weather. This year has been no exception, with New Zealand's pre-summer forecasts - predicting average temperatures and rainfall - proving wildly inaccurate. La Niña has been blamed for what Wellingtonians have deemed a 'bummer summer', January having provided the capital with its fewest 'beach days' of any summer in the last thirty years. Sunshine hours, temperature, rainfall and wind speed data from the MetService support this as a nationwide trend; even New Zealand flora and fauna have been affected with late blossoming and reduced breeding respectively.

However, people tend to have short memories and often recall childhood weather as somehow superior to that of later life. Our rose-tinted spectacles make us remember long, hot summer school holidays and epic snowball fights in winter, but is this a case of remembering the hits and forgetting the misses (meteorologically speaking)? After all, there are few things more boring than a comment that the weather is the same as the previous ten comments and surely our memories of exciting outdoor ventures are more prominent than being forced to stay indoors due to inclement conditions?

Therefore could our fascination with weather but dubious understanding - or even denial - of climate change be due to us requiring personal or even emotional involvement in a meteorological event? Most of us have had the luck not to experience extreme weather (or 'weather bombs' as the media now term them), so unless you have been at the receiving end of hurricanes or flash floods the weather is simply another aspect of our lives, discussed in everyday terms and rarely examined in detail.

Since we feel affected by weather events that directly impact us (down to the level of 'it rained nearly every day on holiday but the locals said it had been dry for two months prior') we have a far greater emotional response to weather than we do to climate. The latter appears amorphous and almost mythical by comparison. Is this one of the reasons that climate change sceptics achieve such success when their arguments are so unsupported?

Now that we are bombarded with countless pieces of trivia, distracting us from serious analysis in favour of short snippets of multimedia edutainment, how can we understand climate change and its relationship to weather? The standard explanation is that weather is short term (covering hours, days or at most weeks) whilst climate compares annual or seasonal variations over far longer timeframes. Neil deGrasse Tyson in Cosmos:A Spacetime Odyssey made the great analogy that weather is like the zigzag path of a dog on a leash whereas its owner walks in a straight line from A to B. So far so good, but there's not even a widespread designation for the duration that counts as valid for assessing climate variability.

As such this leads us to statistics. Everyone thinks they understand the word 'average' but averages can represent the mean, median or mode. Since the period start and end date can be varied, as can the scaling on infographics (a logarithmic axis, for example), these methods allow a single set of statistics to be presented in a wide variety of ways.

The laws of probability rear their much-misinterpreted head too. The likelihood of variation may change wildly, depending on the length of the timeframe: compare a five-year block to that of a century and you can see that climate statistics is a tricky business; what is highly improbable in the former period may be inevitable over the latter. As long as you are allowed to choose the timeframe, you can skew the data to support a favoured hypothesis. So much then for objective data!

By comparison, if someone is the recipient of a worse than expected summer, as per New Zealand in 2017, then that personal experience may well be taken as more important than all the charts of long-term climate trends. It might just be the blink of an eye in geological terms, but being there takes precedence over far less emotive science and mathematics.

Perhaps then we subconsciously define weather as something that we feel we experience whilst climate is a more abstract notion, perhaps a series of weather events codified in some sort of order? How else can climate change deniers, when faced with photographs proving glacial or polar cap shrinkage, offer alternative explanations to global warming?

This is where politics comes into the mix. Whereas weather has little obvious involvement with politics, climate has become heavily politicised in the past thirty years, with party lines in some nations (mentioning no names) clearly divided. Although some of the naysayers have begun to admit global warming appears to be happening - or at least that the polar caps and glaciers are melting - they stick to such notions that (a) it will be too slow to affect humans - after all, there have been far greater swings in temperature in both directions in previous epochs - and (b) it has natural causes. The latter implies there is little we can do to mitigate it (solar output may be involved, not just Earth-related causes) and so let's stick our head in the sand and do some ostrich impressions.

As an aside, I've just finished reading a 1988 book called Prehistoric New Zealand. Its three authors are a palaeontologist (Graeme Stevens), an archaeologist (Beverley McCulloch)  and an environmental researcher (Matt McGlone) so the content covers a wide range of topics, including the nation's geology, climate, wildlife and human impact. Interestingly, the book states if anything the climate appears to be cooling and the Earth is probably heading for the next glaciation!

Unfortunately no data is supplied to support this, but Matt McGlone has since confirmed that there is a wealth of data supporting the opposite conclusion. In 2008 the conservative American Heartland Institute published a list of 500 scientists it claimed supported the notion that current climate change has solely natural causes. McGlone was one of many scientists who asked for his name to be removed from this list, stating both his work and opinions were not in agreement with this idea.

So are there any solutions or is it simply the case that we believe what we personally experience but have a hard time coming to terms with less direct, wider-scale events? Surely there are enough talented science communicators and teachers to convince the public of the basic facts, or are people so embedded in the now that even one unseasonal rain day can convince them - as it did some random man I met on the street - that climate change is a myth?

Friday 23 December 2016

O Come, All ye Fearful: 12 woes for Christmas future

This month I thought I would try and adopt something of the Yuletide spirit by offering something short and sharp (if not sweet) that bares a passing resemblance to the carol On the Twelve Days of Christmas. However, instead of gifts I'll be attempting to analyse twelve key concerns that humanity may face in the near future, some being more immediate - not to mention inevitable - than others.

I'll start off with the least probable issues then gradually work down to those most likely to have widespread effects during the next few decades. As it is meant to be a season of good cheer I'll even suggest a few solutions or mitigation strategies where these are applicable. The ultimate in low-carb gifts: what more could you ask for?

12: ET phones Earth. With the SETI Institute and Breakthrough Listen project leading efforts to pick up signals from alien civilisations, what are the chances that we might receive an extra-terrestrial broadcast in the near future? Although many people might deem this just so much science fiction, the contents of a translated message (or autonomous probe) could prove catastrophic. Whether it would spark faith-based wars or aid the development of advanced technology we couldn't control - or be morally fit enough to utilise - there may be as many negative issues as positive ones.

Solution: Keeping such information secret, especially the raw signal data, would be incredibly difficult. Whether an international translation project could be conducted in secret is another matter, with censorship allowing a regular trickle of the less controversial information into the public domain. Whilst this is the antithesis of good scientific practice, it could prove to be the best solution in the long term. Not that most politicians are ever able to see anything that way, however!

11. Acts of God. There is a multitude of naturally-occurring events that are outside of human control, both terrestrial (e.g. super volcano, tsunami) and extra-terrestrial, such as asteroid impacts. Again, until recently few people took much interest in the latter, although Hollywood generated some awareness via several rather poor movies in the late 1990s. The Chelyabinsk meteor of February 2013 (rather than meteorite, as most of the material exploded at altitude led to 1500 injuries, showing that even a small object that doesn't reach the ground intact can cause havoc. Since 2000, there have been over twenty asteroid impacts or atmospheric break-ups ranging from a kiloton up to half a megaton.

Solution: Although there are various projects to assess the orbits of near-Earth objects (NEOs), the development of technologies to deflect or destroy impactors requires much greater funding than is currently in place. Options range from devices that use just their velocity to knock NEOs off-course to the brute force approach of high-powered lasers and hydrogen bombs. However, with the cancellation of NASA's Ares V heavy launch vehicle it's difficult to see how such solutions could be delivered in time. Hopefully in the event something would be cobbled together pretty quickly!

10. Grey goo scenario. As defined by Eric Drexler in his 1986 book Engines of Creation, what if self-replicating nanobots (developed for example, for medical purposes), break their programming and escape into the world, eating everything in their path? Similar to locust swarms, they would only be limited by the availability of raw materials.

Solution: The Royal Society's 2004 report on nanoscience declared that the possibility of von Neumann machines are some decades away and therefore of little concern to regulators. Since then, other research has suggested there should be limited need to develop such machines anyway. So that's good to know!

9. Silicon-destroying lifeforms. What if natural mutations lead to biological organisms that can seriously damage integrated circuitry? A motherboard-eating microbe would be devastating, especially in the transport and medical sectors, never mind the resulting communication network outages and financial chaos. This might sound as ridiculous as any low-grade science fiction plot, but in 1975 nylon-eating bacteria were discovered. Since then, research into the most efficient methods to recover metals from waste electronics have led to experiments in bioleaching. As well as bacteria, the fungus Aspergillus niger has been shown to breakdown the metals used in circuits.

Solution: As bioleaching is potentially cheaper and less environmentally damaging it could become widespread. Therefore it will be up to the process developers to control their creations. Fingers crossed, then!

8. NCB. Conventional weapons may be more common place, but the development of nuclear, chemical and biological weapons by rogue states and terrorist organisations is definitely something to be worried about. The International Atomic Energy Agency has a difficult time keeping track of all the radioactive material that is stolen or goes missing each year.  As the 1995 fatal release of the nerve agent sarin on the Tokyo subway shows, terrorists are not unwilling to use weapons of mass destruction on the general public.

Solution: There's not much I can suggest here. Let's hope that the intelligence services can keep all the Dr Evils at bay.

7. Jurassic Park for real. At Harvard last year a chicken embryo's genes were tweaked in such a way as to create a distinctly dinosaurian snout rather than a beak. Although it may be sometime before pseudo-velociraptors are prowling (high-fenced) reserves, what if genome engineering was used to develop Homo superior? A 2014 paper from Michigan State University suggests both intellectual and physical improvements via CRISPR-cas9 technology is just around the corner.

Solution: If the tabloids are to be believed (as if) China will soon be editing human genomes, to fix genetic diseases as well as generating enhanced humans. Short of war, what's to stop them?

Planet Earth wrapped as a Christmas present

6. DIY weaponry. The explosion in 3D printers for the domestic market means that you can now make your own handguns. Although current designs wear out after a few firings, bullets are also being developed that will work without limiting their lifespan. Since many nations have far more stringent gun laws than the USA, an increase in weaponry among the general public is just what we don't need.

Solution: how about smart locking systems on printers so they cannot produce components that could be used to build a weapon? Alternatively, there are now 3D printer models that can manufacture prototype bulletproof clothing. Not that I'd deem that a perfect solution!

5. Chemical catastrophe. There are plenty of chemicals no longer in production that might affect humanity or our agriculture. These range from the legacy effects of polychlorinated biphenyl (PCB), a known carcinogen, to the ozone depletion causing by CFCs, which could be hanging around the stratosphere for another century; this doesn't just result in increased human skin cancer - crops are also affected by the increased UVB.

Solution: we can only hope that current chemical development now has more rigorous testing and government regulation than that accorded to PCBs, CFCs, DDTs, et al. Let's hope all that health and safety legislation pays off.

4. The energy crisis. Apart from the obvious environmental issues around fossil fuels, the use of fracking generates a whole host of problems on its own, such as the release of methane and contamination of groundwater by toxic chemicals, including radioactive materials.

Solution: more funding is required for alternatives, especially nuclear fusion (a notoriously expensive area to research). Iceland generated 100% of its electricity from renewables whilst Portugal managed 4 consecutive days in May this year via wind, hydro, biomass and solar energy sources. Greater recycling and more incentives for buying electric and hybrid vehicles wouldn't hurt either!

3. Forced migration. The rise in sea levels due to melt water means that it won't just be Venice and small Pacific nations that are likely to become submerged by the end of the century. Predictions vary widely, but all in the same direction: even an increase of 150mm would be likely to affect over ten million people in the USA alone, with probably five times that number in China facing similar issues.

Solution: a reduction in greenhouse gas emissions would seem to be the thing. This requires more electric vehicles and less methane-generating livestock. Arnold Schwarzenegger's non-fossil fuel Hummers and ‘Less meat, less heat, more life' campaign would appear to be good promotion for the shape of things to come - if he can be that progressive, there's hope for everyone. Then of course there's the potential for far more insect-based foodstuffs.

2. Food and water. A regional change in temperature of only a few degrees can seriously affect crop production and the amount of water used by agriculture. Over 700 million people are already without clean water, with shortages affecting agriculture even in developed regions - Australia and California spring to mind. Apparently, it takes a thousand litres of water to generate a single litre of milk!

Solution: A few far-sighted Australian farmers are among those developing methods to minimise water usage, including a few low-tech schemes that could be implemented anywhere. However, really obvious solutions would be to reduce the human population and eat food that requires less water. Again, bug farming seems a sensible idea.

1. Preventing vegegeddon. A former professor at Oxford University told me that some of his undergraduates have problems relating directly to others, having grown up in an environment with commonplace communication via electronic interfaces. If that's the problem facing the intellectual elite, what hope for the rest of our species? Physical problems such as poor eyesight are just the tip of the iceberg: the human race is in severe danger of degenerating into low-attention ‘sheeple' (as they say on Twitter). Children are losing touch with the real world, being enticed into virtual environments that on the surface are so much more appealing. Without knowledge or experience of reality, even stable democracies are in danger of being ruled by opportunistic megalomaniacs, possibly in orange wigs.

Solution: Richard Louv, author of  Last Child in the Woods: Saving Our Children From Nature-Deficit Disorder suggests children require unstructured time out of doors in order to gain an (occasionally painful) understanding of  the real world; tree-climbing, fossicking, etc. Restricting time on electronic devices would seem to go hand in hand with this.

Well, that about wraps it up from me. And if the above seems somewhat scary, then why not do something about it: wouldn't working for a better future be the best Christmas present anyone could ever give?

Saturday 26 December 2015

Beetlemania: can eating insects help save the environment?

Christmas - along with Thanksgiving for Americans - has probably got to be the most obvious time of the year when Westerners over-indulge in animal protein. However, this meatfest comes at a severe cost to the planet, as anyone who is environmentally aware is likely to know. Although many people have started making changes to mitigate climate change and pollution, compared to say recycling and reducing your carbon footprint, cutting down on meat seems to be far more challenging.

Actor and former California Governor Arnold Schwarzenegger has suggested Americans should have one or two meat-free days each week, but that's easier said than done in a continent raised on heaped platefuls of red meat. It isn't as if switching from cattle, sheep and goat to more unusual species would help either, as recent research confirms the likes of kangaroo and reindeer as sources of high methane emissions too. As a side note, it isn't just meat consumption that needs to be reduced; there's also dairy farming to consider. Does anyone really like soya milk? Mind you, I haven't tried almond milk yet...

United Nations reports suggest that greenhouse gas emissions from farming, primarily due to livestock and artificial fertilisers, have almost doubled in the past half century. As you might expect,these are likely to continue increasing at a similar rate over the next fifty years. In addition, vast tracts of Amazonian rainforest - amongst other unspoilt natural habitats - are being destroyed to make way for cattle grazing. At around three million acres lost each year, there's obviously not much in the way of sustainability about this particular development!

So is there any good news in all this culinary doom and gloom? Both Europe and especially North America have recently seen a profusion of companies marketing manufactured foods intended as meat replacements that are derived from of all things…insects. These products range from burgers to crackers and usually offer little appearance or taste to indicate their source material. Is it possible that the future for developed nations could include the delights of grasshopper goulash and wormicelli pasta?

It isn't as strange as it sounds. Over a quarter of mankind routinely eats insects from several thousand species as part of their traditional diet, usually with the source animal obvious in the presentation. This makes sense for developing nations, since wild insects can be caught en masse, farmed bugs fed on cheap waste material that can't be converted into conventional animal feed - and of course they require comparatively little water. Although the material isn't being converted to highly processed foodstuffs, Thailand - with over 20,000 insect farms - is an example of a nation currently increasing its insect consumption.

The species used in the new ‘hidden' insect foods varies widely, with crickets prominent on the menu. It isn't as straightforward as just killing the wee beasties and grinding them into powder, but many of the new American and European companies are conducting extensive research, developing mechanised processes that bode well for industrial-scale production.

The nutritional analysis shows promise to say the least, with some Hymenoptera species containing up to three times the protein yield of domestic cattle. The vitamin and mineral statistics are pretty good too, sometimes exceeding both farmed mammals and birds as well as plant staples such as soya beans. Not bad, considering that bug farming should prove to be at least four times as efficient as cattle husbandry.

Whether a trendy novelty can become mainstream remains to be seen, since the fledgling industry faces more than just the ‘yuck' factor. As with much cutting-edge technology, legislation has yet to catch up: there could be issues around safety concerns, with short shelf life, uncaught impurities or pollutants and allergic reactions all potential factors that could inhibit widescale production.

Bug protein isn't the only dish on the table (see what I did there?) as there are even more sophisticated approaches to reducing the environmental degradation caused by meat production. One well-publicised technique has been the cultivation of animal flesh in-vitro. However, it's only been a couple of years since the (nurturing? propagation?) of the first petri dish burger and so the process is still prohibitively expensive. By comparison, insects (bees and butterflies excepted) are not currently in short supply.

As a someone who hasn't eaten any land-based flesh for over a quarter of a century - and yes, I try to be careful with which aquatic species I consume - I suppose I have a fairly objective opinion about this matter. It does seem to make environmental sense to pursue processed insect protein as a replacement for domesticated mammal and bird species, but how often has logic taken a backseat to prejudice and the irrational? I look forward to near future developments, not least the massive brand campaigns that will no doubt be required to convert the Western public to the likes of Cricket crackers and Wormer schnitzel. Look out turkeys, your Christmases could be numbered...