Showing posts with label global warming. Show all posts
Showing posts with label global warming. Show all posts

Tuesday 25 February 2020

Falling off the edge: in search of a flat Earth

It's just possible that future historians will label the 21st century as the Era of Extreme Stupidity. In addition to the 'Big Four' of climate change denial, disbelief in evolution by natural selection, young Earth creationism and the anti-vaxxers, there are groups whose oddball ideas have rather less impact on our ecosystem and ourselves. One segment of people that I place in the same camp as UFO abductees and their probing fixation are believers in a flat Earth.

Although on the surface this - admittedly tiny - percentage of people appear to be more amusing than harmful, their media visibility makes them a microcosm of the appalling state of science education and critical thinking in general. In addition, their belief in an immense, long-running, global conspiracy adds ammunition to those with similar paranoid delusions, such as the moon landing deniers. One example of how intense those beliefs can be (at times there's just a whiff of religious fanaticism), the American inventor and stuntman 'Mad' Mike Hughes was killed recently flying a self-built rocket intended to prove that the Earth is a disc.

I won't bother to describe exactly what the flat Earthers take to be true, except that their current beliefs resemble a description of the late, great Terry Pratchett's fantasy Discworld - albeit without the waterfall around the edge of the disc. For anyone who wants to test the hypothesis themselves rather than rely on authority (the mark of a true scientist) there are plenty of observational methods to try. These include:
  1. Viewing the Earth's shadow on the Moon during a lunar eclipse
  2. Noticing that a sailing ship's mast disappears/reappears on the horizon after/before the hull
  3. How certain stars are only visible at particular latitudes
For anyone with a sense of adventure, you can also build a high-altitude balloon or undertake a HAHO skydive to photograph the Earth's curvature - from any point on the planet!

It's easy to suggest that perhaps our brains just aren't up to the task of deciphering the intricacies of a 13.7 billion old universe, but basic experiments and observations made several thousand years ago were enough for Greek scientists to confirm both the shape and size of our planet. So what has changed in the past century or so to turn back the clock, geophysically-speaking?

The modern take on a flat Earth seems to have begun in the late 19th century, with an attempt - similar to contemporary mid-Western creationists - to ignore scientific discoveries that disagree with a literal interpretation of the Old Testament. Indeed, the forerunners of today's flat Earthers were anti-science in many respects, also denying that prominent enemy of today's Biblical literalists, evolution by natural selection. However, many of the 21st century' s leading adherents to a disc-shaped Earth have more sympathy and interest in scientific discoveries, even supporting such politically contentious issues as rapid, human-induced, climate change.

This topic is laden with ironies, few greater than the fact that a large proportion of the evidence for global warming is supplied by space agencies such as NASA. The latter has long been claimed by the Flat Earth Society as a leading conspirator and purveyor of faked imagery in the promotion of a spherical earth (yes to all pedants, I know that strictly speaking our planet is an oblate spheroid, not purely spherical).

Today's flat Earth societies follow the typical pseudo-scientific / fringe approach, analysing the latest science theories for material they can cherry pick and cannibalise to support their ideas. In recent years they've even tackled key new developments such as dark energy; in fact, about the only area they are lagging behind in is the incorporation of elements involving quantum mechanics.

But for anyone with an understanding of parsimony or Occam's Razor, the physics for a flat Earth have about as much likelihood as Aristotle's crystalline spheres. It isn't just the special pleading for localised astrophysics (since the other planets are deemed spherical); isn't it obviously absurd that there could be a global conspiracy involving rival nations and potentially hundreds of thousands of people - with no obvious explanation of what the conspirators gain from the deception?

Even for the vast majority of the public with little interest or understanding of the physics, most people considering the flat Earth hypothesis are presumably puzzled by this apparent lack of motivation. In a nutshell, what's in it for the conspirators? Until recently, NASA (nick-named 'Never A Straight Answer,') was the main enemy, but with numerous other nations and private corporations building space vehicles, there is now a plethora of conspiracy partners. Going back half a century to the height of the Cold War why, for example, would the USA and Soviet Union have agreed to conspire? As yet, there hasn't been anything approaching a satisfactory answer; but ask Carl Sagan said: "Extraordinary claims require extraordinary evidence."

Unlike most fringe groups, flat Earthers don't appear to favour other, popular conspiracy theories above scientific evidence. Yet somehow, their ability to support ludicrous ideas whilst denying fundamental observations and the laws of physics in the light of so much material evidence is astonishing.  Of course our species doesn't have a mental architecture geared solely towards rational, methodical thought processes, but the STEM advances that Homo sapiens has made over the millennia prove we are capable of suppressing the chaotic, emotional states we usually associate with young children.

Whether we can transform science education into a cornerstone topic, as daily-relevant as reading, writing and arithmetic, remains to be seen. Meanwhile, the quest continues for funding a voyage to find the Antarctic ice wall that prevents the oceans falling over the edge of the world. Monty Python, anyone?

Wednesday 22 January 2020

Wildfires and woeful thinking: why have Australians ignored global warming?

In a curious example of serendipity, I was thinking about a quote from the end of Carl Sagan's novel Contact ("For small creatures such as we the vastness is bearable only through love") just a few minutes before discovering his daughter Sasha Sagan's book For Small Creatures Such as We. Okay, so I didn't buy the book - due to the usual post-Christmas funds shortage - and cannot provide a review, but this indication of our place in the scale of creation is something that resonates deep within me.

I've often discussed how biased we are due to our physical size, especially when compared to other species we share the planet with. However, I've never really considered that other fundamental dimension, time. Another Carl Sagan quote echoes many a poet's rumination on our comparatively brief lifespan: "We are like butterflies who flutter for a day and think it is forever."

There's more to this than just fairly familiar poetic conceit. Earlier this month I was given a brief taste of what it might be like to live on Mars, thanks to high-altitude dust and ash transported across the Tasman Sea from the Australian bush fires. By three o'clock in the afternoon a New Zealand summer's day was turned into an eerie orange twilight, with birds and nocturnal insects starting their evening routine some five hours early. There was even a faint powdery, acrid taste in the air, adding to the sense of other-worldliness.

Apart from the obvious fact that this an example of how climate change in one nation can affect another, there is a more disturbing element to all this. Why is it that despite the reports and general consensus of the global climate science community Australians have shown a woeful lack of interest, or indeed, negativity, towards climate change?

Could it be that our society is now centred upon such short increments of time (competing businesses trying to out-do each other, which comes down to working at the ever-increasing speed our technology dictates) that we have replaced analysis with unthinking acceptance of the simplest and most aggressive opinions? Research shows that compared to even twenty years' ago, children read far less non-school literature and rely on the almost useless 'celebrity' shouters of social media for much of their information; there's not much chance of learning about informed, considered arguments via these sources!

After all, it's difficult for most of us to remember exact details of the weather a year ago, but understanding climate change relies on acceptance of directional trends over at least decades. How much easier is it to accept the opinions of those who preserve the status quo and claim we can maintain our current lifestyle with impunity? When combined with the Western capitalist notion of continuous growth and self-regulation, we see a not-so-subtle indoctrination that describes action to prevent climate change as disruptive to the fundamental aspects of the society that has arisen since the Industrial Revolution.

There is an old French saying that we get the government we deserve, which in Australia's case, implies a widespread desire to ignore or even deny global warming. Yet the irony is that of all developed nations, Australia has been at the receiving end of some of its worst effects, thanks to an average increase in daily temperature of several degrees over past century. It takes little cognition to understand how this can lead to the drier conditions that have caused the horrific bush fires; even though some have been deliberately started, their scale has been exacerbated by the change of climate. So what until now has prevented Australians from tying the cause to the effects?

It's not as if there isn't plenty of real-world evidence. However, with computer technology able to generate 'deep fakes', which implies a level of sophistication that only experts can detect, is the public becoming mistrustful of the multitude of videos and photographs of melting polar caps and shrinking glaciers? When combined with the decreased trust in authority figures, scientists and their technical graphs and diagrams don't stand much of a chance of acceptance without a fair amount of suspicion. As mentioned, it's difficult to understand the subtleties inherent in much of science when you are running at breakneck speed just to stand still; slogans and comforting platitudes are much more acceptable - unless of course people become caught up in the outcome themselves.

However, this doesn't explain why it is the key phrases such as 'climate change' and 'global warming' generate such negative sentiment, even from those Australian farmers who admit to hotter, drier conditions than those experienced by their parents' and grandparents' generations. Somehow, these sober terms have become tainted as political slogans rather than scientifically-derived representations of reality. That this negativity has been achieved by deniers seems incredible, when you consider that not only does it run counter to the vast majority of report data but that it comes from many with vested interests in maintaining current industrial practices and levels of fossil fuel usage.

Could it simply be a question of semantics, with much-used labels deemed unacceptable at the same time as the causes of directly-experienced effects accepted as valid? If so, it would suggest that our contemporary technological society differs little from the mindset of pre-industrial civilisation, in which leaders were believed to have at very least a divine right to rule, or even a divine bloodline. In which case, is it appalling to suggest that the terrible bush fires have occurred not a minute too soon?

If it is only by becoming victims at the tip of the impending (melted) iceberg that global warming is deemed genuine, then so be it. When scientists are mistrusted and activists labelled as everything from misguided to corrupt and scheming manipulators, this might only leaves a taste of what lies ahead to convince a majority who would otherwise rather keep doing as they always have done and trust politicians to do the thinking for them. I can think of nothing more apt to end on than another Carl Sagan quote: "For me, it is far better to grasp the Universe as it really is than to persist in delusion, however satisfying and reassuring."

Wednesday 27 November 2019

Ocean acidification: climate change at the sour end

A few weeks ago, I overheard a 58 year old man telling a 12 year old boy that the most dire of scientists' warnings concerning global warming over the past 30 years had failed to materialise - and that what the boy needed to learn was to be able to separate facts from propaganda.

Although it is no doubt next to impossible to be able to change such entrenched mindsets as those of this particular baby boomer, there is still extremely limited public understanding of the insidious changes currently taking place in our oceans. In addition to the rise in both sea temperature and sea level (approaching a centimetre every two-to-three years) a rapid increase in ocean acidity is now on course to profoundly disrupt marine life.

With the USA pulling out of the Paris Agreement, will the rest of world manage to pull together in order to prevent another tipping point? After all, increasing ocean acidification isn't something us non-marine scientists can directly observe. One key point that is immediately obvious is that it isn't a localised issue: as a third of atmospheric carbon dioxide is absorbed into the oceans, all the planet's seas will be affected. The decrease of 0.1pH unit in the past few centuries equates to an astonishing 26-29% increase in acidity. What's more, this change is predicted to have doubled by the end of this century. Clearly, the effect on marine life is set to be substantial.

So what is being done to assess the probable issues? Various projects around the world are using mesocosms - transparent cylinders up to ten metres long - to understand the effects of current and predicted near-future acidity levels on marine life. Coral bleaching is possibly the one condition people will have heard of (although there appear to be an astonishing number of people who think that coral is a plant rather than invertebrate animal) but sea temperature changes are as much a cause as increased acidity. Apart from causing stress to some marine organisms, leading to such conditions as lowered immune systems and so the spread of disease, acidification reduces the material available for shell and carapace formation, especially for juveniles and nauplii.

The problem isn't so much the change itself as the rate of change, which is far faster than normal geophysical processes. Indeed, one report states that over the past 20 million years, changes in oceanic acidification have been barely one percent of the current rate. Obviously, there is minimal chance of the non-directed mechanism of natural selection keeping pace with adaptations to the new conditions.

While many organisms will suffer, some such as jellyfish and toxic algae may benefit, with the latter leading to the poisoning of key fishing industry species. This in turn could lead to toxins entering the human food chain, on top of the economic issues from the decline in fish and shellfish stocks. Indeed, the US Pacific coast aquaculture industry is already experiencing a reduction in the shellfish populations. This will be in addition to the pollution of fresh waterways already explored in a post last year.

Of the various experiments aiming to understand the impact of the rapid increase, the largest project is the pan-European Biological Impacts of Ocean Acidification (BIOACID) scheme. Giant mesocosms sunk in a Swedish fjord have been sealed with local ocean water (and associated organisms) and half of them modified with the projected pH level.

Similar but small projects are underway in New Zealand and the Canary Islands, with preservation of edible stocks a key priority. Another problem with a decline in shellfish species destined for human consumption would be the loss of the raw material for chitosan, which may prove to be an ecologically-friendly replacement for plastic packaging.

Clearly, there could be numerous - and some as yet unknown - knock-on effects from the ocean acidification. Unlike the rise in atmospheric temperature, it is much more difficult to see the results of this fundamental change and for the public to understand the consequences. Yet again, the life forms affected are far from the cute poster species usually paraded to jump-start the public's environmental consciousness. Unfortunately, these may prove to be far more critical to the future of humanity and the wider world than say, giant pandas or Amur leopards. It's time for some serious sci-comm to spread the warning message!

Wednesday 30 October 2019

Our feline friends - not so miaowvellous after all?


I've published a few posts concerning citizen science, from the active participation in conservation-orientated projects here in New Zealand to the more passive involvement in distributed computing projects that I briefly mentioned back in 2012.

A type of public involvement in scientific research half way between these examples has been developed to utilise the human ability to match up patterns, a skill which artificial intelligence is only just beginning to replicate. One early implementation of this was the Galaxy Zoo crowdsourced project, in which volunteers examining photographs taken by robotic, Earth-based telescopes to classify galaxies. Since 2009, the Zooniverse online portal has utilised more than one million volunteers to examine data on behalf of over fifty projects, many of which are within STEM disciplines.

Although initially often used for astronomy or astrophysics programmes, crowd sourcing platforms have latterly found an important role in conservation and biodiversity research. An example is the Smithsonian Institute-sponsored eMammal, which specialises in the examination of camera trap footage to identify the locations of animal species on a scale that could not obtained by other means.

In line with the outcome of the perhaps too ambitious Predator-free 2050 programme, one project that may require the assistance of the Zooniverse volunteers is analysis of feral cat DNA from New Zealand's Auckland Island. The DNA, derived partially from fecal matter (nice), is to discover what the cats on the island are eating. Although this research aims to discover the best way to remove invasive species from Auckland Island (cats are known to predate on native seabird species) there now appears to be another issue caused by cats living near coastlines.

Over the past fifteen years a body of evidence from around the world has shown that cats are directly responsible for the deaths of marine mammals. This might sound rather unlikely, but the microbial culprit, Toxoplasma gondii, is only found in the digestive system of cats. Both feral and domestic cats that catch and eat infected rodents or birds can acquire the parasite and pass it by their fecal matter into the wider environment via fresh water run-off or sewage outfalls. Eventually, it enters the marine food chain, reaching the apex in the former of cetaceans and pinnipeds among others.

Species such as sea otters, seals, and dolphins have been killed by toxoplasmosis, according to autopsies of specimens washed up on seashores as far apart as New Zealand and the USA. Increasing temperatures (thanks again, man-made climate change) and greater rainfall can spread toxoplasmosis even further. In addition to direct contamination from fecal matter, cat owners who flush cat litter down the toilet can also start the highly resilient microbes on a journey via sewer networks to the ocean. Among the New Zealand species proven to have been killed by infection are the critically endangered Maui dolphin and locally vulnerable Hector’s dolphin, so there is definitely a need for some prompt action.

It isn't just a case of the top marine predators eating infected fish or squid: sea mammals could swallow oocysts (basically, the protozoan equivalent of a fertilised egg) directly from water. Only now that Maui dolphins are falling victim to the parasite is the story of this deadly microbe becoming better known. Not incidentally, our species can also become ill with toxoplasmosis due to exposure to cat feces, with serious consequences for babies born to infected mothers and to people with compromised immune systems. In addition to the other potential dangers from the likes of Salmonella, Listeria and E. coli, the recent fad for 'raw' (i.e. unpasteurised) milk could lead to a far higher rate of toxoplasmosis in humans.

What can be done? Well, cat owners could stop flushing kitty litter down their toilets for a start. Is it a case that there are just too many cats in the world? Some recent reports claim that Homo sapiens and their domesticated species constitute 96% of the global mammal biomass. As for cat numbers, an estimate last year suggested that there are six hundred million pet cats and the same number of feral individuals worldwide.

Is this just too many? I admit that I'm fairly biased as it is: a few cat owners I know here in Auckland have pets that regularly kill skinks and it's only luck that these are invasive rainbow skinks rather than rare native species. When it comes to the likes of the last 55 Maui dolphins falling prey to a disease spread by an extremely common domesticated species, I'd rather be over-zealous than over-cautious in developing a solution. As far as I can see, the best control methods would be a vast reduction in cat numbers or the development of an innoculation for our feline friends that can kill the parasite. Somehow I doubt either course of action is likely, which means a far from purrfect method would be to educate cat owners as to how to minimise the spread of Toxoplasma gondii. So if you are a cat owner, or know of one, I guess this could be your time to shine...

Sunday 23 June 2019

Spray and walk away? Why stratospheric aerosols could be saviours or destroyers

My first scientific encounters with aerosols weren't particularly good ones. In my early teens, I read that the CFC propellants used as aerosols were depleting the ozone layer. Therefore, tiny atmospheric particles had negative connotations for me from my formative years. This was further enforced by Carl Sagan and Richard Turco's 1990 book A Path Where No Man Thought: Nuclear Winter and the End of the Arms Race, which discussed the potentially devastating effects of high-altitude aerosol's around the world following a nuclear attack. Strike two against these pesky particles!

Of course aerosols aren't just man-made. The stratospheric dust particles generated following the Chicxulub impact event 66 million years ago are known to have been instrumental in the global climate disruption that wiped out the dinosaurs and many other life forms. This would have been in addition to the thousands of years of environmental changes caused by sulfur aerosols from the Deccan Traps supervolcano. Rather more recently, the Mount Tambora volcanic eruption in 1815 led to starvation and epidemics around the world for up to three years.

Now that our civilisation is generating a rapid increase in global temperatures, numerous solutions are being researched. One of the most recent areas involves reducing the amount of solar radiation reaching the Earth's surface. Several methods have been suggested for this, but this year sees a small-scale experiment to actually test a solution, namely seeding the atmosphere with highly reflective particles in an artificial recreation of a volcanic event. The Stratospheric Controlled Perturbation Experiment (SCoPEx) is a solar geoengineering project involving Harvard University that will use a balloon to release calcium carbonate in aerosol form at about twenty kilometres above the Earth's surface, analysing the local airspace the following day to assess the effects.

This experiment is controversial for several reasons. Firstly, it doesn't lead to any reduction in greenhouse gases and particulate pollutants; if anything, by sweeping the issue under a stratospheric rug, it could allow fossil fuel corporations to maintain production levels and reduce investment in alternatives. If the recent reports by meteorologists that natural and non-intentional man-made aerosols are already mitigating global warming, then the gross effects of heat pollution must be higher than realised!

Next, this sort of minute level of testing is unlikely to pinpoint issues that operational use might generate, given the chaotic nature of atmospheric weather patterns. To date, numerous computer simulations have been run, but bearing in mind how inaccurate weather forecasting is beyond ten days, nothing can be as accurate as the real thing. Therefore at what point could a test prove that the process is effective and safe enough to be carried out on a global scale? Possibly it might require such a large scale experiment that it is both research and the actual process itself!

The duration that the aerosols remain aloft is still not completely understood, hinting that regular replenishment would be essential. In addition, could the intentionally-polluted clouds capture greater amounts of water vapour, at first holding onto and then dropping their moisture so as to cause drought followed by deluge? Clouds cannot be contained within the boundaries of the testing nation, meaning other countries could suffer these unintended side-effects.

It may be that as a back-up plan, launching reflective aerosols into the stratosphere makes sense, but surely it makes much more sense to reduce greenhouse gas emissions and increase funding of non-polluting alternatives? The main emphasis from ecologists to date has been to remove human-generated substances from the environment, not add new ones in abundance. I'm all for thinking outside the box, but I worry that the only way to test this technique at a fully effective level involves such a large scale experiment as to be beyond the point of no return. Such chemical-based debacles as ozone depletion via chlorofluorocarbons (CFCs) prove that in just a matter of decades we can make profound changes to the atmosphere - and badly effect regions furthest removed from the source itself.  So why not encourage more reducing, reusing and recycling instead?

Monday 10 June 2019

Defrosting dangers: global warming and the biohazards under the ice

Despite frequent news reports on the thawing of polar and glacial ice, there appears to be less concern shown towards this aspect of climate change than many others. Perhaps this is due to so few humans living in these regions; lack of familiarity with something helps us to ignore its true importance. The most obvious effects of melting ice are said to be the increase in atmospheric carbon, rising sea levels and unpredictable weather patterns, but there is another threat to our species that is only just beginning to be noticed - and as yet has failed to generate any mitigation plans.

A report last year confirmed a frightening cause behind the deaths back in 2015 of approximately half the world's remaining saiga antelope population: thanks to warmer and more humid weather, a type of bacteria usually confirmed to their nose had spread to the antelopes' bloodstream. Although not the sort of news to attract much attention even from nature-lovers, this ecological David and Goliath scenario looks set to be repeated in colder environments around the globe. Microscopic and fungal life forms that have been trapped or dormant for long periods, possibly millennia, may be on the verge of escaping their frozen confines.

The various film adaptions of John W. Campbell's 1938 novella Who Goes There? show the mayhem caused by an alien organism that has escaped its icy tomb. The real-life equivalents to this fictional invader are unlikely to be of extra-terrestrial origin, but they could prove at least as perilous, should climate change allow them to thaw out. The problem is easy to state: there is an enormous amount of dormant microbial life trapped in ice and permafrost that is in danger of escaping back into the wider ecosystem.

In the first quarter of the Twentieth Century over a million reindeer were killed by anthrax, with subsequent outbreaks occurring sporadically until as late as 1993. Recent years have seen the death of both farmers and their cattle from infection related to the thawing of a single infected reindeer carcass. In various incidents in 2016, dozens of Siberian herders and their families were admitted to hospital while Russian biohazard troops were flown in to run the clean-up operations. One issue is that until recently the infected animals - domesticated as well as wild - have rarely been disposed of to the recommended safety standards. Therefore, it doesn't take much for reactivated microbes to spread into environments where humans can encounter them.

Of course, the numbers of people and livestock living near glaciers and the polar caps is relatively low, but there are enormous regions of permafrost that are used by herders and hunters. Meltwater containing pathogens can get into local water supplies (conventional water treatment doesn't kill anthrax spores), or even reach further afield via oceans - where some microbes can survive for almost two years. The record high temperatures in some of the Northern Hemisphere's permafrost zones are allowing the spread of dangerous biological material into regions that may not have seen them for centuries - or far longer.

Decades-old anthrax spores aren't the only worry. Potential hazards include the smallpox virus, which caused a Siberian epidemic in the 1890s and may be able to survive in a freeze-dried state in victim's corpses before - however unlikely - reviving due to warmer temperatures. In addition, it should be remembered that many of the diseases that infect Homo sapiens today only arose with the development of farming, being variants of bacteria and viruses that transferred across from our domestic livestock.

This would suggest that permafrost and ice sheets include ancient microbes that our species hasn't interacted with for centuries - and which we may therefore have minimal resistance to. Although natural sources of radiation are thought to destroy about half of a bacteria's genome within a million years, there have been various - if disputed - claims of far older bacteria being revived, including those found in salt crystals that are said to be 250 million years old. In this particular case, their location deep underground is said to have minimised cosmic ray mutations and thus ensured their survival. Sounds like one for the Discovery Channel if you ask me, but never say never...

Even if this improbable longevity turns out to be inaccurate, it is known that dormant spore-forming bacteria such those leading to tetanus and botulism could, like anthrax, be revived after decades of containment in permafrost. Fungal spores are likewise known to survive similar interments; with amphibian, bat and snake populations currently declining due to the rapid spread of fungal pathogens, the escape of such material shouldn't be taken lightly.

So can anything be done to prevent these dangers? Other than reversing the increase in global temperatures, I somehow doubt it. Even the location of some of the mass burials during twentieth century reindeer epidemics have been lost, meaning those areas cannot be turned into no-go zones. Anthrax should perhaps be thought of as only one of a suite of biohazards that melting permafrost may be about to inflict on a largely uninformed world. The death of some remote animals and their herders may not earn much public sympathy, but if the revived pathogens spread to the wider ecosystem, there could be far more at stake. Clearly, ignorance is no protection from the microscopic, uncaring dangers now waking up in our warming world.

Sunday 10 March 2019

Buzzing away: are insects on the verge of global extinction?

It's odd how some of these posts get initiated. For this particular one, there were two driving factors. One was passing a new house on my way to work where apart from the concrete driveway, the front garden consisted solely of a large square of artificial grass; the owners are clearly not nature lovers! The second inspiration was listening to a BBC Radio comedy quiz show, in which the panel discussed the recent report on global insect decline without being able to explain why this is important, apart from a vague mention of pollination.

Insect biologists have long sung the praises of these unrewarded miniature heroes, from JBS Haldane's supposed adage about God being "inordinately fond of stars and beetles" to EO Wilson's 1987 speech that described them as "the little things that run the world." In terms of numbers of species and individuals, invertebrates, especially insects, are the great success story of macroscopic life on our planet. So if they are in serious decline, does that spell trouble for Homo sapiens?

The new research claims that one-third of all insect species are currently endangered, extrapolating to wholesale extinction for the class Insecta over the next century. Although the popular press has started using evocative phrases such as "insect genocide" and even "insectageddon", just how accurate are these dramatic claims?

The United Nation's Red List currently describes three hundred insect species as critically endangered and a further seven hundred as vulnerable, but this is a tiny proportion of the total of...well, at lot more, at any rate. One oft-quoted figure is around one million insect species, although entomologists have estimated anywhere from 750,000 up to 30 million, with many species still lacking formal scientific identification. The hyperbole could therefore easily sound like unnecessary scaremongering, until you consider the details.

The new report states that butterflies and caddis flies are suffering the greatest decline, while cockroaches - as anyone who has faced a household infestation of them will know, they are likely to remain around until the end of world - and flies are the least affected orders. So, to paraphrase Monty Python, what have the insects ever done for us?

Pollination is of course of key importance, to both horticulture and un-managed 'wild' environments. Insects are near the base of many food webs; if numbers were much reduced, never mind removed, the impact on the rest of the ecosystem would be catastrophic. With the human population set to top ten billion in thirty years' time, we require ever larger regions of productive land for agriculture. They may be small at an individual level, but arthropods in general total about seventeen times the mass of all us H. sapiens. Insects replenish the soil, as alongside bacteria they break down dead matter and fecal material. So important is this latter function that New Zealand has been trialling non-native dung beetles to aid cattle farmers.

One key way to save fresh water and lessen the generation of the potent greenhouse gas methane is to reduce meat consumption in favour of insect protein. If insects are no longer around, then that will be an additional challenge in reducing environmental degradation. This of course also ignores the fact that insects are already a component in the diet of many developing nations. Last year I wrote about how scientists have been creating advanced materials derived from animals. Again, we are shooting ourselves in the foot if we allow this ready-made molecular library to be destroyed.

What is responsible for this global decline? Perhaps unsurprisingly, it turns out to be the usual suspects. Agricultural chemicals including pesticides have been associated with honey-bee colony collapse disorder (not incidentally, some tests have found honey samples with neonicotinoids - the mostly widely-used insecticides - exceeding the recommended human dosage) so clearly the same culprit is affecting other insects. Fresh waterways, home to many aquatic insect species, are frequently as polluted as the soil, either due to agricultural run-off or industrial contaminants. Wild landscapes are being converted with great haste into farm land and urban sprawl, with an obviously much-reduced biota.

Climate change is playing its part, with soil acidity increasing just as it is in the oceans. Even areas as remote as central Australia have seen marked decreases in insects as higher temperatures and lower rainfall outpaces the ability to adapt to the new conditions. I've often mentioned the role of invasive species in the decimation of indigenous vertebrates, but insects are equally prone to suffer from the arrival of newcomers. Although New Zealand has very strict biosecurity protocols, the likes of Queensland fruit flies and brown marmorated stink bugs are still occasionally found in or around ports of entry.

Many nations have no such procedures in place, resulting in local species being out-competed or killed by introduced species or pathogens to which they have no resistance. Until fairly recently, even New Zealand had a lax attitude to the issue, resulting in the decline of native species such as carabid beetles. When I conducted a brief survey of my garden in 2017 I found that one-third of the insect species were non-native, most of these being accidental imports since the arrival of European settlers.

If insects are so vital to our survival, why has there been so little interest in their well-being? There are some fairly obvious suggestions here. Firstly, at least in Western cultures, insects have been deemed dirty, ugly things that can be killed without a second thought. Wasps, ants and cockroaches in particular are seen in this light of being unwelcome pests, with typical insect-related phrases including "creepy crawlies" and "don't let the bed bugs bite".

It's fairly well-known that malaria-carrying mosquitoes are the most dangerous animals for us humans in terms of fatalities. The widespread outbreaks of the Zika virus haven't done them any favours either. As Brian Cox's television series Wonders of Life showed, their small size has given them veritable super powers compared to us lumbering mammals, from climbing up sheer surfaces (as a praying mantis was doing a few nights' ago on my window) to having amazing strength-to-weight ratios. All in all, insects are a bit too alien for their own good!

Clearly, scale prejudice is also a key factor. On a recent trip to Auckland Central Library I only found one book on insects versus dozens on birds. Photographic technology has been a double-edged sword when it comes to giving us a clearer picture of insects: close-ups are often greeted with revulsion, yet until Sir David Attenborough's 2005 BBC series Life in the Undergrowth, there was little attempt to film their behaviour with the same level of detail as say, the lions and antelopes of the Serengeti. It should also be mentioned that when Rachel Carson's ground-breaking book about the dangers of pesticides, Silent Spring, was published in 1962, the resulting environmentalism was largely in support of birds rather than insects.

Among all this doom and gloom, are there any ways to prevent it? One thing is for certain, and that is that it won't be easy. The agricultural sector would have to make drastic changes for a start, becoming much smarter in the use of chemicals and be held responsible for the local environment, including waterways. Vertical farming and other novel techniques could reduce the need for new agricultural land and water usage, but developing nations would be hard-pressed to fund these themselves.

Before any major undertaking, there's going to have to be either a fundamental crisis, such as food shortages, in a rich nation or a massive public relations exercise to convince people to consider insects in the same light as giant pandas or dolphins. This is not going to be easy, but as David Attenborough put it: "These small creatures are within a few inches of our feet, wherever we go on land - but often, they're disregarded. We would do very well to remember them."

Wednesday 12 September 2018

Seasons of the mind: how can we escape subjective thinking?

According to some people I've met, the first day of spring in the Southern Hemisphere has been and gone with the first day of September. Not incidentally, there are also some, myself included, who think that it has suddenly started to feel a bit warmer. Apparently, the official start date is at the spring equinox during the third week of September. So on the one hand, the weather has been warming since the start of the month but on the other, why should a planet followed neat calendrical conventions, i.e. the first of anything? Just how accurate is the official definition?

There are many who like to reminisce about how much better the summer weather was back in their school holidays. The rose-tinted memories of childhood can seem idyllic, although I also recall summer days of non-stop rain (I did grow up in the UK, after all). Therefore our personal experiences, particularly during our formative years, can promote an emotion-based response that is so far ingrained we fail to consider they may be inaccurate. Subjectivity and wishful thinking are key to the human experience: how often do we remember the few hits and not the far more misses? As science is practiced by humans it is subject to the same lack of objectivity as anything else; only its built-in error-checking can steer practitioners onto a more rational course than in other disciplines.

What got me to ponder the above was that on meeting someone a few months' ago for the first time, almost his opening sentence was a claim that global warming isn't occurring and that instead we are on the verge of an ice age. I didn't have time for a discussion on the subject, so I filed that one for reply at a later date. Now seems like a good time to ponder what it is that leads people to make such assertions that are seemingly contrary to the evidence.

I admit to being biased on this particular issue, having last year undertaken research for a post on whether agriculture has postponed the next glaciation (note that this woolly - but not mammoth, ho-ho - terminology is one of my bugbears: we are already in an ice age, but currently in an interglacial stage). Satellite imagery taken over the past few decades shows clear evidence of large-scale reductions in global ice sheets. For example, the northern polar ice cap has been reduced by a third since 1980, with what remains only half its previous thickness. Even so, are three decades a long enough period to make accurate predictions? Isn't using a scale that can be sympathetic with the human lifespan just as bad as relying on personal experience?

The UK's Met Office has confirmed that 2018 was that nation's hottest summer since records began - which in this instance, only goes back as far back as 1910.  In contrast, climate change sceptics use a slight growth in Antarctic sea ice (contrary to its steadily decreasing continental icesheet) as evidence of climate equilibrium. Now I would argue that this growth is just a local drop in the global ocean, but I wonder if my ice age enthusiast cherry-picked this data to formulate his ideas? Even so, does he believe that all the photographs and videos of glaciers, etc. have been faked by the twenty or so nations who have undertaken Earth observation space missions? I will find out at some point!

If we try to be as objective as possible, how can we confirm with complete certainty the difference between long term climate change and local, short term variability? In particular, where do you draw the line between the two? If we look at temporary but drastic variations over large areas during the past thousand years, there is a range of time scales to explore. The 15th to 18th centuries, predominantly the periods 1460-1550 and 1645-1715, contained climate variations now known as mini ice ages, although these may have been fairly restricted in geographic extent. Some briefer but severe, wide-scale swings can be traced to single events, such as the four years of cold summers following the Tambora eruption of 1815.

Given such variability over the past millennium, in itself a tiny fragment of geological time, how much certainty surrounds the current changes? The public have come to expect yes or no answers delivered with aplomb, yet some areas of science such as climate studies involve chaos mathematics, thus generating results based on levels of probability. What the public might consider vacillation, researchers consider the ultimate expression of scientific good practice. Could this lack of black-and-white certainty be why some media channels insist on providing a 'counterbalancing' viewpoint from non-expert sources, as ludicrous as this seems?

In-depth thinking about a subject relies upon compartmentalisation and reductionism. Otherwise, we would forever be bogged down in the details and never be able to form an overall picture. But this quantising of reality is not necessarily a good thing if it generates a false impression regarding cause and effect. By suffering from what Richard Dawkins calls the “tyranny of the discontinuous mind” we are prone to creating boundaries that just don't exist. In which case, could a line ever be found between short term local variation and global climate change? Having said that, I doubt many climate scientists would use this as an excuse to switch to weather forecasting instead. Oh dear: this is beginning to look like a 'does not compute' error!

In a sense of course we are exceptionally lucky to have developed science at all. We rely on language to define our ideas, so need a certain level of linguistic sophistication to achieve this focus; tribal cultures whose numbers consist of imprecise values beyond two are unlikely to achieve much headway in, for example, classifying the periodic table.

Unfortunately, our current obsession with generating information of every quality imaginable and then loading it to all available channels for the widest possible audience inevitably leads to a tooth-and-claw form of meme selection. The upshot of this bombardment of noise and trivia is to require an enormous amount of time just filtering it. The knock-on effect being that minimal time is left for identifying the most useful or accurate content rather than simply the most disseminated.

Extremist politicians have long been adept at exploiting this weakness to expound polarising phraseology that initially sounds good but lacks depth; they achieve cut-through with the simplest and loudest of arguments, fulfilling the desire most people have to fit into a rigid social hierarchy - as seen in many other primate species. The problem is that in a similar vein to centrist politicians who can see both sides of an argument but whose rational approach negates emotive rhetoric, scientists are often stuck with the unappealing options of either taking a stand when the outcome is not totally clear, or facing accusations of evasion. There is current trend, particularly espoused by politicians, to disparage experts, but discovering how the universe works doesn't guarantee hard-and-fast answers supplied exactly when required and which provide comfort blankets in a harsh world.

Where then does this leave critical thinking, let alone science? Another quote from Richard Dawkins is that "rigorous common sense is by no means obvious to much of the world". This pessimistic view of the human race is supported by many a news article but somewhat negated by the immense popularity of star science communicators, at least in a number of countries.

Both the methods and results of science need to find a space amongst the humorous kitten videos, conspiracy theorists and those who yearn for humanity to be the pinnacle and purpose of creation. If we can comprehend that our primary mode of thinking includes a subconscious baggage train of hopes, fears and distorted memories, we stand a better chance of seeing the world for how it really is and not how we wish it to be. Whether enough of us can dissipate that fog remains to be seen. Meanwhile, the ice keeps melting and the temperature rising, regardless of what you might hear...

Tuesday 9 January 2018

Amphibian Armageddon and killed-off kauri: the worldwide battle against fighting fungi

I recently wanted to visit the Ark in the Park, an open sanctuary in the Waitakere Ranges west of Auckland that uses constant predator control to protect native plants and animals. However, I was stopped by a sign stating that Te Kawerau a Maki, the Maori of the district, have placed a rāhiu or prohibition on entering the forest. Although not legally binding, the rāhui is intended to stop people walking through the area and spreading infection, serving in place of any notice by the New Zealand Government or Auckland City Council, since the latter two bodies have failed to take action. Perhaps this inactivity is because the infection does not directly affect humans or farming. Instead a fungus-like pathogen is killing the native kauri Agathis australis, one the largest tree species on Earth.

Known to live for over a thousand years and grow to over fifty metres tall, the largest kauri are seen by Maori as the lords of New Zealand's northern forests. Yet since 2009 the microscopic water mould Phytophthora agathidicida has been causing kauri dieback at an ever-increasing rate. Surveys in the Waitakeres show that most of the infected areas are within ten metres of walking paths and therefore the mould is being spread by visitors to the lowland forests who fail to thoroughly clean their shoes with the supplied disinfectant spray. In a truly David versus Goliath battle between the miniscule mould and giant trees, introduced species such as possums and pigs are aiding the former by accidentally spreading the minute spores.

Auckland Council reported last winter that the amount of affected kauri has reached 19 percent, meaning a doubling in scale in only five years. Since there is no cure for infected kauri, some scientists are now predicting the extinction of this magnificent tree in the near future. The combination of the pathogen's microscopic size with its rain-based activation after dormancy means there are currently no methods that can prevent the infection from spreading. In a way, the rāhui may just slow down the inevitable. Considering the immense kauri are home to a unique ecosystem of epiphytes, orchids and associated symbiotic organisms, the future flora and fauna of kauri-free forests may well be markedly different from the Waitakeres as they are today.

I've previously discussed the ubiquity of the unsung fungi and how prominent they are even within totally man-made environments. It seems surprising that New Zealand's authorities, so keen to preserve native birds and reptiles, are failing to take any action to at least buy time for the kauri; perhaps they have already deemed extinction as unavoidable and not worth spending public funds on.

The kauri are far from being the only organisms currently threatened by fungi or their kin. Over the past decade more than thirty snake species in the eastern and mid-western United States have started succumbing to what has been termed Snake Fungal Disease. The culprit is thought to be a soil-based fungus called Ophidiomyces ophiodiicola, with a similar organism now also thought to be affecting snakes in the United Kingdom and mainland Europe. Research suggests that up to ninety percent of infected snakes die from the condition, so clearly if humans and their vehicles play unwitting hosts to the microscopic fungal spores, the future for the world's snake population looks depressing. Although many people might not like snakes, ecosystems without them may see an explosion in the numbers of their prey animals, including rodents; to say the least, this would not bode well for crop farmers!

Perhaps the best-known of the global fungal-caused epidemics is the amphibian-decimating Chytridiomycosis, whose affects were initially recognised twenty years ago but may have started much earlier. As its spores can live in water, the responsible Batrachochytrium fungi are ideally situated to infect about one-third of all frog, toad, newt and salamander species. Again, it is thought that man has inadvertently caused the problem, as the African clawed frog Xenopus laevis is an immune carrier of the fungus and has been exported worldwide since the 1930's.

Another contributor may be climate change, as amphibian-rich forests experience temperature variations that are ideal for the chytrid fungi to proliferate in. As a final nail in the coffin - and as with bees and Colony Collapse Disorder - pesticides may play a key role in the epidemic. Agrochemicals are shown to lower the amphibian immune response and so increase their susceptibility to infection. However, the situation isn't completely hopeless: here in New Zealand, researchers at the University of Otago have used chloramphenicol, an antibiotic eye ointment, to cure infected Archey's frogs (Leiopelma archeyi). This species is already critically endangered even without the chytrid epidemic; hopefully, the cure will prove to be the saviour of other amphibian species too. This would be just as well, considering the dangerous side effects found in other treatments such as antifungal drugs and heat therapy (the latter involving temperature-controlled environments that are lethal to the pathogen).

During the past decade, over five million North American bats have been killed by white-nose syndrome, which is caused by the fungus Pseudogymnoascus destructans. Again, humans have inadvertently spread the pathogen, in this case from Eurasia, where the bat species are immune to it, to North America, where they are most definitely susceptible. The bats are only affected during hibernation, which makes treating them difficult, although brief exposure to ultraviolet light has been shown to kill the fungus. This may prove to be a cure to infected colonies, although how the UV could be administered without disturbing the cave-roosting populations will take some figuring out.

It appears then that a combination of manmade causes (international travel, climate change and chemical pollution) is creating a field day for various tiny fungi or fungus-like organisms, at the expense of numerous species of fauna and flora. The culprits are so small and pervasive that there is a little hope of preventing their spread. Therefore if conventional cures cannot be found, the only hope for the likes of the kauri might be the use of genetic engineering to either give the victim resistance or to kill off the pathogen. This science fiction-sounding technology wouldn't be cheap and its knock-on effects unknown – and potentially disastrous. The former technique would presumably not be any use to the existing populations, only to the germ line cells of the next generation. Whatever happens, our short-sighted approach to the environment is certainly starting to have major repercussions. A world without the magnificent kauri, not to mention many amphibian, reptile and mammal species, would be a much poorer one.

Wednesday 27 September 2017

Cow farts and climate fiddling: has agriculture prevented a new glaciation?

Call me an old grouch, but I have to say that one of my bugbears is the use of the term 'ice age' when what is usually meant is a glacial period. We currently live in an interglacial (i.e. warmer) era, the last glaciation having ended about 11,700 years ago. These periods are part of the Quaternary glaciation that has existed for almost 2.6 million years and deserving of the name 'Ice Age', with alternating but irregular cycles of warm and cold. There, that wasn't too difficult now, was it?

What is rather more interesting is that certain geology textbooks published from the 1940s to 1970s hypothesised that the Earth is overdue for the next glaciation. Since the evidence suggests the last glacial era ended in a matter of decades, the proposed future growth of the ice sheets could be equally rapid. Subsequent research has shown this notion to be flawed, with reliance on extremely limited data leading to over-confident conclusions. In fact, current estimates put interglacial periods as lasting anywhere from ten thousand to fifty thousand years, so even without human intervention in global climate, there would presumably be little to panic about just yet.

Over the past three decades or so this cooling hypothesis has given way to the opposing notion of a rapid increase in global temperatures. You only have to read such recent news items as the breakaway of a six thousand square kilometre piece of the Antarctic ice shelf to realise something is going on, regardless of whether you believe it is manmade, natural or a combination of both. But there is a minority of scientists who claim there is evidence for global warming - and an associated postponement of the next glaciation - having begun thousands of years prior to the Industrial Revolution. This then generates two key questions:

  1. Has there been a genuine steady increase in global temperature or is the data flawed?
  2. Assuming the increase to be accurate, is it due to natural changes (e.g. orbital variations or fluctuations in solar output) or is it anthropogenic, that is caused by human activity?

As anyone with even a vague interest in or knowledge of climate understands, the study of temperature variation over long timescales is fraught with issues, with computer modelling often seen as the only way to fill in the gaps. Therefore, like weather forecasting, it is far from being an exact science (insert as many smileys here as deemed appropriate). Although there are climate-recording techniques involving dendrochronology (tree rings) and coral growth that cover the past few thousand years, and ice cores that go back hundreds of thousands, there are still gaps and assumptions that mean the reconstructions involve variable margins of error. One cross-discipline assumption is that species found in the fossil record thrived in environments - and crucially at temperatures - similar to their descendants today. All in all this indicates that none of the numerous charts and diagrams displaying global temperatures over the past twelve thousand years are completely accurate, being more along the lines of a reconstruction via extrapolation.

Having looked at some of these charts I have to say that to my untrained eye there is extremely limited correlation for the majority of the post-glacial epoch. There have been several short-term fluctuations in both directions in the past two thousand years alone, from the so-called Mediaeval Warm Period to the Little Ice Age of the Thirteenth to Nineteenth centuries. One issue of great importance is just how wide a region did these two anomalous periods cover outside of Europe and western Asia? Assuming however that the gradual warming hypothesis is correct, what are the pertinent details?

Developed in the 1920s, the Milankovitch cycles provide a reasonable fit for the evidence of regular, long-term variations in the global climate. The theory states that changes in the Earth's orbit and axial tilt are the primary causes of these variations, although the timelines do not provide indisputable correlation. This margin of error has helped to lead other researchers towards an anthropogenic cause for a gradual increase in planet-wide warming since the last glaciation.

The first I heard of this was via Professor Iain Stewart's 2010 BBC series How Earth Made Us, in which he summarised the ideas of American palaeoclimatologist Professor William Ruddiman, author of Plows, Plagues and Petroleum: How Humans Took Control of Climate. Although many authors, Jared Diamond amongst them, have noted the effects of regional climate on local agriculture and indeed the society engaged in farming, Professor Ruddiman is a key exponent of the reverse: that pre-industrial global warming has resulted from human activities. Specifically, he argues that the development of agriculture has led to increases in atmospheric methane and carbon dioxide, creating an artificial greenhouse effect long before burning fossil fuels became ubiquitous. It is this form of climate change that has been cited as postponing the next glaciation, assuming that the current interglacial is at the shorter end of such timescales. Ruddiman's research defines two major causes for an increase in these greenhouse gases:

  1. Increased carbon dioxide emissions from burning vegetation, especially trees, as a form of land clearance, i.e. slash and burn agriculture.
  2. Increased methane from certain crops, especially rice, and from ruminant species, mostly cattle and sheep/goat.

There are of course issues surrounding many of the details, even down to accurately pinpointing the start dates of human agriculture around the world. The earliest evidence of farming in the Near East is usually dated to a few millennia after the end of the last glaciation, with animal husbandry preceding the cultivation of crops. One key issue concerns the lack of sophistication in estimating the area of cultivated land and ruminant population size until comparatively recent times, especially outside of Western Europe. Therefore generally unsatisfactory data concerning global climate is accompanied by even less knowledge concerning the scale of agriculture across the planet for most of its existence.

The archaeological evidence in New Zealand proves without a doubt that the ancestors of today's Maori, who probably first settled the islands in the Thirteenth Century, undertook enormous land clearance schemes. Therefore even cultures remote from the primary agricultural civilisations have used similar techniques on a wide scale. The magnitude of these works challenges the assumption that until chemical fertilisers and pesticides were developed in the Twentieth Century, the area of land required per person had altered little since the first farmers. In a 2013 report Professor Ruddiman claims that the level of agriculture practiced by New Zealand Maori is just one example of wider-scale agricultural land use in pre-industrial societies.

As for the role played by domesticated livestock, Ruddiman goes on to argue that ice core data shows an anomalous increase in atmospheric methane from circa 3000BCE onwards. He hypothesises that a rising human population led to a corresponding increase in the scale of agriculture, with rice paddies and ruminants the prime suspects. As mentioned above, the number of animals and size of cultivated areas remain largely conjectural for much of the period in question.  Estimates suggest that contemporary livestock are responsible for 37% of anthropogenic methane and 9% of anthropogenic carbon dioxide whilst cultivated rice may be generating up to 20% of anthropogenic methane. Extrapolating back in time allows the hypothesis to gain credence, despite lack of access to exact data.

In addition, researchers both in support and opposition to pre-industrial anthropogenic global warming admit that the complexity of feedback loops, particularly with respect to the role of temperature variation in the oceans, further complicates matters. Indeed, such intricacy, including the potential latency between cause and effect, means that proponents of Professor Ruddiman's ideas could be using selective data for support whilst suppressing its antithesis. Needless to say, cherry-picking results is hardly model science.

There are certainly some intriguing aspects to this idea of pre-industrial anthropogenic climate change, but personally I think the jury is still out (as I believe it is for the majority of professionals in this area).  There just isn't the level of data to guarantee its validity and what data is available doesn't provide enough correlation to rule out other causes. I still think such research is useful, since it could well prove essential in the fight to mitigate industrial-era global warming. The more we know about longer term variations in climate change, the better the chance we have of understanding the causes - and potentially the solutions - to our current predicament. And who knows, the research might even persuade a few of the naysayers to move in the right direction. That can't be bad!

Friday 23 December 2016

O Come, All ye Fearful: 12 woes for Christmas future

This month I thought I would try and adopt something of the Yuletide spirit by offering something short and sharp (if not sweet) that bares a passing resemblance to the carol On the Twelve Days of Christmas. However, instead of gifts I'll be attempting to analyse twelve key concerns that humanity may face in the near future, some being more immediate - not to mention inevitable - than others.

I'll start off with the least probable issues then gradually work down to those most likely to have widespread effects during the next few decades. As it is meant to be a season of good cheer I'll even suggest a few solutions or mitigation strategies where these are applicable. The ultimate in low-carb gifts: what more could you ask for?

12: ET phones Earth. With the SETI Institute and Breakthrough Listen project leading efforts to pick up signals from alien civilisations, what are the chances that we might receive an extra-terrestrial broadcast in the near future? Although many people might deem this just so much science fiction, the contents of a translated message (or autonomous probe) could prove catastrophic. Whether it would spark faith-based wars or aid the development of advanced technology we couldn't control - or be morally fit enough to utilise - there may be as many negative issues as positive ones.

Solution: Keeping such information secret, especially the raw signal data, would be incredibly difficult. Whether an international translation project could be conducted in secret is another matter, with censorship allowing a regular trickle of the less controversial information into the public domain. Whilst this is the antithesis of good scientific practice, it could prove to be the best solution in the long term. Not that most politicians are ever able to see anything that way, however!

11. Acts of God. There is a multitude of naturally-occurring events that are outside of human control, both terrestrial (e.g. super volcano, tsunami) and extra-terrestrial, such as asteroid impacts. Again, until recently few people took much interest in the latter, although Hollywood generated some awareness via several rather poor movies in the late 1990s. The Chelyabinsk meteor of February 2013 (rather than meteorite, as most of the material exploded at altitude led to 1500 injuries, showing that even a small object that doesn't reach the ground intact can cause havoc. Since 2000, there have been over twenty asteroid impacts or atmospheric break-ups ranging from a kiloton up to half a megaton.

Solution: Although there are various projects to assess the orbits of near-Earth objects (NEOs), the development of technologies to deflect or destroy impactors requires much greater funding than is currently in place. Options range from devices that use just their velocity to knock NEOs off-course to the brute force approach of high-powered lasers and hydrogen bombs. However, with the cancellation of NASA's Ares V heavy launch vehicle it's difficult to see how such solutions could be delivered in time. Hopefully in the event something would be cobbled together pretty quickly!

10. Grey goo scenario. As defined by Eric Drexler in his 1986 book Engines of Creation, what if self-replicating nanobots (developed for example, for medical purposes), break their programming and escape into the world, eating everything in their path? Similar to locust swarms, they would only be limited by the availability of raw materials.

Solution: The Royal Society's 2004 report on nanoscience declared that the possibility of von Neumann machines are some decades away and therefore of little concern to regulators. Since then, other research has suggested there should be limited need to develop such machines anyway. So that's good to know!

9. Silicon-destroying lifeforms. What if natural mutations lead to biological organisms that can seriously damage integrated circuitry? A motherboard-eating microbe would be devastating, especially in the transport and medical sectors, never mind the resulting communication network outages and financial chaos. This might sound as ridiculous as any low-grade science fiction plot, but in 1975 nylon-eating bacteria were discovered. Since then, research into the most efficient methods to recover metals from waste electronics have led to experiments in bioleaching. As well as bacteria, the fungus Aspergillus niger has been shown to breakdown the metals used in circuits.

Solution: As bioleaching is potentially cheaper and less environmentally damaging it could become widespread. Therefore it will be up to the process developers to control their creations. Fingers crossed, then!

8. NCB. Conventional weapons may be more common place, but the development of nuclear, chemical and biological weapons by rogue states and terrorist organisations is definitely something to be worried about. The International Atomic Energy Agency has a difficult time keeping track of all the radioactive material that is stolen or goes missing each year.  As the 1995 fatal release of the nerve agent sarin on the Tokyo subway shows, terrorists are not unwilling to use weapons of mass destruction on the general public.

Solution: There's not much I can suggest here. Let's hope that the intelligence services can keep all the Dr Evils at bay.

7. Jurassic Park for real. At Harvard last year a chicken embryo's genes were tweaked in such a way as to create a distinctly dinosaurian snout rather than a beak. Although it may be sometime before pseudo-velociraptors are prowling (high-fenced) reserves, what if genome engineering was used to develop Homo superior? A 2014 paper from Michigan State University suggests both intellectual and physical improvements via CRISPR-cas9 technology is just around the corner.

Solution: If the tabloids are to be believed (as if) China will soon be editing human genomes, to fix genetic diseases as well as generating enhanced humans. Short of war, what's to stop them?

Planet Earth wrapped as a Christmas present

6. DIY weaponry. The explosion in 3D printers for the domestic market means that you can now make your own handguns. Although current designs wear out after a few firings, bullets are also being developed that will work without limiting their lifespan. Since many nations have far more stringent gun laws than the USA, an increase in weaponry among the general public is just what we don't need.

Solution: how about smart locking systems on printers so they cannot produce components that could be used to build a weapon? Alternatively, there are now 3D printer models that can manufacture prototype bulletproof clothing. Not that I'd deem that a perfect solution!

5. Chemical catastrophe. There are plenty of chemicals no longer in production that might affect humanity or our agriculture. These range from the legacy effects of polychlorinated biphenyl (PCB), a known carcinogen, to the ozone depletion causing by CFCs, which could be hanging around the stratosphere for another century; this doesn't just result in increased human skin cancer - crops are also affected by the increased UVB.

Solution: we can only hope that current chemical development now has more rigorous testing and government regulation than that accorded to PCBs, CFCs, DDTs, et al. Let's hope all that health and safety legislation pays off.

4. The energy crisis. Apart from the obvious environmental issues around fossil fuels, the use of fracking generates a whole host of problems on its own, such as the release of methane and contamination of groundwater by toxic chemicals, including radioactive materials.

Solution: more funding is required for alternatives, especially nuclear fusion (a notoriously expensive area to research). Iceland generated 100% of its electricity from renewables whilst Portugal managed 4 consecutive days in May this year via wind, hydro, biomass and solar energy sources. Greater recycling and more incentives for buying electric and hybrid vehicles wouldn't hurt either!

3. Forced migration. The rise in sea levels due to melt water means that it won't just be Venice and small Pacific nations that are likely to become submerged by the end of the century. Predictions vary widely, but all in the same direction: even an increase of 150mm would be likely to affect over ten million people in the USA alone, with probably five times that number in China facing similar issues.

Solution: a reduction in greenhouse gas emissions would seem to be the thing. This requires more electric vehicles and less methane-generating livestock. Arnold Schwarzenegger's non-fossil fuel Hummers and ‘Less meat, less heat, more life' campaign would appear to be good promotion for the shape of things to come - if he can be that progressive, there's hope for everyone. Then of course there's the potential for far more insect-based foodstuffs.

2. Food and water. A regional change in temperature of only a few degrees can seriously affect crop production and the amount of water used by agriculture. Over 700 million people are already without clean water, with shortages affecting agriculture even in developed regions - Australia and California spring to mind. Apparently, it takes a thousand litres of water to generate a single litre of milk!

Solution: A few far-sighted Australian farmers are among those developing methods to minimise water usage, including a few low-tech schemes that could be implemented anywhere. However, really obvious solutions would be to reduce the human population and eat food that requires less water. Again, bug farming seems a sensible idea.

1. Preventing vegegeddon. A former professor at Oxford University told me that some of his undergraduates have problems relating directly to others, having grown up in an environment with commonplace communication via electronic interfaces. If that's the problem facing the intellectual elite, what hope for the rest of our species? Physical problems such as poor eyesight are just the tip of the iceberg: the human race is in severe danger of degenerating into low-attention ‘sheeple' (as they say on Twitter). Children are losing touch with the real world, being enticed into virtual environments that on the surface are so much more appealing. Without knowledge or experience of reality, even stable democracies are in danger of being ruled by opportunistic megalomaniacs, possibly in orange wigs.

Solution: Richard Louv, author of  Last Child in the Woods: Saving Our Children From Nature-Deficit Disorder suggests children require unstructured time out of doors in order to gain an (occasionally painful) understanding of  the real world; tree-climbing, fossicking, etc. Restricting time on electronic devices would seem to go hand in hand with this.

Well, that about wraps it up from me. And if the above seems somewhat scary, then why not do something about it: wouldn't working for a better future be the best Christmas present anyone could ever give?

Tuesday 23 December 2014

Easy fixes: simple corrections of some popular scientific misconceptions

A few months' ago I finally saw the film 'Gravity', courtesy of a friend with a home theatre system. Amongst the numerous technical errors - many pointed out on Twitter by Neil deGrasse Tyson - was one that I hadn't seen mentioned. This was how rapidly Sandra Bullock's character acclimatised to the several space stations and spacecraft immediately after removing her EVA suit helmet. As far as I am aware, the former have nitrogen-oxygen atmospheres whilst the suits are oxygen-only, necessitating several hours of acclimatisation.

I may of course be wrong on this, and of course dramatic tension would be pretty much destroyed if such delays had to be woven into the plot, but it got me thinking that there are some huge fundamental errors propagated in non-scientific circles. Therefore my Christmas/Hanukkah/holiday season present is a very brief, easy -on-the-brain round-up of a few of the more obvious examples.

  1. The Earth is perfect sphere.
    Nope, technically I think the term is 'oblate spheroid'. Basically, a planet's spin squashes the mass so that the polar diameter is less than the equatorial diameter. Earth is only about 0.3% flatter in polar axis but if you look at a photograph of Saturn you can see a very obvious squashing.

  2. Continental drift is the same thing as plate-tectonics.
    As a child I often read that these two were interchangeable, but this is not so. The former is the hypothesis that landmasses have moved over time whilst the latter is the mechanism now accepted to account for this, with the Earth's crust floating over the liquid mantle in large segments or plates.

    Geologist Alfred Wegener suggested the former in 1912 but is was largely pooh-poohed until the latter was discovered by ocean floor spreading half a century later. As Carl Sagan often said, "extraordinary claims require extraordinary evidence".

  3. A local increase in cold, wet weather proves that global warming is a fallacy.
    Unfortunately, chaose theory shows that even the minutest of initial changes can cause major differences of outcome, hence weather forecasting being far from an exact science.

    However, there is another evidence for the validity of this theory, fossil fuel lobbyists and religious fundamentalists aside. I haven't read anything to verify this, but off the top of my head I would suggest that if the warm water that currently travels north-east across the Atlantic from the Gulf of Mexico (and prevents north-western Europe from having cold Canadian eastern seaboard winters), then glacial meltwater may divert this warm, denser seawater. And then the Isles of Scilly off the Cornish coast may face as frosty a winter as the UK mainland!

  4. Evolution and natural selection are the same thing.
    Despite Charles Darwin's On the Origin of Species having been published in 1859, this mistake is as popular as ever. Evolution is simply the notion that a population within a parent species can slowly differentiate to become a daughter species, but until Darwin and Alfred Russel Wallace independently arrived at natural selection, there really wasn't a hypothesis for the mechanism.

    This isn't to say that there weren't attempts to provide one, it's just that none of them fit the facts quite as well as the elegant simplicity of natural selection. Of course today's technology, from DNA analysis to CAT scans of fossils, provides a lot more evidence than was available in the mid-Nineteenth Century. Gregor Mendel's breeding programmes were the start of genetics research that led to the modern evolutionary synthesis that has natural selection at its core.

  5. And finally…freefall vs zero gravity.
    Even orbiting astronauts have been known to say that they are in zero gravity when they are most definitely not. The issue is due to the equivalence of gravity and acceleration, an idea which was worked on by luminaries such as Galileo, Newton and Einstein. If you find yourself in low Earth orbit - as all post-Apollo astronauts are - then clearly you are still bound by our planet's gravity.

    After all, the Moon is approximately 1800 times further away from the Earth than the International Space Station (ISS), but it is kept in orbit by the Earth's pull (okay, so there is the combined Earth-Moon gravitational field, but I'm keeping this simple). By falling around the Earth at a certain speed, objects such as the ISS maintain a freefalling trajectory: too slow and the orbit would decay, causing the station to spiral inwards to a fiery end, whilst too fast would cause it to fly off into deep space.

    You can experience freefall yourself via such delights as an out-of-control plummeting elevator or a trip in an arc-flying astronaut training aircraft A.K.A. 'Vomit Comet'. I'm not sure I'd recommend either! Confusingly, there's also microgravity and weightlessness, but as it is almost Christmas we'll save that for another day.
There are no doubt numerous other, equally fundamental errors out there, which only goes to show that we could do with much better science education in our schools and media. After all, no-one would make so many similar magnitude mistakes regarding the humanities, would they? Or, like the writer H.L. Mencken, would I be better off appreciating that "nobody ever went broke underestimating the intelligence of the (American) public"? I hope not!

Monday 1 August 2011

Weather with you: thundersnow, hosepipe bans and climate punditry

I must confess to have not watched any of the current BBC series The Great British Weather, since (a) it looks rubbish; and (b) I spend enough time comparing the short-range forecast with the view outside my window as it is, in order to judge whether it will be a suitable night for astronomy. Since buying a telescope at the start of the year (see an earlier astronomy-related post for more details) I've become just a little bit obsessed, but then as an Englishman it's my inalienable right to fixate on the ever-changeable meteorology of these isles. If I think that there is a chance of it being a cloud-free night I tend to check the forecast every few hours, which for the past two months or so has proved to be almost uniformly disappointing; as a matter of fact, the telescope has remained boxed up since early May.

There appears to be a grim pleasure for UK-based weather watchers that when a meteorology source states that it is currently sunny and dry in your location it may in fact be raining torrentially. We all realise forecasting relies on some understanding of a complex series of variables, but if they can't even get the 'nowcast' correct what chance do the rest of us have?

So just how has the UK's mercurial weather patterns affected the science of meteorology and our attitude towards weather and climate? As far back as 1553 the English mathematician and inventor Leonard Digges included weather lore and descriptions of phenomena in his A General Prognostication. Since then, British scientists have been in the vanguard of meteorology. Isaac Newton's contemporary and rival Robert Hooke may have been the earliest scientist to keep meteorological records, as well as inventing several associated instruments. Vice-Admiral Robert FitzRoy, formerly captain of HMS Beagle (i.e. Darwin's ship) was appointed as the first Meteorological Statist to the Board of Trade in 1854, which in today’s terms would make him the head of the Met Office; he is even reputed to be the inventor of the term 'forecast'.

Modern science aside, as children we pick up a few snippets of the ancient folk learning once used to inculcate elementary weather knowledge. We all know a variation of "Red sky at night, shepherd's delight; red sky in the morning, shepherd's warning", the mere tip of the iceberg when it comes to pre-scientific observation and forecasting. But to me it looks if all of us in ever-changeable Britain have enough vested interest in the weather (once it was for crop-growing, now just for whether it is a sunglasses or umbrella day – or both) to maintain our own, personal weather database in our heads. Yet aren't our memories and lifespan in general just too short to allow us a genuine understanding of meteorological patterns?

One trend that I consider accurate is that those 'little April showers' I recall from childhood (if you remember the song from 'Bambi') are now a thing of the past, with April receiving less rainfall than June. This is an innate feeling: I have not researched it enough to find out if there has been a genuine change over the past three decades. Unfortunately, a combination of poor memory and spurious pattern recognition means we tend to over-emphasise 'freak' events - from thundersnow to the day it poured down at so-and-so's June wedding - at the expense of genuine trends.

For example, my rose-tinted childhood memories of six largely rain-free weeks each summer school break centre around the 1976 drought, when my brother had to be rescued from the evil-smelling mud of a much-reduced reservoir and lost his shoes in the process. I also recall the August 1990 heat wave as I was at the time living less than 20 km from Nailstone in Leicestershire, home of the then record UK temperature of 37.1°C. In contrast, I slept through the Great Storm of 1987 with its 200+km/h winds and don’t recall the event at all! As for 2011, if I kept a diary it would probably go down as the 'Year I Didn't Stop Sneezing'. City pollution and strong continental winds have combined to fill the London air with pollen since late March, no doubt much to the delight of antihistamine manufacturers.

An Norfolk beach in a 21st century summer
An East Anglian beach, August 2008


Our popular media frequently run stories about the latest report on climate change, either supporting or opposing certain hypotheses, but rarely compare it to earlier reports or long-term records. Yet even a modicum of research shows that in the Nineteenth Century Britain experienced a large variation in weather patterns. For example, the painter J.M.W. Turner's glorious palette was not all artistic licence, but almost certainly influenced by the volcanic dust-augmented sunsets following the 1815 Tambora eruption. It wasn't just painting that was affected either, as the UK suffered poor harvests the following year whilst in the eastern United States 1816 was known as 'Eighteen Hundred and Froze to Death'.

The influence of the subjective on the objective doesn't sound any different from most other human endeavours, except that weather professionals too - meteorologists, climatologists, and the like - also rely on biases in their work. Ensemble forecasting, which uses slightly different initial conditions to create data reports which are then combined to provide an average outcome, has been shown to be a more accurate method of prediction. In other words, this sounds like a form of scientific bet hedging!

Recent reports have shown that once-promising hypotheses involving singular factors such as sunspot cycles can in no way account for most primary causes of climate change, either now or in earlier epochs. It seems the simple answers we yearn for are the prerogative of Hollywood narrative, not geophysical reality. One bias that can seriously skew data is the period being used in a report. It sounds elementary, but we are rarely informed that even the difference of a single year in the start date can significantly affect the outcome as to whether, for example, temperature is increasing over time. Of course, scientists may deliberately only publish results for periods that support their hypotheses (hardly a unique trait, if you read Ben Goldacre). When this is combined with sometimes counter-intuitive predictions – such as that a gradual increase in global mean temperature could lead to cooler European winters – is it little wonder we non-professionals are left to build our level of belief in climate change via a muddle of personal experience, confusion and folk tales? The use of glib phrases such as 'we're due another glaciation right about now' doesn't really help either. I'm deeply interested in the subject of climate change and I think there is serious cause for concern, but the data is open to numerous interpretations.

So what are we left with? (Help: I think I'm turning into Jerry Springer!) For one thing, the term 'since records began' can be about as much use as a chocolate teapot. Each year we get more data (obviously) and so each year the baseline changes. Meteorology and climatology are innately complex anyway, but so far both scientists and our media have comprehensively failed to explain to the public just how little is known and how even very short-term trends are open to abrupt change (as with the notorious 'don't worry' forecast the night of the 1987 Great Storm). But then you have only to look out of the window and compare it to the Met Office website to see we have a very long way to go indeed…

Thursday 1 April 2010

Blown away: some weird and wonderful animal defence mechanisms

At a time when environmentalists are calling for farmers to swap cattle for non-ruminant species such as kangaroos in an effort to stem bovine methane emission, a recent report by a leading Argentinean palaeontologist reminds me of Karl Marx's popular axiom "History repeats itself first as tragedy, second as farce".

The report's theme concerns animal defensive mechanisms, a classic example of truth being infinitely stranger than fiction. Consider for instance the bombardier beetle, an innocuous enough looking insect that when endangered can squirt a boiling liquid from its rear abdomen. Okay, that's only mildly weird. Well what about the several species of frogs and newts that when threatened extrude internal claws or spines by puncturing their own skin? Or the Asian carpenter ants whose soldiers literally self-destruct in the defence of their colony, in the process spraying a sticky poison over their attackers? Surely if anyone needed a good argument against Creationism then this panoply of the bizarre would suit admirably, since it postulates an equally bizarre, not to say warped, sense of humour on behalf of a Creator.

But the news from Argentina may well outshine (if that is the right word) all of the above, not least from the sheer scale of the animals involved. The main players are those undisputed giants of the dinosaur world, the South American titanosauria sauropods of the mid- to late-Cretaceous. Partial remains found over the past twenty years imply species such as Argentinosaurus may have reached lengths of 40 metres, thereby exceeding their better-known Jurassic relatives such as Diplodocus by around 20 per cent.

In 2002 Fernando Calvo, Professor of Natural Sciences at La Salta University in Argentina, became intrigued by sauropod growth patterns and nutrition. Although coprolites (fossilised poo) have not been found for any species of Argentinean titanosaur, the study of microscopic phytoliths, silicified plant fragments, suggest these animals enjoyed a broad plant diet. The notion that Mesozoic vegetation consisted primarily of conifers, cycads, horsetails and ferns has been overturned by recent discoveries of palms and even tall, primitive grasses. Since modern grazers such as cattle can survive solely on such unpromising material, how about titanosaurs?

Calvo and his team began a study to go where no scientists had gone before and assess the potential digestive systems of Argentinosaurus and its relatives. One of the luxuries of an enormous bulk is being able to subsist on nutritionally-poor foodstuffs, a case of sheer quantity over quality. The La Salta group hypothesised that their native sauropods were amongst the most efficient of digesters just because of their size: by the time plant material had worked its way through such a large digestive tract most of the nutrients would be absorbed, no doubt aided by gastroliths, literally stomach stones deliberately swallowed to help churn the material.

The preliminary report was published in March last year and quickly became notorious in palaeontological circles. For there was no delicate way of describing the findings: the titanosaurs would easily top the Guinness Book of Records' list of “World's Greatest Farters”. Whilst sauropods did not have the multiple stomach arrangements of modern ruminants the hypothesis was clear: titanosaur herds would have been surrounded by an omnipresent cloud of methane.

For Calvo, the next step came several months later when a tip-off from a farmer in Chubut led to an astonishing series of finds. The site, whose exact location remains secret, revealed the semi-articulated fragments from a tight-knit group of three predatory Giganotosaurus and approximately 15 per cent of the skeleton of a single, adult Argentinosaurus. Team member Jose Chiappe led the extraction work on the latter colossus and postulated that it had died slowly, perhaps due to blood loss following an attack.

What were far more intriguing were the positions of the attackers: all three had a slumped, head-down attitude, implying sudden collapse and virtually instantaneous death. Calvo found himself asking the obvious: how could they have died? Whereas a Diplodocus tail was well-formed for use as a whip, it was a much more gracile animal than its Cretaceous counterparts. The larger bulk of Argentinosaurus didn't bode well for a fast reaction: by the time a titanosaur had noticed the approach of a Giganotosaurus it would have had precious few seconds to position its tail for a whiplash response. Then Chiappe remembered an Early Cretaceous site in Liaoning Province, China, where animals had died of suffocation due to volcanic gases.

The resemblance in the post-mortem postures of the Giganotosaurus led to an incredible but as yet unpublished hypothesis: if correctly positioned, a frightened titanosaur could have defended itself by the simple expedient of raising its tail and expelling gaseous waste directly into the conveniently-placed head of an oncoming predator. An initial calculation based on scaling up from modern animals suggested an adult titanosaur could have produced about one tonne of methane per week. Computer simulations suggest a sustained five-second burst at close range would have K-O'd an eight-ton Giganotosaurus, and with a brain barely half that of Tyrannosaurus, it's unlikely the predators had the wherewithal to avoid their fate. If only the late Michael Crichton had known this, perhaps he would have written a scene involving an ignominious demise at the rear end of a sauropod for some of the characters in Jurassic Park (Jurassic Fart, anyone?) Or since this occurred in the Cretaceous, in the name of scientific accuracy perhaps that should that be Gone with the Wind?

Technorati Tags: , ,