Sunday 23 June 2019

Spray and walk away? Why stratospheric aerosols could be saviours or destroyers

My first scientific encounters with aerosols weren't particularly good ones. In my early teens, I read that the CFC propellants used as aerosols were depleting the ozone layer. Therefore, tiny atmospheric particles had negative connotations for me from my formative years. This was further enforced by Carl Sagan and Richard Turco's 1990 book A Path Where No Man Thought: Nuclear Winter and the End of the Arms Race, which discussed the potentially devastating effects of high-altitude aerosol's around the world following a nuclear attack. Strike two against these pesky particles!

Of course aerosols aren't just man-made. The stratospheric dust particles generated following the Chicxulub impact event 66 million years ago are known to have been instrumental in the global climate disruption that wiped out the dinosaurs and many other life forms. This would have been in addition to the thousands of years of environmental changes caused by sulfur aerosols from the Deccan Traps supervolcano. Rather more recently, the Mount Tambora volcanic eruption in 1815 led to starvation and epidemics around the world for up to three years.

Now that our civilisation is generating a rapid increase in global temperatures, numerous solutions are being researched. One of the most recent areas involves reducing the amount of solar radiation reaching the Earth's surface. Several methods have been suggested for this, but this year sees a small-scale experiment to actually test a solution, namely seeding the atmosphere with highly reflective particles in an artificial recreation of a volcanic event. The Stratospheric Controlled Perturbation Experiment (SCoPEx) is a solar geoengineering project involving Harvard University that will use a balloon to release calcium carbonate in aerosol form at about twenty kilometres above the Earth's surface, analysing the local airspace the following day to assess the effects.

This experiment is controversial for several reasons. Firstly, it doesn't lead to any reduction in greenhouse gases and particulate pollutants; if anything, by sweeping the issue under a stratospheric rug, it could allow fossil fuel corporations to maintain production levels and reduce investment in alternatives. If the recent reports by meteorologists that natural and non-intentional man-made aerosols are already mitigating global warming, then the gross effects of heat pollution must be higher than realised!

Next, this sort of minute level of testing is unlikely to pinpoint issues that operational use might generate, given the chaotic nature of atmospheric weather patterns. To date, numerous computer simulations have been run, but bearing in mind how inaccurate weather forecasting is beyond ten days, nothing can be as accurate as the real thing. Therefore at what point could a test prove that the process is effective and safe enough to be carried out on a global scale? Possibly it might require such a large scale experiment that it is both research and the actual process itself!

The duration that the aerosols remain aloft is still not completely understood, hinting that regular replenishment would be essential. In addition, could the intentionally-polluted clouds capture greater amounts of water vapour, at first holding onto and then dropping their moisture so as to cause drought followed by deluge? Clouds cannot be contained within the boundaries of the testing nation, meaning other countries could suffer these unintended side-effects.

It may be that as a back-up plan, launching reflective aerosols into the stratosphere makes sense, but surely it makes much more sense to reduce greenhouse gas emissions and increase funding of non-polluting alternatives? The main emphasis from ecologists to date has been to remove human-generated substances from the environment, not add new ones in abundance. I'm all for thinking outside the box, but I worry that the only way to test this technique at a fully effective level involves such a large scale experiment as to be beyond the point of no return. Such chemical-based debacles as ozone depletion via chlorofluorocarbons (CFCs) prove that in just a matter of decades we can make profound changes to the atmosphere - and badly effect regions furthest removed from the source itself.  So why not encourage more reducing, reusing and recycling instead?

Monday 10 June 2019

Defrosting dangers: global warming and the biohazards under the ice

Despite frequent news reports on the thawing of polar and glacial ice, there appears to be less concern shown towards this aspect of climate change than many others. Perhaps this is due to so few humans living in these regions; lack of familiarity with something helps us to ignore its true importance. The most obvious effects of melting ice are said to be the increase in atmospheric carbon, rising sea levels and unpredictable weather patterns, but there is another threat to our species that is only just beginning to be noticed - and as yet has failed to generate any mitigation plans.

A report last year confirmed a frightening cause behind the deaths back in 2015 of approximately half the world's remaining saiga antelope population: thanks to warmer and more humid weather, a type of bacteria usually confirmed to their nose had spread to the antelopes' bloodstream. Although not the sort of news to attract much attention even from nature-lovers, this ecological David and Goliath scenario looks set to be repeated in colder environments around the globe. Microscopic and fungal life forms that have been trapped or dormant for long periods, possibly millennia, may be on the verge of escaping their frozen confines.

The various film adaptions of John W. Campbell's 1938 novella Who Goes There? show the mayhem caused by an alien organism that has escaped its icy tomb. The real-life equivalents to this fictional invader are unlikely to be of extra-terrestrial origin, but they could prove at least as perilous, should climate change allow them to thaw out. The problem is easy to state: there is an enormous amount of dormant microbial life trapped in ice and permafrost that is in danger of escaping back into the wider ecosystem.

In the first quarter of the Twentieth Century over a million reindeer were killed by anthrax, with subsequent outbreaks occurring sporadically until as late as 1993. Recent years have seen the death of both farmers and their cattle from infection related to the thawing of a single infected reindeer carcass. In various incidents in 2016, dozens of Siberian herders and their families were admitted to hospital while Russian biohazard troops were flown in to run the clean-up operations. One issue is that until recently the infected animals - domesticated as well as wild - have rarely been disposed of to the recommended safety standards. Therefore, it doesn't take much for reactivated microbes to spread into environments where humans can encounter them.

Of course, the numbers of people and livestock living near glaciers and the polar caps is relatively low, but there are enormous regions of permafrost that are used by herders and hunters. Meltwater containing pathogens can get into local water supplies (conventional water treatment doesn't kill anthrax spores), or even reach further afield via oceans - where some microbes can survive for almost two years. The record high temperatures in some of the Northern Hemisphere's permafrost zones are allowing the spread of dangerous biological material into regions that may not have seen them for centuries - or far longer.

Decades-old anthrax spores aren't the only worry. Potential hazards include the smallpox virus, which caused a Siberian epidemic in the 1890s and may be able to survive in a freeze-dried state in victim's corpses before - however unlikely - reviving due to warmer temperatures. In addition, it should be remembered that many of the diseases that infect Homo sapiens today only arose with the development of farming, being variants of bacteria and viruses that transferred across from our domestic livestock.

This would suggest that permafrost and ice sheets include ancient microbes that our species hasn't interacted with for centuries - and which we may therefore have minimal resistance to. Although natural sources of radiation are thought to destroy about half of a bacteria's genome within a million years, there have been various - if disputed - claims of far older bacteria being revived, including those found in salt crystals that are said to be 250 million years old. In this particular case, their location deep underground is said to have minimised cosmic ray mutations and thus ensured their survival. Sounds like one for the Discovery Channel if you ask me, but never say never...

Even if this improbable longevity turns out to be inaccurate, it is known that dormant spore-forming bacteria such those leading to tetanus and botulism could, like anthrax, be revived after decades of containment in permafrost. Fungal spores are likewise known to survive similar interments; with amphibian, bat and snake populations currently declining due to the rapid spread of fungal pathogens, the escape of such material shouldn't be taken lightly.

So can anything be done to prevent these dangers? Other than reversing the increase in global temperatures, I somehow doubt it. Even the location of some of the mass burials during twentieth century reindeer epidemics have been lost, meaning those areas cannot be turned into no-go zones. Anthrax should perhaps be thought of as only one of a suite of biohazards that melting permafrost may be about to inflict on a largely uninformed world. The death of some remote animals and their herders may not earn much public sympathy, but if the revived pathogens spread to the wider ecosystem, there could be far more at stake. Clearly, ignorance is no protection from the microscopic, uncaring dangers now waking up in our warming world.