Wednesday 18 August 2021

Mushrooms to Mars: how fungi research could help long-duration space travel

I've often noted that fungi are the forgotten heroes of the ecosystem, beavering away largely out of sight and therefore out of mind. Whether it's the ability to break down plastic waste or their use as meat substitutes and pharmaceuticals, this uncharismatic but vital life form no doubt hold many more surprises in store for future research to discover. It's estimated that less than ten percent of all fungi species have so far been scientifically described; it's small wonder then that a recent study suggests an entirely new use for several types of these under-researched organisms.

Investigation of the Chernobyl nuclear power station in 1991 found that Cladosporium sphaerospermum, a fungus first described in the late nineteenth century, was thriving in the reactor cooling tanks. In other words, despite the high levels of radiation, the species was able to not only repair its cells but maintain a good rate of growth in this extreme environment. This led to research onboard the International Space Station at the end of 2018, when samples of the fungus were exposed to a month of cosmic radiation. The results were promising: a two millimetre thick layer of the fungus absorbed nearly two percent of the radiation compared to a fungus-free control.

This then suggests that long-duration crewed space missions, including to Mars, might be able to take advantage of this material to create a self-repairing radiation shield, both for spacecraft and within the walls of surface habitats. A twenty-one centimetre thick layer was deemed effective against cosmic rays, although this could potentially be reduced to just nine centimetres if the fungal mycelia were mixed with similar amounts of Martian soil. In addition, there is even the possibility of extracting the fungus' radiation-proof melanin pigment for use in items that require much thinner layers, such as spacesuit fabric.

If this sounds too good to be true, there are still plenty of technological hurdles to be overcome. Science fiction has frequently described the incorporation of biological elements into man-made technology, but it's early days as far as practical astronautics is concerned. After all, there is the potential for unique dangers, such as synthetic biology growing unstoppably (akin to scenarios of runaway nanobot replication). However, NASA's Innovative Advanced Concepts program (NIAC) shows that they are taking the idea of fungi-based shielding seriously, the current research considering how to take dormant fungal spores to Mars and then add water to grow what can only be described as myco-architecture elements - even interior fittings and furniture. In addition to the radiation shielding, using organic material also has the advantage of not having to haul everything with you across such vast distances.

Even more ideas are being suggested for the use of similarly hardy species of fungi on a Mars base, from bioluminescent lighting to water filtration. Of course, this doesn't take into account any existing Martian biology: the seasonal methane fluctuations that have been reported are thought by some to be too large to have a geochemical cause; this suggests that somewhere in the sink holes or canyon walls of Mars there are colonies of methane-producing microbes, cosily shielded from the worst of the ultraviolet. If this proves to be the case, you would hope that any fungi taken to the red planet would be genetically modified to guarantee that it couldn't survive outside of the explorer's habitats and so damage Martian biota. Humanity's track record when it comes to preserving the ecosystems of previously isolated environments is obviously not something we can be proud of!

What fungi can do alone, they also do in symbiosis with algae, i.e. as lichens. Various experiments, including the LIchens and Fungi Experiment (LIFE) on the International Space Station (incidentally, doesn't NASA love its project acronyms?) have tested extremophile lichens such as Xanthoria elegans and Rhizocarpon geographicum in simulated Martian environments for up to eighteen months. The researchers found that the organisms could remain active as long as they were partially protected, as if they were growing in sink holes beneath the Martian surface. Of course, this success also enhances the possibility of similar lifeforms already existing on the red planet, where it would have had eons in which to adapt to the gradually degraded conditions that succeeded Mars' early, clement, phase.

The CRISPR-Cas9 system and its successors may well develop synthetic fungi and lichens that can be used both on and especially off the Earth, but we shouldn't forget that Mother Nature got there first. Spacecraft shielding and myco-architecture based on natural or genetically modified organisms may prove to be an extremely efficient way to safeguard explorers beyond our world: the days of transporting metal, plastic and ceramic objects into space may be numbered; the era of the interplanetary mushroom may be on the horizon. Now there's a phrase you don't hear every day!


Sunday 18 July 2021

The uncertainty principle: does popular sci-comm imply more than is really known?

Over the years I've examined how ignorance in science can be seen as a positive thing and how it can be used to define the discipline, a key contrast to most religions. We're still a long way from understanding many fundamental aspects of the universe, but the religious fundamentalist (see what I did there?) mindset is seemingly unable to come to terms with this position and so incorporates lack of knowledge into arguments disparaging science. After all, the hackneyed train of thought goes, scientific theories are really only that, an idea, not something proven beyond all possible doubt. Of course this isn't the case, but thanks to the dire state of most school science education, with the emphasis on exams and fact-stuffing rather than analysis of what science really is (a group of methods, not a collection of facts) - let alone anything that tries to teach critical thinking - you can see why some people fall prey to such disinformation, i.e. that most science isn't proven to any degree of certainty.

With this in mind, you have to wonder what percentage of general audience science communication describes theories with much more certainty than is warranted, when instead there is really a dearth of data that creates a partial reliance on inferred reasoning. Interestingly, the complete opposite used to be a common statement; for example, in the nineteenth century the composition of stars was thought to be forever unknowable, but thanks to spectroscopy that particular wonder came to fruition from the 1860s onwards. It is presumably the speed of technological change today that has reduced that negativity, yet it can play into the anti-rationalist hands of religious hardliners if scientists claim absolute certainty for any particular theory (the Second Law of Thermodynamics excepted). 

As it is, many theories are based on a limited amount of knowledge (both evidential and mathematical) that rely on an expert filling in of the gaps. As an aside, the central tenet of evolution by natural selection really isn't one of these: the various sources of evidence, from fossils to DNA, provide comprehensive support to the theory. However, there are numerous other areas which rely on a fairly small smattering of physical evidence and a lot of inference. This isn't to say the latter is wrong - Nobel-winning physicist Richard Feynman once said that a scientific idea starts with a guess - but to a non-specialist this approach can appear somewhat slapdash.

Geophysics appears to rely on what a layman might consider vague correlations rather than exact matches. For example, non-direct observation techniques such as measuring seismic waves have allowed the mapping of the interior composition of the Earth; unless you are an expert in the field, the connection between the experimental results and clear-cut zones seem more like guesswork. Similarly, geologists have been able to create maps of the continental plates dating back around 600 million years, before which the position of land masses hasn't been so much vague as completely unknown. 

The time back to the Cambrian is less than fifteen percent of the age our 4.5 billion year old planet. This (hopefully) doesn't keep the experts up at night, as well-understood geophysical forces mean that rock is constantly being subducted underground, to be transformed and so no longer available for recording. In addition, for its first 1.3 billion years the planet's surface would have been too hot to allow plates to form. Even so, the position of the continental crust from the Cambrian period until today is mapped to a high level of detail at frequent time intervals; this is because enough is known of the mechanisms involved that if a region at the start of a period is in position A and is later found at position Z, it must have passed through intermediate positions B through Y en route.

One key geological puzzle related to the building and movement of continental rock strata is known as the Great Unconformity, essentially a 100 million year gap in the record that occurs in numerous locations worldwide for the period when complex multicellular life arose. In some locales the period expands both forwards and backwards to as much as a billion years of missing rock; that's a lot of vanished material! Most of the popular science I've read tends to downplay the absent strata, presumably because in the 150 years since the Great Unconformity was first noticed there hasn't been a comprehensive resolution to its cause. The sheer scale of the issue suggests a profound level of ignorance within geology. Yes, it is a challenge, but it doesn't negate the science in its entirety; on the other hand, it's exactly the sort of problem that fundamentalists can use as ammunition to promote their own versions of history, such as young Earth creationism.

In recent decades, the usually conservative science of geology has been examining the evidence for an almost global glaciation nicknamed 'Snowball Earth' (or 'Slushball Earth', depending on how widespread you interpret the evidence for glaciation). It appears to have occurred several times in the planet's history, with the strongest evidence for it occurring between 720 and 635 million years ago. What is so important about this era is that it is precisely the time (at least in geological terms) when after several billion years of microbial life, large and sophisticated, multicellular organisms rapidly evolved during the inaccurately-titled Cambrian explosion.

All in all then, the epoch under question is extremely important. But just how are the Great Unconformity, global glaciation and the evolution of complex biota connected? Since 2017 research, including from three Australian universities, has led to the publication of the first tectonic plate map centred on this critical period. Using various techniques, including measuring the oxygen isotopes within zircon crystals, the movements of the continents has been reconstructed further back in time than ever before. The resulting hypothesis is a neat one (perhaps overly so, although it appears to be tenable): the top 3km to 5km of surface rock was first eroded by glacial activity, then washed into the oceans - where the minerals kick-started the Ediacaran and early Cambrian biota -  before being subducted by tectonic activity. 

The conclusion doesn't please some skeptics but the combined evidence, including the erosion of impact craters and a huge increase in sedimentation during the period, gives further support, with the additional inference that an immense increase in shallow marine environments (thanks to the eroded material raising the seafloor) had become available for new ecological niches. In addition, the glacial scouring of the primary biominerals calcium carbonate, calcium phosphate and silicon dioxide into the oceans altered the water chemistry and could have paved the way for the first exoskeletons and hard shells, both by providing their source material and also generating a need for them in the first place, in order to gain protection from the changes in water chemistry.

Deep-time thermochronology isn't a term most of us are familiar with, but the use of new dating techniques is beginning to suggest solutions to some big questions. Not that there aren't plenty of other fundamental questions (the nature of non-baryonic matter and dark energy, anyone?) still to be answered. The scale of the unknown should not be used to denigrate science; not knowing something doesn't mean science isn't the tool for the job. One of its more comforting (at least to its practitioners) aspects is that good science always generates more questions than it answers. To expect simple, easy, straightforward solutions should be left to other human endeavours that relish just-so stories. While working theories are often elegant and simpler than alternatives, we should expect filling in the gaps as a necessity, not a weapon used to invalidate the scientific method or its discoveries.