Wednesday, 18 August 2021

Mushrooms to Mars: how fungi research could help long-duration space travel

I've often noted that fungi are the forgotten heroes of the ecosystem, beavering away largely out of sight and therefore out of mind. Whether it's the ability to break down plastic waste or their use as meat substitutes and pharmaceuticals, this uncharismatic but vital life form no doubt hold many more surprises in store for future research to discover. It's estimated that less than ten percent of all fungi species have so far been scientifically described; it's small wonder then that a recent study suggests an entirely new use for several types of these under-researched organisms.

Investigation of the Chernobyl nuclear power station in 1991 found that Cladosporium sphaerospermum, a fungus first described in the late nineteenth century, was thriving in the reactor cooling tanks. In other words, despite the high levels of radiation, the species was able to not only repair its cells but maintain a good rate of growth in this extreme environment. This led to research onboard the International Space Station at the end of 2018, when samples of the fungus were exposed to a month of cosmic radiation. The results were promising: a two millimetre thick layer of the fungus absorbed nearly two percent of the radiation compared to a fungus-free control.

This then suggests that long-duration crewed space missions, including to Mars, might be able to take advantage of this material to create a self-repairing radiation shield, both for spacecraft and within the walls of surface habitats. A twenty-one centimetre thick layer was deemed effective against cosmic rays, although this could potentially be reduced to just nine centimetres if the fungal mycelia were mixed with similar amounts of Martian soil. In addition, there is even the possibility of extracting the fungus' radiation-proof melanin pigment for use in items that require much thinner layers, such as spacesuit fabric.

If this sounds too good to be true, there are still plenty of technological hurdles to be overcome. Science fiction has frequently described the incorporation of biological elements into man-made technology, but it's early days as far as practical astronautics is concerned. After all, there is the potential for unique dangers, such as synthetic biology growing unstoppably (akin to scenarios of runaway nanobot replication). However, NASA's Innovative Advanced Concepts program (NIAC) shows that they are taking the idea of fungi-based shielding seriously, the current research considering how to take dormant fungal spores to Mars and then add water to grow what can only be described as myco-architecture elements - even interior fittings and furniture. In addition to the radiation shielding, using organic material also has the advantage of not having to haul everything with you across such vast distances.

Even more ideas are being suggested for the use of similarly hardy species of fungi on a Mars base, from bioluminescent lighting to water filtration. Of course, this doesn't take into account any existing Martian biology: the seasonal methane fluctuations that have been reported are thought by some to be too large to have a geochemical cause; this suggests that somewhere in the sink holes or canyon walls of Mars there are colonies of methane-producing microbes, cosily shielded from the worst of the ultraviolet. If this proves to be the case, you would hope that any fungi taken to the red planet would be genetically modified to guarantee that it couldn't survive outside of the explorer's habitats and so damage Martian biota. Humanity's track record when it comes to preserving the ecosystems of previously isolated environments is obviously not something we can be proud of!

What fungi can do alone, they also do in symbiosis with algae, i.e. as lichens. Various experiments, including the LIchens and Fungi Experiment (LIFE) on the International Space Station (incidentally, doesn't NASA love its project acronyms?) have tested extremophile lichens such as Xanthoria elegans and Rhizocarpon geographicum in simulated Martian environments for up to eighteen months. The researchers found that the organisms could remain active as long as they were partially protected, as if they were growing in sink holes beneath the Martian surface. Of course, this success also enhances the possibility of similar lifeforms already existing on the red planet, where it would have had eons in which to adapt to the gradually degraded conditions that succeeded Mars' early, clement, phase.

The CRISPR-Cas9 system and its successors may well develop synthetic fungi and lichens that can be used both on and especially off the Earth, but we shouldn't forget that Mother Nature got there first. Spacecraft shielding and myco-architecture based on natural or genetically modified organisms may prove to be an extremely efficient way to safeguard explorers beyond our world: the days of transporting metal, plastic and ceramic objects into space may be numbered; the era of the interplanetary mushroom may be on the horizon. Now there's a phrase you don't hear every day!


Sunday, 18 July 2021

The uncertainty principle: does popular sci-comm imply more than is really known?

Over the years I've examined how ignorance in science can be seen as a positive thing and how it can be used to define the discipline, a key contrast to most religions. We're still a long way from understanding many fundamental aspects of the universe, but the religious fundamentalist (see what I did there?) mindset is seemingly unable to come to terms with this position and so incorporates lack of knowledge into arguments disparaging science. After all, the hackneyed train of thought goes, scientific theories are really only that, an idea, not something proven beyond all possible doubt. Of course this isn't the case, but thanks to the dire state of most school science education, with the emphasis on exams and fact-stuffing rather than analysis of what science really is (a group of methods, not a collection of facts) - let alone anything that tries to teach critical thinking - you can see why some people fall prey to such disinformation, i.e. that most science isn't proven to any degree of certainty.

With this in mind, you have to wonder what percentage of general audience science communication describes theories with much more certainty than is warranted, when instead there is really a dearth of data that creates a partial reliance on inferred reasoning. Interestingly, the complete opposite used to be a common statement; for example, in the nineteenth century the composition of stars was thought to be forever unknowable, but thanks to spectroscopy that particular wonder came to fruition from the 1860s onwards. It is presumably the speed of technological change today that has reduced that negativity, yet it can play into the anti-rationalist hands of religious hardliners if scientists claim absolute certainty for any particular theory (the Second Law of Thermodynamics excepted). 

As it is, many theories are based on a limited amount of knowledge (both evidential and mathematical) that rely on an expert filling in of the gaps. As an aside, the central tenet of evolution by natural selection really isn't one of these: the various sources of evidence, from fossils to DNA, provide comprehensive support to the theory. However, there are numerous other areas which rely on a fairly small smattering of physical evidence and a lot of inference. This isn't to say the latter is wrong - Nobel-winning physicist Richard Feynman once said that a scientific idea starts with a guess - but to a non-specialist this approach can appear somewhat slapdash.

Geophysics appears to rely on what a layman might consider vague correlations rather than exact matches. For example, non-direct observation techniques such as measuring seismic waves have allowed the mapping of the interior composition of the Earth; unless you are an expert in the field, the connection between the experimental results and clear-cut zones seem more like guesswork. Similarly, geologists have been able to create maps of the continental plates dating back around 600 million years, before which the position of land masses hasn't been so much vague as completely unknown. 

The time back to the Cambrian is less than fifteen percent of the age our 4.5 billion year old planet. This (hopefully) doesn't keep the experts up at night, as well-understood geophysical forces mean that rock is constantly being subducted underground, to be transformed and so no longer available for recording. In addition, for its first 1.3 billion years the planet's surface would have been too hot to allow plates to form. Even so, the position of the continental crust from the Cambrian period until today is mapped to a high level of detail at frequent time intervals; this is because enough is known of the mechanisms involved that if a region at the start of a period is in position A and is later found at position Z, it must have passed through intermediate positions B through Y en route.

One key geological puzzle related to the building and movement of continental rock strata is known as the Great Unconformity, essentially a 100 million year gap in the record that occurs in numerous locations worldwide for the period when complex multicellular life arose. In some locales the period expands both forwards and backwards to as much as a billion years of missing rock; that's a lot of vanished material! Most of the popular science I've read tends to downplay the absent strata, presumably because in the 150 years since the Great Unconformity was first noticed there hasn't been a comprehensive resolution to its cause. The sheer scale of the issue suggests a profound level of ignorance within geology. Yes, it is a challenge, but it doesn't negate the science in its entirety; on the other hand, it's exactly the sort of problem that fundamentalists can use as ammunition to promote their own versions of history, such as young Earth creationism.

In recent decades, the usually conservative science of geology has been examining the evidence for an almost global glaciation nicknamed 'Snowball Earth' (or 'Slushball Earth', depending on how widespread you interpret the evidence for glaciation). It appears to have occurred several times in the planet's history, with the strongest evidence for it occurring between 720 and 635 million years ago. What is so important about this era is that it is precisely the time (at least in geological terms) when after several billion years of microbial life, large and sophisticated, multicellular organisms rapidly evolved during the inaccurately-titled Cambrian explosion.

All in all then, the epoch under question is extremely important. But just how are the Great Unconformity, global glaciation and the evolution of complex biota connected? Since 2017 research, including from three Australian universities, has led to the publication of the first tectonic plate map centred on this critical period. Using various techniques, including measuring the oxygen isotopes within zircon crystals, the movements of the continents has been reconstructed further back in time than ever before. The resulting hypothesis is a neat one (perhaps overly so, although it appears to be tenable): the top 3km to 5km of surface rock was first eroded by glacial activity, then washed into the oceans - where the minerals kick-started the Ediacaran and early Cambrian biota -  before being subducted by tectonic activity. 

The conclusion doesn't please some skeptics but the combined evidence, including the erosion of impact craters and a huge increase in sedimentation during the period, gives further support, with the additional inference that an immense increase in shallow marine environments (thanks to the eroded material raising the seafloor) had become available for new ecological niches. In addition, the glacial scouring of the primary biominerals calcium carbonate, calcium phosphate and silicon dioxide into the oceans altered the water chemistry and could have paved the way for the first exoskeletons and hard shells, both by providing their source material and also generating a need for them in the first place, in order to gain protection from the changes in water chemistry.

Deep-time thermochronology isn't a term most of us are familiar with, but the use of new dating techniques is beginning to suggest solutions to some big questions. Not that there aren't plenty of other fundamental questions (the nature of non-baryonic matter and dark energy, anyone?) still to be answered. The scale of the unknown should not be used to denigrate science; not knowing something doesn't mean science isn't the tool for the job. One of its more comforting (at least to its practitioners) aspects is that good science always generates more questions than it answers. To expect simple, easy, straightforward solutions should be left to other human endeavours that relish just-so stories. While working theories are often elegant and simpler than alternatives, we should expect filling in the gaps as a necessity, not a weapon used to invalidate the scientific method or its discoveries. 

Tuesday, 15 June 2021

Meat-free marvels: does a vegetarian diet reduce your risk of disease?

Is it me, or are there times when contemporary diet trends appear to verge on pseudoscientific crankery? While I briefly mentioned potentially dangerous items such as raw water and unpasteurised milk a few years' ago, it's surprising how many fad diets in developed nations bear a suspicious resemblance to the traditional ingredients of non-Western societies. 

Super foods are a particularly overhyped element of this faddish arena; the marketing suggests they can help achieve perfect 'balance' and 'wellness' in the body. Some assertions go much further, with consumption of the likes of kombucha claimed as something of a miracle cure. While the pseudocereal quinoa is sold in the West as the 'grain of the gods', it is unlikely to give the partaker any super powers. It certainly didn't save the Inca and Anasazi - who cultivated it in pre-Columbian America - from the rapid collapse of their civilisations and apparently suffered from disease and famine as much any other society.

There is a scientific basis for recommending certain non-meat items, from the antioxidants in tea and coffee to the vitamin D in mushrooms, while various plants and vegetable oils contain Omega-3 fatty acids. But a recent report has concluded that a vegetarian diet may have a marked positive effect on overall health compared to one with regular meat consumption. The research was conducted by the University of Glasgow, with the data showing substantial reductions in disease biomarkers for non-meat eaters. However, it was unable to provide an underlying reason for the positive results, once risk factors such as age, alcohol and nicotine intake had been accounted for. Cholesterol and products linked to increased risk of cancers, cardiovascular disease, and liver and kidney problems were all lower in vegetarians.

Apart from suggesting that vegetarians eat more fibre, fruit, vegetables and nuts - some of which have known health benefits - the report's conclusion also noted that rather than the positive effect of these items, avoiding processed meat products and red meat may have also contributed to the results. As someone who hasn't eaten meat in over thirty years, I find the research extremely interesting, although I think there are many other factors that should be considered, with the report forming just part of the debate. 

For example, the data was drawn from c.420,000 people living in just the UK, rather than from a variety of nations and environments. In the past century, the diet and lifestyle of most people in the West has changed enormously, with the emphasis on quick-to-prepare meat dishes including the likes of burgers and sausages, remaining at the forefront despite the replacement of physically demanding lives with predominantly sedentary ones. In other words, the diet hasn't changed to match the alteration in lifestyle. It's little wonder that obesity has outranked malnourishment in some nations.

In addition, it is thought that several billion people, predominantly in less developed regions, consume insect protein on a regular if not daily basis. This is a profoundly different diet to those of Western meat eaters with the latter's concentration on domesticated species such as cattle and horse, sheep/goat, poultry, etc. Although game, bush meat and exotic species such as crocodile are eaten in many regions, these are a much smaller element of the human diet. 

In contrast, vegetarians in many regions can eat an enormous variety of plants and fungi. The geographic and seasonal availability of many fruit and vegetables is expanding too: until a few years ago I hadn't heard of jackfruit, but it is now available as the tinned unripe variety from many stores here in New Zealand. So in both time and in space, there's no such thing as a typical vegetarian diet! This also doesn't include the differences between lacto-vegetarians and vegans; it would definitely be rather more time-consuming to plan a diet with an adequate mix of proteins in the absence of eggs and dairy products. It would therefore be interesting to conduct research to find out the health differences between these two groups.

Although some of the blame for poor health and obesity has been placed on processed and refined foods, there is an ever-increasing array of prepared vegetarian products, often marketed as meat substitutes for meatatarians wanting to cut down on their consumption of animal flesh. My daughters (regular meat eaters) and I have a penchant for fake bacon made of wheat, pea and soy and I also eat a variety of meat-free sausages and burgers as well as Quorn products. 

Many companies are now getting on the bandwagon, with products that aim to replicate the taste and texture of the real thing. Some brands such as Beyond Meat and Impossible Foods have seen a rapid rise to international success, while the UK bakery chain Greggs has benefitted from its tasty (if high-fat) Quorn-based vegan sausage roll becoming one of their top five selling products. Therefore the range of processed foods suitable for vegetarians has grown out of all proportion to those available several decades ago. Could it be that these may have detrimental health effects compared to the less refined ingredients traditionally eaten by Western vegetarians (and still eaten in developing nations)?

Just as there are shed loads of books claiming that epigenetics will allow you to self-improve your DNA through your lifestyle, diet gurus play upon similar fears (and gullibility) to encourage people to eat all sorts of weird stuff that at best maintains equilibrium and at worst can lead to serious health issues. I personally think that a wider amount of research, undertaken in all sorts of regions and societies, needs to be done before a vegetarian diet can be claimed to be distinctly superior to a meat-based one. Of course, a reduction in ruminant farming is good for the planet in general - both for saving water and reducing methane - but as far as a diet equates to health I still think that moderation and a sensible attitude can be key factors in this regard. Nevertheless the Glasgow study certainly is...wait for it...food for thought!

Friday, 14 May 2021

Weedbusting for a better world: the unpleasant truth about invasive plants

There's been a lot written about New Zealand's Predator Free 2050 programme, including my own post from 2016, but while the primary focus has been on fauna, what about the invasive species of flora? Until recently it was easy to think of plants as poor man's animals, with little in the way of the complex behaviour that characterises the life of vertebrates and many invertebrates. However, that's been changing thanks to studies that show the life of plants is actually rather complex - and includes the likes of chemical signalling. Although they might not have the emotional impact of animals, land vegetation alone has about one thousand times the mass of terrestrial fauna. So they're important - and then some!

A few months' ago I was volunteering on the sanctuary island of Motuihe, less than an hour's boat ride from downtown Auckland. Our group was charged with cutting down woolly nightshade, a soil-poisoning plant native to South America. Destroying these evil-smelling shrubs made me wonder how and why they were introduced to New Zealand in the first place, considering they don't look particularly attractive and their berries are poisonous to humans. Like so many exotic plant species, they were apparently deliberately introduced as a decorative garden plant, though frankly I can't see why. 

Like many similar stories from around the world, New Zealand has been inundated with large numbers of non-native floral species. Unlike woolly nightshade, some were introduced for practical purposes, such as radiata pine for timber and gorse for hedging, while others were accidentally brought in as seeds in soil. In many cases they are stories of greed and incompetence, for which later generations have paid a heavy price. 

Although there were pioneering lone voices who from as early as the late nineteenth century could see the deleterious effects of exotic plant species on native vegetation, it wasn't until the last half century that any serious effort was made to promote their removal. British botanist and presenter David Bellamy was one of the first scientists to popularise this message, starring in a 1989 television advert to explain why Clematis vitalba (AKA Old man's beard) needed eradicating. Bellamy then went on to present the tv series Moa's Ark, which drew attention to the country's unique biota and the dangers it faced from poorly managed development. 

Given his botanical background, it's perhaps not surprising that rather than see plants as the background to dramas of the animal kingdom Bellamy made them central to the ecosystem, claiming that we should put nature before culture. Again, although lacking the dynamic aspects of fauna, invasive weeds (by definition, aren't weeds just plants in the wrong place?) such as Old man's beard can gain up to ten metres in a single growing season. You only have to look around a suburban garden - mine included - to see that constant vigilance is required to remove the likes of self-seeded wattle and climbing asparagus before they take hold and smother native species.

It isn't just on land that we face this issue: freshwater systems can easily be choked by the likes of Elodea canadensis, a North American pondweed that has escaped from its ornamental aquarium environment (thanks to highly irresponsible people, of course) and been spread by boats and fishing equipment, clogging and stagnating streams and lakes. What is worrying is that it is far short of being the worst of the fifty or so non-native aquatic plants that threaten New Zealand's waterways. Considering that around three-quarters of all invasive species in this environment have a detrimental effect, it clearly makes the point that introduced flora is just not good.

So what can - and is - being done? Thanks to numerous volunteer groups, sanctuaries for rare native species (principally fauna, but occasionally flora too) are keeping invasive weeds at bay. Outside these protected environments, annual weeding programmes aim to reduce wilding pine, but the issue here is that commercial interest still maintains the upper hand. Whether for timber plantations or carbon sequestration, species such as Douglas fir continue to be planted, allowing the seed to spread to new areas far and wide on the wind. Luckily, there are numerous websites to help the public identify and  destroy pest plants; here are just some of the online resources available for New Zealanders:

Clearly, this isn't an issue that will ever go away. With most Government-led efforts focusing on pest animal species, eradicating invasive plants has been given far less support and so they remain comparatively unknown. Perhaps it would be good if schools undertook a compulsory programme, including practical work, in the identification and removal of non-native pest flora? Trapping and poisoning invasive animals can be a complex business, but weeding is comparative child's play. Everyone can help out: in effect, this is a form of citizen science that has a positive practical effect on the environment. Why not start with your garden today?


Thursday, 1 April 2021

Zapping zombies: how the US military uses the entertainment industries as a recruitment tool

We hear a lot about gamification these days. As video games edge closer to simulating the real world, while Hollywood blockbusters seem to more and more resemble video games, it's little wonder that businesses are using the gaming concept as a learning tool. If anyone has noticed an eerie similarity between the plethora of military sci-fi movies, combat video games and the technology used by United States' armed forces, then you might be interested to learn that this is no coincidence.

Developed at MIT in 1962, Spacewar! is frequently cited as the earliest combat video game. Of course, it was developed for mainframe computers and so it took a long time before high enough quality visuals - with sound effects - could be installed in gaming arcades, followed in the early 1980s by games written for the first generation of ready-assembled home microcomputers.

Hollywood capitalised on the rapidly burgeoning video game market - both at home and in arcades - via movies such as 1984's The Last Starfighter, in which an expert arcade player finds himself recruited into an alien war. In other words, the game he excels at is really a simulator designed to discover and hone players who can then use their gaming skills in genuine space combat.

So how does this fiction compare to the real world? Specialist aviation publications have been full of articles with titles such as 'Do Gamers Make Better Drone Operators Than Pilots?' - the answer being that in addition to the obvious skills such as good hand-eye coordination, gamers are used to not being at personal risk from playing video games (except possibly RSI) and so remain calm under pressure. The conclusion is that they may give them an edge for controlling drones, although not it has to be said, larger, manually piloted aircraft.

The big question is how deep is the military involvement in the development, promotion and assessment of video games that contain combat skills? The relationship certainly appears to go back many decades, considering that the MIT graduate students who developed Space War! were funded by the Pentagon. With the development of much more lifelike virtual worlds, the US military has taken a front seat in both producing games that hone useful skills and creating realistic simulators for training its warfighters. 

There is complex feedback loop between these two spheres and in 1999 the Department of Defense set up the Institute for Creative Technologies to work across them. Games such as Full Spectrum Warrior (2003) and its non-commercial officer training stablemate Full Spectrum Command attempted to portray realistic combat scenarios, facing enemies who frequently resemble their real-life counterparts. 

America's Army (2002) was the first of a series of (initially free) video games that began as propaganda and recruitment tools and then became a widespread commercial franchise. Marines and Special Forces soldiers were amongst those combat veterans involved in the development of these games. In addition, the developers were allowed to scan weapons (in order to build realistic digital simulations) and even shoot them on a firing range so as to experience the physical attributes at first hand. Needless to say, the potential for glorification of violence led to opposition from various quarters.

It isn't just the software that has crossed over between the military and civilian life: weaponry and control systems also feedback between the real world and combat simulations, easing the move from game playing to the genuine article. Of course, skills such as leadership and team cooperation are also being honed by these games. The idea is that they reduce the cost of recruitment and training, leading to the realisation that the free version of America's Army, having had 1.5 million downloads in its first month (and a whopping 40 million downloads over the following six years), proved how effective they could be. 

Going in the other direction, US armed forces personnel have taken part in campaigns such as Operation Phantom Fury, which let's face it, has more than a touch of the Xbox or PlayStation about it. I assume this is also part of the process to ensure a smooth transition between young combat game players and activities in the real-world military. The channel is unlikely to diminish any time soon, seeing as China is now following America's lead; their Glorious Mission online video game, aimed at potential recruits as well as enlisted service personnel, already has over 300 million players.

The US military gaming sector has also started to diversify. To minimise complaints - already prevalent in the gaming sector, due to the implacable enemy often being a group of Muslim fundamentalists - there needed to be a new target that wouldn't raise the ire of any particular nation or ethnic group. To this end, the Call of Duty series of games has introduced reanimated dead soldiers, AKA zombies, as opponents. Bearing in mind that in the past ten years there have been over fifty video games featuring zombie antagonists, its clear that this theme is just as popular as invading aliens and terrorist zealots. Perhaps it's not surprising that doomsday preppers and survivalist groups are often said to be getting ready for the zombie apocalypse!

Recently released - although heavily-redacted - files suggest that as well as developing and promoting video games centred on combat simulation, the Department of Defense has also secretly collected players' data in order to understand their demographics. This is presumably in order to tailor recruitment and training programmes for recruits with a gaming background. The same information also hints that Hollywood too is being used by the military-industrial complex to promote its own agenda. It sounds a bit far-fetched, but the facts speak for themselves. 

The US military have long taken an interest in how Hollywood portrays them. Ronald Reagan's Whitehouse had screenings of Red Dawn (1984) and WarGames (1983) with the former gaining the Pentagon's approval while the latter was not well received (hardly surprising, if you know the plot). Gung-ho space marine movies started back in the mid-1980s with likes of Predator and Aliens, but really took off in mid-1990s with blockbusters such as Independence Day, Stargate and Starship Troopers

Hollywood hasn't looked back since, and as well as the US military fighting off hordes of alien invaders, there are plenty of zombie movies - over 170 worldwide over the past decade - along with numerous zombie-themed tv series. Of course, this genre usually features civilians fighting against the living dead, but nonetheless the firearm-laden format resembles its military counterparts. Critics have been keen to note that just as the alien invasion films of 1950s and 1960s were thinly-disguised Cold War allegories, so zombie movies contain subtext of the unpredictable nature of global terrorism - and imply readiness to engage the perceived enemy is a patriotic duty.

So what is the underlying connection between these genres and the Pentagon? Even a minimum of research will reveal that a fair number of the Department of Defense's advanced weaponry projects, from the F22 Raptor tactical fighter to the Global Hawk surveillance UAV (that's an Unmanned Aerial Vehicle to you and me) have been truncated, in both these cases with only about half the number of units being built compared to the original proposals. The funding for those cancelled vehicles is being redirected elsewhere and Hollywood is the most likely recipient, the money being used for both movies and tv shows that follow the DoD agenda.

And how does the Pentagon know it's getting value for money? As more people book cinema tickets online and via their smartphones, the DoD is able to build frighteningly detailed profiles of those adolescents with the aptitude and skills they are looking for. Thanks to tv subscription services, it is also much easier to see exactly who is watching how much of what.

By immersing America's youth in popular entertainment across a variety of channels that both gives a homely familiarity to the military and allows niche targeting for potential recruits, the Pentagon is saving money on blanket advertising while promoting its own values as a mainstream cultural element. Thanks to a business culture that embeds military-derived phrases ('locked and loaded', 'SWAT team', 'strategic planning', etc) the distance between the armed forces and civilian life has been much reduced since the anti-war ethos of the 1970s. So if you're a teenager who plays certain types of video games and/or watches these sorts of movies and tv shows, don't be surprised if you start receiving recruitment adverts tailored closely to your personality profile. To paraphrase the Village People: they want you as a new recruit!

Monday, 15 March 2021

Distorted Darwin: common misconceptions about evolution and natural selection

A few months' ago, I discussed how disagreements with religious texts can lead the devout to disagree with key scientific theories; presumably this is a case of fundamentalists denying the fundamentals? Of all the areas of scientific research that cause issues today, it is evolutionary biology that generates the most opposition. This is interesting in so many ways, not least because the primary texts of the Abrahamic religions have little to say on the topic beyond the almost universal elements seen in creation myths, namely that one or more superior beings created all life on Earth and that He/They placed humanity at the zenith.

Thanks to opposition to the modern evolutionary synthesis, there is a plethora of misinformation, from material taken out of context to complete falsehoods, that is used to promote Creationist ideas rather than scientifically-gleaned knowledge. Even those with well-meaning intentions often make mistakes when condensing the complexity of the origin and history of life into easy-to-digest material. I've previously written about the concepts of evolutionary bushes rather than ladders, concurrent rather than consecutive radiation of sister species and speciation via punctuated equilibrium (i.e., the uneven pace of evolution) so here are a few other examples where the origin, implications and illustrations of natural selection has been distorted or overly simplified to the point of inaccuracy.

I've previously mentioned that Charles Darwin was the earliest discoverer - but only a decade or two ahead of Alfred Russel Wallace - of natural selection, and not as is often written, evolution per se. However, this is not completely accurate. Darwin's hypothesis was more complete than Wallace's, in the sense of being entirely scientific and therefore testable. Wallace on the other hand maintained there must have been divine intervention in the creation of our species, making us different from all other life forms.

In addition, there were several precursors who partially formulated ideas regarding natural selection, but who were unable to promote a consistent, evidence-based hypothesis to anywhere near the extent that Darwin achieved. For example, as early as 1831 the Scottish agriculturalist Patrick Matthew published some notes on what he termed 'new diverging ramifications of life' as he thought must occur after mass extinctions. Nevertheless, he failed to expand and fully explain his ideas, seemingly unaware of where they could lead. In this sense, he is a minor figure compared to the thorough research Darwin undertook to back up his hypothesis. 

Darwin appears to have been unaware of Matthew's ideas, although the same could not be said for Robert Chambers' (anonymous) 1844 publication Vestiges of the Natural History of Creation, which although highly speculative contained some kernels of truth about the mechanisms behind biological evolution. Just as Thomas Malthus' 1798 An Essay on the Principle of Population inspired Darwin, so the mid-nineteenth century contained other combinations of ideas and real-world inspiration that provided,an ideal background for the formulation of natural selection. In other words, the conditions were ready for those with the correct mindset to uncover the mechanism behind evolution. What Darwin did was to combine the inspiration with an immense amount of rigour, including examples taken from selective breeding.

Another frequently quoted fallacy is that evolution always maintains a single direction from earlier, simpler organisms to later, more complex ones. I've covered this before in discussions of the evolution of our own species, as many popular biology accounts seek parallels between technological progress and a central branch of animal evolution leading ever upwards until it produced us. 

Modern techniques such as genetic analysis and sophisticated examination of fossils - including scanning their internal cavities – has negated this appealing but incorrect idea. For example, mammals evolved around the same time as the dinosaurs (and over one hundred million years before flowering plants) while parasitic species often have a far more rudimentary structure than their ancestors. 

Despite this, we still see countless illustrations showing a clear-cut path from primordial organisms 'up' to Homo sapiens. No-one who has seen the cranial endocast of a dinosaur would consider it to be superior to even the least intelligent of mammals, although the later medium-sized carnivorous species were on the way to developing a bird-like brain-to-body mass ratio. Yet throughout the Jurassic and Cretaceous periods, dinosaurs filled most ecological niches at the expense of the mammals; you would be hard-pressed to state that the latter were the dominant type of land organism during the Mesozoic!

Research published last year shows that New Zealand's unique tuatara, the sole remaining member of the Rhynchocephalia, is a reptile that shares some genetic similarities to the Monotremata, the egg-laying mammalian species known as platypus and echidna. In addition, a report from the beginning of this year states that the ancestors of today's five monotreme species diverged from all other mammals 187 million years ago; therefore, they have spent approximately three times as long on their own evolutionary journey as they did when part of all the other mammalian lineages. As a result of retaining many ancestral features, the platypus genome is in some ways more like that of birds and reptiles rather than placental and marsupial mammals. But we still include them amongst the mammals rather than as a hybrid or separate class; both platypus and echidna have fur, are warm-blooded and produce milk (although with a unique delivery system!) This allows their inclusion in Mammalia; does this mean we arbitrarily allow certain traits and discard others?

Would it be fair to say that the boundaries we make between organisms are more for our convenience than the underlying reality? Are you happy to label birds as 'avian dinosaurs' and if not, why not? If they had feathers, nests and even underground burrows, some dinosaurs were clearly part of the way there; physiologically, it was teeth, bony tail, and a crocodilian-type brain that provided the differentiation from birds. Scans of fossils show that dinosaur hearts may have been more like birds than other reptiles, which along with the possible discovery of bird-like air sacs, means that they could have had something of the former's more active lifestyle. 

This doesn't confirm that they were warm-blooded: today there are eight species, including leatherback turtles, that are mesothermic and therefore lie between warm- and cold-blooded metabolisms. Eggshell analysis suggests that some of the theropod (carnivorous) dinosaurs could have been warm-blooded, but as dinosaurs existed for around 165 million years it may be that some evolved to be mesothermic and others to be endothermic (i.e., fully warm-blooded). In this respect then, some meat-eating dinosaurs especially may have had more in common with us mammals than they did with other reptiles such as lizards and snakes.

All this only goes to show that there is far more to life's rich pageant than the just-so stories still used to illustrate the history of life. Science communication to the public is fundamental to our society but it needs to present the awkward complexities of evolution via all the tortured pathways of natural selection if it is not to fall victim to those who prefer myths of the last few thousand years to the history of countless millennia, as revealed in the genes and rocks waiting for us to explore.


Friday, 19 February 2021

Science, society & stereotypes: examining the lives of trailblazing women in STEM

I was recently flicking through a glossily illustrated Australian book on the history of STEM when I found the name of a pioneer I didn't recognise: Marjory Warren, a British surgeon who is best known today as the 'mother of modern geriatric medicine'. Looking in the index I could find only two other women scientists - compared to over one hundred and twenty men - in a book five hundred pages long! The other two examples were Marie Curie (of course) and American astronomer Vera Rubin. Considering that the book was published in 2008, I was astounded by how skewed this seemed to be. Granted that prior to the twentieth century, few women had the option of becoming involved in science and mathematics; but for any history of STEM, wouldn't the last century contain the largest proportion of subject material?

I therefore thought it would be interesting to choose case studies from the twentieth century to see what sort of obstacles - unique or otherwise - that women scientists faced until recently. If you ask most people to name a female scientist then Marie Curie would probably top the list, although a few countries might have national favourites: perhaps Rosalind Franklin in the UK or Rachel Carson in the USA, for example. Rather than choose the more obvious candidates such as these I have selected four women I knew only a little about, ordered by their date of birth.

Barbara McClintock (1902-1992) was an American cytogeneticist who was ahead of her time in terms of both research and social attitudes. Although her mother didn't want her to train as a scientist, she was lucky to have a father who thought differently to the accepted wisdom - which was that female scientists would be unable to find a husband! McClintock's abilities showed early in her training, leading to post-graduate fellowships which in turn generated cutting-edge research.

At the age of forty-two, Barbara McClintock was only the third woman to be elected to the US National Academy of Sciences. However, her rapid rise within the scientific establishment didn't necessarily assist her: such was the conservative nature of universities that women were not allowed to attend faculty meetings. 

After publishing her research to broad acceptance, McClintock's work then moved into what today would broadly come under the term of epigenetics. Several decades' ahead of its time, it was seen as too radical by most of her peers and so after facing intense opposition she temporarily stopped publishing her results. It is unlikely that being a woman was entirely responsible for the hostility to her work; similar resistance has frequently been experienced throughout the STEM avant-garde. It seems that only when other researchers found similar results to McClintock did the more hidebound sections of the discipline re-examine their negative attitude towards her work.

There has been a fair amount of discussion as to whether it was because McClintock was female, or because of her secretive personality (both at home as well as at work, for she never married) - or a combination of both - that delayed her receipt of the Nobel Prize in Physiology or Medicine. Even by the slow standards of that particular awards committee, 1983 was rather late in the day. However, by then she had already been the recipient of numerous other awards and prizes.

Regardless of the recognition it gave her, Barbara McClintock relished scientific research for the sake of uncovering nature's secrets. In that regard, she said: "I just have been so interested in what I was doing and it's been such a pleasure, such a deep pleasure, that I never thought of stopping...I've had a very, very, satisfying and interesting life."

Tikvah Alper (1909-1995) was a South African radiobiologist who worked on prions - otherwise known as 'misfolded' or 'rogue' proteins - and their relationship to certain diseases. Her outstanding abilities were recognised early, allowing her to study physics at the University of Cape Town. She then undertook post-graduate work in Berlin with the nuclear fission pioneer Lise Meitner, only to be forced to leave before completing her doctorate due to the rise in anti-Semitism in Germany.

Having had her research curtailed by her ethnicity, Alper was initially also stymied on her return to South Africa thanks to her private life: due to the misogynist rules of that nation's universities, married women were not allowed to remain on the faculty. Therefore, along with her husband the veterinary medicine researcher Max Sterne, she continued her work from home. However, eventually her talents were acknowledged and she was made head of the Biophysics section at the South African National Physics Laboratory in 1948. Then only three years later, Alper's personal life intervened once again; this time, she and her husband were forced to leave South Africa due to their opposition to apartheid.

After a period of unpaid research in London, Alper turned to studying the effects of radiation on different types of cells, rising to become head of the Medical Research Council Radiopathology Unit at Hammersmith Hospital. Alper's theories regarding prions were eventually accepted into the mainstream and even after retirement she continued working, writing a renowned text book, Cellular Radiobiology, in 1979. 

Alper's life suggests she was very much a problem solver, tackling anything that she felt needed progressing. As a result of this ethos she worked on a wide range of issues from the standing of women in science and society, to the injustice of apartheid, even to learning and teaching sign language after one of her son's was born profoundly deaf. Despite being forced to leave several nations for different reasons - not because she was a woman - Alper was someone who refused to concede defeat. In that respect she deserves much wider recognition today.

Dorothy Crowfoot Hodgkin (1910-1994) was interested in chemistry, in particular crystals, from a young age. Although women of her generation were encouraged in this area as a hobby, it was highly unusual for them to seek paid employment in the field. Luckily, her mother encouraged her interest and gave Hodgkin a book on x-ray crystallography for her sixteenth birthday, a gift which determined her career path. 

After gaining a first-class honours chemistry degree at Oxford, she moved to Cambridge for doctoral work under the x-ray crystallography pioneer J.D. Bernal. Not only did Hodgkin then manage to find a research post in her chosen field, working at both Cambridge and Oxford, she was able to pursue cutting edge work labelled as too difficult by her contemporaries, Hodgkin and her colleagues achieved ground-breaking results in critical areas, resolving the structure of penicillin, vitamin B12 and insulin. 

Hodgkin's gained international renown, appearing to have faced few of the difficulties experienced by her female contemporaries. In addition to having a well-equipped laboratory at Oxford, she was elected to the Royal Society in 1947 and became its Wolfson Research Professor in 1960. She was also awarded the Nobel Prize in Chemistry in 1964 - the only British woman to have been a recipient to date. Other prestigious awards followed, including the Royal Society's Copley Medal in 1976; again, no other woman has yet received that award.

Presumably in response to the loss of four maternal uncles in the First World War, Hodgkin was an active promoter of international peace. During the 1950s her views were deemed too left wing by the American government and she had to attain special permission to enter the United States to attend science conferences. Ironically, the Soviet Union honoured her on several occasions, admitting her as a foreign member of the Academy of Sciences and later awarding her the Lenin Peace Prize. She also communicated with her Chinese counterparts and became committed to nuclear disarmament, both through CND and Operation Pugwash.

Her work on insulin, itself of enormous importance, is just one facet of her life. Ironically, as someone associated with left-wing politics, she is often remembered today as being one of Margaret Thatcher's lecturers; despite their different socio-political leanings, they maintained a friendship into later life. All this was despite the increasing disability Hodgkin suffered from her mid-twenties due to chronic rheumatoid arthritis, which left her with seemingly minimal dexterity. Clearly, Dorothy Hodgkin was a dauntless fighter in her professional and personal life.

Marie Tharp (1920-2006) was an American geologist best known for her oceanographic cartography work regarding the floor of the Atlantic Ocean. Despite followed the advice of her father (a surveyor) and taking an undergraduate degree in humanities and music, Tharp also took a geology class; perhaps helping her father as a child boosted her interest in this subject. It enabled her to complete a master's degree in geology, thanks to the dearth of male students during the Second World War. Certainly, it was an unusual avenue for women to be interested in; at the time less than four percent of all earth sciences doctorates in the USA were awarded to women.

From a modern perspective, geology during the first half of the twentieth century appears to have been exceedingly hidebound and conservative. Tharp found she could not undertake field trips to uncover fossil fuel deposits, as women were only allowed to do office-based geological work - one explanation for this sexism being that having women on board ship brought bad luck! In fact, it wasn't until 1968 that Tharp eventually joined an expedition. 

However, thanks to painstaking study of her colleague Bruce Heezen's data, Tharp was able to delineate geophysical features such as the mid-Atlantic ridge and consider the processes that generated them. Her map of the Atlantic Ocean floor was far more sophisticated than anything that had previously been created, giving her insights denied to both her contemporaries as well as her predecessors. As such, Tharp suspected that the long-denigrated continental drift hypothesis, as envisaged by Alfred Wegener three decades previously, was correct. It was here that she initially came unstuck, with Heezen labelling her enthusiasm for continental drift as 'girl talk'. Let's hope that phrase wouldn't be used today!

In time though, yet more data (including the mirrored magnetic striping either side of the mid-Atlantic ridge) proved Tharp correct. Heezen's incredulity was replaced by acceptance, as continental drift was reformulated via seafloor spreading to become the theory of plate tectonics. Mainstream geology finally approved what Wegener had proposed, and Marie Tharp was a fundamental part of that paradigm shift. 

What is interesting is that despite receiving many awards in her later years, including the National Geographic Society's Hubbard Medal in 1978, her name is mentioned far less often than other pioneers of plate tectonics such as Harry Hess, Frederick Vine, Drummond Matthews, even Heezen. It's unclear if Tharp's comparative lack of recognition is due to her being female or because she was only one of many researchers working along similar lines. Her own comment from the era suggests that just being a women scientist was reason enough to dismiss her work: she noted that other professional's viewed her ideas with attitudes ranging "from amazement to skepticism to scorn."

There are countless other examples that would serve as case studies, including women from non-Western nations, but these four show the variety of experiences women scientists underwent during the twentieth century, ranging from a level of misogyny that would be unthinkable today to an early acceptance of the value of their work and a treatment not seemingly different from their male colleagues. I was surprised to find such a range of circumstances and attitudes, proving that few things are as straightforward as they are frequently portrayed. However, these examples do show that whatever culture they grow up in, the majority of the population consider its values to be perfectly normal; a little bit of thought - or hindsight - shows that just because something is the norm, doesn't necessarily mean it's any good. When it comes to the attitudes today, you only have to read the news to realise there's still some way to go before women in STEM are treated the same as their male counterparts.