Monday 18 October 2021

Volunteering for victory: can people power make New Zealand pest-free?

I've often discussed citizen science and how it varies from pie-in-the-sky research to projects with practical goals that may be achievable within a lifetime. When it comes to conserving native species New Zealand has a plethora of public engagements, including the Garden Bird Survey and Great Kererū Count (the latter being the country's largest citizen science project.) In a nation that is slowly waking to the realisation that it is far from '100% Pure', concerted efforts are finally be made to secure a future for beleaguered native fauna (and to a lesser extent flora, although few are seemingly aware of the interdependencies).

In late 2016 I wrote a post about the Predator Free 2050 scheme, focusing on how impractical it seemed. There was good reason for this: the University of Auckland estimated that it would require an astonishing NZ$9 billion to implement, a figure approaching 4% of the national GDP. Five years on and it appears this negativity was well deserved, as the project appears woefully underfunded; according to their website so far only NZ$178 million has been spent on the scheme. By comparison the annual budget for controlling possums, rats and stoats is NZ$114m, so it could hardly be deemed a flying start. There are an estimated 30 million possums in New Zealand, never mind the mustelids, rodents and other lesser pest species; the obvious implication is that numbers this large will require equally immense resources to eradicate them.

So what's to prevent this from being just another example of  'doomscrolling', of which have plentiful examples at the moment? After all, with billionaires now spending precious resources on racing to become astronauts - presumably superyachts are so last decade - it could be argued that those with the funds for the task just aren't interested in anything as mundane as conservation. It's often said that it is the people who make a place and in the case of New Zealand, it could just be the citizens - both with and without science - that make the difference. Kiwi ingenuity (that's the people, not the bird) and the 'number eight wire' mentality have enabled a young nation to punch well above its weight in so many fields. Can they do likewise in conservation?

Let's start with the science. New Zealand's rugged landscape requires a smart approach to predator control; there are so many reasons why flying thousands of bait-dropping helicopter missions would not be a good idea, not least due to the impossibility of funding them. Various projects are therefore now looking to lower the cost of poisoning and trapping, seeking robust maintenance-free solutions that can survive in the wilderness with minimal human intervention. From new thermal imaging cameras to auto-reset bait stations containing long-lasting toxins, research projects are showing that small-scale developments can make enormous differences to pest eradication. Hopefully, some of these devices will be out in the field in useful numbers within the next few years.

Often unsung heroes, there are also thousands of New Zealand citizens doing unpaid conservation work. I've met various volunteers for the Department of Conservation who spend their weekends climbing up and down knotted ropes and wading through icy streams in order to replenish bait boxes and reset traps. Many are retirees and some are ex-military - it's physically demanding and not at all glamorous, but can very satisfying work. The nation has a long history of such volunteering, something which has escalated in the past forty years with the setting up of predator-free fenced mainland sanctuaries and small sanctuary islands. To date, there are approximately 120 such refuges for native wildlife, many having been initiated by their local community and now being sustained by volunteers.

Even businesses are belatedly getting in on the act, giving their staff paid workdays to undertake volunteering such as planting and weeding within sanctuaries and coastal rubbish clean-ups. Earlier this year (between lockdowns of course) I was lucky enough to spend a day on Motuihe Island in the Hauraki Gulf, one of a group of thirty or so volunteers removing the noxious invasive plant woolly nightshade. What was amazing was seeing small flocks of native birds such as saddleback/tīeke, New Zealand parakeet/kākāriki and whitehead/pōpokotea, as opposed to the usual one or two you might see elsewhere (such as in zoo enclosures). Clearly, the planting of native species and foreign pest eradication - including abseiling to reach some of the weeds - has paid off beautifully.

Small islands are one thing, but what about the mainland? The nation's capital, Wellington, might be claimed by its inhabitants to be leading the way. Predator Free Wellington is the umbrella organisation for a range of projects that are aiming to eradicate pests from 30,000 hectares in and around the city. Already being possum free, the Miramar peninsula has been the starting point of rodent eradication, with almost 10,000 bait stations and traps placed at regularly intervals, mostly in residential gardens. The project is labour-intensive but still costs millions, so the hope is that by setting an example of what can be achieved, other regions in the country will follow suit. Whether their local councils will prove as farsighted as the capital's remains to be seen.

Like climate change mitigation, it seems that engaging and motivating the general public will be the only way to achieve a predator-free New Zealand, whether in 2050 or most likely at some point later. If this seems a bit naive - and overly optimistic, especially when compared to my initial assessment in 2016 - then last year's incredible work by the population to contain COVID-19 made New Zealand a frequent feature on international headlines, something that was previously a rare event. The 'team of five million' showed the naysayers (most of these, in my experience, being middle-aged white men) that even a relatively small group of people, globally speaking, could provide inspiration and be a role model to kick-start action elsewhere. If a lot of people take a little action, surely it can combine into an enormous amount of change? Much depends on the success - and cost - of Predator Free Wellington; if the nation's capital can achieve it the snowball effect might just take off, with local groups of volunteers making up for the lack of support from government and big business. 

What's in it for the volunteers, you might ask? The health benefits, from physical exercise to reducing stress and anxiety, are now well established. In addition, those who dedicate their spare time to unpaid conservation work can learn new practical skills, meet like-minded people, engage in teamwork and gain enjoyment from the sheer empowerment - knowing that you are actually achieving something useful. According to the Department of Conservation, it is estimated there are currently 200,000 active volunteers in this sector, which might not sound like a large number until you realise that it accounts for almost four percent of the New Zealand population!

Considering the history of the fenced reserves and sanctuary islands, it seems clear that motivating local communities can achieve wonders. If the Predator Free project is to succeed, we need a widespread engagement of the general population. New Zealand is far from alone, but having lost over fifty birds (more if you include the Chatham Islands), three lizards, three frogs, a bat, a freshwater fish, four plant species and numerous invertebrates, now is the time to act. Despite the negative effects of pollution and habitat loss due to development, it is a sobering thought that invasive fauna are equally capable of inflecting immense damage on a previously isolated ecosystem. As this plaque shows, many species were lost prior to the landing of the first Europeans: the original human inhabitants of New Zealand arrived less than a thousand years ago, but a combination of the introduced Polynesian rat and Polynesian dog, and their own hunting prowess, rapidly kick-started the eradication process.

Well, this is my last post, as least for a while. After twelve years I've learnt an enormous amount, but my sustainability champion voluntary work - engaging with over 5,000 work colleague on climate change mitigation and wider environmental issues - is taking up my spare time. If there is a moral to this story, it's a simple one: let's act - now!


Wednesday 15 September 2021

Life in a rut: if microbes are commonplace, where does that leave intelligent aliens?

A few years ago I wrote about how Mars' seasonal methane fluctuations suggested - although far from confirmed - that microbial life might be present just under the Martin surface. Now another world in our solar system, the Saturnian moon Enceladus, has ignited discussion along similar lines.

The Cassini probe conducted flybys of Enceladus over a decade, revealing that Saturn's sixth largest moon was venting geyser-like jets of material, including water vapour, from its southern polar region. The material being emitted from these vents also included organic compounds and methane, hinting that this distant moon's watery oceans may also contain alien methane-producing microbes. Whereas Titan and Europa were originally deemed the moons most suitable for life, Enceladus's status has now been boosted to second only to Mars, with conditions not dissimilar to those in the oceans of the early Earth.

Of course, unknown geochemical processes cannot be ruled out, but nonetheless the quality of the evidence is such as to invite further exploration of Enceladus. There have been at least seven potential mission designs proposed by various bodies, including NASA and ESA, to gain more information about the moon and its geysers. Several of these include landers, while others would fly through a plume in order to examine the vented material for biosignatures. However, to date none have received official funding confirmation. As it stands the first probe to arrive might be billionaire Yuri Milner's privately-funded Breakthrough Enceladus, rather than one from a major organisation. However, don't hold your breath: the earliest any of these missions is likely to reach Enceladus is at some point in the 2030s.

What happens if future probes find evidence of microbial life on both Mars and Enceladus? Or even, whenever a method is found to reach it, in the ice-covered oceans of Jupiter's moon Europa? The first key fact will be whether they are genetically independent of Earth biota or if the panspermia hypothesis - the delivery of microbes via cometary and meteorite impact - has been proven. If that turns out not to be the case and multiple instances of life arose separately within a single solar system, this has some profoundly mixed implications for the search for extraterrestrial intelligence (SETI). After all, if simple life can arise and be sustained on three or even four very different worlds - including bodies far outside their solar system's 'Goldilocks zone' - then shouldn't this also imply a much higher chance of complex alien life evolving on exoplanets? 

Yet despite various SETI programmes over the past few decades, we have failed to pick up any signs of extraterrestrial intelligence - or at least from other technological civilisations prepared to communicate with radio waves, either in our galactic neighbourhood or with super high-powered transmitters further away. This doesn't mean they don't exist: advanced civilisations might use laser pulses at frequencies our SETI projects currently don't have the ability to detect. But nonetheless, it is a little disheartening that we've so far drawn a blank. If there is microbial life on either Mars or Enceladus - or even more so, on both worlds, never mind Europa - then a continued lack of success for SETI suggests the chances of intelligent life evolving are far lower than the probability of life itself arising.

In effect, this means that life we can only view via a microscope - and therefore somewhat lacking in cognitive ability - may turn out to be common, but intelligence a much rarer commodity. While it might be easy to say that life on both Enceladus and Mars wouldn't stand much of a chance of gaining complexity thanks to the unpleasant environmental conditions that have no doubt existed for much of their history, it's clear that Earth's biota has evolved via a complex series of unique events. In other words, the tortuous pathways of history have influenced the evolution of life on Earth.

Whereas the discovery of so many exoplanets in the past decade might imply an optimistic result for the Drake equation, the following factors, being largely unpredictable, infrequent or unique occurrences, might suggest that the evolution of complex (and especially sapiens-level intelligent) life is highly improbable:

  • The Earth orbits inside the solar system's Goldilocks zone (bear in mind that some of the planets have moved from the region of space they were created in) and so water was able to exist in liquid form after the atmospheric pressure became high enough.
  • The size and composition of the planet is such that radioactivity keeps the core molten and so generates a magnetic field to block most solar and cosmic radiation.
  • It is hypothesised that the Earth was hit by another body, nicknamed Theia, that both tilted the planet's axis and caused the formation of the Moon rather than having a catastrophic effect such as tearing our world apart, knocking it on its side (like Uranus) or removing its outer crust (like Mercury).
  • The Moon is comparatively large and close to the Earth and as such their combined gravitational fields help to keep Earth in a very stable, only slightly eccentric orbit. This is turn has helped to maintain a relatively life-friendly environment over the aeons. 
  • The Earth's axial tilt causes seasons and as such generates a simultaneous variety of climates at different latitudes, providing impetus for natural selection.
  • The Great Unconformity and hypothesised near-global glaciation (AKA Snowball Earth) that might have caused it suggests this dramatic period of climate change led to the development of the earliest multi-cellular life around 580 million years ago.
  • Mass extinctions caused rapid changes in global biota without destroying all life. Without the Chicxulub impactor for example, it is unlikely mammals would have radiated due to the dominance of reptiles on the land.
  • Ice ages over the past few million years have caused rapid climate fluctuations that may have contributed to hominin evolution as East African forests gave way to grasslands.

The evolutionary biologist Stephen Jay Gould often discussed 'contingency', claiming that innumerable historical events had led to the evolution of Homo sapiens and therefore that if history could be re-run, most possible paths would not lead to a self-aware ape. Therefore, despite the 4,800 or so exoplanets discovered so far, some within their system's Goldilocks zone, what is the likelihood such a similar concatenation of improbable events would occur of any of them? 

Most people are understandably not interested in talking to microbes. For a start, they are unlikely to gain a meaningful reply. Yet paradoxically, the more worlds that microbial life is confirmed on, when combined with the distinct failure of our SETI research to date, the easier it is to be pessimistic; while life might be widespread in the universe, organisms large enough to view without a microscope, let alone communicate with across the vast reaches of interstellar space, may be exceedingly rare indeed. The origins of life might be a far easier occurrence than we used to think, but the evolution of technological species far less so. Having said that, we are lucky to live in this time: perhaps research projects in both fields will resolve this fundamental issue within the next half century. Now wouldn't that be amazing?

Wednesday 18 August 2021

Mushrooms to Mars: how fungi research could help long-duration space travel

I've often noted that fungi are the forgotten heroes of the ecosystem, beavering away largely out of sight and therefore out of mind. Whether it's the ability to break down plastic waste or their use as meat substitutes and pharmaceuticals, this uncharismatic but vital life form no doubt hold many more surprises in store for future research to discover. It's estimated that less than ten percent of all fungi species have so far been scientifically described; it's small wonder then that a recent study suggests an entirely new use for several types of these under-researched organisms.

Investigation of the Chernobyl nuclear power station in 1991 found that Cladosporium sphaerospermum, a fungus first described in the late nineteenth century, was thriving in the reactor cooling tanks. In other words, despite the high levels of radiation, the species was able to not only repair its cells but maintain a good rate of growth in this extreme environment. This led to research onboard the International Space Station at the end of 2018, when samples of the fungus were exposed to a month of cosmic radiation. The results were promising: a two millimetre thick layer of the fungus absorbed nearly two percent of the radiation compared to a fungus-free control.

This then suggests that long-duration crewed space missions, including to Mars, might be able to take advantage of this material to create a self-repairing radiation shield, both for spacecraft and within the walls of surface habitats. A twenty-one centimetre thick layer was deemed effective against cosmic rays, although this could potentially be reduced to just nine centimetres if the fungal mycelia were mixed with similar amounts of Martian soil. In addition, there is even the possibility of extracting the fungus' radiation-proof melanin pigment for use in items that require much thinner layers, such as spacesuit fabric.

If this sounds too good to be true, there are still plenty of technological hurdles to be overcome. Science fiction has frequently described the incorporation of biological elements into man-made technology, but it's early days as far as practical astronautics is concerned. After all, there is the potential for unique dangers, such as synthetic biology growing unstoppably (akin to scenarios of runaway nanobot replication). However, NASA's Innovative Advanced Concepts program (NIAC) shows that they are taking the idea of fungi-based shielding seriously, the current research considering how to take dormant fungal spores to Mars and then add water to grow what can only be described as myco-architecture elements - even interior fittings and furniture. In addition to the radiation shielding, using organic material also has the advantage of not having to haul everything with you across such vast distances.

Even more ideas are being suggested for the use of similarly hardy species of fungi on a Mars base, from bioluminescent lighting to water filtration. Of course, this doesn't take into account any existing Martian biology: the seasonal methane fluctuations that have been reported are thought by some to be too large to have a geochemical cause; this suggests that somewhere in the sink holes or canyon walls of Mars there are colonies of methane-producing microbes, cosily shielded from the worst of the ultraviolet. If this proves to be the case, you would hope that any fungi taken to the red planet would be genetically modified to guarantee that it couldn't survive outside of the explorer's habitats and so damage Martian biota. Humanity's track record when it comes to preserving the ecosystems of previously isolated environments is obviously not something we can be proud of!

What fungi can do alone, they also do in symbiosis with algae, i.e. as lichens. Various experiments, including the LIchens and Fungi Experiment (LIFE) on the International Space Station (incidentally, doesn't NASA love its project acronyms?) have tested extremophile lichens such as Xanthoria elegans and Rhizocarpon geographicum in simulated Martian environments for up to eighteen months. The researchers found that the organisms could remain active as long as they were partially protected, as if they were growing in sink holes beneath the Martian surface. Of course, this success also enhances the possibility of similar lifeforms already existing on the red planet, where it would have had eons in which to adapt to the gradually degraded conditions that succeeded Mars' early, clement, phase.

The CRISPR-Cas9 system and its successors may well develop synthetic fungi and lichens that can be used both on and especially off the Earth, but we shouldn't forget that Mother Nature got there first. Spacecraft shielding and myco-architecture based on natural or genetically modified organisms may prove to be an extremely efficient way to safeguard explorers beyond our world: the days of transporting metal, plastic and ceramic objects into space may be numbered; the era of the interplanetary mushroom may be on the horizon. Now there's a phrase you don't hear every day!


Sunday 18 July 2021

The uncertainty principle: does popular sci-comm imply more than is really known?

Over the years I've examined how ignorance in science can be seen as a positive thing and how it can be used to define the discipline, a key contrast to most religions. We're still a long way from understanding many fundamental aspects of the universe, but the religious fundamentalist (see what I did there?) mindset is seemingly unable to come to terms with this position and so incorporates lack of knowledge into arguments disparaging science. After all, the hackneyed train of thought goes, scientific theories are really only that, an idea, not something proven beyond all possible doubt. Of course this isn't the case, but thanks to the dire state of most school science education, with the emphasis on exams and fact-stuffing rather than analysis of what science really is (a group of methods, not a collection of facts) - let alone anything that tries to teach critical thinking - you can see why some people fall prey to such disinformation, i.e. that most science isn't proven to any degree of certainty.

With this in mind, you have to wonder what percentage of general audience science communication describes theories with much more certainty than is warranted, when instead there is really a dearth of data that creates a partial reliance on inferred reasoning. Interestingly, the complete opposite used to be a common statement; for example, in the nineteenth century the composition of stars was thought to be forever unknowable, but thanks to spectroscopy that particular wonder came to fruition from the 1860s onwards. It is presumably the speed of technological change today that has reduced that negativity, yet it can play into the anti-rationalist hands of religious hardliners if scientists claim absolute certainty for any particular theory (the Second Law of Thermodynamics excepted). 

As it is, many theories are based on a limited amount of knowledge (both evidential and mathematical) that rely on an expert filling in of the gaps. As an aside, the central tenet of evolution by natural selection really isn't one of these: the various sources of evidence, from fossils to DNA, provide comprehensive support to the theory. However, there are numerous other areas which rely on a fairly small smattering of physical evidence and a lot of inference. This isn't to say the latter is wrong - Nobel-winning physicist Richard Feynman once said that a scientific idea starts with a guess - but to a non-specialist this approach can appear somewhat slapdash.

Geophysics appears to rely on what a layman might consider vague correlations rather than exact matches. For example, non-direct observation techniques such as measuring seismic waves have allowed the mapping of the interior composition of the Earth; unless you are an expert in the field, the connection between the experimental results and clear-cut zones seem more like guesswork. Similarly, geologists have been able to create maps of the continental plates dating back around 600 million years, before which the position of land masses hasn't been so much vague as completely unknown. 

The time back to the Cambrian is less than fifteen percent of the age our 4.5 billion year old planet. This (hopefully) doesn't keep the experts up at night, as well-understood geophysical forces mean that rock is constantly being subducted underground, to be transformed and so no longer available for recording. In addition, for its first 1.3 billion years the planet's surface would have been too hot to allow plates to form. Even so, the position of the continental crust from the Cambrian period until today is mapped to a high level of detail at frequent time intervals; this is because enough is known of the mechanisms involved that if a region at the start of a period is in position A and is later found at position Z, it must have passed through intermediate positions B through Y en route.

One key geological puzzle related to the building and movement of continental rock strata is known as the Great Unconformity, essentially a 100 million year gap in the record that occurs in numerous locations worldwide for the period when complex multicellular life arose. In some locales the period expands both forwards and backwards to as much as a billion years of missing rock; that's a lot of vanished material! Most of the popular science I've read tends to downplay the absent strata, presumably because in the 150 years since the Great Unconformity was first noticed there hasn't been a comprehensive resolution to its cause. The sheer scale of the issue suggests a profound level of ignorance within geology. Yes, it is a challenge, but it doesn't negate the science in its entirety; on the other hand, it's exactly the sort of problem that fundamentalists can use as ammunition to promote their own versions of history, such as young Earth creationism.

In recent decades, the usually conservative science of geology has been examining the evidence for an almost global glaciation nicknamed 'Snowball Earth' (or 'Slushball Earth', depending on how widespread you interpret the evidence for glaciation). It appears to have occurred several times in the planet's history, with the strongest evidence for it occurring between 720 and 635 million years ago. What is so important about this era is that it is precisely the time (at least in geological terms) when after several billion years of microbial life, large and sophisticated, multicellular organisms rapidly evolved during the inaccurately-titled Cambrian explosion.

All in all then, the epoch under question is extremely important. But just how are the Great Unconformity, global glaciation and the evolution of complex biota connected? Since 2017 research, including from three Australian universities, has led to the publication of the first tectonic plate map centred on this critical period. Using various techniques, including measuring the oxygen isotopes within zircon crystals, the movements of the continents has been reconstructed further back in time than ever before. The resulting hypothesis is a neat one (perhaps overly so, although it appears to be tenable): the top 3km to 5km of surface rock was first eroded by glacial activity, then washed into the oceans - where the minerals kick-started the Ediacaran and early Cambrian biota -  before being subducted by tectonic activity. 

The conclusion doesn't please some skeptics but the combined evidence, including the erosion of impact craters and a huge increase in sedimentation during the period, gives further support, with the additional inference that an immense increase in shallow marine environments (thanks to the eroded material raising the seafloor) had become available for new ecological niches. In addition, the glacial scouring of the primary biominerals calcium carbonate, calcium phosphate and silicon dioxide into the oceans altered the water chemistry and could have paved the way for the first exoskeletons and hard shells, both by providing their source material and also generating a need for them in the first place, in order to gain protection from the changes in water chemistry.

Deep-time thermochronology isn't a term most of us are familiar with, but the use of new dating techniques is beginning to suggest solutions to some big questions. Not that there aren't plenty of other fundamental questions (the nature of non-baryonic matter and dark energy, anyone?) still to be answered. The scale of the unknown should not be used to denigrate science; not knowing something doesn't mean science isn't the tool for the job. One of its more comforting (at least to its practitioners) aspects is that good science always generates more questions than it answers. To expect simple, easy, straightforward solutions should be left to other human endeavours that relish just-so stories. While working theories are often elegant and simpler than alternatives, we should expect filling in the gaps as a necessity, not a weapon used to invalidate the scientific method or its discoveries. 

Tuesday 15 June 2021

Meat-free marvels: does a vegetarian diet reduce your risk of disease?

Is it me, or are there times when contemporary diet trends appear to verge on pseudoscientific crankery? While I briefly mentioned potentially dangerous items such as raw water and unpasteurised milk a few years' ago, it's surprising how many fad diets in developed nations bear a suspicious resemblance to the traditional ingredients of non-Western societies. 

Super foods are a particularly overhyped element of this faddish arena; the marketing suggests they can help achieve perfect 'balance' and 'wellness' in the body. Some assertions go much further, with consumption of the likes of kombucha claimed as something of a miracle cure. While the pseudocereal quinoa is sold in the West as the 'grain of the gods', it is unlikely to give the partaker any super powers. It certainly didn't save the Inca and Anasazi - who cultivated it in pre-Columbian America - from the rapid collapse of their civilisations and apparently suffered from disease and famine as much any other society.

There is a scientific basis for recommending certain non-meat items, from the antioxidants in tea and coffee to the vitamin D in mushrooms, while various plants and vegetable oils contain Omega-3 fatty acids. But a recent report has concluded that a vegetarian diet may have a marked positive effect on overall health compared to one with regular meat consumption. The research was conducted by the University of Glasgow, with the data showing substantial reductions in disease biomarkers for non-meat eaters. However, it was unable to provide an underlying reason for the positive results, once risk factors such as age, alcohol and nicotine intake had been accounted for. Cholesterol and products linked to increased risk of cancers, cardiovascular disease, and liver and kidney problems were all lower in vegetarians.

Apart from suggesting that vegetarians eat more fibre, fruit, vegetables and nuts - some of which have known health benefits - the report's conclusion also noted that rather than the positive effect of these items, avoiding processed meat products and red meat may have also contributed to the results. As someone who hasn't eaten meat in over thirty years, I find the research extremely interesting, although I think there are many other factors that should be considered, with the report forming just part of the debate. 

For example, the data was drawn from c.420,000 people living in just the UK, rather than from a variety of nations and environments. In the past century, the diet and lifestyle of most people in the West has changed enormously, with the emphasis on quick-to-prepare meat dishes including the likes of burgers and sausages, remaining at the forefront despite the replacement of physically demanding lives with predominantly sedentary ones. In other words, the diet hasn't changed to match the alteration in lifestyle. It's little wonder that obesity has outranked malnourishment in some nations.

In addition, it is thought that several billion people, predominantly in less developed regions, consume insect protein on a regular if not daily basis. This is a profoundly different diet to those of Western meat eaters with the latter's concentration on domesticated species such as cattle and horse, sheep/goat, poultry, etc. Although game, bush meat and exotic species such as crocodile are eaten in many regions, these are a much smaller element of the human diet. 

In contrast, vegetarians in many regions can eat an enormous variety of plants and fungi. The geographic and seasonal availability of many fruit and vegetables is expanding too: until a few years ago I hadn't heard of jackfruit, but it is now available as the tinned unripe variety from many stores here in New Zealand. So in both time and in space, there's no such thing as a typical vegetarian diet! This also doesn't include the differences between lacto-vegetarians and vegans; it would definitely be rather more time-consuming to plan a diet with an adequate mix of proteins in the absence of eggs and dairy products. It would therefore be interesting to conduct research to find out the health differences between these two groups.

Although some of the blame for poor health and obesity has been placed on processed and refined foods, there is an ever-increasing array of prepared vegetarian products, often marketed as meat substitutes for meatatarians wanting to cut down on their consumption of animal flesh. My daughters (regular meat eaters) and I have a penchant for fake bacon made of wheat, pea and soy and I also eat a variety of meat-free sausages and burgers as well as Quorn products. 

Many companies are now getting on the bandwagon, with products that aim to replicate the taste and texture of the real thing. Some brands such as Beyond Meat and Impossible Foods have seen a rapid rise to international success, while the UK bakery chain Greggs has benefitted from its tasty (if high-fat) Quorn-based vegan sausage roll becoming one of their top five selling products. Therefore the range of processed foods suitable for vegetarians has grown out of all proportion to those available several decades ago. Could it be that these may have detrimental health effects compared to the less refined ingredients traditionally eaten by Western vegetarians (and still eaten in developing nations)?

Just as there are shed loads of books claiming that epigenetics will allow you to self-improve your DNA through your lifestyle, diet gurus play upon similar fears (and gullibility) to encourage people to eat all sorts of weird stuff that at best maintains equilibrium and at worst can lead to serious health issues. I personally think that a wider amount of research, undertaken in all sorts of regions and societies, needs to be done before a vegetarian diet can be claimed to be distinctly superior to a meat-based one. Of course, a reduction in ruminant farming is good for the planet in general - both for saving water and reducing methane - but as far as a diet equates to health I still think that moderation and a sensible attitude can be key factors in this regard. Nevertheless the Glasgow study certainly is...wait for it...food for thought!

Friday 14 May 2021

Weedbusting for a better world: the unpleasant truth about invasive plants

There's been a lot written about New Zealand's Predator Free 2050 programme, including my own post from 2016, but while the primary focus has been on fauna, what about the invasive species of flora? Until recently it was easy to think of plants as poor man's animals, with little in the way of the complex behaviour that characterises the life of vertebrates and many invertebrates. However, that's been changing thanks to studies that show the life of plants is actually rather complex - and includes the likes of chemical signalling. Although they might not have the emotional impact of animals, land vegetation alone has about one thousand times the mass of terrestrial fauna. So they're important - and then some!

A few months' ago I was volunteering on the sanctuary island of Motuihe, less than an hour's boat ride from downtown Auckland. Our group was charged with cutting down woolly nightshade, a soil-poisoning plant native to South America. Destroying these evil-smelling shrubs made me wonder how and why they were introduced to New Zealand in the first place, considering they don't look particularly attractive and their berries are poisonous to humans. Like so many exotic plant species, they were apparently deliberately introduced as a decorative garden plant, though frankly I can't see why. 

Like many similar stories from around the world, New Zealand has been inundated with large numbers of non-native floral species. Unlike woolly nightshade, some were introduced for practical purposes, such as radiata pine for timber and gorse for hedging, while others were accidentally brought in as seeds in soil. In many cases they are stories of greed and incompetence, for which later generations have paid a heavy price. 

Although there were pioneering lone voices who from as early as the late nineteenth century could see the deleterious effects of exotic plant species on native vegetation, it wasn't until the last half century that any serious effort was made to promote their removal. British botanist and presenter David Bellamy was one of the first scientists to popularise this message, starring in a 1989 television advert to explain why Clematis vitalba (AKA Old man's beard) needed eradicating. Bellamy then went on to present the tv series Moa's Ark, which drew attention to the country's unique biota and the dangers it faced from poorly managed development. 

Given his botanical background, it's perhaps not surprising that rather than see plants as the background to dramas of the animal kingdom Bellamy made them central to the ecosystem, claiming that we should put nature before culture. Again, although lacking the dynamic aspects of fauna, invasive weeds (by definition, aren't weeds just plants in the wrong place?) such as Old man's beard can gain up to ten metres in a single growing season. You only have to look around a suburban garden - mine included - to see that constant vigilance is required to remove the likes of self-seeded wattle and climbing asparagus before they take hold and smother native species.

It isn't just on land that we face this issue: freshwater systems can easily be choked by the likes of Elodea canadensis, a North American pondweed that has escaped from its ornamental aquarium environment (thanks to highly irresponsible people, of course) and been spread by boats and fishing equipment, clogging and stagnating streams and lakes. What is worrying is that it is far short of being the worst of the fifty or so non-native aquatic plants that threaten New Zealand's waterways. Considering that around three-quarters of all invasive species in this environment have a detrimental effect, it clearly makes the point that introduced flora is just not good.

So what can - and is - being done? Thanks to numerous volunteer groups, sanctuaries for rare native species (principally fauna, but occasionally flora too) are keeping invasive weeds at bay. Outside these protected environments, annual weeding programmes aim to reduce wilding pine, but the issue here is that commercial interest still maintains the upper hand. Whether for timber plantations or carbon sequestration, species such as Douglas fir continue to be planted, allowing the seed to spread to new areas far and wide on the wind. Luckily, there are numerous websites to help the public identify and  destroy pest plants; here are just some of the online resources available for New Zealanders:

Clearly, this isn't an issue that will ever go away. With most Government-led efforts focusing on pest animal species, eradicating invasive plants has been given far less support and so they remain comparatively unknown. Perhaps it would be good if schools undertook a compulsory programme, including practical work, in the identification and removal of non-native pest flora? Trapping and poisoning invasive animals can be a complex business, but weeding is comparative child's play. Everyone can help out: in effect, this is a form of citizen science that has a positive practical effect on the environment. Why not start with your garden today?


Thursday 1 April 2021

Zapping zombies: how the US military uses the entertainment industries as a recruitment tool

We hear a lot about gamification these days. As video games edge closer to simulating the real world, while Hollywood blockbusters seem to more and more resemble video games, it's little wonder that businesses are using the gaming concept as a learning tool. If anyone has noticed an eerie similarity between the plethora of military sci-fi movies, combat video games and the technology used by United States' armed forces, then you might be interested to learn that this is no coincidence.

Developed at MIT in 1962, Spacewar! is frequently cited as the earliest combat video game. Of course, it was developed for mainframe computers and so it took a long time before high enough quality visuals - with sound effects - could be installed in gaming arcades, followed in the early 1980s by games written for the first generation of ready-assembled home microcomputers.

Hollywood capitalised on the rapidly burgeoning video game market - both at home and in arcades - via movies such as 1984's The Last Starfighter, in which an expert arcade player finds himself recruited into an alien war. In other words, the game he excels at is really a simulator designed to discover and hone players who can then use their gaming skills in genuine space combat.

So how does this fiction compare to the real world? Specialist aviation publications have been full of articles with titles such as 'Do Gamers Make Better Drone Operators Than Pilots?' - the answer being that in addition to the obvious skills such as good hand-eye coordination, gamers are used to not being at personal risk from playing video games (except possibly RSI) and so remain calm under pressure. The conclusion is that they may give them an edge for controlling drones, although not it has to be said, larger, manually piloted aircraft.

The big question is how deep is the military involvement in the development, promotion and assessment of video games that contain combat skills? The relationship certainly appears to go back many decades, considering that the MIT graduate students who developed Space War! were funded by the Pentagon. With the development of much more lifelike virtual worlds, the US military has taken a front seat in both producing games that hone useful skills and creating realistic simulators for training its warfighters. 

There is complex feedback loop between these two spheres and in 1999 the Department of Defense set up the Institute for Creative Technologies to work across them. Games such as Full Spectrum Warrior (2003) and its non-commercial officer training stablemate Full Spectrum Command attempted to portray realistic combat scenarios, facing enemies who frequently resemble their real-life counterparts. 

America's Army (2002) was the first of a series of (initially free) video games that began as propaganda and recruitment tools and then became a widespread commercial franchise. Marines and Special Forces soldiers were amongst those combat veterans involved in the development of these games. In addition, the developers were allowed to scan weapons (in order to build realistic digital simulations) and even shoot them on a firing range so as to experience the physical attributes at first hand. Needless to say, the potential for glorification of violence led to opposition from various quarters.

It isn't just the software that has crossed over between the military and civilian life: weaponry and control systems also feedback between the real world and combat simulations, easing the move from game playing to the genuine article. Of course, skills such as leadership and team cooperation are also being honed by these games. The idea is that they reduce the cost of recruitment and training, leading to the realisation that the free version of America's Army, having had 1.5 million downloads in its first month (and a whopping 40 million downloads over the following six years), proved how effective they could be. 

Going in the other direction, US armed forces personnel have taken part in campaigns such as Operation Phantom Fury, which let's face it, has more than a touch of the Xbox or PlayStation about it. I assume this is also part of the process to ensure a smooth transition between young combat game players and activities in the real-world military. The channel is unlikely to diminish any time soon, seeing as China is now following America's lead; their Glorious Mission online video game, aimed at potential recruits as well as enlisted service personnel, already has over 300 million players.

The US military gaming sector has also started to diversify. To minimise complaints - already prevalent in the gaming sector, due to the implacable enemy often being a group of Muslim fundamentalists - there needed to be a new target that wouldn't raise the ire of any particular nation or ethnic group. To this end, the Call of Duty series of games has introduced reanimated dead soldiers, AKA zombies, as opponents. Bearing in mind that in the past ten years there have been over fifty video games featuring zombie antagonists, its clear that this theme is just as popular as invading aliens and terrorist zealots. Perhaps it's not surprising that doomsday preppers and survivalist groups are often said to be getting ready for the zombie apocalypse!

Recently released - although heavily-redacted - files suggest that as well as developing and promoting video games centred on combat simulation, the Department of Defense has also secretly collected players' data in order to understand their demographics. This is presumably in order to tailor recruitment and training programmes for recruits with a gaming background. The same information also hints that Hollywood too is being used by the military-industrial complex to promote its own agenda. It sounds a bit far-fetched, but the facts speak for themselves. 

The US military have long taken an interest in how Hollywood portrays them. Ronald Reagan's Whitehouse had screenings of Red Dawn (1984) and WarGames (1983) with the former gaining the Pentagon's approval while the latter was not well received (hardly surprising, if you know the plot). Gung-ho space marine movies started back in the mid-1980s with likes of Predator and Aliens, but really took off in mid-1990s with blockbusters such as Independence Day, Stargate and Starship Troopers

Hollywood hasn't looked back since, and as well as the US military fighting off hordes of alien invaders, there are plenty of zombie movies - over 170 worldwide over the past decade - along with numerous zombie-themed tv series. Of course, this genre usually features civilians fighting against the living dead, but nonetheless the firearm-laden format resembles its military counterparts. Critics have been keen to note that just as the alien invasion films of 1950s and 1960s were thinly-disguised Cold War allegories, so zombie movies contain subtext of the unpredictable nature of global terrorism - and imply readiness to engage the perceived enemy is a patriotic duty.

So what is the underlying connection between these genres and the Pentagon? Even a minimum of research will reveal that a fair number of the Department of Defense's advanced weaponry projects, from the F22 Raptor tactical fighter to the Global Hawk surveillance UAV (that's an Unmanned Aerial Vehicle to you and me) have been truncated, in both these cases with only about half the number of units being built compared to the original proposals. The funding for those cancelled vehicles is being redirected elsewhere and Hollywood is the most likely recipient, the money being used for both movies and tv shows that follow the DoD agenda.

And how does the Pentagon know it's getting value for money? As more people book cinema tickets online and via their smartphones, the DoD is able to build frighteningly detailed profiles of those adolescents with the aptitude and skills they are looking for. Thanks to tv subscription services, it is also much easier to see exactly who is watching how much of what.

By immersing America's youth in popular entertainment across a variety of channels that both gives a homely familiarity to the military and allows niche targeting for potential recruits, the Pentagon is saving money on blanket advertising while promoting its own values as a mainstream cultural element. Thanks to a business culture that embeds military-derived phrases ('locked and loaded', 'SWAT team', 'strategic planning', etc) the distance between the armed forces and civilian life has been much reduced since the anti-war ethos of the 1970s. So if you're a teenager who plays certain types of video games and/or watches these sorts of movies and tv shows, don't be surprised if you start receiving recruitment adverts tailored closely to your personality profile. To paraphrase the Village People: they want you as a new recruit!

Monday 15 March 2021

Distorted Darwin: common misconceptions about evolution and natural selection

A few months' ago, I discussed how disagreements with religious texts can lead the devout to disagree with key scientific theories; presumably this is a case of fundamentalists denying the fundamentals? Of all the areas of scientific research that cause issues today, it is evolutionary biology that generates the most opposition. This is interesting in so many ways, not least because the primary texts of the Abrahamic religions have little to say on the topic beyond the almost universal elements seen in creation myths, namely that one or more superior beings created all life on Earth and that He/They placed humanity at the zenith.

Thanks to opposition to the modern evolutionary synthesis, there is a plethora of misinformation, from material taken out of context to complete falsehoods, that is used to promote Creationist ideas rather than scientifically-gleaned knowledge. Even those with well-meaning intentions often make mistakes when condensing the complexity of the origin and history of life into easy-to-digest material. I've previously written about the concepts of evolutionary bushes rather than ladders, concurrent rather than consecutive radiation of sister species and speciation via punctuated equilibrium (i.e., the uneven pace of evolution) so here are a few other examples where the origin, implications and illustrations of natural selection has been distorted or overly simplified to the point of inaccuracy.

I've previously mentioned that Charles Darwin was the earliest discoverer - but only a decade or two ahead of Alfred Russel Wallace - of natural selection, and not as is often written, evolution per se. However, this is not completely accurate. Darwin's hypothesis was more complete than Wallace's, in the sense of being entirely scientific and therefore testable. Wallace on the other hand maintained there must have been divine intervention in the creation of our species, making us different from all other life forms.

In addition, there were several precursors who partially formulated ideas regarding natural selection, but who were unable to promote a consistent, evidence-based hypothesis to anywhere near the extent that Darwin achieved. For example, as early as 1831 the Scottish agriculturalist Patrick Matthew published some notes on what he termed 'new diverging ramifications of life' as he thought must occur after mass extinctions. Nevertheless, he failed to expand and fully explain his ideas, seemingly unaware of where they could lead. In this sense, he is a minor figure compared to the thorough research Darwin undertook to back up his hypothesis. 

Darwin appears to have been unaware of Matthew's ideas, although the same could not be said for Robert Chambers' (anonymous) 1844 publication Vestiges of the Natural History of Creation, which although highly speculative contained some kernels of truth about the mechanisms behind biological evolution. Just as Thomas Malthus' 1798 An Essay on the Principle of Population inspired Darwin, so the mid-nineteenth century contained other combinations of ideas and real-world inspiration that provided,an ideal background for the formulation of natural selection. In other words, the conditions were ready for those with the correct mindset to uncover the mechanism behind evolution. What Darwin did was to combine the inspiration with an immense amount of rigour, including examples taken from selective breeding.

Another frequently quoted fallacy is that evolution always maintains a single direction from earlier, simpler organisms to later, more complex ones. I've covered this before in discussions of the evolution of our own species, as many popular biology accounts seek parallels between technological progress and a central branch of animal evolution leading ever upwards until it produced us. 

Modern techniques such as genetic analysis and sophisticated examination of fossils - including scanning their internal cavities – has negated this appealing but incorrect idea. For example, mammals evolved around the same time as the dinosaurs (and over one hundred million years before flowering plants) while parasitic species often have a far more rudimentary structure than their ancestors. 

Despite this, we still see countless illustrations showing a clear-cut path from primordial organisms 'up' to Homo sapiens. No-one who has seen the cranial endocast of a dinosaur would consider it to be superior to even the least intelligent of mammals, although the later medium-sized carnivorous species were on the way to developing a bird-like brain-to-body mass ratio. Yet throughout the Jurassic and Cretaceous periods, dinosaurs filled most ecological niches at the expense of the mammals; you would be hard-pressed to state that the latter were the dominant type of land organism during the Mesozoic!

Research published last year shows that New Zealand's unique tuatara, the sole remaining member of the Rhynchocephalia, is a reptile that shares some genetic similarities to the Monotremata, the egg-laying mammalian species known as platypus and echidna. In addition, a report from the beginning of this year states that the ancestors of today's five monotreme species diverged from all other mammals 187 million years ago; therefore, they have spent approximately three times as long on their own evolutionary journey as they did when part of all the other mammalian lineages. As a result of retaining many ancestral features, the platypus genome is in some ways more like that of birds and reptiles rather than placental and marsupial mammals. But we still include them amongst the mammals rather than as a hybrid or separate class; both platypus and echidna have fur, are warm-blooded and produce milk (although with a unique delivery system!) This allows their inclusion in Mammalia; does this mean we arbitrarily allow certain traits and discard others?

Would it be fair to say that the boundaries we make between organisms are more for our convenience than the underlying reality? Are you happy to label birds as 'avian dinosaurs' and if not, why not? If they had feathers, nests and even underground burrows, some dinosaurs were clearly part of the way there; physiologically, it was teeth, bony tail, and a crocodilian-type brain that provided the differentiation from birds. Scans of fossils show that dinosaur hearts may have been more like birds than other reptiles, which along with the possible discovery of bird-like air sacs, means that they could have had something of the former's more active lifestyle. 

This doesn't confirm that they were warm-blooded: today there are eight species, including leatherback turtles, that are mesothermic and therefore lie between warm- and cold-blooded metabolisms. Eggshell analysis suggests that some of the theropod (carnivorous) dinosaurs could have been warm-blooded, but as dinosaurs existed for around 165 million years it may be that some evolved to be mesothermic and others to be endothermic (i.e., fully warm-blooded). In this respect then, some meat-eating dinosaurs especially may have had more in common with us mammals than they did with other reptiles such as lizards and snakes.

All this only goes to show that there is far more to life's rich pageant than the just-so stories still used to illustrate the history of life. Science communication to the public is fundamental to our society but it needs to present the awkward complexities of evolution via all the tortured pathways of natural selection if it is not to fall victim to those who prefer myths of the last few thousand years to the history of countless millennia, as revealed in the genes and rocks waiting for us to explore.


Friday 19 February 2021

Science, society & stereotypes: examining the lives of trailblazing women in STEM

I was recently flicking through a glossily illustrated Australian book on the history of STEM when I found the name of a pioneer I didn't recognise: Marjory Warren, a British surgeon who is best known today as the 'mother of modern geriatric medicine'. Looking in the index I could find only two other women scientists - compared to over one hundred and twenty men - in a book five hundred pages long! The other two examples were Marie Curie (of course) and American astronomer Vera Rubin. Considering that the book was published in 2008, I was astounded by how skewed this seemed to be. Granted that prior to the twentieth century, few women had the option of becoming involved in science and mathematics; but for any history of STEM, wouldn't the last century contain the largest proportion of subject material?

I therefore thought it would be interesting to choose case studies from the twentieth century to see what sort of obstacles - unique or otherwise - that women scientists faced until recently. If you ask most people to name a female scientist then Marie Curie would probably top the list, although a few countries might have national favourites: perhaps Rosalind Franklin in the UK or Rachel Carson in the USA, for example. Rather than choose the more obvious candidates such as these I have selected four women I knew only a little about, ordered by their date of birth.

Barbara McClintock (1902-1992) was an American cytogeneticist who was ahead of her time in terms of both research and social attitudes. Although her mother didn't want her to train as a scientist, she was lucky to have a father who thought differently to the accepted wisdom - which was that female scientists would be unable to find a husband! McClintock's abilities showed early in her training, leading to post-graduate fellowships which in turn generated cutting-edge research.

At the age of forty-two, Barbara McClintock was only the third woman to be elected to the US National Academy of Sciences. However, her rapid rise within the scientific establishment didn't necessarily assist her: such was the conservative nature of universities that women were not allowed to attend faculty meetings. 

After publishing her research to broad acceptance, McClintock's work then moved into what today would broadly come under the term of epigenetics. Several decades' ahead of its time, it was seen as too radical by most of her peers and so after facing intense opposition she temporarily stopped publishing her results. It is unlikely that being a woman was entirely responsible for the hostility to her work; similar resistance has frequently been experienced throughout the STEM avant-garde. It seems that only when other researchers found similar results to McClintock did the more hidebound sections of the discipline re-examine their negative attitude towards her work.

There has been a fair amount of discussion as to whether it was because McClintock was female, or because of her secretive personality (both at home as well as at work, for she never married) - or a combination of both - that delayed her receipt of the Nobel Prize in Physiology or Medicine. Even by the slow standards of that particular awards committee, 1983 was rather late in the day. However, by then she had already been the recipient of numerous other awards and prizes.

Regardless of the recognition it gave her, Barbara McClintock relished scientific research for the sake of uncovering nature's secrets. In that regard, she said: "I just have been so interested in what I was doing and it's been such a pleasure, such a deep pleasure, that I never thought of stopping...I've had a very, very, satisfying and interesting life."

Tikvah Alper (1909-1995) was a South African radiobiologist who worked on prions - otherwise known as 'misfolded' or 'rogue' proteins - and their relationship to certain diseases. Her outstanding abilities were recognised early, allowing her to study physics at the University of Cape Town. She then undertook post-graduate work in Berlin with the nuclear fission pioneer Lise Meitner, only to be forced to leave before completing her doctorate due to the rise in anti-Semitism in Germany.

Having had her research curtailed by her ethnicity, Alper was initially also stymied on her return to South Africa thanks to her private life: due to the misogynist rules of that nation's universities, married women were not allowed to remain on the faculty. Therefore, along with her husband the veterinary medicine researcher Max Sterne, she continued her work from home. However, eventually her talents were acknowledged and she was made head of the Biophysics section at the South African National Physics Laboratory in 1948. Then only three years later, Alper's personal life intervened once again; this time, she and her husband were forced to leave South Africa due to their opposition to apartheid.

After a period of unpaid research in London, Alper turned to studying the effects of radiation on different types of cells, rising to become head of the Medical Research Council Radiopathology Unit at Hammersmith Hospital. Alper's theories regarding prions were eventually accepted into the mainstream and even after retirement she continued working, writing a renowned text book, Cellular Radiobiology, in 1979. 

Alper's life suggests she was very much a problem solver, tackling anything that she felt needed progressing. As a result of this ethos she worked on a wide range of issues from the standing of women in science and society, to the injustice of apartheid, even to learning and teaching sign language after one of her son's was born profoundly deaf. Despite being forced to leave several nations for different reasons - not because she was a woman - Alper was someone who refused to concede defeat. In that respect she deserves much wider recognition today.

Dorothy Crowfoot Hodgkin (1910-1994) was interested in chemistry, in particular crystals, from a young age. Although women of her generation were encouraged in this area as a hobby, it was highly unusual for them to seek paid employment in the field. Luckily, her mother encouraged her interest and gave Hodgkin a book on x-ray crystallography for her sixteenth birthday, a gift which determined her career path. 

After gaining a first-class honours chemistry degree at Oxford, she moved to Cambridge for doctoral work under the x-ray crystallography pioneer J.D. Bernal. Not only did Hodgkin then manage to find a research post in her chosen field, working at both Cambridge and Oxford, she was able to pursue cutting edge work labelled as too difficult by her contemporaries, Hodgkin and her colleagues achieved ground-breaking results in critical areas, resolving the structure of penicillin, vitamin B12 and insulin. 

Hodgkin's gained international renown, appearing to have faced few of the difficulties experienced by her female contemporaries. In addition to having a well-equipped laboratory at Oxford, she was elected to the Royal Society in 1947 and became its Wolfson Research Professor in 1960. She was also awarded the Nobel Prize in Chemistry in 1964 - the only British woman to have been a recipient to date. Other prestigious awards followed, including the Royal Society's Copley Medal in 1976; again, no other woman has yet received that award.

Presumably in response to the loss of four maternal uncles in the First World War, Hodgkin was an active promoter of international peace. During the 1950s her views were deemed too left wing by the American government and she had to attain special permission to enter the United States to attend science conferences. Ironically, the Soviet Union honoured her on several occasions, admitting her as a foreign member of the Academy of Sciences and later awarding her the Lenin Peace Prize. She also communicated with her Chinese counterparts and became committed to nuclear disarmament, both through CND and Operation Pugwash.

Her work on insulin, itself of enormous importance, is just one facet of her life. Ironically, as someone associated with left-wing politics, she is often remembered today as being one of Margaret Thatcher's lecturers; despite their different socio-political leanings, they maintained a friendship into later life. All this was despite the increasing disability Hodgkin suffered from her mid-twenties due to chronic rheumatoid arthritis, which left her with seemingly minimal dexterity. Clearly, Dorothy Hodgkin was a dauntless fighter in her professional and personal life.

Marie Tharp (1920-2006) was an American geologist best known for her oceanographic cartography work regarding the floor of the Atlantic Ocean. Despite followed the advice of her father (a surveyor) and taking an undergraduate degree in humanities and music, Tharp also took a geology class; perhaps helping her father as a child boosted her interest in this subject. It enabled her to complete a master's degree in geology, thanks to the dearth of male students during the Second World War. Certainly, it was an unusual avenue for women to be interested in; at the time less than four percent of all earth sciences doctorates in the USA were awarded to women.

From a modern perspective, geology during the first half of the twentieth century appears to have been exceedingly hidebound and conservative. Tharp found she could not undertake field trips to uncover fossil fuel deposits, as women were only allowed to do office-based geological work - one explanation for this sexism being that having women on board ship brought bad luck! In fact, it wasn't until 1968 that Tharp eventually joined an expedition. 

However, thanks to painstaking study of her colleague Bruce Heezen's data, Tharp was able to delineate geophysical features such as the mid-Atlantic ridge and consider the processes that generated them. Her map of the Atlantic Ocean floor was far more sophisticated than anything that had previously been created, giving her insights denied to both her contemporaries as well as her predecessors. As such, Tharp suspected that the long-denigrated continental drift hypothesis, as envisaged by Alfred Wegener three decades previously, was correct. It was here that she initially came unstuck, with Heezen labelling her enthusiasm for continental drift as 'girl talk'. Let's hope that phrase wouldn't be used today!

In time though, yet more data (including the mirrored magnetic striping either side of the mid-Atlantic ridge) proved Tharp correct. Heezen's incredulity was replaced by acceptance, as continental drift was reformulated via seafloor spreading to become the theory of plate tectonics. Mainstream geology finally approved what Wegener had proposed, and Marie Tharp was a fundamental part of that paradigm shift. 

What is interesting is that despite receiving many awards in her later years, including the National Geographic Society's Hubbard Medal in 1978, her name is mentioned far less often than other pioneers of plate tectonics such as Harry Hess, Frederick Vine, Drummond Matthews, even Heezen. It's unclear if Tharp's comparative lack of recognition is due to her being female or because she was only one of many researchers working along similar lines. Her own comment from the era suggests that just being a women scientist was reason enough to dismiss her work: she noted that other professional's viewed her ideas with attitudes ranging "from amazement to skepticism to scorn."

There are countless other examples that would serve as case studies, including women from non-Western nations, but these four show the variety of experiences women scientists underwent during the twentieth century, ranging from a level of misogyny that would be unthinkable today to an early acceptance of the value of their work and a treatment not seemingly different from their male colleagues. I was surprised to find such a range of circumstances and attitudes, proving that few things are as straightforward as they are frequently portrayed. However, these examples do show that whatever culture they grow up in, the majority of the population consider its values to be perfectly normal; a little bit of thought - or hindsight - shows that just because something is the norm, doesn't necessarily mean it's any good. When it comes to the attitudes today, you only have to read the news to realise there's still some way to go before women in STEM are treated the same as their male counterparts.

Monday 25 January 2021

Ignorance is bliss: why admitting lack of knowledge could be good for science

"We just don't know" might be one of the best phrases in support of the scientific method ever written. But unfortunately it carries an inherent danger: if a STEM professional - or indeed an amateur scientist/citizen scientist - uses the term, it can be used by those wishing to disavow the subject under discussion. Even adding "- yet" to the end of it won't necessarily improve matters; we humans have an unfortunate tendency to rely on gut instinct rather than rational analysis for our world model, hence - well, just about any man-made problem you care to name, now or throughout history.

Even though trust in scientists and the real-world application of their work may have taken an upswing thanks to some rapid vaccine development during the current pandemic, there are many areas of scientifically-gleaned knowledge that are still as unpopular as ever. Incidentally, I wonder whether if it wasn't for much stricter laws in most countries today, we would have seen far more of the quackery that arose during the 1918 Spanish flu epidemic. During this period low-tech 'cures' included gas inhalation, enemas and blood-letting, the former about as safe as last year's suggestion to drink bleach. I've seen very little about alternative cures, no doubt involving crystals, holy water or good old-fashioned prayer, but then I probably don't mix in those sort of circles (and certainly don't have that type of online cookie profile). But while legislation might have prevented alternative pandemic treatments from being advertised as legitimate and effective, it hasn't helped other areas of science that suffer from widespread hostility. 

Partly this is due to the concept - at least in liberal democracies - of free speech and the idea that every thesis must surely have an antithesis worthy of discussion. Spherical planets not your bag, baby? Why not join the Flat Earth Society. It's easy to be glib about this sort of thing, but there are plenty of more serious examples of anti-scientific thinking that show no sign of abating. The key element that disparate groups opposing science seem to have in common is simple; it all comes down to where it disagrees with the world picture they learnt as a child. In most cases this can be reduced even further to just two words: religious doctrine.

This is where a humble approach to cutting-edge research comes in. Humility has rarely been a key characteristic of fictional scientists; Hollywood for example has often depicted (usually male) scientists as somewhere on a crude line between power-crazed megalomaniacs and naive, misguided innocents. The more sensational printed volumes and tv documentaries communicating scientific research to a popular audience likewise frequently eschew ambiguities or dead-ends in favour of this-is-how-it-is approach. Only, quite often, it isn't how it works at all. Doubts and negative results are not only a key element of science, they are a fundamental component; only by discarding failures can the search for an answer to an hypothesis (or if you prefer the description of the brilliant-yet-humble physicist Richard Feynman: a guess) be narrowed down. 

There are plenty of examples where even the most accomplished of scientists have admitted they don't know the answer to something in their area of expertise, such as Sir Isaac Newton being unable to resolve the ultimate cause of gravity. As it was, it took over two centuries for another genius - Albert Einstein - to figure it out. Despite all the research undertaken over the past century or so, the old adage remains as true as ever: good science creates as many new questions as it answers. Key issues today that are unlikely to gain resolution in the next few years - although never say never - include what is the nature of dark energy (and possibly likewise for dark/non-baryonic matter) and what is the ultimate theory behind quantum mechanics? 

Of course, these questions, fascinating though they are, hold little appeal to most people; they are just too esoteric and far removed from everyday existence to be bothered about. So what areas of scientific knowledge or research do non-scientists worry about? As mentioned above, usually it is something that involves faith. This can be broken down into several factors:

  1. Disagreement with a key religious text
  2. Implication that humans lack an non-corporeal element, such as an immortal soul
  3. Removal of mankind as a central component or focal point for the universe 

These obviously relate to some areas of science - from a layman's viewpoint - far more than others. Most non-specialists, even religious fundamentalists, don't appear to have an issue with atomic theory and the periodic table. Instead, cosmology and evolutionary biology are the disciplines likely to raise their ire. Both are not in any sense complete; the amount of questions still being asked is far greater than the answers so far gleaned from research. The former is yet to understand what 96% of the universe is composed of, while the latter is still piecing together the details of the origin and development of life of our planet, from primordial slime up to Donald Trump (so possibly more of a sideways move, then). 

Herein lies the issue: if scientists claim they are 'certain' about the cause of a particular phenomenon or feature of reality, but further research confirms a different theory, then non-scientists are  legitimately able to ask why the new idea is any more final than the previous one? In addition, the word 'theory' is also prone to misinterpretation, implying it is only an idea and not an hypothesis (guess, if you like) that hasn't yet failed any tests thrown at it, be they practical experiments, digital simulations or mathematical constructions. Bill Bryson's best-selling A Short History of Nearly Everything is an example of how science can be done a disservice by material meant to promote it, in that the book treats science as if it were an ever-expanding body of knowledge rather than as a collection of methods that are used to explore answerable questions about life, the universe, and of course, everything.

Perhaps one answer to all this would be for popular science journalism, from books written by professional scientists to short news items, to include elements related to what is not yet known. The simplistic approach that avoids the failures only serves to strengthen the opinion that experts are arrogant believers in their own personal doctrines, as inflexible and uncompromising as holy writ. 

Unfortunately, in efforts to be both concise and easy-to-comprehend, much science communication appears to render the discipline in this manner, avoiding dissension and doubt. In addition, the often wonderful - and yet to be resolved subtleties - of research are neglected. For example, the majority of specialists agree that birds are descended from theropod (i.e. carnivorous) dinosaurs, and yet the primary growth axis on the forelimbs of the two groups differs. This issue has not been satisfactorily answered, but the vast collection of evidence, both from fossils and experimentation, claims it as the most plausible solution to this particular phylogenetics tree. Further research, especially in embryology, may one day find a more complete solution.

Ultimately then, science education would probably benefit from it confirming boundaries of uncertainty, where they exist. This may help allay fears that the discipline wants to impose absolutes about everything; in most areas (the second law of thermodynamics excepted) we are still in the early stages of understanding. This doesn't mean that the Earth may be flat or only six thousand years old, but it does mean that science usually works in small steps, not giant paradigm shifts that offer the final say on an aspect of reality. After all, if scientists already knew everything about a subject, there wouldn't be any need for further research. What a boring world that would be!