Monday 14 December 2020

Biomaterial bonanza: putting plastics out of a job

With the rapidly approaching midwinter (at least for the Northern Hemisphere) festival - and traditionally a time of gift-giving - wouldn't it be great to say that humanity can offer a present to the entire planet? The amount of plastic-based products manufactured every year is somewhere between three hundred and four hundred million tons, about fifty percent of which is single-use or disposable. 

Presumably if you've got any sort of interest whatsoever in the world around you (and how your children will get on) then you have been replacing disposable plastic items with reusable non-plastic, or at least biodegradable, alternatives. But are the companies producing the latter guilty of subtle greenwashing?

A friend recently told me that he had put some allegedly biodegradable plastic bags into his compost heap, only to retrieve them - albeit with some holes in - a year or so later. Bearing in mind there isn't an internationally-recognised set of characteristics for just what defines biodegradable, is it surprising that the wool (sorry, polyester) is being pulled over consumers' eyes?

A report last year summarised a three-year research programme at the UK's University of Plymouth, offering clear evidence that many types of allegedly biodegradable bags do not break down when buried in soil or underwater. Although the material did decay in open air, it was just into smaller pieces of plastic rather than degrading into simpler molecules. 

Recent studies by Tel Aviv University and the Goethe Universität in Frankfurt go even further in putting allegedly ecofriendly materials in a bad light. Both claim that not just biodegradable plastics but even those based on starch and cellulose contain numerous toxic chemicals. Such materials are used in food and drink packaging. So where do we go from here?

Last year I wrote a post about the potential of chitosan, a genuinely biodegradable material made from marine arthropod carapaces (i.e. shellfish discards) that can be produced in an eco-friendly process. 

There now appear to be several other materials that also have the possibility to replace traditional plastics. A group at the University Of Science And Technology Of China have developed a lightweight but durable material using mica and cellulose-derived nanofibre that has more than double the strength of high-performance petroleum-based plastics.

Another alternative to plastic that utilises surprising source material has been developed by a student at the University of Sussex in the UK. Lucy Hughes has used red algae to augment discarded fish scales and fish skin to produce a single-use translucent substance called MarinaTex. In addition to its use of material otherwise destined for landfill, MarinaTex - which biodegrades within six weeks - is the antithesis of conventional plastics in that the red algae component makes its production carbon positive! 

The bad news is that both materials are still at the research and development stage and there is no indication of when they would be ready for commercial mass-production. Crucially, there doesn't appear to be any news of large corporations buying the research for implementation; why is it that so many paradigm-shifting projects are having to be developed by crowd-funded start-ups rather than established multi-nationals? 

Surely there are enough ethical executives out there to pick up once the research has shown such potential? But as I've no doubt mentioned before, we are living in a world where the largest national economy - the United States of course - spends more each year on pet grooming products than on nuclear fusion research. Will future historians dub our era the Decades of Dubious Sanity?

Meanwhile, immense amounts of plastics are dumped in landfill and the oceans, polluting everything, including microplastics in the food we eat. Isn't it time these researchers are given the backing they need to convert smart ideas into ecosystem saviours? After all, no-one, no matter how wealthy, can opt out of the planetary biosphere!

Monday 23 November 2020

Self-destructive STEM: how scientists can devalue science

Following on from last month's exploration of external factors inhibiting the scientific enterprise, I thought it would be equally interesting to examine issues within the sector that can negatively influence STEM research. There is a range of factors that vary from the sublime to the ridiculous, showing that science and its practitioners are as prey to the whims of humanity as any other discipline. 

1) Conservatism

The German physicist Max Planck once said that a "new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it." With peer review of submitted articles, it's theoretically possible that a new hypothesis could be prevented from seeing the light of day due to being in the wrong place at the wrong time; or more precisely, because the reviewers personally object to the ideas presented.

Another description of this view is that there are three stages before the old guard accept the theories of the young turks, with an avant garde idea eventually being taken as orthodoxy. One key challenge is the dislike shown by established researchers to outsiders who promote a new hypothesis in a specialisation they have no formal training in. 

A prominent example of this is the short shrift given to meteorologist Alfred Wegener when he described continental drift to the geological establishment; it took over thirty years and a plethora of evidence before plate tectonics was found to correlate with Wegener's seemingly madcap ideas. More recently, some prominent palaeontologists wrote vitriolic reviews of the geologist-led account of the Chicxulub impact as the main cause of the K-T extinction event. 

This also shows the effect impatience may have; if progress in a field is slow or seemingly negative, it may be prematurely abandoned by most if not all researchers as a dead end.

2) Putting personal preferences before evidence 

Although science is frequently sold to the public as having a purely objective attitude towards natural phenomena, disagreements at the cutting edge are common enough to become cheap ammunition for opponents of STEM research. When senior figures within a field disagree with younger colleagues, it's easy to see why there might be a catch-22 situation in which public funding is only available when there is consensus and yet consensus can only be reached when sufficient research has as placed an hypothesis on a fairly firm footing.

It is well known that Einstein wasted the last thirty or so years of his life trying to find a unified field theory without including quantum mechanics. To his tidy mind, the uncertainty principle and entanglement didn't seem to be suitable as foundation-level elements of creation, hence his famous quote usually truncated as "God doesn't play dice". In other words, just about the most important scientific theory ever didn't fit into his world picture - and yet the public's perception of Einstein during this period was that he was the world's greatest physicist.

Well-known scientists in other fields have negatively impacted their reputation late in their career. Two well-known examples are the astronomer Fred Hoyle and microbiologist Lynn Margulis. Hoyle appears to have initiated increasingly fruity ideas as he got older, including the claim that the archaeopteryx fossil at London's Natural History Museum was a fake. Margulis for her part stayed within her area of expertise, endosymbiotic theory for eukaryotic cells, to claim her discoveries could account for an extremely wide range of biological functions, including the cause of AIDS. It doesn't take much to realise that if two such highly esteemed scientists can publish nonsense, then uninformed sections of the public might want to question the validity of a much wider variety of established scientific truths.

3) Cronyism and the academic establishment

While nepotism might not appear often in the annals of science history, there have still been plenty of instances in which favoured individuals gain a position at the expense of others. This is of course a phenomenon as old as natural philosophy, although thankfully the rigid social hierarchy that affected the careers of nineteenth century luminaries such as physicist Michael Faraday and dinosaur pioneer Gideon Mantell is no longer much of an issue. 

Today, competition for a limited number of places in university research faculties can lead to results as unfair as in any humanities department.  A congenial personality and an ability to self-publicise may tip the balance on gaining tenure as a faculty junior; scientists with poor interpersonal skills can fare badly. As a result, their reputation can be denigrated even after their death, as happened with DNA pioneer Rosalind Franklin in James Watson's memoirs. 

As opponents of string theory are keen to point out, graduates are often forced to get on bandwagons in order to gain vital grants or academic tenure. This suggests that playing safe by studying contemporary ‘hot' areas of research is preferred to investigating a wider range of new ones. Nobel Laureate and former Stephen Hawking collaborator Roger Penrose describes this as being particularly common in theoretical physics, whereby the new kids on the block have to join the entourage of an establishment figure rather than strike out with their own ideas.

Even once a graduate student has gained a research grant, it doesn't mean that their work will be fairly recognised. Perhaps the most infamous example of this occurred with the 1974 Nobel Prize in Physics. One of the two recipients was Antony Hewish, who gained the prize for his "decisive role in the discovery of pulsars”. Yet it was his student Jocelyn Bell who promoted the hypothesis while Hewish was claiming the signal to be man-made interference. 

4) Jealousy and competitiveness

Although being personable and a team player can be important, anyone deemed to be too keen on self-aggrandising may attract the contempt of the scientific establishment. Carl Sagan was perhaps the most prominent science communicator of his generation but was blackballed from the US National Academy of Sciences due to being seen as too popular! This is despite some serious planetary astronomy in his earlier career, including work on various Jet Propulsion Laboratory probes. 

Thankfully, attitudes towards sci-comm have started to improve. The Royal Society has advocated the notion that prominent scientists should become involved in promoting their field, as public engagement has been commonly judged by STEM practitioners as the remit of those at the lower end of scientific ability. Even so, there remains the perception that those engaged in communicating science to the general public are not proficient enough for a career in research. Conversely, research scientists should be able to concentrate on their work rather than having to spend large amounts of their time of seeking grants or undertaking administration - but such ideals are not likely to come to in the near future!

5) Frauds, hoaxes and general misdemeanours 

Scientists are as human as everyone else and given the temptation have been known to resort to underhand behaviour in order to obtain positions, grants and renown. Such behaviour has been occurring since the Enlightenment and varies from deliberate use of selective evidence through to full-blown fraud that has major repercussions for a field of research. 

One well-known example is the Piltdown Man hoax, which wasn't uncovered for forty years. This is rather more due to the material fitting in with contemporary social attitudes rather than the quality - or lack thereof - of the finds. However, other than generating public attention of how scientists can be fooled, it didn't damage science in the long run. 

A far more insidious instance is that of Cyril Burt's research into the heritability of intelligence. After his death, others tried to track down Burt's assistants, only to find they didn't exist. This of course placed serious doubt on the reliability of both his data and conclusions, but even worse his work was used by several governments in the late twentieth century as the basis for social engineering. 

Scandals are not unknown in recent years, providing ammunition for those wanting to deny recognition of fundamental scientific theories (rarely the practical application). In this age of social media, it can take only one person's mistake - deliberate or otherwise - to set in motion a global campaign that rejects the findings of science, regardless of the evidence in its favour. As the anti-vaccination lobby have proven, science communication still has long way to go if we are to combine the best of both worlds: a healthy scepticism with an acceptance of how the weird and wonderful universe really works, and not how we would like it to.

Tuesday 27 October 2020

Bursting the bubble: how outside influences affect scientific research

In these dark times, when some moron (sorry, non-believer in scientific evidence) can easily reach large numbers of people on social media with their conspiracy theories and pseudoscientific nonsense, I thought it would be an apt moment to look at the sort of issues that block the initiation, development and acceptance of new scientific ideas. We are all aware of the long-term feud between some religions and science but aside from that, what else can influence or inhibit both theoretical and applied scientific research?

There are plenty of other factors, from simple national pride to the ideologies of the far left and right that have prohibited theories considered inappropriate. Even some of the greatest twentieth century scientists faced persecution; Einstein was one of the many whose papers were destroyed by the Nazis simply for falling under the banner 'Jewish science'. At least this particular form of state-selective science was relatively short-lived: in the Soviet Union, theories deemed counter to dialectical materialism were banned for many decades. A classic example of this was Stalin's promotion of the crackpot biologist Trofim Lysenko - who denied the modern evolutionary synthesis - and whose scientific opponents were ruthlessly persecuted. 

Even in countries with freedom of speech, if there is a general perception that a particular area of research has negative connotations then no matter how unfounded, public funding may be affected likewise. From the seemingly high-profile adulation of STEM in the 1950s and 1960s (ironic, considering the threat of nuclear war), subsequent decades have seen a decreasing trust in both science and its practitioners. For example, the Ig Nobel awards have for almost thirty years been a high-profile way of publicising scientific projects deemed frivolous or a waste of resources. A similar attitude is frequently heard in arts graduate-led mainstream media; earlier this month, a BBC radio topical news comedy complemented a science venture that was seen as "doing something useful for once." 

Of course, this attitude is commonly related to how research is funded, the primary question being why should large amounts of resources go to keep STEM professionals employed if their work fails to generate anything of immediate use? I've previously discussed this contentious issue, and despite the successes of the Large Hadron Collider and Laser Interferometer Gravitational-Wave Observatory, there are valid arguments in favour of them being postponed until our species has dealt with fundamental issues such as climate change mitigation. 

There are plenty of far less grandiose projects that could benefit from even a few percent of the resources given to the international, mega-budget collaborations that gain the majority of headlines. Counter to the 'good science but wrong time' argument is the serendipitous nature of research; many unforeseen inventions and discoveries have been made by chance, with few predictions hitting the mark.

The celebrity-fixated media tends to skew the public's perception of scientists, representing them more often as solitary geniuses rather than team players. This has led to oversimplified distortions, such as that inflicted on Stephen Hawking for the last few decades of his life. Hawking was treated as a wise oracle on all sorts of science- and future-related questions, some far from his field of expertise. This does neither the individuals involved nor the scientific enterprise any favours. It makes it appear as if a mastermind can pull rabbits out of a hat, rather than hardworking groups spending years on slow, methodical and - let's face it - from the outsider's viewpoint what appears to be somewhat dull research. 

The old-school caricature of the wild-haired, lab-coated boffin is thankfully no longer in evidence, but there are still plenty of popular misconceptions that even dedicated STEM media channels don't appear to have removed. For example, almost everyone I meet fails to differentiate between the science of palaeontology and the non-science of archaeology, the former of course usually being solely associated with dinosaurs. If I had to condense the popular media approach to science, it might be something along these lines:

  • Physics (including astronomy). Big budget and difficult to understand, but sometimes exciting and inspiring
  • Chemistry. Dull but necessary, focusing on improving products from food to pharmaceuticals
  • Biology (usually excluding conventional medicine). Possibly dangerous, both to human ego and our ethical and moral compass (involve religion at this point if you want to) due to both working theories (e.g. natural selection) and practical applications, such as stem cell research. 

Talking of applied science, a more insidious form of pressure has sometimes been used by industry, either to keep consumers purchasing their products or prevent them moving to rival brands. Various patents, such as for longer-lasting products, have been snapped up and hidden by companies protecting their interests, while the treatment meted out to scientific whistle blowers has been legendary. Prominent examples include Rachel Carson's expose of DDT, which led to attacks on her credibility, to industry lobbying of governments to prevent the banning of CFCs after they were found to be destroying the ozone layer.

When the might of commerce is combined with wishful thinking by the scientist involved, it can lead to dreadful consequences. Despite a gathering body of evidence for smoking-related illnesses, the geneticist and tobacco industry spokesman Ronald Fisher - himself a keen pipe smoker - argued for a more complex relationship between nicotine and lung disease. The sector used his prominence to denigrate the truth, no doubt shortening the lives of immense numbers of smokers.

If there's a moral to all this, it is that even at a purely theoretical level science cannot be isolated from all manner of activities and concerns. Next month I'll investigate negative factors within science itself that have had deleterious effects on this uniquely human sphere of accomplishment.

Thursday 24 September 2020

Dangerous cargo: the accidental spread of alien organisms via commercial shipping

It's often said that whichever culture and environment we grow up in is the one we consider as the norm. Whilst my great-grandparents were born before the invention of heavier-than-air flying machines, I've booked numerous long-haul flights without considering much beyond their monetary and environmental cost. Yet this familiarity with our fast and efficient global transportation network masks an unpleasant side effect: it is second only to habitat loss when it comes to endangering biodiversity.

Although many environmental campaigns focus on fossil fuels, deforestation and unsustainable agricultural practices, the (mostly inadvertent) transportation of alien plants, animals and fungi from one region to another has quietly but catastrophically reduced biodiversity in many areas of the planet.

The earliest example I recall learning about was Stephen Jay Gould's heart-felt description of the extinction of French Polynesia's partulid tree snails at the hands of introduced carnivorous snails intended to control edible snail species (which were also deliberately introduced). While the nineteenth and early twentieth centuries saw large numbers of species intentionally established in areas far from their natural territories, the past half century has seen an acceleration in equally disastrous accidental introductions as a by-product of international trade.

A potential starting point for invasion ecology as a discipline in its own right was Oxford professor Charles Elton's 1958 publication The Ecology of Invasions by Animals and Plants. The International Union for Conservation of Nature's Red List of Threatened Species followed six years later. Clearly, the negative effects of our activities were starting to become known. But has enough been done to publicise it in the intervening decades?

The Red list is the most accurate data source for regional biodiversity and the population health of all organisms known to science; yet few non-specialists seem even aware of its existence. Indeed, several decades passed after the list's creation before invasive biology became an important subject in professional ecology. Over the past thirty years the topic has seen a ten-fold increase in publications and citations - a sign of recognition if ever there was one - although mainstream media appears barely aware of its existence.

The IUCN's Invasive Species Specialist Group aids governments and organisations in planning the monitoring, containment, and where possible, destruction of invasive species. It runs the publicly-available Global Invasive Species Database, but its online presence appears to be poorly funded, or at least coordinated. Rather than a central hub there is a plethora of websites featuring varying degrees of professionalism and some distinctly out-of-date content. Perhaps clients are given direct instructions, but as a member of the public I found the ISSG sites bewildering in their variety.

Needless to say, when it does come to taking action, it can be assumed that economic imperatives such as agricultural pests take precedence over preservation of other endangered species. The only country I know of that is attempting a nation-wide eradication of most invasive animals (note: not plants and fungi) is New Zealand, with our Predator Free 2050 project. However, I'm uncertain how realistic it is. Even pre-Covid it appears to have lacked a solid funding source and now - with thirty years and counting until the deadline - there's even less chance of a comprehensive removal of numerous pest species.

What the Predator Free 2050 plan doesn't include is the multitude of plants and animals that slip through the net, so to speak: the legion of species currently invading our offshore environment. It's one thing to actually see land-based plants and animals, but the ocean is largely unknown territory to most people. With over forty thousand cargo vessels moving around the globe every year there is plenty of opportunity for organisms, especially their larval forms, to be inadvertently spread to new territories via both hulls and ballast water. Whilst Killer Algae (a slight hint there in the common name for Caulerpa taxifolia) and the Chinese mitten crab aren't as well-known as Japanese knotweed and Common myna bird they are just two of the many dangerous invaders spreading ever further from their original territories.

It isn't just marine vessels that can carry such dangerous cargo: the immense amount of plastic waste in our oceans can serve as life rafts for the propagation of alien species, albeit at the whim of currents moving rather slower than diesel power. The problem of course is that the oceans are enormous and so the only time the issue becomes known about is when an invasive organism is spotted encroaching in coastal waters. Unfortunately, marine lifeforms can't be easily dealt with using the traps and poison that work on land-based entities; indeed, international regulations seem as much concerned with the dangers of anti-fouling systems as with the issues they prevent.

In 2011 the International Maritime Organization implemented guidelines to minimise vessel biofouling as it relates to the accidental incursions of invasive marine organisms. New Zealand was the first of several nations to execute their own national strategy that turned these guidelines into mandatory practice - and take them further. In addition, New Zealand's National Institute of Water and Atmospheric Research (NIWA) runs annual surveys, particularly around ports, but otherwise their funding appears inadequate to the immensity of the task. 

It's all very well keeping track of the ever-increasing list of resident invasive species around the nation's coastline, but little has been done to remove them. With about 150 types of alien organism now in residence around New Zealand's coast and the same again in occasional visitors, NIWA has been a partner in international competitions aimed at finding pest management solutions, at least for coastal ecosystems if not the deep ocean. Obvious solutions such as scrubbing hulls would just lead to direct contamination of ports, so some new thinking is clearly required.

Of course, the use of cargo ships is unlikely to reduce any time soon. Our global marine transport network is far from in decline and many nations lack the stringent precautions that New Zealand and Australia are now implementing. It has been estimated that cleaning hulls to prevent biofouling could reduce global marine fuel consumption by 10%, so perhaps this commercial benefit may win over those reluctant to spend heavily on prevention measures. But just as fishing vessels are still getting away with immense amounts of by-kill, merchant shipping in many areas of the world appears to be a law unto self.

Preserving regional marine biota is just as critical as land-based environmental protection. Allowing species to proliferate outside their normal range can only lead to deleterious changes - and when combined with our warming, increasingly acidic oceans, this does not bode well for all life on Earth, especially a hungry Homo sapiens. Just because we humans spend most of our time on land, we cannot afford to ignore the far larger ecosystems of the seas.

Monday 24 August 2020

Fundamental fungi: the forgotten kingdom vital to our future

At the end of 1993 the Convention on Biological Diversity came into force. A key piece of global legislation in the promotion of sustainable development, it marked a change in focus for environmental concerns. Whereas previous high-profile conservation efforts such as those of the World Wide Fund for Nature or Greenpeace were frequently aimed at individual species or regional ecosystems, the legislation initiated by the 1992 Earth Summit in Rio de Janeiro was aimed at the biota of the entire planet. However, there are still segments of enormous ecological importance that are lacking sufficient research.

I've previously discussed how little attention general-readership natural history pays to the kingdom of fungi, which may have somewhere between 1.5 million and 3.8 million species. Of these, less than 150,000 have been scientifically described. Clearly, this is one life form where our knowledge barely covers the tip of the iceberg. It's hardly as if this attitude is a new one, either. While Linnaeus produced comprehensive editions on plant and animal taxonomy in the 1750s, it took over seventy years for anyone to bother with fungi: it wasn't until 1821 that another Swedish naturalist, Elias Magnus Fries, produced an equivalent work called Systema Mycologicum.

Thanks to the majority of fungal material living either underground or in dark, damp environments such as leaf litter, the kingdom fails to get the attention it deserves. Even the forms we see more regularly, such as mushrooms and symbiotic lichen, engender little interest. Many people no doubt still mistake the former as plants - and are scared off any interest in the wild forms due to the dangers of poisonous species - while the latter are rarer in polluted, i.e. urban, environments and fail to compete in sight and scent with the glories of the flowering plants.

In the eight years since I wrote about the lack of interest in fungi, I've found reason to mention the long-forgotten kingdom in various important contexts. For a start, numerous animals and plants are becoming critically endangered due to fungal pathogens accidentally being spread by global travel. In addition, research over the past three years has shown that Aspergillus tubingensis and several other types of fungi show promise as a bio-friendly solution to plastic waste. Finally, last month I looked at non-animal protein substitutes, including the mycoprotein-derived Quorn.

Despite the potential of these various forms of fungi, the organism's losses due to rapid environmental changes don't appear to be garnering much attention. The IUCN Red List, which tabulates the differing levels of threat faced by all life on Earth, only shows 343 fungi as currently endangered; this contrasts with over 43,000 plants and 76,000 animals on the list. Undoubtedly, the Kingdom Fungi is being given an underwhelming amount of attention just as we are discovering how important it is to maintaining ecosystem stability and for the future of our species.

Recently published reports of studies conducted in the Amazon region show that deforestation has a long-term impact on soil biota, which in turn affects the entire local ecology. Studies of a range of habitats, such as primary forest, agricultural land (including monoculture), pasture/grazing, forestry plantations and secondary/regenerated forest showed that although overall fungal mass might remain consistent, species diversity is far lower outside of the original rainforest. The lack of fungal variety was linked directly to the lack of plant diversity in those biomes, with recovery a slow or unlikely prospect due to the newly-fragmented nature of the landscape preventing efficient dispersal of fungal spores.

There are some obvious points that agribusiness seems to ignore, such as the effects of pesticides and fertilisers on local fungi and the loss of microhabitats vital to maintaining a healthy variety of fungal species. If only more generalist fungi can survive the change in land use from the wonderful diversity of the rainforest (with up to 400 fungal species per teaspoonful) then this may have repercussions for future farming. As an example, the fungus Fusarium oxysporum has a phytopathogenic effect on agricultural plants including palm oil, but without competition from a wider cross-section of fungi (for example, Paraconiothyrium variabile) it could spread rapidly within a dismal monoculture environment. 

As a predominantly visual species, we humans are unthinkingly biased about the natural world based upon what we see: think cute giant panda versus the unappealing aesthetics of the blobfish. It really is a case of out of sight, out of mind, but unfortunately no amount of spin doctoring will make fungi as much loved as furry mammals. Yet our attitudes need to change if we are to maintain the delicate ecological balance; fungi are highly important for recycling nutrients, regulating carbon dioxide levels, and as a source of food and pharmaceuticals. Yet they remain the soil equivalents of the ubiquitous underwater copepods, unsung heroes of the global ecosystem. It's about time we took a lot more notice of this forgotten kingdom.

Wednesday 22 July 2020

Eco-friendly eats: the potential of new foods to save the planet

Back in 2015 I wrote a post about the potential for insect-based protein to become a much more common environmentally-friendly food in the near future. Although it may be rather better for the environment than traditional ingredients, Western cultures have a cultural bias against eating anything with more than four limbs. So what are the alternatives to conventional farming and fishing that can be eco-friendly but don't rely on insects as their source material?

As someone who hasn't eaten meat in over three decades, I was interested to read that Helsinki-based Solar Foods have found how to create a protein flour from almost nothing. Hydrogen-eating soil bacteria are being used to generate a taste-free product called Solein, intended as an additive in place of other protein sources and also serving as a medium for growing lab-cultured meat. There's even the potential for it to become a livestock feed; could it replace the environmentally appalling Palm Kernel Expeller?

It might sound fantastic, but the issue of course comes down to economics. Current estimates suggest it will be five to ten years before Solein can compete with soya on a commercial scale. It has even been predicted that this sort of precision fermentation may cost as little as ten percent of conventional animal-derived protein by the mid-2030s, indicating a potentially key role in food technology over the next few decades.

This is in marked contrast to growing real meat via laboratory cultures, such as from stem cells. Synthetic meat may be far more ecologically sound than livestock farming (did you know that it takes 15,400 litres of water to produce a kilogram of beef?) but cultured animal flesh is still some years from viability; after all, it's only seven years since the first bio-reactor burger, produced for an eye-watering $300,000!

The United Nations is promoting a reduction in meat consumption to fight climate change, so what are the current options for those wanting to change their diet to reduce agricultural land usage, greenhouse gas emissions and water consumption/pollution? The oldest alternatives to animal protein such as soy or gluten are known allergens, so although these are increasingly widespread - think of the vast number of tofu-based products that have become available over the past twenty years - they are not a completely satisfactory solution. In addition, soya agriculture in developing nations has been linked to critical deforestation, some of which has been committed illegally.

Mycoprotein-based foods (i.e. those derived from fungi) are one possibility. Since the early 1990s, Quorn products have become available in nineteen countries. It has a very small environmental footprint, can be fermented rapidly and is a high-quality form of nutrition. There is some evidence for it being mildly allergenic, but the main sticking point to it spreading to international markets appears to have been due to failure to comply with food standards authorities. My main issue with the product is that for something consisting primarily of fungal filaments grown on glucose syrup, it is very expensive!

Algae is another potential source of meat replacement. Fake shrimp made primarily from red algae (itself a food for real shrimp) are said to be close to the texture and flavour of the real thing. Considering the carbon mileage of commercial shrimp fishing, this product alone could be of tremendous benefit to the environment - including of course to the sustainability and preservation of shrimp species themselves.

An unlikely substitute for meat, at least in terms of texture if not nutrition or taste, is the unripe jack fruit. In the last two years here in New Zealand it has risen from zero to hero and can now be found in supermarkets as a basic canned product as well as being served in vegetarian fast food options; before 2018 I had never seen jackfruit, despite it having been cultivated in Asia for at least six thousand years.

All this isn't to say it will be easy to make a global transition to non-meat diets. Quite apart from the powerful cattle and fishing lobbies, some alternative products use genetically-modified ingredients, which is still a key political issue in many nations. However, with even fast food companies falling over themselves to offer lacto-vegetarian and vegan dishes, the public is slowly but steadily increasing its green eating credentials. Whereas there used to be a clearly defined boundary - at least in more affluent nations - between most people and the vegetarian minority, the likes of the Impossible Sausage and Beyond Burger are now appealing to both groups, the intention obviously being to minimise disruption to the meat-lovers' diet.

With the global human population forecast to peak at over nine billion later this century, responsible eating cannot come a moment too soon. It's slowly beginning to dawn on both Westerners and elsewhere that the rights of the individual to consume a high fat, highly processed, red meat-heavy diet has led to a situation that is bad both for them and for the quality of life of future generations.

Over-exploitation of seafood stocks is already having a profound effect on local ecosystems such as the Sea of Cortez, so a reduction in both types of protein is essential to the long-term health of the oceans as well as the land. Luckily, new start-ups and established companies are beginning to find alternatives that can appeal to even the most ardent of meat eaters. The trick is to find a satisfying diet (that's just what you eat, not something to slim by) that can aid your personal health as well as reducing your carbon footprint, water usage, and other environmental factors. The good news is that the number of options is only going to increase. Why not check one out today?

Tuesday 23 June 2020

Grey matter blues: why has the human brain been shrinking?

There is a disturbing fact about our species that the public don't appear to know, and few specialists seem to want to discuss: over recent millennia, the human brain has been shrinking. There have been plenty of non-scientific warnings about the alleged deleterious effects on IQ of first television and more recently smartphones and tablets, but palaeontological evidences proves that over some tens of thousands of years, the Homo sapiens brain has shrunk somewhere between ten and seventeen percent.

There are usually two key indicators said to provide an accurate measure of smartness: encephalisation quotient and absolute brain size. Encephalisation quotient or EQ is simply the ratio of the mass of the brain to the mass of the body. Overall size is seen as critical due to the number of neural connections required for complex thought processes; you can only squeeze so many neurons into any given volume. Having said that, there is some considerably flexibility around this, thanks to variation in neuron density. The reason that some birds, especially the crow and parrot families are highly intelligent despite the small absolute size of their brains is due to their higher neural density compared to mammals.

Analysis of data from the examination of thousands of human fossil remains suggests that our species reached a peak in EQ around 70,000 years ago, followed by a gradual decline. The reduction in brain size appears to be due to a loss of the archetypal grey matter itself, rather than the white matter that provides support to the neural architecture. However, one key issue is lack of agreement as to a definitive start date for this decline, with 20,000 to 40,000 years ago being the most commonly cited origin. With such basic points remaining unsettled, it's perhaps not surprising that there is a plethora of opinions as to the cause. Here are some of the more popular hypotheses for the decline in human brain size:

1. Change to body size

The first and perhaps most obvious - but easily refuted idea - is that human body size has been steadily declining and so cranial capacity has kept in step with this. While it is true that archaic sapiens may have had a higher mass and even stature than modern humans, the reduction in brain size is greater than would be expected when compared to the overall shrinkage. The assumption is that the development of material culture, from clothing to weapons, has given humans a less calorie-demanding lifestyle.

This would allow - although not dictate - natural selection to trend towards a smaller body size. This doesn't appear to offer any help for the comparatively greater reduction in brain mass, although we should remember that an overall reduction in body size means a smaller birth canal. This in turn requires a smaller skull at birth; as is well known, the human gestation period is three months' less than for similar-size mammals, but our seemingly premature delivery is necessary for the pelvis to maintain efficient bipedalism.

2. Self-domestication

Another idea is that humanity has become domesticated via the impact of culture upon natural selection. Following the population bottleneck of 70,000 years ago - the cause of which is not yet confirmed, despite attempts to correlate it with the Toba super-volcano - there has been continual growth of the human population.

Just as all our domesticated animal species have brain sizes some 10-15% smaller than their wild cousins and ancestors, so the move to larger groups sizes may have led to a more docile humanity, with associated traits such as a smaller cranial capacity being carried along with it.

There are several issues with this hypothesis, ranging from a lack of data on the size of gatherer-hunter bands to the biological mechanisms involved. As regards the latter, there has been some speculation concerning neoteny, in which a species no longer grows to the final stage of maturity. The idea is that if adults are more aggressive than juveniles but peaceful collaboration can lead to larger groups, mutual aid and longer lifespans, then unintentional selective breeding for the retention of juvenile characteristics, including smaller brains, may cause a shift away from the fully mature but more aggressive individuals.

Research in recent years has suggested our brains may continuing to grow into our early thirties rather than cease growing in our teens, so it's possible there could be some truth to this; it would interesting to seek evidence as to whether the brains of archaic sapiens continued growing longer than ours do.

3. The impact of culture

Taking this a step further, increased population density allows a more rapid development and transmission of new ideas, including those that lead to better health, longer lifespans and so to an increased birth rate. Culture and sophisticated language may have reduced the need for most people to gain a wide range of skills - courtesy of a higher intellectual capability - as tasks could be shared and specialisation take hold. In effect, larger societies provide a safety net for those who would be less able to cope in smaller groups.

If ideas could be handed down, then individuals wouldn't have to continually 'reinvent the wheel' in each generation, allowing survival despite a smaller brain size and decreased level of intelligence. The problem with this scenario is that we have no proof the 10-17% reduction has led to an associated drop in intellect; it may well be that the size of certain lobes, used in specialist thought processes such as formulating complex speech, far outweigh any decline in less critical areas.

4. The expensive big brain

One possibility that has a clear cause-and-effect concerns the energy demands of having larger brains. Although they consume a quarter of our daily calories, the human brain is less than five per cent of our body weight. Therefore, there could be a case for arguing the existence of an evolutionary competition between smaller-brained individuals who can survive on less food with those who use their larger brains to improve food collecting strategies. Unfortunately, there are so many variables that it's difficult to judge whether the former would continually trend against the latter and - considering it clearly occurred - why the larger brain managed to evolve in the first place?

5. The more efficient brain

Although a smaller brain might have fewer neurons than a larger version with similar architecture, it has been suggested that its shorter pathways would lead to more rapid thought processing than in a larger counterpart. In addition, there might be fewer neural pathways, again increasing the efficiency. This 'nimble thinking' approach certainly seems logical, although again it doesn't explain the evolution of larger EQ in archaic sapiens.

This is certainly a subject ripe for much more research. I've often concluded with a statement along the lines that it wouldn't be surprising if some or all these factors were involved, since nature rarely conforms to the nice, neat patterns we would like to lay upon it. There is an even possibility that brain size - like so many other aspects of all animal species - fluctuates around a mean value, so that what goes up may come down again, only to later go up again.

At least one anthropological study on both Afro-Americans and US citizens of European descent proposes that over the past few hundred years there may have been an upward drift towards larger brains. Assuming the research is accurate, one possibility is that the superior nutrition available since the Industrial Revolution is allowing such development, thanks to the comparative ease with which its energy demands can be fulfilled.

It would certainly be interesting to investigate this hypothesis on a global scale, considering the wide differences between the clinically obese nations and those still subject to frequent famine. Whatever the results, they are unlikely to be the simple 'just-so' stories often passed-off as to the public in lieu of accurate but understandable science communication. The answers may be out there somewhere...I'd certainly love to know what's been happening to the most sophisticated object in the known universe!


Tuesday 12 May 2020

Ancestral tales: why we prefer fables to fact for human evolution

It seems that barely a month goes by without there being a news article concerning human ancestry. In the eight years since I wrote a post on the apparent dearth of funding in hominin palaeontology there appears to have been some uptake in the amount of research in the field. This is all to the good of course, but what is surprising is that much of the non-specialist journalism - and therefore public opinion - is still riddled with fundamental flaws concerning both our origins and evolution in general.

It also seems that our traditional views of humanity's position in the cosmos is often the source of the errors. It's one thing to make such howlers as the BBC News website did some years' back, in which they claimed chimpanzees were direct human ancestors, but there are a key number of more subtle errors that are repeated time and again. What's interesting is that in order to explain evolution by natural selection, words and phrases have become imbued with incorrect meaning or in some cases, just a slight shift of emphasis. Either way, it seems that evolutionary ideas have been tacked onto existing cultural baggage and in the process, failed to explain the intended theories; personal and socio-political truths have triumphed over objective truth, as Neil deGrasse Tyson might say.

1) As evolutionary biologist Stephen Jay Gould use to constantly point out, the tree of life is like the branches of a bush, not a ladder of linear progression. It's still fairly common to see the phrase 'missing link' applied to our ancestry, among others; I even saw David Attenborough mention it in a tv series about three years' ago. A recent news article described - as if in surprise - that there were at least three species of hominins living in Africa during the past few million years, at the same time and in overlapping regions too. Even college textbooks use it - albeit in quotation marks - among a plethora of other phrases that were once valid, so perhaps it isn't surprising that popular publications continue to use them without qualification.

Evolution isn't a simple, one-way journey through space and time from ancestors to descendants: separate but contemporaneous child species can arise via geographical isolation and then migrate to a common location, all while their parent species continues to exist. An example today would be the lesser black-backed and herring gulls of the Arctic circle, which is either a single, variable species or two clearly distinct species, depending where you look within its range.

It might seem obvious, but species also migrate and then their descendants return to the ancestral homeland; the earliest apes evolved in Africa and then migrated to south-east Asia, some evolving into the ancestors of gibbons and orangutan while others returned to Africa to become the ancestors of gorillas and chimpanzees. One probable culprit of the linear progression model is that some of the examples chosen to teach evolution such as the horse have few branches in their ancestry, giving the false impression of a ladder in which a descendant species always replaces an earlier one.

2) What defines a species is also much misunderstood. The standard description doesn't do any favours in disentangling human evolution; this is where Richard Dawkins' oft-repeated phrase 'the tyranny of the discontinuous mind' comes into play. Examine a range of diagrams for our family tree and you'll find distinct variations, with certain species sometimes being shown as direct ancestors and sometimes as cousins on extinct branches.

If Homo heidelbergensis is the main root stock of modern humans but some of us have small amounts of Neanderthal and/or Denisovan DNA, then do all three qualify as direct ancestors of modern humans? Just where do you draw the line, bearing in mind every generation could breed with both the one before and after? Even with rapid speciation events between long periods of limited variability (A.K.A. punctuated equilibrium) there is no clear cut-off point separating us from them. Yet it's very rare to see Neanderthals labelled as Homo sapiens neanderthalensis and much more common to see them listed as Homo neanderthalensis, implying a wholly separate species.

Are the religious beliefs and easy-to-digest just-so stories blinding us to the complex, muddled background of our origins? Obviously, the word 'race' has profoundly negative connotations these days, with old-school human variation now known to be plain wrong. For example, there's greater genetic variation in the present-day sub-Saharan African population than in the rest of the world combined, thanks to it being the homeland of all hominin species and the out-of-Africa migrations of modern humans occurring relatively recently.

We should also consider that species can be separated by behaviour, not just obvious physical differences. Something as simple as the different pitches of mating calls separate some frog species, with scientific experiments proving that the animals can be fooled by artificially changing the pitch. Also, just because species appear physically similar doesn't necessarily mean an evolutionary close relationship: humans and all other vertebrates are far closer to spiny sea urchins and knobbly sea cucumbers than they are to any land invertebrates such as the insects.

3) Since the Industrial Revolution, societies - at least in the West - have become obsessed with growth, progress and advance. This bias has clearly affected the popular conception that evolution always leads to improvements, along the lines of faster cheetahs to catch more nimble gazelles and 'survival of the fittest'. Books speak of our epoch as the Age of Mammals, when by most important criteria we live in the era of microbes; just think of the oxygen-generating cyanobacteria. Many diagrams of evolutionary trees place humans on the central axis and/or at the pinnacle, as if we were destined to be the best thing that over three billion years of natural selection could achieve. Of course, this is no better than what many religions have said, whereby humans are the end goal of the creator and the planet is ours to exploit and despoil as we like (let's face it, for a large proportion of our existence, modern Homo sapiens was clearly less well adapted to glacial conditions than the Neanderthals).

Above all, these charts give the impression of a clear direction for evolution with mammals as the core animal branch. Popular accounts still describe our distant ancestors, the synapsids, as the 'mammal-like reptiles', even though they evolved from a common ancestor of reptiles, not from reptiles per se. Even if this is purely due to lazy copying from old sources rather than fact-checking, doesn't it belie the main point of the publication? Few general-audience articles admit that all of the earliest dinosaurs were bipedal, presumably because we would like to conflate standing on two legs with more intelligent or 'advanced' (a tricky word to use in a strict evolutionary sense) lineages.

The old ladder of fish-amphibian-reptile/bird-mammal still hangs over us and we seem unwilling to admit to extinct groups (technically called clades) that break our neat patterns. Incidentally, for the past 100 million years or so, about half of all vertebrate species have been teleost fish - so much for the Age of Mammals! No-one would describe the immensely successful but long-extinct trilobites as just being 'pill bug-like marine beetles' or similar, yet when it comes to humans, we have a definite sore spot. There is a deep psychological need to have an obvious series of ever-more sophisticated ancestors paving the way for us.

What many people don't realise is that organisms frequently evolve both physical and behavioural attributes that are subsequently lost and possibly later regained. Some have devolved into far simpler forms, frequently becoming parasites. Viruses are themselves a simplified life form, unable to reproduce without a high-jacked cell doing the work for them; no-one could accuse them of not being highly successful - as we are currently finding out to our cost. We ourselves are highly adaptable generalists, but on a component-by-component level it would appear that only our brains make us as successful as we are. Let's face it, physically we're not up to much: even cephalopods such as squid and octopus have a form of camera eye that is superior to that of all vertebrates.

Even a cursory glance at the natural history of life, using scientific disciplines as disparate as palaeontology and comparative DNA analysis, shows that some lineages proved so successful that their outward physiology has changed very little. Today, there are over thirty species of lancelet that are placed at the base of the chordates and therefore closely related to the ancestors of all vertebrates. They are also extremely similar in appearance to 530-million-year-old fossils of the earliest chordates in the Cambrian period. If evolution were a one-way ticket to progress, why have they not long since been replaced by later, more sophisticated organisms?

4) We appear to conflate success simply with being in existence today, yet our species is a newcomer and barely out of the cradle compared to some old-timers. We recently learned that Neanderthals wove plant fibre to make string and ate a wide variety of seafood. This knowledge brings with it a dwindling uniqueness for modern Homo sapiens. The frequently given explanation of our superiority over our extinct cousins is simply that they aren't around anymore, except as minor components of our genome. But this is a tautology: they are inferior because they are extinct and therefore an evolutionary dead end; yet they became extinct because of their inferiority. Hmmm...there's not much science going on here!

The usual story until recently was that at some point (often centred around 40,000-50,000 years ago) archaic sapiens developed modern human behaviour, principally in the form of imaginative, symbolic thinking. This of course ignores the (admittedly tentative) archaeological evidence of Neanderthal cave-painting, jewelry and ritual, all of which are supposed to be evidence of our direct ancestor's unique Great Leap Forward (yes, it was named after Chairman Mao's plan). Not only did Neanderthals have this symbolic behaviour, they appear to have developed it independently of genetically-modern humans. This is a complete about-turn from the previous position of them being nothing more than poor copyists.

There are alternative hypotheses to the Great Leap Forward, including:
  1. Founder of the Comparative Cognition Project and primate researcher Sarah Boysen observed that chimpanzees can create new methods for problem solving and processing information. Therefore, a gradual accumulation of cognitive abilities and behavioural traits over many millennia - and partially inherited from earlier species - may have reached a tipping point. 
  2. Some geneticists consider there to have been a sudden paradigm shift caused by a mutation of the FOXP2 gene, leading to sophisticated language and all that it entails.
  3. Other researchers consider that once a certain population size and density was achieved, complex interactions between individuals led the way to modern behaviour. 
  4. A better diet, principally in the form of larger amounts of cooked meat, led to increased cognition. 
In some ways, all of these are partly speculative and as is often the case we may eventually find that a combination of these plus other factors were involved. This shouldn't stop us from realising how poor the communication of evolutionary theories still is and how many misconceptions exist, with the complex truth obscured by our need to feel special and to tell simple stories that rarely convey the amazing evolution of life on Earth.



Wednesday 1 April 2020

Herbaceous dialects and dialectical materialism: how plants communicate with their pollinators

The inspiration behind this post stems from reading two of the giants of science popularisation during my formative years. The first component is from Carl Sagan's book Broca's Brain: Reflections on the Romance of Science, which remarks that the emotional lives of plants are an example of pure pseudoscience. The second is Stephen Jay Gould's essay on Pyotr Kropotkin, a nineteenth century Russian anarchist who wrote the essay collection Mutual Aid: A Factor of Evolution. What joins them together is recent research that uncovers an astonishingly complex relationship between certain plants and animals.

Kropotkin's hypothesis was that cooperation between species was as fundamental to life on our planet as natural selection. Although his socialist-motivated ideas have been somewhat downscaled by the evidence of the succeeding century, there are still some truths to be learnt about the mutual aid - or symbiosis if you prefer - between fundamentally different life forms.

I recently read about some experiments in Israel and Germany, which involved such esoteric boffinry as placing laser microphones close to tobacco and tomato plants in order to pick up any ultrasonic noises that they might emit. The plants were heavily pruned or moved into parched soil, in other words, subject to physiological stress.

Analysis of the recordings revealed high-pitch sounds (or in the researchers' words, 'squeals') emanating from their herbaceous guinea pigs. Not only did the sounds vary depending on whether the plant was suffering from mutilation or lack of moisture, but each species (both members of the Solanaceae family) had differing numbers of repetitions and time intervals between each sound. What's even more interesting is the noises differed according to the local invertebrate life, specifically the potential pollinating insects.

In addition to the scientists' equipment, animals such as bats and rodents were placed in the vicinity of the subjects and reacted to the sounds as they were being produced, verifying the shrieks as emanating from the plants. The physiological cause appears to be the movement of air bubbles within liquids such as sap, but how are plants able to perceive the problems, let alone respond to them?

It's been known for some years that plants can communicate with other members of their species via emitting chemical compounds; just think of the odour of freshly cut grass. Forest trees even share nutrients via a symbiotic root system in order to allow smaller members of their species to grow faster - so much for selfish genetics here!

Communication between plants by all three methods, namely direct contact, sound, and chemical odour, suggests purpose and awareness, only without a central nervous system to guide it. This might sound impossible, but then the marine bacteria species Bacillus subtilus uses potassium ions to communicate across its colonies and few would argue that bacterium are more advanced life forms than the kingdom Plantae. We should also remember that in even in animals, brains aren't the be-all and end-all: there are neurons in vertebrate (including human) stomachs and in the arms of cephalopods.

The symbiotic relationship between angiosperms (flowering plants) and pollinating insects evolved in the late Cretaceous, so natural selection has had over sixty-five million years to work on the communications systems between these collaborators. Could it be that plants have evolved a specialist messaging service for their pollinating symbionts, despite having no equivalent of neurons to coordinate it?

Some of the recent Israeli research seems to verify this – and how! When endangered by being cut or deprived of water, the specific noises were not only picked up by pollinating insects, they were acted upon. Insects such as hawk moths flew away from the plants that were suffering drought or mutilation to control specimens on the farthest side of the greenhouse laboratory and laid their eggs upon those plants. Meanwhile, other insects that were known pollinators on the same plant species but not local the region ignored the audio signals. Somehow, there is a level of fine-tuning going on that reveals the sensory world of plants is far superior to what is usually credited.

Parallel experiments successfully tested for the opposite effect. Individual tobacco plants with mature flowers sent messages that attracted the attention of local pollinators such as stilt bugs. All in all, it appears that certain plant species – at least of the Solanaceae family - engage in a form of mutual aid that Kropotkin would be proud of. Not only do plants use ultrasonics to target useful insects, they have developed a messaging service that is regionalised towards those insect species, essentially a dialect rather than a universal language.

While tobacco and tomato plants might not be screaming in pain every time they are cut or lacking water, it seems that they cannot be as easily dismissed as the poorer relation to us animals. The time may be due for a complete reappraisal of their perception capabilities, although amateur researchers would do well to remember that both tomato and tobacco are from the same family as the mandrake and as any Harry Potter fan should know, you wouldn't want to hear those scream!

Tuesday 17 March 2020

Printing ourselves into a corner? Mankind and additive manufacturing

One technology that has seemingly come out of nowhere in recent years is the 3D printer. More correctly called additive manufacturing, it has only taken a few years between the building of early industrial models and a thriving consumer market - unlike say, the gestation period between the invention and availability of affordable domestic video cassette recorders.

Some years ago I mentioned the similarities between the iPAD and Star Trek The Next Generation's PADD, with only several decades separating the real-world item from its science fiction equivalent. Today's 3D printers are not so much a primitive precursor of the USS Enterprise-D's replicator as a paradigm shift away in terms of their profound limitations. And yet they still have capabilities that would have seemed incredibly futuristic when I was a child. As an aside, devices such as 3D printers and tablets show just how flexible and adaptable we humans are. Although my generation would have considered them as pure sci-fi, today's children regularly use them in schools and even at home and consider the pocket calculators and digital watches of my childhood in the same way as I looked at steam engines.

But whilst it can't yet produce an instant cup of earl grey tea, additive manufacturing tools are now being tested to create organic, even biological components. Bioprinting promises custom-made organs and replacement tissue in the next few decades, meaning that organ rejection and immune system repression could become a thing of the past. Other naturally-occurring substances such as ice crystals are also being replicated, in this case for realistic testing of how aircraft wings can be designed to minimise problems caused by ice. All in all, the technology seems to find a home in practically every sector of our society and our lives.

Even our remotest of outposts such as the International Space Station are benefiting from the use of additive manufacturing in cutting-edge research as well as the more humdrum role of creating replacement parts - saving the great expense of having to ship components into space. I wouldn't be surprised if polar and underwater research bases are also planning to use 3D printers for these purposes, as well as for fabricating structures in hostile environments. The European Space Agency has even been looking into how to construct a lunar base using 3D printing, with tests involving Italian volcanic rock as a substitute for lunar regolith.

However, even such promising, paradigm-shifting technologies as additive manufacturing can have their negative aspects. In this particular case there are some obvious examples, such as home-printed handguns (originally with very short lifespans, but with the development of 3D printed projectiles instead of conventional ammunition, that is changing.) There are also subtle but more profound issues that arise from the technology, including how reliance on these systems can lead to over-confidence and the loss of ingenuity. It's easy to see the failure due to hubris around such monumental disasters as the sinking of the Titanic, but the dangers of potentially ubiquitous 3D printing technology are more elusive.

During the Apollo 13 mission in 1970, astronauts and engineers on the ground developed a way to connect the CSM's lithium hydroxide canisters to the LM's air scrubbers, literally a case of fitting a square peg into a round hole. If today's equivalents had to rely solely on a 3D printer - with its power consumption making it a less than viable option - they could very well be stuck. Might reliance on a virtual catalogue of components that can be manufactured at the push of a button sap the creativity vital to the next generation of space explorers?

I know young people who don't have some of the skills that my generation deemed fairly essential, such as map reading and basic arithmetic. But deeper than this, creative thinking is as important as analytical rigour and mathematics to the STEM disciplines. Great physicists such as Einstein and Richard Feynman stated how much new ideas in science come from daydreaming and guesswork, not by sticking to robot-like algorithmic processes. Could it be that by using unintelligent machines in so many aspects of our lives we are starting to think more like them, not vice versa?

I've previously touched on how consumerism may be decreasing our intelligence in general, but in this case might such wonder devices as 3D printers be turning us into drones, reducing our ability to problem-solve in a crisis? Yes, they are a brave new world - and bioprinting may prove to be a revolution in medicine - but we need to maintain good, old-fashioned ingenuity; what we in New Zealand call the 'Number 8 wire mentality'. Otherwise, our species risks falling into the trap that there is a wonder device for every occasion - when in actual fact the most sophisticated object in the known universe rests firmly inside our heads.

Tuesday 25 February 2020

Falling off the edge: in search of a flat Earth

It's just possible that future historians will label the 21st century as the Era of Extreme Stupidity. In addition to the 'Big Four' of climate change denial, disbelief in evolution by natural selection, young Earth creationism and the anti-vaxxers, there are groups whose oddball ideas have rather less impact on our ecosystem and ourselves. One segment of people that I place in the same camp as UFO abductees and their probing fixation are believers in a flat Earth.

Although on the surface this - admittedly tiny - percentage of people appear to be more amusing than harmful, their media visibility makes them a microcosm of the appalling state of science education and critical thinking in general. In addition, their belief in an immense, long-running, global conspiracy adds ammunition to those with similar paranoid delusions, such as the moon landing deniers. One example of how intense those beliefs can be (at times there's just a whiff of religious fanaticism), the American inventor and stuntman 'Mad' Mike Hughes was killed recently flying a self-built rocket intended to prove that the Earth is a disc.

I won't bother to describe exactly what the flat Earthers take to be true, except that their current beliefs resemble a description of the late, great Terry Pratchett's fantasy Discworld - albeit without the waterfall around the edge of the disc. For anyone who wants to test the hypothesis themselves rather than rely on authority (the mark of a true scientist) there are plenty of observational methods to try. These include:
  1. Viewing the Earth's shadow on the Moon during a lunar eclipse
  2. Noticing that a sailing ship's mast disappears/reappears on the horizon after/before the hull
  3. How certain stars are only visible at particular latitudes
For anyone with a sense of adventure, you can also build a high-altitude balloon or undertake a HAHO skydive to photograph the Earth's curvature - from any point on the planet!

It's easy to suggest that perhaps our brains just aren't up to the task of deciphering the intricacies of a 13.7 billion old universe, but basic experiments and observations made several thousand years ago were enough for Greek scientists to confirm both the shape and size of our planet. So what has changed in the past century or so to turn back the clock, geophysically-speaking?

The modern take on a flat Earth seems to have begun in the late 19th century, with an attempt - similar to contemporary mid-Western creationists - to ignore scientific discoveries that disagree with a literal interpretation of the Old Testament. Indeed, the forerunners of today's flat Earthers were anti-science in many respects, also denying that prominent enemy of today's Biblical literalists, evolution by natural selection. However, many of the 21st century' s leading adherents to a disc-shaped Earth have more sympathy and interest in scientific discoveries, even supporting such politically contentious issues as rapid, human-induced, climate change.

This topic is laden with ironies, few greater than the fact that a large proportion of the evidence for global warming is supplied by space agencies such as NASA. The latter has long been claimed by the Flat Earth Society as a leading conspirator and purveyor of faked imagery in the promotion of a spherical earth (yes to all pedants, I know that strictly speaking our planet is an oblate spheroid, not purely spherical).

Today's flat Earth societies follow the typical pseudo-scientific / fringe approach, analysing the latest science theories for material they can cherry pick and cannibalise to support their ideas. In recent years they've even tackled key new developments such as dark energy; in fact, about the only area they are lagging behind in is the incorporation of elements involving quantum mechanics.

But for anyone with an understanding of parsimony or Occam's Razor, the physics for a flat Earth have about as much likelihood as Aristotle's crystalline spheres. It isn't just the special pleading for localised astrophysics (since the other planets are deemed spherical); isn't it obviously absurd that there could be a global conspiracy involving rival nations and potentially hundreds of thousands of people - with no obvious explanation of what the conspirators gain from the deception?

Even for the vast majority of the public with little interest or understanding of the physics, most people considering the flat Earth hypothesis are presumably puzzled by this apparent lack of motivation. In a nutshell, what's in it for the conspirators? Until recently, NASA (nick-named 'Never A Straight Answer,') was the main enemy, but with numerous other nations and private corporations building space vehicles, there is now a plethora of conspiracy partners. Going back half a century to the height of the Cold War why, for example, would the USA and Soviet Union have agreed to conspire? As yet, there hasn't been anything approaching a satisfactory answer; but ask Carl Sagan said: "Extraordinary claims require extraordinary evidence."

Unlike most fringe groups, flat Earthers don't appear to favour other, popular conspiracy theories above scientific evidence. Yet somehow, their ability to support ludicrous ideas whilst denying fundamental observations and the laws of physics in the light of so much material evidence is astonishing.  Of course our species doesn't have a mental architecture geared solely towards rational, methodical thought processes, but the STEM advances that Homo sapiens has made over the millennia prove we are capable of suppressing the chaotic, emotional states we usually associate with young children.

Whether we can transform science education into a cornerstone topic, as daily-relevant as reading, writing and arithmetic, remains to be seen. Meanwhile, the quest continues for funding a voyage to find the Antarctic ice wall that prevents the oceans falling over the edge of the world. Monty Python, anyone?

Wednesday 22 January 2020

Wildfires and woeful thinking: why have Australians ignored global warming?

In a curious example of serendipity, I was thinking about a quote from the end of Carl Sagan's novel Contact ("For small creatures such as we the vastness is bearable only through love") just a few minutes before discovering his daughter Sasha Sagan's book For Small Creatures Such as We. Okay, so I didn't buy the book - due to the usual post-Christmas funds shortage - and cannot provide a review, but this indication of our place in the scale of creation is something that resonates deep within me.

I've often discussed how biased we are due to our physical size, especially when compared to other species we share the planet with. However, I've never really considered that other fundamental dimension, time. Another Carl Sagan quote echoes many a poet's rumination on our comparatively brief lifespan: "We are like butterflies who flutter for a day and think it is forever."

There's more to this than just fairly familiar poetic conceit. Earlier this month I was given a brief taste of what it might be like to live on Mars, thanks to high-altitude dust and ash transported across the Tasman Sea from the Australian bush fires. By three o'clock in the afternoon a New Zealand summer's day was turned into an eerie orange twilight, with birds and nocturnal insects starting their evening routine some five hours early. There was even a faint powdery, acrid taste in the air, adding to the sense of other-worldliness.

Apart from the obvious fact that this an example of how climate change in one nation can affect another, there is a more disturbing element to all this. Why is it that despite the reports and general consensus of the global climate science community Australians have shown a woeful lack of interest, or indeed, negativity, towards climate change?

Could it be that our society is now centred upon such short increments of time (competing businesses trying to out-do each other, which comes down to working at the ever-increasing speed our technology dictates) that we have replaced analysis with unthinking acceptance of the simplest and most aggressive opinions? Research shows that compared to even twenty years' ago, children read far less non-school literature and rely on the almost useless 'celebrity' shouters of social media for much of their information; there's not much chance of learning about informed, considered arguments via these sources!

After all, it's difficult for most of us to remember exact details of the weather a year ago, but understanding climate change relies on acceptance of directional trends over at least decades. How much easier is it to accept the opinions of those who preserve the status quo and claim we can maintain our current lifestyle with impunity? When combined with the Western capitalist notion of continuous growth and self-regulation, we see a not-so-subtle indoctrination that describes action to prevent climate change as disruptive to the fundamental aspects of the society that has arisen since the Industrial Revolution.

There is an old French saying that we get the government we deserve, which in Australia's case, implies a widespread desire to ignore or even deny global warming. Yet the irony is that of all developed nations, Australia has been at the receiving end of some of its worst effects, thanks to an average increase in daily temperature of several degrees over past century. It takes little cognition to understand how this can lead to the drier conditions that have caused the horrific bush fires; even though some have been deliberately started, their scale has been exacerbated by the change of climate. So what until now has prevented Australians from tying the cause to the effects?

It's not as if there isn't plenty of real-world evidence. However, with computer technology able to generate 'deep fakes', which implies a level of sophistication that only experts can detect, is the public becoming mistrustful of the multitude of videos and photographs of melting polar caps and shrinking glaciers? When combined with the decreased trust in authority figures, scientists and their technical graphs and diagrams don't stand much of a chance of acceptance without a fair amount of suspicion. As mentioned, it's difficult to understand the subtleties inherent in much of science when you are running at breakneck speed just to stand still; slogans and comforting platitudes are much more acceptable - unless of course people become caught up in the outcome themselves.

However, this doesn't explain why it is the key phrases such as 'climate change' and 'global warming' generate such negative sentiment, even from those Australian farmers who admit to hotter, drier conditions than those experienced by their parents' and grandparents' generations. Somehow, these sober terms have become tainted as political slogans rather than scientifically-derived representations of reality. That this negativity has been achieved by deniers seems incredible, when you consider that not only does it run counter to the vast majority of report data but that it comes from many with vested interests in maintaining current industrial practices and levels of fossil fuel usage.

Could it simply be a question of semantics, with much-used labels deemed unacceptable at the same time as the causes of directly-experienced effects accepted as valid? If so, it would suggest that our contemporary technological society differs little from the mindset of pre-industrial civilisation, in which leaders were believed to have at very least a divine right to rule, or even a divine bloodline. In which case, is it appalling to suggest that the terrible bush fires have occurred not a minute too soon?

If it is only by becoming victims at the tip of the impending (melted) iceberg that global warming is deemed genuine, then so be it. When scientists are mistrusted and activists labelled as everything from misguided to corrupt and scheming manipulators, this might only leaves a taste of what lies ahead to convince a majority who would otherwise rather keep doing as they always have done and trust politicians to do the thinking for them. I can think of nothing more apt to end on than another Carl Sagan quote: "For me, it is far better to grasp the Universe as it really is than to persist in delusion, however satisfying and reassuring."