Wednesday, 12 September 2018

Seasons of the mind: how can we escape subjective thinking?

According to some people I've met, the first day of spring in the Southern Hemisphere has been and gone with the first day of September. Not incidentally, there are also some, myself included, who think that it has suddenly started to feel a bit warmer. Apparently, the official start date is at the spring equinox during the third week of September. So on the one hand, the weather has been warming since the start of the month but on the other, why should a planet followed neat calendrical conventions, i.e. the first of anything? Just how accurate is the official definition?

There are many who like to reminisce about how much better the summer weather was back in their school holidays. The rose-tinted memories of childhood can seem idyllic, although I also recall summer days of non-stop rain (I did grow up in the UK, after all). Therefore our personal experiences, particularly during our formative years, can promote an emotion-based response that is so far ingrained we fail to consider they may be inaccurate. Subjectivity and wishful thinking are key to the human experience: how often do we remember the few hits and not the far more misses? As science is practiced by humans it is subject to the same lack of objectivity as anything else; only its built-in error-checking can steer practitioners onto a more rational course than in other disciplines.

What got me to ponder the above was that on meeting someone a few months' ago for the first time, almost his opening sentence was a claim that global warming isn't occurring and that instead we are on the verge of an ice age. I didn't have time for a discussion on the subject, so I filed that one for reply at a later date. Now seems like a good time to ponder what it is that leads people to make such assertions that are seemingly contrary to the evidence.

I admit to being biased on this particular issue, having last year undertaken research for a post on whether agriculture has postponed the next glaciation (note that this woolly - but not mammoth, ho-ho - terminology is one of my bugbears: we are already in an ice age, but currently in an interglacial stage). Satellite imagery taken over the past few decades shows clear evidence of large-scale reductions in global ice sheets. For example, the northern polar ice cap has been reduced by a third since 1980, with what remains only half its previous thickness. Even so, are three decades a long enough period to make accurate predictions? Isn't using a scale that can be sympathetic with the human lifespan just as bad as relying on personal experience?

The UK's Met Office has confirmed that 2018 was that nation's hottest summer since records began - which in this instance, only goes back as far back as 1910.  In contrast, climate change sceptics use a slight growth in Antarctic sea ice (contrary to its steadily decreasing continental icesheet) as evidence of climate equilibrium. Now I would argue that this growth is just a local drop in the global ocean, but I wonder if my ice age enthusiast cherry-picked this data to formulate his ideas? Even so, does he believe that all the photographs and videos of glaciers, etc. have been faked by the twenty or so nations who have undertaken Earth observation space missions? I will find out at some point!

If we try to be as objective as possible, how can we confirm with complete certainty the difference between long term climate change and local, short term variability? In particular, where do you draw the line between the two? If we look at temporary but drastic variations over large areas during the past thousand years, there is a range of time scales to explore. The 15th to 18th centuries, predominantly the periods 1460-1550 and 1645-1715, contained climate variations now known as mini ice ages, although these may have been fairly restricted in geographic extent. Some briefer but severe, wide-scale swings can be traced to single events, such as the four years of cold summers following the Tambora eruption of 1815.

Given such variability over the past millennium, in itself a tiny fragment of geological time, how much certainty surrounds the current changes? The public have come to expect yes or no answers delivered with aplomb, yet some areas of science such as climate studies involve chaos mathematics, thus generating results based on levels of probability. What the public might consider vacillation, researchers consider the ultimate expression of scientific good practice. Could this lack of black-and-white certainty be why some media channels insist on providing a 'counterbalancing' viewpoint from non-expert sources, as ludicrous as this seems?

In-depth thinking about a subject relies upon compartmentalisation and reductionism. Otherwise, we would forever be bogged down in the details and never be able to form an overall picture. But this quantising of reality is not necessarily a good thing if it generates a false impression regarding cause and effect. By suffering from what Richard Dawkins calls the “tyranny of the discontinuous mind” we are prone to creating boundaries that just don't exist. In which case, could a line ever be found between short term local variation and global climate change? Having said that, I doubt many climate scientists would use this as an excuse to switch to weather forecasting instead. Oh dear: this is beginning to look like a 'does not compute' error!

In a sense of course we are exceptionally lucky to have developed science at all. We rely on language to define our ideas, so need a certain level of linguistic sophistication to achieve this focus; tribal cultures whose numbers consist of imprecise values beyond two are unlikely to achieve much headway in, for example, classifying the periodic table.

Unfortunately, our current obsession with generating information of every quality imaginable and then loading it to all available channels for the widest possible audience inevitably leads to a tooth-and-claw form of meme selection. The upshot of this bombardment of noise and trivia is to require an enormous amount of time just filtering it. The knock-on effect being that minimal time is left for identifying the most useful or accurate content rather than simply the most disseminated.

Extremist politicians have long been adept at exploiting this weakness to expound polarising phraseology that initially sounds good but lacks depth; they achieve cut-through with the simplest and loudest of arguments, fulfilling the desire most people have to fit into a rigid social hierarchy - as seen in many other primate species. The problem is that in a similar vein to centrist politicians who can see both sides of an argument but whose rational approach negates emotive rhetoric, scientists are often stuck with the unappealing options of either taking a stand when the outcome is not totally clear, or facing accusations of evasion. There is current trend, particularly espoused by politicians, to disparage experts, but discovering how the universe works doesn't guarantee hard-and-fast answers supplied exactly when required and which provide comfort blankets in a harsh world.

Where then does this leave critical thinking, let alone science? Another quote from Richard Dawkins is that "rigorous common sense is by no means obvious to much of the world". This pessimistic view of the human race is supported by many a news article but somewhat negated by the immense popularity of star science communicators, at least in a number of countries.

Both the methods and results of science need to find a space amongst the humorous kitten videos, conspiracy theorists and those who yearn for humanity to be the pinnacle and purpose of creation. If we can comprehend that our primary mode of thinking includes a subconscious baggage train of hopes, fears and distorted memories, we stand a better chance of seeing the world for how it really is and not how we wish it to be. Whether enough of us can dissipate that fog remains to be seen. Meanwhile, the ice keeps melting and the temperature rising, regardless of what you might hear...

Monday, 27 August 2018

Hammer and chisel: the top ten reasons why fossil hunting is so important

At a time when the constantly evolving world of consumer digital technology seems to echo the mega-budget, cutting-edge experiments of the LHC and LIGO, is there still room for such an old-fashioned, low-tech science as paleontology?

The answer is of course yes, and while non-experts might see little difference between its practice today and that of its Eighteenth and Nineteenth Century pioneers, contemporary paleontology does on occasion utilise MRI scanners among other sophisticated equipment. I've previously discussed the delights of fossil hunting as an easy way to involve children in science, yet the apparent simplicity of its core techniques mask the key role that paleontology still plays in evolutionary biology.

Since the days of Watson and Crick molecular biology has progressed in leaps and bounds, yet the contemporary proliferation of cheap DNA-testing kits and television shows devoted to gene-derived genealogy obscure just how tentatively some of their results should be accepted. The levels of accuracy quoted in non-specialist media is often far greater than what can actually be attained. For example, the data on British populations has so far failed to separate those with Danish Viking ancestry from descendants of earlier Anglo-Saxon immigration, leading to population estimates at odds with the archaeological evidence.


Here then is a list of ten reasons why fossil hunting will always be a relevant branch of science, able to supply information that cannot be supplied by other scientific disciplines:
  1. Locations. Although genetic evidence can show the broad sweeps connecting extant (and occasionally, recently-extinct) species, the details of where animals, plants or fungi evolved, migrated to - and when - relies on fossil evidence.
  2. Absolute dating. While gene analysis can be used to obtain the dates of a last common ancestor shared by contemporary species, the results are provisional at best for when certain key groups or features evolved. Thanks to radiometric dating from some fossiliferous locales, paleontologists are able to fill in the gaps in fossil-filled strata that don't have radioactive mineralogy.
  3. Initial versus canonical. Today we think of land-living tetrapods (i.e. amphibians, reptiles, mammals and birds) as having a maximum of five digits per limb. Although these are reduced in many species – as per horse's hooves – five is considered canonical. However, fossil evidence shows that early terrestrial vertebrates had up to eight digits on some or all of their limbs. We know genetic mutation adds extra digits, but this doesn't help reconstruct the polydactyly of ancestral species; only fossils provide confirmation.
  4. Extinct life. Without finding their fossils, we wouldn't know of even such long-lasting and multifarious groups as the dinosaurs: how could we guess about the existence of a parasaurolophus from looking at its closest extant cousins, such as penguins, pelicans or parrots? There are also many broken branches in the tree of life, with such large-scale dead-ends as the pre-Cambrian Ediacaran biota. These lost lifeforms teach us something about the nature of evolution yet leave no genetic evidence.
  5. Individual history. Genomes show the cellular state of an organism, but thanks to fossilised tooth wear, body wounds and stomach contents (including gastroliths) we have important insights into day-to-day events in the life of ancient animals. This has led to fairly detailed biographies of some creatures, prominent examples being Sue the T-Rex and Al the Allosaurus, their remains being comprehensive enough to identify various pathologies.
  6. Paleoecology. Coprolites (fossilised faeces), along with the casts of burrows, help build a detailed enviromental picture that zoology and molecular biology cannot provide. Sometimes the best source of vegetation data comes from coprolites containing plant matter, due to the differing circumstances of decomposition and mineralisation.
  7. External appearance. Thanks to likes of scanning electron microscopes, fossils of naturally mummified organisms or mineralised skin can offer details that are unlikely to be discovered by any other method. A good example that has emerged in the past two decades is the colour of feathered dinosaurs obtained from the shape of their melanosomes.
  8. Climate analysis. Geological investigation can provide ancient climate data but fossil evidence, such as the giant insects of the Carboniferous period, confirm the hypothesis. After all, dragonflies with seventy centimetre wingspans couldn't survive with today's level of atmospheric oxygen.
  9. Stratigraphy. Paleontology can help geologists trying to sequence an isolated section of folded stratigraphy that doesn't have radioactive mineralogy. By assessing the relative order of known fossil species, the laying down of the strata can be placed in the correct sequence.
  10. Evidence of evolution. Unlike the theories and complex equipment used in molecular biology, anyone without expert knowledge can visit fossils in museums or in situ. They offer a prominent resource as defence against religious fundamentalism, as their ubiquity makes them difficult to explain by alternative theories. The fact that species are never found in strata outside their era supports the scientific view of life's development rather than those found in religious texts (the Old Testament, for example, erroneously states that birds were created prior to all other land animals).
To date, no DNA has been found over about 800,000 years old. This means that many of the details of the history of life rely primarily on fossil evidence. It's therefore good to note that even in an age of high-tech science, the painstaking techniques of paleontology can shed light on biology in a way unobtainable by more recent examples of the scientific toolkit. Of course, the study is far from fool-proof: it is thought that only about ten percent of all species have ever come to light in fossil form, with the found examples heavily skewed in favour of shallow marine environments.

Nevertheless, paleontology is a discipline that constantly proves its immense value in expanding our knowledge of the past in a way no religious text could ever do. It may be easy to understand what fossils are, but they are assuredly worth their weight in gold: precious windows onto an unrecoverable past.

Monday, 13 August 2018

Life on Mars? How accumulated evidence slowly leads to scientific advances

Although the history of science is often presented as a series of eureka moments, with a single scientist's brainstorm paving the way for a paradigm-shifting theory, the truth is usually rather less dramatic. A good example of the latter is the formulation of plate tectonics, with the meteorologist Alfred Wegener's continental drift being rejected by the geological orthodoxy for over thirty years. It was only with the accumulation of data from late 1950's onward that the mobility of Earth's crust slowly gained acceptance, thanks to the multiple strands of new evidence that supported it.

One topic that looks likely to increase in popularity amongst both public and biologists is the search for life on Mars. Last month's announcement of a lake deep beneath the southern polar ice cap is the latest piece of observational data that Mars might still have environments suitable for microbial life. This is just the latest in an increasing body of evidence that conditions may be still be capable of supporting life, long after the planet's biota-friendly heyday. However, the data hasn't always been so positive, having fluctuated in both directions over the past century or so. So what is the correspondence between positive results and the levels of research for life on Mars?

The planet's polar ice caps were first discovered in the late Seventeenth Century, which combined with the Earth-like duration of the Martian day implied the planet might be fairly similar to our own. This was followed a century later by observation of what appeared to be seasonal changes to surface features, leading to the understandable conclusion of Mars as a temperate, hospitable world covered with vegetation. Then another century on, an early use of spectroscopy erroneously described abundant water on Mars; although the mistake was later corrected, the near contemporary reporting of non-existent Martian canals led to soaring public interest and intense speculation. The French astronomer Camille Flammarion helped popularise Mars as a potentially inhabited world, paving the way for H.G. Wells' War of the Worlds and Edgar Rice Burroughs' John Carter series.

As astronomical technology improved and the planet's true environment became known (low temperatures, thin atmosphere and no canals), Mars' popularity waned. By the time of Mariner 4's 1965 fly-by, the arid, cratered and radiation-smothered surface it revealed only served to reinforce the notion of a lifeless desert; the geologically inactive world was long past its prime and any life still existing there probably wouldn't be visible without a microscope.

Despite this disappointing turnabout, NASA somehow managed to gain the funding to incorporate four biological experiments on the two Viking landers that arrived on Mars in 1976. Three of the experiments gave negative results while the fourth was inconclusive, most researchers hypothesising a geochemical rather than biological explanation for the outcome. After a decade and a half of continuous missions to Mars, this lack of positive results - accompanied by experimental cost overruns - probably contributed to a sixteen-year hiatus (excluding two Soviet attempts at missions to the Martian moons). Clearly, Mars' geology by itself was not enough to excite the interplanetary probe funding czars.

In the meantime, it was some distinctly Earth-bound research that reignited interested in Mars as a plausible source of life. The 1996 report that Martian meteorite ALH84001 contained features resembling fossilised (if extremely small) bacteria gained worldwide attention, even though the eventual conclusion repudiated this. Analysis of three other meteorites originating from Mars showed that complex organic chemistry, lava flows and moving water were common features of the planet's past, although they offered no more than tantalising hints that microbial life may have flourished, possibly billions of years ago.

Back on Mars, NASA's 1997 Pathfinder lander delivered the Sojourner rover. Although it appeared to be little more than a very expensive toy, managing a total distance in its operational lifetime of just one hundred metres, the proof of concept led to much larger and more sophisticated vehicles culminating in today’s Curiosity rover.

The plethora of Mars missions over the past two decades has delivered immense amounts of data, including that the planet used to have near-ideal conditions for microbial life - and still has a few types of environment that may be able to support miniscule extremophiles.

Together with research undertaken in Earth-bound simulators, the numerous Mars projects of the Twenty-first Century have to date swung the pendulum back in favour of a Martian biota. Here are a few prominent examples:

  • 2003 - atmospheric methane is discovered (the lack of active geology implying a biological rather than geochemical origin)
  • 2005 - atmospheric formaldehyde is detected (it could be a by-product of methane oxidation)
  • 2007 - silica-rich rocks, similar to hot springs, are found
  • 2010 - giant sinkholes are found (suitable as radiation-proof habitats)
  • 2011 - flowing brines and gypsum deposits discovered
  • 2012 - lichen survived for a month in the Mars Simulation Laboratory
  • 2013 - proof of ancient freshwater lakes and complex organic molecules, along with a long-lost magnetic field
  • 2014 - large-scale seasonal variation in methane, greater than usual if of geochemical origin
  • 2015 - Earth-based research successfully incubates methane-producing bacteria under Mars-like conditions
  • 2018 - a 20 kilometre across brine lake is found under the southern polar ice sheet

Although these facts accumulate into an impressive package in favour of Martian microbes, they should probably be treated as independent points, not as one combined argument. For as well as finding factors supporting microbial life, other research has produced opposing ones. For example, last year NASA found that a solar storm had temporarily doubled surface radiation levels, meaning that even dormant microbes would have to live over seven metres down in order to survive. We should also bear in mind that for some of each orbit, Mars veers outside our solar system's Goldilocks Zone and as such any native life would have its work cut out for it at aphelion.

A fleet of orbiters, landers, rovers and even a robotic helicopter are planned for further exploration in the next decade, so clearly the search for life on Mars is still deemed a worthwhile effort. Indeed, five more missions are scheduled for the next three years alone. Whether any will provide definitive proof is the big question, but conversely, how much of the surface - and sub-surface - would need to be thoroughly searched before concluding that Mars has either never had microscopic life or that it has long since become extinct?

What is apparent from all this is that the quantity of Mars-based missions has fluctuated according to confidence in the hypothesis. In other words, the more that data supports the existence of suitable habitats for microbes, the greater the amount of research to find them. In a world of limited resources, even such profoundly interesting questions as extra-terrestrial life appear to gain funding based on the probability of near-future success. If the next generation of missions fails to find traces of even extinct life, my bet would be a rapid and severe curtailing of probes to the red planet.

There is a caricature of the stages that scientific hypotheses go through, which can ironically best be described using religious terminology: they start as heresy; proceed to acceptance; and are then carved into stone as orthodoxy. Of course, unlike with religions, the vast majority of practitioners accept the new working theory once the data has passed a certain probability threshold, even if it totally negates an earlier one. During the first stage - and as the evidence starts to be favourable - more researchers may join the bandwagon, hoping to be the first to achieve success.

In this particular case, the expense and sophistication of the technology prohibits entries from all except a few key players such as NASA and ESA. It might seem obvious that in expensive, high-tech fields, there has to be a correlation between hypothesis-supporting facts and the amount of research. But this suggests a stumbling block for out-of-the-box thinking, as revolutionary hypotheses fail to gain funding without at least some supporting evidence.

Therefore does the cutting-edge, at least in areas that require expensive experimental confirmation, start life as a chicken-and-egg situation? Until data providentially appears, is it often the case that the powers-that-be have little enticement for funding left-field projects? That certainly seems to have been true for meteorologist Alfred Wegener and his continental drift hypothesis, since it took several research streams to codify plate tectonics as the revolutionary solution. 

Back to Martian microbes. Having now read in greater depth about seasonal methane, it appears that the periodicity could be due to temperature-related atmospheric changes. This only leaves the scale of variation as support for a biological rather than geochemical origin. Having said that, the joint ESA/Roscosmos ExoMars Trace Gas Orbiter may find a definitive answer as to its source in the next year or so, although even a negative result is unlikely to close the matter for some time to come. Surely this has got to be one of the great what-ifs of our time? Happy hunting, Mars mission teams!

Monday, 30 July 2018

Biophilic cities: why green is the new black

I've previously discussed the notion that children who spend more time outside in natural surroundings are more likely to have improved mental and physical health compared to their indoors, gadget-centred peers, but does the same hold true for adults as well? After all, there have been many claims that the likes of the fractal geometry of natural objects, the sensual stimulation, the random behaviour of animals, even feeling breezes or better air quality can have a positive or 'wellness' (horrific term though it is) effect.

It is pretty much a given that the larger the percentage of nature existing within conurbations, the greater the improvement to the local environment. This begins at the practical level, with vegetation mitigating extremes of heat while its roots helps reduce flooding. In addition, fauna and flora gain more room to live in, with a greater number of species able to survive than just the usual urban adaptees such as rats and pigeons. What about the less tangible benefits to humans, culminating in a better quality of life? Science isn't wishful thinking, so what about the evidence for more nature-filled urban environments improving life for all its citizens, not just children?

Studies suggest that having window views of trees can increase concentration and wellbeing in the workplace, while for hospital patients there is a clear correlation between types of view and both the length of recovery periods and painkiller usage. Therefore it seems that even the appearance of close-at-hand nature can have an effect, without the necessity of immersion. Having said that, there are clear advantages to having a public green space, since it allows a wide range of activities such as flying kites, playing ball games, jogging and boot camps, or just having a picnic.

Our largely sedentary, over-caloried lives necessitate as much physical activity as we can get, but there is apparently something greater than just physical exercise behind nature as a promoter of wellbeing. Investigation appears to show that spaces with trees and the hint of wilderness are far more beneficial than the unnatural and restricted geometries of manicured lawns and neatly maintained flower beds. It seems that we are still very much beholden to the call of the wild. If this is a fundamental component of our highly civilised lives, are urban planners aware of this and do they incorporate such elements into our artificial environments?

The concept of integrating nature into our towns and cities certainly isn't a new one. As a child, I occasionally visited Letchworth Garden City, a town just north of London. As the name suggests, it was an early form of 'Green Belt' planning, created at the start of the Twentieth century and divided into sectors for residential, industrial and agricultural usage. In its first half century it tried to live up to its intention to be self-sufficient in food, water and power generation, but this later proved impractical. I don't recall it being anything special, but then its heyday as a mecca for the health conscious (at a time when the likes of exercise and vegetarianism were associated with far left-wing politics) has long since passed. As to whether the inhabitants have ever been mentally - or even physically - advantaged compared to the older conurbations elsewhere in the UK, I cannot find any evidence.

Across the Atlantic, the great American architect Frank Lloyd-Wright conceived of something similar but on a far larger scale. His Broadacre City concept was first published in 1932, with the key idea that every family would live on an acre-sized plot. However, Lloyd-Wright's concept - apart from being economically prohibitive - relied on private cars (later updated to aerator, a form of personal helicopter) for most transportation; sidewalks were largely absent from his drawings and models. Incidentally, some US cities today have partially adopted the sidewalk-free model but without Lloyd-Wright's green-oriented features. For example, there are suburbs in oil-centric Houston that are only reachable by car; you have to drive even to reach shopping malls you can see from your own home, with high pedestrian mortality rates proving the dangers of attempting to walk anywhere. Back to Lloyd-Wright: like many of his schemes, his own predilections and aesthetic sensibilities seem to have influenced his design rather more than any evidence-based insight into social engineering.

In recent years the term 'biophilic cities' has been used to describe conurbations attempting to increase their ratio of nature to artifice, often due to a combination of public campaigning and far-sighted local governments. Although these schemes cover much wider ground than just human wellbeing (prominent issues being reduction in power usage and waste, greater recycling and ecological diversity, etc), one of the side effects of the improvements is to quality of life. Thirteen cities joined the Biophilic Cities project in 2013, but others are just as committed in the long-term to offsetting the downsides of urban living. Here are three cities I have visited that are dedicated to improving their environment:

  1. Singapore. Despite the abundance of tower blocks, especially in its southern half, this city that is also a nation has a half-century history of planting vegetation in order to live up to the motto ‘Singapore - City in a Garden’. Despite its large-scale adoption of high-tech, high-rise architecture, Singapore has preserved an equivalent area of green space and now ranks top of the Green View Index. Even the maximal artificiality of the main highways is tempered by continuous rows of tall, closedly-packed trees while building regulations dictate replacement of ground-level vegetation lost to development. A new 280-metre tall office, retail and residential building, due for completion in 2021, is set to incorporate overtly green elements such as a rainforest plaza. It could be argued that it's easy for Singapore to undertake such green initiatives considering that much of city didn't exist before the late Twentieth century and what did has been subject to wide-scale demolition. However, it seems that Singapore's Government has a long-term strategy to incorporate nature into the city, with the resulting improvements in the mental and physical wellbeing of its inhabitants.
  2. Toronto. Although not as ecologically renowned as Vancouver, the local government and University of Toronto are engaged in a comprehensive series of plans to improve the quality of life for both humans and the rest of nature. From the green roof bylaw and eco-friendly building subsidies to Live Green Toronto Program, there is a set of strategies to aid the local environment and planet in general. It is already paying dividends in a large reduction of air pollution-related medical cases, while quality of life improvements are shown by the substantial bicycle-biased infrastructure and increase in safe swimming days. There's still plenty to do in order to achieve their long term goals, particularly around traffic-related issues, but the city and its inhabitants are clearly aiming high.
  3. Wellington. New Zealand's capital has wooded parks and tree-filled valleys that the council promotes as part of the city's quality of life. The recreated wetlands at Waitangi Park and the Zealandia (formerly Karori) predator-proof wildlife sanctuary are key components in the integration of large-scale nature into the urban environment. Indeed, the latter is proving so successful that rare native birds such as the kaka are being increasingly found in neighbourhood gardens. Both the city and regional councils are committed to improving both the quality of life for citizens as well as for the environment in general, from storm water filtering in Waitangi Park to the wind turbines on the hilltops of what may be the world's windiest city.

These cities are just the tip of the iceberg when it comes to conurbations around the world seeking to make amends for the appalling environmental and psychological consequences of cramming immense numbers of humans into a small region that cannot possibly supply all their needs. In some respects these biophilic cities appear too good to be true, as their schemes reduce pollution and greenhouse gas emissions, improve the local ecosystem, and at the same time appear to aid the physical and mental wellbeing of their inhabitants. Yet it shouldn't be surprising really; cities are a recent invention and before that a nomadic lifestyle embedded us in landscapes that were mostly devoid of human intervention. If we are to achieve any sort of comfortable equilibrium in these hectic times, then surely covering bare concrete with greenery is the key? You don't have to be a hippy tree hugger to appreciate what nature can bring to our lives.

Sunday, 15 July 2018

Minding the miniscule: the scale prejudice in everyday life

I was recently weeding a vegetable bed in the garden when out of the corner of my eye I noticed a centipede frantically heading for cover after I had inadvertently disturbed its hiding spot. In my experience, most gardeners are oblivious to the diminutive fauna and flora around them unless they are pests targeted for removal or obliteration. It's only when the likes of a biting or stinging organism - or even just a large and/or hairy yet harmless spider - comes into view do people consciously think about the miniature cornucopia of life around them.

Even then, we consider our needs rather greater than theirs: how many of us stop to consider the effect we are having when we dig up paving slabs and find a bustling ant colony underneath? In his 2004 essay Dolittle and Darwin, Richard Dawkins pondered what contemporary foible or -ism future generations will castigate us for. Something I consider worth looking at in this context is scale-ism, which might be defined as the failure to apply a suitable level of consideration to life outside of 'everyday' measurements.

I've previously discussed neo-microscopic water-based life but larger fauna visible without optical aids is easy to overlook when humans are living in a primarily artificial environment - as over half our species is now doing. Several ideas spring to mind as to why breaking this scale-based prejudice could be important:
  1. Unthinking destruction or pollution of the natural environment doesn't just cause problems for 'poster' species, predominantly cuddly mammals. The invertebrates that live on or around larger life-forms may be critical to these ecosystems or even further afield. Removal of one, seemingly inconsequential, species could allow another to proliferate at potentially great cost to humans (for example, as disease vectors or agricultural pests). Food webs don't begin at the chicken and egg level we are used to from pre-school picture books onwards.
  2. The recognition that size doesn't necessarily equate to importance is critical to the preservation of the environment not just for nature's sake but for the future of humanity. Think of the power of the small water mould Phytophthora agathidicida which is responsible for killing the largest residents of New Zealand's podocarp forests, the ancient coniferous kauri Agathis australis. The conservation organisation Forest and Bird claims that kauri are the lynchpin for seventeen other plant species in these forests: losing them will have a severe domino effect.
  3. Early detection of small-scale pests may help to prevent their spread but this requires vigilance from the wider public, not just specialists; failure to recognise that tiny organisms may be far more than a slight nuisance can be immensely costly. In recent years there have been two cases in New Zealand where the accidental import of unwanted insects had severe if temporary repercussions for the economy. In late 2017 three car carriers were denied entry to Auckland when they were found to contain the brown marmorated stink bug Halyomorpha halys. If they had not been detected, it is thought this insect would have caused NZ$4 billion in crop damage over the next twenty years. Two years earlier, the Queensland fruit fly Bactrocera tryoni was found in central Auckland. As a consequence, NZ$15 million was spent eradicating it, a small price to pay for the NZ$5 billion per annum it would have cost the horticulture industry had it spread.
Clearly, these critters are to be ignored at our peril! Although the previous New Zealand government introduced the Predator Free 2050 programme, conservation organisations are claiming the lack of central funding and detailed planning makes the scheme unrealistic by a large margin (if anything, the official website suggests that local communities should organise volunteer groups and undertake most of the work themselves!) Even so, this scheme is intended to eradicate alien mammal species, presumably on the grounds that despite their importance, pest invertebrates are just too small to keep excluded permanently - the five introduced wasp species springing to mind at this point.

It isn't just smaller scale animals that are important; and how many people have you met who think that the word animal means only a creature with a backbone, not insects and other invertebrates? Minute and inconspicuous plants and fungi also need considering. As curator at Auckland Botanic Gardens Bec Stanley is keen to point out, most of the public appear to have plant blindness. Myrtle rust is a fungus that attacks native plants such as the iconic pōhutukawa or New Zealand Christmas tree, having most probably been carried on the wind to New Zealand from Australia. It isn't just New Zealand's Department of Conservation that is asking the public to watch out for it: the Ministry for Primary Industries also requests notification of its spread across the North Island, due to the potential damage to commercial species such as eucalyptus. This is yet another example of a botanical David versus Goliath situation going on right under our oblivious noses.

Even without the economic impact, paying attention to the smaller elements within our environment is undoubtedly beneficial. Thinking more holistically and less parochially is often a good thing when it comes to science and technology; paradigm shifts are rarely achieved by being comfortable and content with the status quo. Going beyond the daily centimetre-to-metre range that we are used to dealing with allows us to comprehend a bit more of the cosmic perspective that Neil deGrasse Tyson and other science communicators endeavour to promote - surely no bad thing when it comes to lowering boundaries between cultures in a time with increasingly sectarian states of mind?

Understanding anything a little out of the humdrum can be interesting in and of itself. As Brian Cox's BBC documentary series Wonders of Life showed, a slight change of scale can lead to apparent miracles, such as the insects that can walk up glass walls or support hundreds of times their own weight and shrug off equally outsized falls. Who knows, preservation or research into some of our small-scale friends might lead to considerable benefits too, as with the recent discovery of the immensely strong silk produced by Darwin's bark spider Caerostris darwini. Expanding our horizons isn't difficult, it just requires the ability to look down now and then and see what else is going on in the world around us.

Wednesday, 27 June 2018

A necessary evil? Is scientific whaling worthwhile - or even valid science?

There are some phrases - 'creation science' and 'military intelligence' spring readily to mind - that are worth rather more attention than a first or second glance. Another example is 'scientific whaling', which I believe deserves wider dissemination in the global public consciousness. I previously mentioned this predominantly Japanese phenomenon back in 2010 and it has subsequently had the habit of occasionally appearing in the news. It likewise has a tendency to aggravate emotions rather than promote rational discourse, making it difficult to discern exactly what is going on and whether it fulfils the first part of the phrase.

I remember being about ten years' old when a classmate's older sister visited our school and gave a talk describing her work for Greenpeace. At the time this organisation was in the midst of the Save the Whale campaign, which from my memory appears to have been at the heart of environmental activism in the 1970s. As such, it gained a high level of international publicity and support, perhaps more so than any previous conservation campaign.

Although this finally led to a ban on whale hunting in 1986, several nations opted out. In addition to a small-scale continuation in some indigenous, traditional, whale-hunting communities, Iceland and Norway continue to hunt various species. As a result, various multi-national corporations have followed public opinion and removed their operations from these nations. Japan, on the other hand - with a much larger economy and population, yet home to a far greater whale-hunting operation - is a very different prospect.

There was an international outcry back in March when Norway announced that it was increasing its annual whaling quota by 28%. It's difficult to understand the motivation behind this rise, bearing in mind that Norway's shrinking whale fleet are already failing to meet government quotas. Thanks to warming oceans, the remaining whale populations are moving closer to the North Pole, depriving the Norwegians of an easy catch. What is caught is used for human consumption as well as for pet and livestock food, as it is in Iceland, where the same tourists who go on whale-watching trips are then encouraged to tuck in to cetacean steaks and whale burgers (along with the likes of puffin and other local delicacies).

Although we think of pre-1980s whaling as a voracious industry there have been periods of temporary bans dating back to at least the 1870s, admittedly driven by profit-led concern of declining stocks rather than animal welfare and environmentalism in general. It wasn't just the meat that was economically significant; it's easy to forget that before modern plastics were invented, baleen served a multitude of purposes while the bones and oil of cetaceans were also important materials.

But hasn't modern technology superseded the need for whale-based products? Thanks to a scientific research exemption, Japanese vessels in Antarctica and the North Pacific can work to catch quotas set by the Japanese government, independent of the International Whaling Commission. The relevant legislation also gives the Japanese Institute of Cetacean Research permission to sell whale meat for human consumption, even if it was obtained within the otherwise commercially off-limits Southern Ocean Whale Sanctuary. That's some loophole! So what research is being undertaken?

The various Japanese whaling programmes of the past thirty years have been conducted principally in the name of population management for Bryde's, Fin, Minke and Sei whales. The role of these four species within their local ecosystem and the mapping of levels of toxic pollutants are among the research objectives. The overarching aim is simple: to evaluate if the stocks are robust enough to allow the resumption of large-scale yet sustainable commercial whaling. In other words, Japan is killing a smaller number of whales to assess when they can start killing a greater number of whales!

Following examination of the Japanese whaling programmes, including the current JARPA II study, environmental groups including the World Wildlife Fund as well as the Australian Government have declared Japan's scientific whaling as not fit for purpose. The programmes have led to a very limited number of published research papers, especially when compared to the data released by other nations using non-lethal methods of assessment.

There is now an extremely wide range of non-fatal data collection techniques, such as biopsy sampling and GPS tagging. Small drones nicknamed 'snotbots' are being used to obtain samples from blowhole emissions, while even good old-fashioned sighting surveys that rely on identification of individuals from diagnostics such as tail flukes can be used for population statistics. Japanese scientists have continually stated that they would stop whale hunting if other techniques proved as effective, yet the quality and quantity of research they have published since the 1980s completely negates this.

After examining the results, even some Japanese researchers have admitted that killing whales has not proven to be an accurate way to gain data. Indeed, sessions in 2014 at the United Nations' International Court of Justice confirmed that if anything, the Japanese whale quotas are far too small to provide definitive evidence for their objectives. To put it another way, Japan's Institute of Cetacean Research would have to kill far more whales to confirm if the populations are healthy enough to bear the brunt of pre-1980's scale commercial whaling! Anyone for a large dollop of irony?

Looking at the wider picture, does Japan really need increased volumes of cetacean flesh anyway? After the Second World War, food shortages led to whale meat becoming a primary protein source. Today, Japanese consumption has dropped to just one percent of what it was in the decade post-war. The domestic stockpile is no doubt becoming a burden, since whale meat is now even used in subsidised school lunches, despite the danger of heavy metal poisoning.

Due to the reduction in market size, Japan's scientific whaling programmes are no longer economically viable. So how is it that the long-term aim is to increase catch to fully commercial levels - and who do they think will be eating it? Most countries abide by the International Whaling Commission legislation, so presumably it will be for the domestic market. Although approximately half the nation's population support whale hunting, possibly due its traditional roots (or as a reaction to perceived Western cultural imperialism?) most no longer eat whale meat. So why are the Japanese steadfast in pursuing research that generates poor science, is unprofitable, internationally divisive, and generates an unwanted surplus?

The answer is: no-one really knows, at least outside of the Institute of Cetacean Research; and they're not saying. If ever there was a case of running on automatic pilot, this seems to be it. The name of science is being misused in order to continue with the needless exploitation of marine resources in the Pacific and Southern oceans. Thousands of whales have been unnecessarily slaughtered (I realise that's an emotive word, but it's worth using under the circumstances) at a time when non-lethal techniques are proving their superior research value. Other countries are under pressure to preserve fish stocks and reduce by-catch - by comparison Japan's attitude appears anachronistic in the extreme. By allowing the loophole of scientific whaling, the International Whaling Commission has compromised both science and cetaceans for something of about as much value as fox hunting.

Wednesday, 13 June 2018

Debunking DNA: A new search for the Loch Ness monster

I was recently surprised to read that a New Zealand genomics scientist, Neil Gemmell of Otago University, is about to lead an international team in the search for the Loch Ness monster. Surely, I thought, that myth has long since been put to bed and is only something exploited for the purposes of tourism? I remember some years ago that a fleet of vessels using side-sweeping sonar had covered much of the loch without discovering anything conclusive. When combined with the fact that the most famous photograph is a known fake and the lack of evidence from the plethora of tourist cameras (never mind those of dedicated Nessie watchers) that have convened on the spot, the conclusion seems obvious.

I've put together a few points that don't bode well for the search, even assuming that Nessie is a 'living fossil' (à la coelacanth) rather than a supernatural creature; the usual explanation is a cold water-adapted descendant of a long-necked plesiosaur - last known to have lived in the Cretaceous Period:
  1. Loch Ness was formed by glacial action around 10,000 years ago, so where did Nessie come from? 
  2. Glacial action implies no underwater caves for hiding in
  3. How can a single creature maintain a long-term population (the earliest mentions date back thirteen hundred years)? 
  4. What does such a large creature eat without noticeably reducing the loch's fish population?
  5. Why have no remains ever been found, such as large bones, even on sonar?
All in all, I didn't think much of the expedition's chances and therefore I initially thought that the new research would be a distinct waste of money that could be much better used elsewhere in Scotland. After all, the Shetland seabird population is rapidly decreasing thanks to over-fishing, plastic pollution and loss of plankton due to increasing ocean temperatures. It would make more sense to protect the likes of puffins (who have suffered a 98% decline over the past 20 years), along with guillemots and kittiwakes amongst others.

However, I then read that separate from the headline-grabbing monster hunt, the expedition's underlying purpose concerns environmental DNA sampling, a type of test never before used at Loch Ness. Gemmell's team have proffered a range of scientifically valid reasons for their project:
  1. To survey the loch's ecosystem, from bacteria upwards 
  2. Demonstrate the scientific process to the public (presumably versus all the pseudoscientific nonsense surrounding cryptozoology)
  3. Test for trace DNA from potential but realistic causes of 'monster' sightings, such as large sturgeon or catfish 
  4. Understand local biodiversity with a view to conservation, especially as regards the effect caused by invasive species such as the Pacific pink salmon. 
Should the expedition find any trace of reptile DNA, this would of course prove the presence of something highly unusual in the loch. Gemmell has admitted he doubts they will find traces of any monster-sized creatures, plesiosaur or otherwise, noting that the largest unknown species likely to be found are bacteria. Doesn't it seem strange though that sometimes the best way to engage the public - and gain funding - for real science is to use what at best could be described as pseudoscience?

Imagine if NASA could only get funding for Earth observation missions by including the potential to prove whether our planet was flat or not? (Incidentally, you might think a flat Earth was just the territory of a few nutbars, but a poll conducted in February this year suggests that fully two percent of Americans are convinced the Earth is a disk, not spherical).

Back to reality. Despite the great work of scientists who write popular books and hold lectures on their area of expertise, it seems that the media - particularly Hollywood - are the primary source of science knowledge to the general public. Hollywood's version of de-extinction science, particularly for ancient species such as dinosaurs, seems to be far better known than the relatively unglamorous reality. Dr Beth Shapiro's book How to clone a mammoth for example is an excellent introduction to the subject, but would find it difficult to compete along side the adventures of the Jurassic World/Park films.

The problem is that many if not most people want to believe in a world that is more exciting than their daily routine would suggest, with cryptozoology offering itself as an alternative to hard science thanks to its vast library of sightings over the centuries. Of course it's easy to scoff: one million tourists visit Loch Ness each year but consistently fail to find anything; surely in this case absence of evidence is enough to prove evidence of absence?

The Loch Ness monster is of course merely the tip of the mythological creature iceberg. The Wikipedia entry on cryptids lists over 170 species - can they all be just as suspect? The deep ocean is the best bet today for large creatures new to science. In a 2010 post I mentioned that the still largely unexplored depths could possibly contain unknown megafauna, such as a larger version of the oarfish that could prove to be the fabled sea serpent.

I've long had a fascination with large creatures, both real (dinosaurs, of course) and imaginary. When I was eight years old David Attenborough made a television series called Fabulous Animals and I had the tie-in book. In a similar fashion to the new Loch Ness research project, Attenborough used the programmes to bring natural history and evolutionary biology to a pre-teen audience via the lure of cryptozoology. For example, he discussed komodo dragons and giant squid, comparing extant megafauna to extinct species such as woolly mammoth and to mythical beasts, including the Loch Ness Monster.

A few years later, another television series that I avidly watched covered some of the same ground, namely Arthur C. Clarke's Mysterious World. No less than four episodes covered submarine cryptozoology, including the giant squid, sea serpents and of course Nessie him (or her) self. Unfortunately the quality of such programmes has plummeted since, although as the popularity of the (frankly ridiculous) seven-year running series Finding Bigfoot shows, the public have an inexhaustible appetite for this sort of stuff.

I've read that it is estimated only about ten percent of extinct species have been discovered in the fossil record, so there are no doubt some potential surprises out there (Home floresiensis, anyone?) However, the evidence - or lack thereof - seems firmly stacked against the Loch Ness monster. What is unlikely though is that the latest expedition will dampen the spirits of the cryptid believers. A recent wolf-like corpse found in Montana, USA, may turn out to be coyote-wolf hybrid, but this hasn't stopped the Bigfoot and werewolf fans from spreading X-Files style theories across the internet. I suppose it’s mostly harmless fun, and if Professor Gemmell’s team can spread some real science along the way, who am I to argue with that? Long live Nessie!

Wednesday, 30 May 2018

Photons vs print: the pitfalls of online science research for non-scientists


It's common knowledge that school teachers and university lecturers are tired of discovering that their students' research is often limited to one search phrase on Google or Bing. Ignoring the minimal amount of rewriting that often accompanies this shoddy behaviour - leading to some very same-y coursework - one of the most important questions to arise is how easy is it to confirm the veracity of online material compared to conventionally-published sources? This is especially important when it comes to science research, particularly when the subject matter involves new hypotheses and cutting-edge ideas.

One of the many problems with the public's attitude to science is that it is nearly always thought of as an expanding body of knowledge rather than as a toolkit to explore reality. Popular science books such as Bill Bryson's 2003 best-seller A Short History of Nearly Everything follow this convention, disseminating facts whilst failing to illuminate the methodologies behind them. If non-scientists don't understand how science works is it little wonder that the plethora of online sources - of immensely variable quality - can cause confusion?

The use of models and the concurrent application of two seemingly conflicting theories (such as Newton's Universal Gravitation and Einstein's General Theory of Relativity) can only be understood with a grounding in how the scientific method(s) proceed. By assuming that scientific facts are largely immutable, non-scientists can become unstuck when trying to summarise research outcomes, regardless of the difficulty in understanding the technicalities. Of course this isn't true for every theory: the Second Law of Thermodynamics is unlikely to ever need updating; but as the discovery of dark energy hints, even Einstein's work on gravity might need amending in future. Humility and caution should be the bywords of hypotheses not yet verified as working theories; dogma and unthinking belief have their own place elsewhere!

In a 1997 talk Richard Dawkins stated that the methods of science are 'testability, evidential support, precision, quantifiability, consistency, intersubjectivity, repeatability, universality, and independence of cultural milieu.' The last phrase implies that the methodologies and conclusions for any piece of research should not differ from nation to nation. Of course the real world intrudes into this model and so culture, gender, politics and even religion play their part as to what is funded and how the results are presented (or even which results are reported and which obfuscated).

For those who want to stay ahead of the crowd by disseminating the most recent breakthroughs it seems obvious that web resources are far superior to most printed publications, professional journals excepted - although the latter are rarely suitable for non-specialist consumption. The expenses associated with producing popular science books means that online sources are often the first port of call.

Therein lies the danger: in the rush to skim seemingly inexhaustible yet easy to find resources, non-professional researchers frequently fail to differentiate between articles written by scientists, those by journalists with science training, those by unspecialised writers, largely on general news sites, and those by biased individuals. It's usually quite easy to spot material from cranks, even within the quagmire of the World Wide Web (searching for proof that the Earth is flat will generate tens of millions of results) but online content written by intelligent people with an agenda can be more difficult to discern. Sometimes, the slick design of a website offers reassurance that the content is more authentic than it really is, the visual aspects implying an authority that is not justified.

So in the spirit of science (okay, so it's hardly comprehensive being just a single trial) I recently conducted a simple experiment. Having read an interesting hypothesis in a popular science book I borrowed from the library last year, I decided to see what Google's first few pages had to say on the same subject, namely that the Y chromosome has been shrinking over the past few hundred million years to such an extent that its days - or in this case, millennia - are numbered.

I had previously read about the role of artificial oestrogens and other disruptive chemicals in the loss of human male fertility, but the decline in the male chromosome itself was something new to me. I therefore did a little background research first. One of the earliest sources I could find for this contentious idea was a 2002 paper in the journal Nature, in which the Australian geneticist Professor Jennifer Graves described the steady shrinking of the Y chromosome in the primate order. Her extrapolation of the data, combined with the knowledge that several rodent groups have already lost their Y chromosome, suggested that the Home sapiens equivalent has perhaps no more than ten million years left before it disappears.

2003 saw the publication of British geneticist Bryan Sykes' controversial book Adam's Curse: A Future Without Men. His prediction based on the rate of atrophy in the human Y chromosome was that it will only last another 125,000 years. To my mind, this eighty-fold difference in timescales suggests that for these early days in its history, very little of the hypothesis could be confirmed with any degree of certainty.

Back to the experiment itself. The top results for 'Y chromosome disappearing' and similar search phrases lead to articles published between 2009 and 2018. They mostly fall into one of two categories: (1) that the Y chromosome is rapidly degenerating and that males, at least of humans and potentially all other mammal species, are possibly endangered; and (2) that although the Y chromosome has shrunk over the past few hundred million years it has been stable for the past 25 million and so is no longer deteriorating. A third, far less common category, concerns the informal polls taken of chromosomal researchers, who have been fairly evenly divided between the two opinions and thus nicknamed the "leavers" and the "remainers". Considering the wildly differing timescales mentioned above, perhaps this lack of consensus is proof of science in action; there just hasn't been firm enough evidence for either category to claim victory.

What is common to many of the results is that inflammatory terms and hyperbole are prevalent, with little in the way of caution you would hope to find with cutting-edge research. Article titles include 'Last Man on Earth?', 'The End of Men' and 'Sorry, Guys: Your Y Chromosome May Be Doomed ', with paragraph text contain provocative phrases such as 'poorly designed' and 'the demise of men'. This approach is friendly to organic search at the same time as amalgamating socio-political concerns with the science.

You might expect that the results would show a change in trend of time, first preferring one category and then the other, but this doesn't appear to be the case. Rearranged in date order, the search results across the period 2009-2017 include both opinions running concurrently. This year however has seen a change, with the leading 2018 search results so far only offering support to the rapid degeneration hypothesis. The reason for this difference is readily apparent: publication of a Danish study that bolsters support for it. This new report is available online, but is difficult for a non-specialist to digest. Therefore, most researchers such as myself would have to either rely upon second-hand summaries or, if there was enough time, wait for the next popular science book that discusses it in layman's terms.

As it is, I cannot tell from my skimming approach to the subject whether the new research is thorough enough to be completely reliable. For example, it only examined the genes of sixty-two Danish men, so I have no idea if this is a large enough sample to be considered valid beyond doubt. However, all of the 2018 online material I read accepted the report without question, which at least suggests that after a decade and a half of vacillating between two theories, there may now be an answer. Even so, by examining the content in the "remainers" category, I wonder how the new research confirms a long term trend rather than short term blip in chromosomal decline. I can't help thinking that the sort of authoritative synthesis found in the better sort of popular science books would answer these queries, such is my faith in the general superiority of print volumes!

Of course books have been known to emphasise pet theories and denigrate those of opponents, but the risk of similar issues for online content is far greater. Professor Graves' work seems to dominate the "leavers" category, via her various papers subsequent to her 2002 original, but just about every reference to them is contaminated with overly emotive language. I somehow doubt that if her research was only applicable to other types of animals, say reptiles, there would be nearly so many online stories covering it, let alone the colourful phrasing that permeates this topic. The history of the Y chromosome is as extraordinary as the chromosome itself, but treating serious scientific speculation - and some limited experimental evidence - with tabloid reductionism and show business hoopla won't help when it comes to non-specialists researching the subject.

There may be an argument here for the education system to systematically teach such basics as common sense and rigour, in the hopes of giving non-scientists a better chance of detecting baloney. This of course includes the ability to accurately filter online material during research. Personally, I tend to do a lot of cross-checking before committing to something I haven't read about on paper. If even such highly-resourced and respected websites as the BBC Science News site can make howlers (how about claiming that chimpanzees are human ancestors?) why should we take any of these resources on trust? Unfortunately, the seductive ease with which information can be found on the World Wide Web does not in any way correlate with its quality. As I found out with the shrinking Y chromosome hypothesis, there are plenty of traps for the unwary.

Tuesday, 15 May 2018

Troublesome trawling: how New Zealand's fishing industry hid the truth about by-kill

I recently signed a petition to reduce by-kill in New Zealand waters by installing cameras on all commercial fishing vessels. Forest and Bird and World Wildlife Fund New Zealand are jointly campaigning for this monitoring, as only a small percentage of boats as yet have cameras. The previous New Zealand government agreed to the wider introduction, but Fisheries Minister Stuart Nash is considering reversing this due to industry pressure. Considering that the current administration is a coalition involving the Green Party, this seems highly ironic. Is this yet another nail in the coffin of New Zealand's tourist brand as 100% Pure?

Despite requests from the fishing industry not to release it to the public, on-board footage shows the extent of the by-kill. High numbers of rare and endangered species have been drowned in nets, from seabirds such as wandering albatross and yellow-eyed penguins/hoiho, to cetaceans (there are thought to be only fifty or so Māui's dolphin/popoto left), seals and sea lions.

Many of the cameras already installed on New Zealand boats failed in the first three months due to inadequate waterproofing; when allied with the fact that the supplier of the technology was an integrated part of the seafood industry, there's more than a whiff of something fishy going on. Although official statistics are often considered to be of dubious quality, Occam's razor can be used to decipher the by-kill figures as they have been reported in the past decade. Only three percent of New Zealand's set net boats are officially monitored, yet they account for the vast majority of the recorded by-kill. Given a choice between sheer coincidence (i.e. only monitored vessels are catching large numbers of non-target species) and severe under-reporting from unmonitored boats, it is obvious that the latter hypothesis follows the law of parsimony.

Sadly, widespread deception by New Zealand's fishing industry isn't something new. A third-party report involving undercover operatives stated that between 1950 and 2010 up to 2.7 times the official tonnage of fish was actually being caught, peaking in 1990. All this comes from an industry that is laden with checks and measures, not to mention sustainability certificates. Killing marine mammals within the country's Exclusive Economic Zone isn't just a minor inconvenience: since 1978 it's been illegal, with severe fines and even prison sentences for those convicted. Small wonder then that the majority of by-kill has been undeclared.

What is equally sad is the lack of interest from the New Zealand public in resolving the problems. After all, over ninety percent of the population are not vegetarian, so we must assume the vast majority enjoy seafood in their diet. The rapid replacement of over-fished sharks with Humboldt squid in the Sea of Cortez off Mexico's Pacific coast shows how the removal of a key species can severely affect food webs. If New Zealanders are to continue to enjoy eating fish with their chips, the sea needs better protection.

Over the past decade, other nations have shown commitment to reducing by-kill and lessening waste. In 2010 the British celebrity chef Hugh Fearnley-Whittingstall launched the Fish Fight campaign to stop the discard of about half of the European Union's catch (due to it being either undersized or from non-quota species). Immense public support over the subsequent four years led to phased changes in the European Union's Common Fisheries Policy, proof that citizen action can make fundamental improvements.

Incidentally, it wasn't even a case of division along party lines; I was living in the UK at the time and wrote to my Labour Member of Parliament, who replied in a typically circumlocutory fashion that she would look into the matter! Even the then Conservative Prime Minister David Cameron agreed that EU policy needed a radical overhaul, a rare instance of cross-party sense and sensibility over pride and prejudice.

So what solutions are there to reducing by-kill? After all, installing cameras would only be the first step in assessing the scale of the problem, not removing it. Since Australia started monitoring its long-line tuna fleet, there has been a whopping seven hundred percent increase in the reporting of seabird and marine mammal by-kill. Some seaboard states in the USA have already banned set netting, which is still in widespread use in New Zealand. Several areas around the New Zealand coast such as between Kaipara Harbour and Mokau already prohibit this method of fishing - in this case to protect the few remaining Māui's dolphin - so there are precedents.

In addition, there are programmes currently testing new technology that may provide the answer. In 2002 the now charitable trust Southern Seabird Solutions was created to reduce by-kill of albatross and other endangered pelagic species.  This alliance of fishing industry leaders, recreational fishers, researchers and government analysts are trialling wondrously-named devices such as the Brady Bird Baffler, Hook Pod, tori lines and warp scarers.

Elsewhere, nocturnal experiments have been conducted using acoustic pingers to deter dolphins, although the results to date aren't especially promising. Equally dubious is the amended trawl net design for squid fishing vessels that incorporates the Sea Lion Exclusion Device (SLED); only today, it was reported that a juvenile sea lion had been found dead in such a net. Clearly, STEM ingenuity is being brought into play, but it will require further development and widespread introduction of the best solutions without industry interference in order to minimalise by-kill.

There are also some simple changes of practice that don't require equipment, only for the boat crews to be more aware of wildlife and act accordingly. Such procedures include moving away from areas with marine mammals present, not dumping offal, recovering lost gear, and changing the operating depth and retrieval speed of nets.

As usual, the financial considerations have taken precedence over the ecological ones. New Zealand has a comparatively small economy and as seafood is the nation's fifth largest export earner - over one billion dollars annually - it is hardly surprising that successive governments have tended to side with industry rather than environmentalists. However, could it be that there is now enough apprehension about the general state of the oceans to overhaul the sector's laissez faire practice? After all, in 2007 a fishing ban in New Zealand waters was placed on orange roughy, whose rapid decline caused huge concern.

There are of course plenty of other environmental issues affecting marine life: plastic pollution (including microbeads); increasing temperature and acidity, the latter especially drastic for shellfish; offshore algal blooms due to agricultural nutrient run-off; and numerous problems created by the oil and gas industry, from spillages to the far less reported exploratory air guns that impact cetacean behaviour.

The longer I've been writing this blog the more I'm convinced that science cannot be considered independent of the wider society in which it exists. Social, political and religious pressures and viewpoints can all adversely affect both what research is funded, what the time constraints are and how the results are presented or even skewed in favour of a particular outcome.

In the case outlined above, government ministries hid evidence to protect short-term industry profits at the expense of long-term environmental degradation - and of course the increase in public spending the latter will require for mitigation. New Zealand's precious dairy sector is already taking a pounding for the problems it has knowingly generated, so no doubt the fishing industry is keen to avoid a similar fate.

By allowing such sectors to regulate and police themselves and thus avoid public transparency, the entire nation suffers in the long run. We don't know what the decline or disappearance of populations of for example wandering albatross and Māui's dolphins might have on the (dis)appearance of snapper or blue cod at the dinner table, but as the alarming loss of Mediterranean and Californian anchovies and sardines suggests, negative cascades in the food chain can occur with extreme rapidity. Natural selection is a wonderful method of evolution but we are pushing it to the limit if we expect it to cope with the radical changes we are making to the environment at such a high speed. By-kill is something we can reduce, but only if industry and governments give science and the public a 'fair go'. Now isn't that something New Zealanders are supposed to be good at?

Saturday, 14 April 2018

Avian Einsteins: are some bird species as clever as primates?


One of the strangest examples of animal behaviour I've ever seen in real life took place in my neighbourhood last year, with what for all intents and purposes appeared to be a vigil or wake. Half a dozen Common myna birds (Acridotheres tristis) were gathered in a circle around a dead or dying member of their species, making the occasional muted noise. Unfortunately I was rushing to get to work and didn't stop to take a photograph, which was a shame as the birds ignored me even when I passed within a few metres of them.

Even though the street was a cul-de-sac, I couldn't help thinking that sitting in the road was not the safest place for the birds to congregate, considering that they could have stayed close by on the grass verge; instead their proximity to the central, non-moving individual seemed to override their concerns for personal safety. 

Some biologists have suggested that this behaviour, mostly known from corvids (that is, the crow family) is due to the birds' instinctive need to advertise the area to others as particularly dangerous. Although there are plenty of cats in my neighbourhood this idea doesn't seem to make sense, at least in this particular instance. There were several trees that would have served as convenient perching locations for the myna birds, who weren't nearly as loud as they usually are.

Without getting too anthropomorphic about it, they were a lot less garrulous than normal, implying a sombre occasion. Far from providing warnings about the locality, the birds were extremely quiet for a gathering of this size; I should know, as myna birds are probably the third or fourth most common species in my garden and their routine screeches and squawks are far from subtle, to say the least. 

So is it possible that despite one of their number no longer moving or making sounds, its fellow birds understood that this inanimate object was one of their kind?  I've occasionally found dead birds of other species such as goldfinches, song thrushes and blackbirds in my garden and none have been the subject of similar behaviour. As an amateur scientist - or indeed anyone with curiosity might do - I researched the subject and found that crows are well known for gathering around bodies of the same species while magpies (another corvid) have even been reported as covering up dead fellows with twigs and the like. Are these reports all April Fool jokes or are some species of Aves unsung geniuses?

Further enquiry led me to discover that the corvid family, which includes ravens and jays, is the pinnacle of avian intelligence, closely followed by parrots. I initially thought that my observations of the myna bird, a member of the Sturnidae family, constituted something new, until I read a 2014 report from the School of Biological Sciences in Malaysia stating that laboratory testing proved them to be better at counting food items than House crows (Corvus splendens). In certain situations then, myna birds are up there with the brainiest of their kind.

In January this year I received another surprise on reading that the three most common avian raptors in northern Australia's tropical savannas - the Brown falcon (Falco berigora), the Black kite (Milvus migrans) and the Whistling kite (Haliastur sphenurus) - have been reported as deliberately spreading bush fires. It appears that after lightning has started a wild fire, the birds pick up burning twigs in their beaks or claws and drop them on untouched forest or grassland some tens of metres away. This then causes prey items such as lizards, snakes, rodents and amphibians to flee the new fire zone, only to be picked off by the waiting raptors.

Although birds of prey in North and South America, West African and Papua New Guinea are known to hunt on the edges of wild fires, the ingenuity of their Australian counterparts is without precedent.  What's more, they appear to have been using this behaviour for thousands of years, since it is clearly recorded in local Aboriginal legends concerning 'fire hawks'; only until now, white settlers have ignored the stories due to their implausibility.

Intelligent corvids

A 2016 report by the U.S. National Academy of Sciences has uncovered biological evidence to support advanced avian intelligence. Although their brains are a lot smaller than mammals, especially primates, this is obviously due to the overall diminutive size of the animals themselves. The brain mass to body mass ratio of some bird families is far larger than expected for an animal of that size, approximating that of the most intelligent mammals.

Although bird brains have a somewhat different structure to mammalian brains, the corresponding higher-functioning regions are both comparatively large and have a neuron density double that of primate equivalents. Therefore it appears corvids and some other birds have undergone parallel evolution that has maximised their cognition.

Birds are good at far more than just adapting to new conditions and environments, with the most social (as opposed to solitary) species leading the way in problem solving and abstract thinking. Here are few more examples that prove their cognition can go far beyond basic instinct:
  1. Self-recognition: Eurasian magpies (Pica pica) pass the mirror test, meaning they can recognise their reflection as themselves rather than as another member of their species. 
  2. Tool usage: various birds use twigs and cactus spines to extract insects, much as chimpanzees insert sticks into termite mounds.  
  3. Deception: Woodhouse's scrub jays (Aphelocoma woodhouseii) have been observed moving food caches to deceive onlookers and keep the food to themselves.
  4. Planning: Crows have used multi-step planning in tests to retrieve progressively longer sticks with which to reach food. This ability isn't new either, since the 1st century Roman polymath Pliny the Elder observed corvids undertaking similar behaviour to that described in Aesop's fable The Crow and the Pitcher.
  5. Exploiting artificial environments: The kea (Nestor notabilis), New Zealand's alpine parrot, has learnt to unzip rucksack pockets to obtain food. Despite being unlike anything in nature, some bird species understand man-made objects.
When I was a child, the term 'bird brain' was employed for derogatory purposes while 'talking' caged birds such as cockatiels, budgies and parakeets were thought of as just mimics without any understanding of what they were saying. This continuation of the Western tradition that humanity is the pinnacle of creation, far superior to all other lifeforms, is now under serious attack. Our prejudices have caused us to ignore the evidence right under our noses, but as per my post on animals that farm, we humans have very few unique traits left. Avian intelligence is undoubtedly different from ours, but perhaps less so than that of dolphins, whose watery environment means they are unlikely to ever be tool makers.

Another issue is that compared to say mammals, there is a far smaller variety of bird forms; even such specialised species as penguins, ostrich and kiwi don't stray far from the generic Aves design, meaning we tend to associate bird intelligence with the most ubiquitous - and comparatively slow-witted - urban species such as house sparrows and feral pigeons.

Beginning in the 1970s, researchers have explored the sometimes controversial notion that if the dinosaurs hadn't died out at the end of the Cretaceous, a small- to medium-sized carnivore such as Troodon would have eventually evolved into a reptile with human-level intelligence. Crows and their kind may not have a primate-sized brain, but these dinosaur descendants are evidently far superior to the dim stereotype we usually assign to them. They may be small, but clearly in this case, size doesn't seem to matter: our feathered friends are capable of far greater mental activity than their songs, squawks and screeches imply.

Sunday, 1 April 2018

Engagement with Oumuamua: is our first interstellar visitor an alien spacecraft?

It's often said that fact follows fiction but there are times when some such instances appear to be uncanny beyond belief.  One relatively well-known example comes from the American writer Morgan Robertson, whose 1898 novella The Wreck of the Titan (originally entitled Futility) eerily prefigured the 1912 loss of the Titanic. The resemblances between the fictional precursor and the infamous passenger liner are remarkable, including the month of the sinking, the impact location, and similarities of size, speed and passenger capacity. I was first introduced to this series of quirky coincidences via Arthur C. Clarke's 1990 novel The Ghost from the Grand Banks, which not incidentally is about attempts to raise the Titanic. The reason for including the latter reference is that there may have just been an occurrence that involves another of Clarke's own works.

Clarke's 1973 award-winning novel Rendezvous with Rama tells of a 22nd century expedition to a giant interstellar object that is approaching the inner solar system. The fifty-four kilometre long cylinder, dubbed Rama, is discovered by an Earthbound asteroid detection system called Project Spaceguard, a name which since the 1990s has been adopted by real life surveys aiming to provide early warning for Earth-crossing asteroids. Rama is revealed to be a dormant alien spacecraft, whose trajectory confirms its origin outside of our solar system. After a journey of hundreds of thousands of years, Rama appears to be on a collision course with the Sun, only for it to scoop up solar material as a fuel source before heading back into interstellar space (sorry for the spoiler, but if you haven't yet read it, why not?)

In October last year astronomer Robert Weryk at the Haleakala Observatory in Hawaii found an unusual object forty days after its closest encounter with the Sun. Initially catalogued as 1I/2017 U1, the object was at first thought to be a comet, but after no sign of a tail or coma it was reclassified as an asteroid. After another week's examination 1I/2017 U1 was put into a class all by itself and this is when observers began to get excited, as its trajectory appeared to proclaim an interstellar origin.

As it was not spotted until about thirty-three million kilometres from the Earth, the object was far too small to be photographed in any detail; all that appears to telescope-mounted digital cameras is a single pixel. Therefore its shape was inferred from the light curve, which implied a longest-to-shortest axis ratio of 5:1 or even larger, with the longest dimension being between two hundred and four hundred metres. As this data became public, requests were made for a more familiar name than just 1I/2017; perhaps unsurprisingly, Rama became a leading contender. However, the Hawaiian observatory's Pan-STARRS team finally opted for the common name Oumuamua, which in the local language means 'scout'.

Various hypotheses have been raised as to exactly what type of object Oumuamua is, from a planetary fragment to a Kuiper belt object similar - although far smaller than - Pluto.  However, the lack of off-gassing even at perihelion (closest approach to the Sun) implies that any icy material must lie below a thick crust and the light curve suggests a denser material such as metal-rich rock. This sounds most unlike any known Kuiper belt object.

These unusual properties attracted the attention of senior figures in the search for extra-terrestrial intelligence. Project Breakthrough Listen, whose leadership includes SETI luminaries Frank Drake, Ann Druyan and Astronomer Royal Martin Rees, directed the world's largest manoeuvrable radio telescope towards Oumuamua. It failed to find any radio emissions, although the lack of a signal is tempered with the knowledge that SETI astronomers are now considering lasers as a potentially superior form of interstellar communication to radio.

The more that Oumuamua has been studied, the more surprising it appears. Travelling at over eighty kilometres per second relative to the Sun, its path shows that it has not originated from any of the twenty neighbouring solar systems. Yet it homed in on our star, getting seventeen percent nearer to the Sun than Mercury does at its closest. This seems to be almost impossible to have occurred simply by chance - space is just too vast for an interstellar object to have achieved such proximity. So how likely is it that Oumuamua is a real-life Rama? Let's consider the facts:
  1. Trajectory. The area of a solar system with potentially habitable planets is nicknamed the 'Goldilocks zone', which for our system includes the Earth. It's such a small percentage of the system, extremely close to the parent star, that for a fast-moving interstellar object to approach at random seems almost impossible. Instead, Oumuamua's trajectory was perfectly placed to obtain a gravity assist from the Sun, allowing it to both gain speed and change course, with it now heading in the direction of the constellation Pegasus.
  2. Motion. Dr Jason Wright, an associate professor of astronomy and astrophysics at Penn State University, likened the apparent tumbling motion to that of a derelict spacecraft, only to retract his ideas when criticised for sensationalism.
  3. Shape. All known asteroids and Kuiper belt objects are much less elongated than Oumuamua, even though most are far too small to settle into spherical shape due to gravitational attraction (the minimum diameter being around six hundred kilometres for predominantly rocky objects). The exact appearance is unknown, with the ubiquitous crater-covered asteroid artwork being merely an artist's impression. Astronautical experts have agreed that Oumuamua's shape is eminently suitable for minimising damage from particles.
  4. Composition. One definitive piece of data is that Oumuamua doesn't emit clouds of gas or dust that are usually associated with objects of a similar size. In addition, according to a report by the American Astronomical Society, it has an 'implausibly high density'. Somehow, it has survived a relatively close encounter with the Sun while remaining in one piece - at a maximum velocity of almost eighty-eight kilometres per second relative to our star!
  5. Colour. There appears to be a red region on the surface, rather than a uniform colour expected for an object that has been bombarded with radiation on all sides whilst in deep space for an extremely long period.
So where does this leave us? There is an enormous amount of nonsense written about alien encounters, conspiracy theories and the like, with various governments and the military seeking to hide their strategies in deliberate misinformation. For example, last year the hacker collective Anonymous stated that NASA would soon be releasing confirmation of contact with extraterrestrials; to date, in case you were wondering, there's been no such announcement. Besides which, wouldn't it more likely to come from a SETI research organisation such as the Planetary Society or Project Breakthrough Listen?

Is there any evidence to imply cover-up regarding Oumuamua? Here's some suggestions:
  1. The name Rama - already familiar to many from Arthur C. Clarke's novel and therefore evocative of an artificial object - was abandoned for a far less expressive and more obscure common name. Was this an attempt to distance Oumuamua from anything out of the ordinary?
  2. Dr Wright's proposals were luridly overstated in the tabloid media, forcing him to abandon further investigation. Was this a deliberate attempt by the authorities to make light of his ideas, so as to prevent too much analysis while the object was still observable?
  3. Limited attempts at listening for radio signals have been made, even though laser signalling is now thought to be a far superior method. So why have these efforts been so half-hearted for such a unique object?
  4. The only images available in the media are a few very samey artist's impressions of an elongated asteroid, some pock-marked with craters, others, especially animations, with striations (the latter reminding me more of fossilised wood). Not only are these pure speculation but none feature the red area reported from the light curve data. It's almost as if the intention was to show a totally standard asteroid, albeit of unusual proportions. But this appearance is complete guesswork: Oumuamua has been shoe-horned into a conventional natural object, despite its idiosyncrasies.
Thanks to Hollywood, most people's ideas of aliens are as implacable invaders. If - and when - the public receive confirmation of intelligent alien life will there be widespread panic and disorder? After all, the Orson Welles' 1938 radio version of H.G. Wells' War of the Worlds led some listeners to flee their homes, believing a Martian invasion had begun. Would people today be any different? The current following of dangerous fads such as paleo diets and raw water, never mind the paranoid conspiracy theories that fill the World Wide Web, lead me to expect little change from our credulous forbears.

The issue of course, comes down to one of security. Again, science fiction movies tend to overshadow real life space exploration, but the fact is that we have no spacecraft capable of matching orbits with the likes of Oumuamua. In Arthur C. Clarke's Rendezvous with Rama, colonists on 22nd century Mercury become paranoid with the giant spacecraft's approach and attempt to destroy it with a nuclear missile (oops, another spoiler there). There is no 21st century technology that could match this feat, so if Oumuamua did turn out to be an alien craft, we would have to hope for the best. Therefore if, for example, the U.S. Government gained some data that even implied the possibility of artifice about Oumuamua, wouldn't it be in their best interest to keep it quiet, at least until it is long gone?

In which case, promoting disinformation and encouraging wild speculation in the media would be the perfect way to disguise the truth. Far from being an advanced - if dead or dormant - starship, our leaders would rather we believed it to be a simple rocky asteroid, despite the evidence to the contrary. Less one entry for the Captain's log, and more a case of 'to boulderly go' - geddit?

Sunday, 18 March 2018

Smart phone, dumb people: is technology really reducing our intelligence?

IQ testing is one of those areas that always seems to polarise opinion, with many considering it useful for children as long as it is understood to be related to specific areas of intelligence rather than a person's entire intellectual capabilities. However, many organisations, including some employers, use IQ tests as a primary filter, so unfortunately it cannot be ignored as either irrelevant or outdated. Just as much of the education system is still geared towards passing exams, IQ tests are seen as a valid method to sort potential candidates. They may not be completely valid, but are used as a short-cut tool that serves a limited purpose.

James Flynn of the University of Otago in New Zealand has undertaken long-term research into intelligence, so much so that the 'Flynn Effect' is the name given to the worldwide increase in intelligence since IQ tests were developed over a century ago. The reasons behind this increase are not fully understood, but are probably due to the complex interaction of numerous environmental factors such as enriched audio-visual stimulation, better - and more interactive - education methods, even good artificial lighting for longer hours of reading and writing. It is interesting that as developing nations rapidly gain these improvements to society and infrastructure, their average IQ shows a correspondingly rapid increase when compared to the already developed West and its more staid advancement.

Research suggests that while young children's IQ continues to increase in developed nations, albeit at a reduced rate, the intelligence of teenagers in these countries has been in slow decline over the past thirty years. What is more, the higher the income decile, the larger the decrease. This hints that the causes are more predominant in middle-class lifestyles; basically, family wealth equates to loss of IQ! Data for the UK and Scandinavian countries indicates that a key factor may be the development of consumer electronics, starting with VCRs, games consoles and home computers and now complemented by smart phones, tablets and social media. This would align with the statistics, since the drop is highest among children likely to have greatest access to the devices. So could it be true that our digital distractions are dumbing us down?

1) Time

By spending more time on electronic devices, children live in a narrower world, where audio-visual stimulation aims for maximum enjoyment with minimal effort, the information and imagery flying by at dizzying speed. This isn't just the AV presentation of course: digital content itself closely aligns to pop cultural cornerstones, being glamorous, gimmicky, transient and expendable. As such, the infinitesimally small gradations of social status and friendship that exist amongst children and teenagers requires enormous effort on their part to maintain a constant online presence, both pro-actively and reactively responding to their peers' (and role models') endless inanities.

The amount of effort it would take to filter this is mind-boggling and presumably takes away a lot of time that could be much better spent on other activities. This doesn't have to be something as constructive as reading or traditional studying: going outdoors has been shown to have all sorts of positive effects, as described in Richard Louv's 2005 best-seller Last Child in the Woods: Saving Our Children From Nature-Deficit Disorder.

Studies around the world have shown that there are all sorts of positive effects, including on mood, by mere immersion in nature, not just strenuous physical activity. Whether humans have an innate need for observing the intricate fractal patterns of vegetation (grass lawns and playing fields have been found to be ineffective) or whether it's noticing the seemingly unorganised behaviour of non-human life forms, the Japanese government have promoted Shinrin-yoku or 'forest air bathing' as a counterbalance to the stresses of urbanised existence. It sounds a bit New Age, but there is enough research to back up the idea that time spent in the natural environment can profoundly affect us.

Meanwhile, other nations appear to have given in, as if admitting that their citizens have turned into digitally-preoccupied zombies. Last year, the Dutch town of Bodegraven decided to reduce accidents to mobile-distracted pedestrians by installing red and green LED strips at a busy road junction, so that phone users could tell if it was safe to cross without having to look up!

2) Speed

One obvious change in the past four decades has been in the increased pace of life in developed nations. As we have communication and information retrieval tools that are relatively instantaneous, so employers expect their workforce to respond in tune with the speed of these machines. This act-now approach hardly encourages in-depth cogitation but relies upon seat-of-the-pants thinking, which no doubt requires a regular input of caffeine and adrenaline. The emphasis on rapid turnaround, when coupled with lack of patience, has led to an extremely heavy reliance on the first page of online search results: being smart at sifting through other people's data is fast becoming a replacement for original thought, as lazy students have discovered and no doubt as many school teachers and university lecturers could testify.

Having a convenient source of information means that it is easier for anyone to find a solution to almost anything rather than working something out for themselves. This can lead to a decline in initiative, something which separates thought leaders from everyone else. There is a joy to figuring out something, which after all is a key motivation for many STEM professionals. Some scientists and engineers have explained that being able to understand the inner workings of common objects was a key component of their childhood, leading to an obvious career choice. For example, New Zealand-based scientist and science communicator Michelle Dickinson (A.K.A. Nanogirl) spent her childhood dismantling and repairing such disparate household items as home computers and toasters, echoing Ellie Arroway, the heroine in Carl Sagan's novel Contact, who as a child repaired a defective valve radio before going on to become a radio astronomer.

Of course, these days it would be more difficult to repair contemporary versions of these items, since they are often built so that they cannot even be opened except in a machine shop. Laptops and tablets are prime examples and I've known cases where the likes of Microsoft simply replace rather than repair a screen-damaged device. When I had a desktop computer I frequently installed video and memory cards, but then how-to videos are ubiquitous on YouTube. The latest generation of technology doesn't allow for such do-it-yourself upgrades, to the manufacturer's advantage and the consumer's detriment. As an aside, it's worrying that so many core skills such as basic repairs or map navigation are being lost; in the event of a massive power and/or network outage due to the likes of a solar flare, there could be a lot of people stuck in headless chicken mode. Squawk!

3) Quality

While the World Wide Web covers every subject imaginable (if being of immensely variable quality), that once fairly reliable source of information, television, has largely downgraded the sometimes staid but usually authoritative documentaries of yesteryear into music promo-style pieces of infotainment. Frequently unnecessary computer graphics and overly-dramatic reconstructions and voice overs are interwoven between miniscule sound bites from the experts, the amount of actual information being conveyed reduced to a bare minimum.

In many cases, the likes of the Discovery Channel are even disguising pure fiction as fact, meaning that children - and frequently adults - are hard-placed to differentiate nonsense from reality. This blurring of demarcation does little to encourage critical or even sustained thinking; knowledge in the media and online has been reduced to a consumer-led circus with an emphasis on marketing and hype. Arguably, radio provides the last media format where the majority of content maintains a semblance of sustained, informative discussion on STEM issues.

4) Quantity

The brave new world of technology that surrounds us is primarily geared towards consumerism; after all, even social media is fundamentally a tool for targeted marketing. If there's one thing that manufacturers do not want it is inquisitive customers, since the buzzwords and hype often hide a lack of quality underneath. Unfortunately, the ubiquity of social media and online news in general means that ridiculous ideas rapidly become must-have fads.

Even such commodities as food and drink have become mired with trendy products like charcoal-infused juice, unpasteurised milk and now raw water, attracting the same sort of uncritical punters who think that nutrition gurus know what really constituted human diets in the Palaeolithic. The fact that some of Silicon Valley's smartest have failed to consider the numerous dangers of raw water shows that again, analytical thinking is taking a back seat to whatever is the latest 'awesome' and 'cool' lifestyle choice.

Perhaps then certain types of thinking are becoming difficult to inculcate and sustain in our mentally vulnerable teenagers due to the constant demands of consumerism and its oh-so-seductive delivery channels. Whether today's youth will fall into the viewing habits of older generations, such as the myriad of 'food porn' shows remains to be seen; with so much on offer, is it any wonder people spend entire weekends binge watching series, oblivious to the wider world?

The desire to fit into a peer group and not be left behind by lack of knowledge about some trivia or other, for example about the latest series on Netflix, means that so much time is wasted on activities that only require a limited number of thought processes. Even a good memory isn't required anymore, with electronic calendars and calculators among the simplest of tools available to replace brain power. Besides which, the transience in popular culture means there's little need to remember most of what happened last week!

Ultimately, western nations are falling prey to the insular decadence well known from history as great civilisations pass their prime. Technology and the pace of contemporary life dictated by it must certainly play a part in any decline in IQ, although the human brain being what it is - after all, the most complex object in the known universe - I wouldn't dare guess how much is due to them.

There are probably other causes that are so familiar as to be practically invisible. Take for instance background noise, both visual and aural, which permeates man-made environments. My commute yesterday offers a typical example of the latter sort, with schoolchildren on my train playing loud music on their phones that could be heard some metres away to the two building sites I walked by, plus a main road packed with vehicles up to the size of construction trucks. As a final bonus, I passed ten shops and cafes that were all playing loud if inane pop music that could be heard on the street, through open doors. Gone are the days of tedious elevator muzak: even fairly expensive restaurants play material so fast and loud it barely constitutes the term 'background music'. If such sensory pollution is everywhere, when do we get to enjoy quality cogitation time?

If you think that consumerism isn't as all-encompassing as I state, then consider that the USA spends more per year on pet grooming than it does on nuclear fusion research. I mean, do you honestly really need a knee-high wall-mounted video phone to keep in touch with your dog or cat while you're at work? Talking of which, did you know that in 2015 the Kickstarter crowdfunding platform's Exploding Kittens card game raised almost US$9 million in less than a month? Let's be frank, we've got some work to do if we are to save subsequent generations from declining into trivia-obsessed sheeple. Baa!