Sunday, 10 March 2019

Buzzing away: are insects on the verge of global extinction?

It's odd how some of these posts get initiated. For this particular one, there were two driving factors. One was passing a new house on my way to work where apart from the concrete driveway, the front garden consisted solely of a large square of artificial grass; the owners are clearly not nature lovers! The second inspiration was listening to a BBC Radio comedy quiz show, in which the panel discussed the recent report on global insect decline without being able to explain why this is important, apart from a vague mention of pollination.

Insect biologists have long sung the praises of these unrewarded miniature heroes, from JBS Haldane's supposed adage about God being "inordinately fond of stars and beetles" to EO Wilson's 1987 speech that described them as "the little things that run the world." In terms of numbers of species and individuals, invertebrates, especially insects, are the great success story of macroscopic life on our planet. So if they are in serious decline, does that spell trouble for Homo sapiens?

The new research claims that one-third of all insect species are currently endangered, extrapolating to wholesale extinction for the class Insecta over the next century. Although the popular press has started using evocative phrases such as "insect genocide" and even "insectageddon", just how accurate are these dramatic claims?

The United Nation's Red List currently describes three hundred insect species as critically endangered and a further seven hundred as vulnerable, but this is a tiny proportion of the total of...well, at lot more, at any rate. One oft-quoted figure is around one million insect species, although entomologists have estimated anywhere from 750,000 up to 30 million, with many species still lacking formal scientific identification. The hyperbole could therefore easily sound like unnecessary scaremongering, until you consider the details.

The new report states that butterflies and caddis flies are suffering the greatest decline, while cockroaches - as anyone who has faced a household infestation of them will know, they are likely to remain around until the end of world - and flies are the least affected orders. So, to paraphrase Monty Python, what have the insects ever done for us?

Pollination is of course of key importance, to both horticulture and un-managed 'wild' environments. Insects are near the base of many food webs; if numbers were much reduced, never mind removed, the impact on the rest of the ecosystem would be catastrophic. With the human population set to top ten billion in thirty years' time, we require ever larger regions of productive land for agriculture. They may be small at an individual level, but arthropods in general total about seventeen times the mass of all us H. sapiens. Insects replenish the soil, as alongside bacteria they break down dead matter and fecal material. So important is this latter function that New Zealand has been trialling non-native dung beetles to aid cattle farmers.

One key way to save fresh water and lessen the generation of the potent greenhouse gas methane is to reduce meat consumption in favour of insect protein. If insects are no longer around, then that will be an additional challenge in reducing environmental degradation. This of course also ignores the fact that insects are already a component in the diet of many developing nations. Last year I wrote about how scientists have been creating advanced materials derived from animals. Again, we are shooting ourselves in the foot if we allow this ready-made molecular library to be destroyed.

What is responsible for this global decline? Perhaps unsurprisingly, it turns out to be the usual suspects. Agricultural chemicals including pesticides have been associated with honey-bee colony collapse disorder (not incidentally, some tests have found honey samples with neonicotinoids - the mostly widely-used insecticides - exceeding the recommended human dosage) so clearly the same culprit is affecting other insects. Fresh waterways, home to many aquatic insect species, are frequently as polluted as the soil, either due to agricultural run-off or industrial contaminants. Wild landscapes are being converted with great haste into farm land and urban sprawl, with an obviously much-reduced biota.

Climate change is playing its part, with soil acidity increasing just as it is in the oceans. Even areas as remote as central Australia have seen marked decreases in insects as higher temperatures and lower rainfall outpaces the ability to adapt to the new conditions. I've often mentioned the role of invasive species in the decimation of indigenous vertebrates, but insects are equally prone to suffer from the arrival of newcomers. Although New Zealand has very strict biosecurity protocols, the likes of Queensland fruit flies and brown marmorated stink bugs are still occasionally found in or around ports of entry.

Many nations have no such procedures in place, resulting in local species being out-competed or killed by introduced species or pathogens to which they have no resistance. Until fairly recently, even New Zealand had a lax attitude to the issue, resulting in the decline of native species such as carabid beetles. When I conducted a brief survey of my garden in 2017 I found that one-third of the insect species were non-native, most of these being accidental imports since the arrival of European settlers.

If insects are so vital to our survival, why has there been so little interest in their well-being? There are some fairly obvious suggestions here. Firstly, at least in Western cultures, insects have been deemed dirty, ugly things that can be killed without a second thought. Wasps, ants and cockroaches in particular are seen in this light of being unwelcome pests, with typical insect-related phrases including "creepy crawlies" and "don't let the bed bugs bite".

It's fairly well-known that malaria-carrying mosquitoes are the most dangerous animals for us humans in terms of fatalities. The widespread outbreaks of the Zika virus haven't done them any favours either. As Brian Cox's television series Wonders of Life showed, their small size has given them veritable super powers compared to us lumbering mammals, from climbing up sheer surfaces (as a praying mantis was doing a few nights' ago on my window) to having amazing strength-to-weight ratios. All in all, insects are a bit too alien for their own good!

Clearly, scale prejudice is also a key factor. On a recent trip to Auckland Central Library I only found one book on insects versus dozens on birds. Photographic technology has been a double-edged sword when it comes to giving us a clearer picture of insects: close-ups are often greeted with revulsion, yet until Sir David Attenborough's 2005 BBC series Life in the Undergrowth, there was little attempt to film their behaviour with the same level of detail as say, the lions and antelopes of the Serengeti. It should also be mentioned that when Rachel Carson's ground-breaking book about the dangers of pesticides, Silent Spring, was published in 1962, the resulting environmentalism was largely in support of birds rather than insects.

Among all this doom and gloom, are there any ways to prevent it? One thing is for certain, and that is that it won't be easy. The agricultural sector would have to make drastic changes for a start, becoming much smarter in the use of chemicals and be held responsible for the local environment, including waterways. Vertical farming and other novel techniques could reduce the need for new agricultural land and water usage, but developing nations would be hard-pressed to fund these themselves.

Before any major undertaking, there's going to have to be either a fundamental crisis, such as food shortages, in a rich nation or a massive public relations exercise to convince people to consider insects in the same light as giant pandas or dolphins. This is not going to be easy, but as David Attenborough put it: "These small creatures are within a few inches of our feet, wherever we go on land - but often, they're disregarded. We would do very well to remember them."

Sunday, 24 February 2019

Core solidification and the Cambrian explosion: did one begat the other?

Let's face it, we all find it easier to live our lives with the help of patterns. Whether it's a daily routine or consultation of an astrology column (insert expletive of choice here) - or even us amateur astronomers guiding our telescopes via the constellations - our continued existence relies on patterns. After all, if we didn't innately recognise our mother's face or differentiate harmless creatures from the shape of a predator, we wouldn't last long. So it shouldn't be any surprise that scientists also rely on patterns to investigate the complexities of creation.

Richard Feynman once said that a scientific hypothesis starts with a guess, which should perhaps be taken with a pinch of salt. But nonetheless scientists like to use patterns when considering explanations for phenomena; at a first glance, this technique matches the principle of parsimony, or Occam's Razor, i.e. the simplest explanation is usually the correct one - excluding quantum mechanics, of course!

An example in which a potential pattern was widely publicised prior to confirmation via hard data was that of periodic mass extinction, the idea being that a single cause might be behind the five greatest extinction events. Four years after Luis Alvarez's team's suggestion that the 66 million year-old Chicxulub impactor could have caused the Cretaceous-Paleogene extinction, paleontologists David Raup and Jack Sepkoski published a 1984 paper hypothesising extinctions at regular intervals due to extraterrestrial impacts.

This necessitated the existance of an object that could cause a periodic gravitational perturbation, in order for asteroids and comets to be diverted into the inner solar system. The new hypothesis was that we live in binary star system, with a dwarf companion star in an highly elliptical, 26 million-year orbit. This would be responsible for the perturbation when it was at perihelion (i.e. closest approach to the sun).

What's interesting is that despite the lack of evidence, the hypothesis was widely publicised in popular science media, with the death-dealing star being appropriately named Nemesis after the Greek goddess of retribution. After all, the diversification of mammals was a direct result of the K-T extinction and so of no small importance to our species.

Unfortunately, further research has shown that mass extinctions don't fall into a neat 26 million-year cycle. In addition, orbiting and ground-based telescopes now have the ability to detect Nemesis and yet have failed to do so. It appears that the hypothesis has reached a dead end; our local corner of the universe probably just isn't as tidy as we would like it to be.

Now another hypothesis has appeared that might appear to belong in a similar category of neat pattern matching taking precedence over solid evidence. Bearing in mind the importance of the subject under scrutiny - the origin of complex life - are researchers jumping the gun in order to gain kudos if proven correct? A report on 565 million year-old minerals from Quebec, Canada, suggests that at that time the Earth's magnetic field was less than ten percent of what it is today. This is considerably lower than earlier estimate of forty percent. Also, the magnetic poles appear to have reversed far more frequently during this period than they have since.

As this is directly related to the composition of the Earth's core, it has led to speculation that the inner core was then in the final stage of solidification. This would have caused increased movement in the outer liquid, iron-rich core, and thus to the rapid generation of a much higher magnetic field. In turn, the larger the magnetic field dipole intensity, the lower the amount of high energy particles that reach the Earth's surface, both cosmic rays and from our own sun. What is particularly interesting about this time is that it is just (i.e. about twenty million years) prior to the so-called Cambrian explosion, following three billion years or so of only microbial life. So were these geophysical changes responsible for a paradigm shift in evolution? To confirm, we would need to confirm the accuracy of this apparently neat match.

It's well known that some forms of bacteria can survive in much higher radiation environments than us larger scale life forms; extremophiles such as Deinococcus radiodurans have even been found thriving inside nuclear reactors. Therefore it would seem obvious that more complex organisms couldn't evolve until the magnetic field was fairly high. But until circa 430 million years ago there was no life on land (there is now evidence that fungi may have been the first organisms to survive in this harsh environment). If all life was therefore in the sea, wouldn't the deep ocean have provided the necessary radiation protection for early plants and animals?

By 600 million years ago the atmospheric oxygen content was only about ten percent of today's value; clearly, those conditions would not have been much use to pretty much any air-breathing animals we know to have ever existed. In addition, the Ediacaran assemblage, albeit somewhat different from most subsequent higher animals, arose no later than this time - with chemical evidence suggesting their development stretched back a further 100 million years. Therefore the Canadian magnetic mineral evidence seems to be too late for the core solidification/higher magnetic field generation to have given the kick start to a more sophisticated biota.

In addition, we shouldn't forget that it is the ozone layer that acts as an ultraviolet shield; UVB is just as dangerous to many organisms, including near-surface marine life, as cosmic rays and high-energy solar particles. High-altitude ozone is thought to have reached current density by 600 million years ago, with blue-green algae as its primary source. O2 levels also increased at this time, perhaps driven by climate change at the end of a global glaciation.

Although the "Snowball Earth" hypothesis - that at least half of all ocean water was frozen solid during three or four periods of glaciation - is still controversial, there is something of a correlation in time between the geophysical evidence and the emergence of the Ediacaran fauna. As to the cause of this glacial period, it is thought to have been a concatenation of circumstances, with emergent plate tectonics as a primary factor.

How to conclude? Well, we would all like to find neat, obvious solutions, especially to key questions about our own origin. Unfortunately, the hypothesis based on the magnetic mineral evidence appears to selectively ignore the evolution of the Ediacaran life forms and the development of the ozone layer. The correlation between the end of "Snowball Earth" and the Ediacaran biota evolution is on slightly firmer ground, but the period is so long ago that even dating deposits cannot be accurate except to the nearest million years or so.

It's certainly a fascinating topic, so let's hope that one day the evidence will be solid enough for us to finally understand how and when life took on the complexity we take for granted. Meanwhile, I would take any speculation based on new evidence with a Feynman-esque pinch of salt; the universe frequently fails to match the nice, neat, parcels of explanations we would like it to. Isn't that one of the factors that makes science so interesting in the first place?

Monday, 11 February 2019

The Square Kilometre Array: is it the wrong big science for New Zealand?

I've previously written about the problems besetting some mega-budget science projects and the notion that perhaps they should lose precedence to smaller programmes with quicker returns to both science and society. Of course there are advantages to long-term international STEM collaboration, including social, economic and political benefits, but there is a good case for claiming that projects are sometimes initiated without a full appreciation of the details.

Take for example, the Square Kilometre Array or SKA, the largest science project New Zealand has ever been involved with. Headquartered at the UK's Jodrell Bank Observatory (incidentally, I've been there a few times and it's well worth a visit if you're in the vicinity), twelve key nations are collaborating to construct two main arrays, one in Australia and the other in South Africa and some of its neighbours. The combined arrays will have a sensitivity fifty times greater than previous radio telescopes, allowing them to survey the sky far faster than has been done before and look back in time much earlier than current instruments.

But such paradigm-shifting specifications come with a very high price tag – and the funding sources are yet to be finalised. The €1.8 billion project is scheduled to start Phase 1 construction in 2024 and aims to begin observations four years later. Research will include a wide range of fundamental astrophysical questions, from exploring the very early universe only 300,000 years after the Big Bang to testing general relativity, gaining information on dark energy and even some SETI research.

The New Zealand contribution is organised via the Australia-New Zealand SKA Coordination Committee (ANZSCC) and is geared towards data processing and storage. The Central Signal Processor and Science Data Processor are fundamental components of the project, since the radio telescopes are expected to generate more data than the world currently stores.  As well as closer collaboration between the scientists and engineers of various nations, one of the aims of SKA is to become a source of public science education, something I have repeatedly pointed out is in desperate need of improvement.

So if this all seems so promising, why has the New Zealand Government announced that it may pull back from committing the outstanding NZ$23 million (equal to less than 10% of Australia's funding)? To date, the country has paid less than NZ$3 million. In 2015 I discussed the danger of the country falling behind in cutting-edge STEM research and Rocket Lab aside (which is after all, an American-owned company despite its kiwi founder) the situation hasn't really changed. so why did Research, Science and Innovation Minister Megan Woods declare this potential about turn, which may well relegate New Zealand to associate membership status?

The initial answer appears to be one of pure economics. Although the project is generating development of world-class computer technology, a report has questioned the long-term benefits from investing such comparatively large sums of public money. India is already an associate member while Germany has been considering a similar downgrade for some years and Canada may follow suit. The project is already  a decade behind schedule and New Zealand had hoped to be an array-hosting nation but lost out due to a lower bid from South Africa. SKA is run by a same-name not-for-profit organisation and so presumably any financial rewards are of a secondary nature (perhaps along the lines of patents or new technologies that can be repurposed elsewhere).

Interestingly, New Zealand's science community has been divided on the issue. While Auckland University of Technology and Victoria University of Wellington have objected to the downgrade, the university of Auckland's head of physics Richard Easther has support the Ministry of Business, Innovation and Employment (MBIE) decision, saying that far from providing financial and long-term science benefits (in both applied computing and astrophysical data), SKA is a white elephant, hinting that it might well be obsolete by the time it starts gathering data.

Another University of Auckland astrophysicist, Dr Nick Rattenbury, argues that the nation's public funding infrastructure is currently too primitive for it to become involved in such international mega-budget STEM projects. I simply don't know enough detail to question whether such adages as you need to speculate in order to accumulate apply here; it's clearly a well-thought out programme, unlike say the politically-motivated yet vague and probably unworkable Predator Free 2050 scheme.

If SKA was committed to solving an immediate practical problem in the fields of say, environmental degradation, food and water production, or medicine, I would probably have no hesitation in supporting it whole-heartedly, regardless of the cost to the public purse. But the universe has been around almost fourteen billion years, so I for one don't mind if it holds onto a few of its fundamental secrets for a little while longer.

Saturday, 26 January 2019

Concrete: a material of construction & destruction - and how to fix it

How often is it that we fail to consider what is under our noses? One of the most ubiquitous of man-made artifices - at least to the 55% of us who live in urban environments - is concrete. Our high-rise cities and power stations, farmyard siloes and hydroelectric dams wouldn't exist without it. As it is, global concrete consumption has quadrupled over the past quarter century, making it second only to water in terms of humanity's most-consumed substance. Unfortunately, it is also one of most environmentally-unfriendly materials on the planet.

Apart from what you might consider to be the aesthetic crimes of the bland, cookie-cutter approach to International Modernist architecture, there is a far greater issue due to the environmental degradation caused by the concrete manufacturing process. Cement is a key component of the material, but generates around 8% of all carbon dioxide emissions worldwide. As such, there needs to be a 20% reduction over the next ten years in order to fulfil the Paris Agreement - yet there is thought there may be a 25% increase in demand for concrete during this time span, particularly from the developing world. Although lower-carbon cements are being developed, concrete production causes other environmental issues as well. In particular, sand and gravel extraction is bad for the local ecology, including catastrophic damage to the sea bed.

So are there any alternatives? Since the 1990's, television series such as Grand Designs have presented British, New Zealand and Australian-based projects for (at times) extremely sustainable houses made from materials such as shipping containers, driftwood, straw bales, even shredded newspaper. However, these are mostly the unique dream builds of entrepreneurs, visionaries and let's face it, latter-day hippies. The techniques used might be suitable for domestic architecture, but they are impractical at a larger scale.

The US firm bioMASON studied coral in order to develop an alternative to conventional bricks, which generate large amounts of greenhouse gases during the firing process. They use a biomineralisation process, which basically consists of injecting microbes into nutrient-rich water containing sand and watching the rod-shaped bacteria grow into bricks over three to five days.  It's still comparatively early days for the technology, so meanwhile, what about applying the three environmental ‘Rs' of Reduce, Reuse and Recycle to conventional concrete design and manufacturing?

1 Reduce

3D printers are starting to be used in the construction industry to fabricate building and structural components, even small footbridges. Concrete extrusion designs require less material than is required by conventional timber moulds - not to mention removing the need for the timber itself. One common technique is to build up shapes such as walls from thin, stacked, layers. The technology is time-effective too: walls can be built up at a rate of several metres per hour, which may induce companies to make the initial outlay for the printing machinery.

As an example of the low cost, a 35 square metre demonstration house was built in Austin, Texas, last year at a cost of US$10,000 - and it only took 2 days to build. This year may see an entire housing project built in the Netherlands using 3D-printed concrete. Another technique has been pioneered at Exeter University in the UK, using graphene as an additive to reduce the amount of concrete required. This greatly increases both the water resistance and strength compared to the conventional material, thus halving the material requirement.

2 Reuse

Less than a third of the material from conventionally-built brick and timber structures can be reused after demolition. The post-war construction industry has continually reduced the quality of the building material it uses, especially in the residential sector; think of pre-fabricated roof trusses, made of new growth, comparatively unseasoned timber and held together by perforated connector plates. The intended lifespan of such structures could be as little as sixty years, with some integrated components such as roofing failing much sooner.

Compare this to Roman structures such as aqueducts and the Pantheon (the latter still being the world's largest unreinforced concrete dome) which are sound after two millennia, thanks to their volcanic ash-rich material and sophisticated engineering. Surely it makes sense to use concrete to construct long-lasting structures, rather than buildings that will not last as long as their architects? If the reuse of contemporary construction materials is minimal (about as far removed as you can get from the traditional approach of robbing out stone-based structures in their entirety) then longevity is the most logical alternative.

3 Recycle

It is becoming possible to both recycle other waste into concrete-based building materials and use concrete itself as a secure storage for greenhouse gases. A Canadian company called CarbonCure has developed a technique for permanently sequestering carbon dioxide in their concrete by converting it into a mineral during the manufacturing process, with the added benefits of increasing the strength of the material while reducing the amount of cement required.

As for recycling waste material as an ingredient, companies around the world have been developing light-weight concrete incorporating mixed plastic waste, the latter comprising anywhere from 10% to 60% of the volume, particularly with the addition of high density polyethylene.

For example New Zealand company Enviroplaz can use unsorted, unwashed plastic packaging to produce Plazrok, a polymer aggregate for creating a concrete which is up to 40% lighter than standard material. In addition, the same company has an alternative to metal and fibreglass panels in the form of Plaztuff, a fully recyclable, non-corroding material which is one-seventh the weight of steel. It has even been used to build boats as well as land-based items such as skips and playground furniture.

Therefore what might appear to be an intractable problem appears to have a variety of overlapping solutions that allow sustainable development in the building and civil engineering sector. It is somewhat unfortunate then that the conservative nature of these industries has until recently stalled progress in replacing a massive pollutant with much more environmentally sound alternatives. Clearly, green architecture doesn't have to be the sole prerogative of the driftwood dreamers; young entrepreneurs around the world are seizing the opportunity to create alternatives to the destructive effects of construction.

Friday, 11 January 2019

Hot, cold or in between: thermoregulation and public misunderstanding of science

I recently spotted an intriguing paleontology article concerning the 180 million year old fossil remains of an ichthyosaur, a marine reptile from the Early Jurassic. The beastie, belonging to the genus Stenopterygius,  is so well preserved that it shows coloration patterns (if not the colours themselves) on patches of scaleless skin, as well as a thick layer of insulating fat or blubber. What makes the latter so intriguing is that reptiles just aren't meant to have blubber. Then again, like some snakes and skinks today, ichthyosaurs must have given birth to live young. Thus the gap between reptiles and mammals surely grows ever smaller?

This conundrum touches on some interesting issues about the public's knowledge of science. Several times I've commented on what Richard Dawkins calls the "tyranny of the discontinuous mind", which is the way in which we use categorisation to make it easier to understand the world. It might seem that this is the very essence of some aspects of science, as in New Zealand physicist Ernest Rutherford's famously ungenerous quote that "Physics is the only real science. The rest are just stamp collecting." Indeed, examination of the life and work of many early botanists for example might appear to verify this statement. However, there needs to be an understanding that science requires a flexibility of mind set, a fundamental scientific process being the discarding of a pet theory in favour of a more accurate one.

I'm sure I've remarked countless times - again, echoing Professor Dawkins - that science is in this respect the antithesis of most religions, which set key ideas into stone and refuse to accept any challenges towards them. In the case of the blubber-filled Stenopterygius, it is still a reptile, albeit one that had many of the attributes of mammals. As for the latter, from our pre-school picture books onwards we tend to think of the main mammalian subclass, the placentals, but there are two smaller subclasses: the marsupials, such as the kangaroo; and the monotremes, for example the duck-billed platypus. It has been known since the 1880s that the platypus lays eggs rather than giving birth to live young, a characteristic it shares with the other four monotreme species alive today. In addition, their body temperature is five degrees Celsius lower than that of placental mammals, part of a suite of features presumably retained from their mammal-like reptile ancestors.

Even so, these traits do not justify the comment made by host Stephen Fry in a 2005 episode of the BBC TV quiz show QI, when he claimed that marsupials are not mammals! Richard Dawkins has frequently pointed out that it would be unacceptable to have a similar level of ignorance about the arts as there is on scientific matters, with this being a clear case in point as regards the cultured and erudite Mr Fry. Yet somehow, much of the general public has either a lack or a confusion concerning basic science. Indeed, only  last week I listened to a BBC Radio topical comedy show in which none of the panel members could work out why one face of the moon is always hidden from our view. Imagine the response if it had been a basic lack of knowledge in the arts and literature, for example if an Oxbridge science graduate had claimed that Jane Austen had written Hamlet!

Coming back to the ichthyosaur, one thing we may have learnt as a child is that some animals are warm-blooded and others cold-blooded. This may be useful as a starting point but it is an overly-simplistic and largely outmoded evaluation of the relevant biology; the use of such binary categorisation is of little use after primary school age. In fact, there is series of steps from endothermic homeotherms (encompassing most mammals and birds) to ectothermic poikilotherms (most species of fish, reptiles, amphibians and invertebrates), with the former metabolic feature having evidently developed from the latter.

Ichthyosaurs are likely to have had one of the intermediate metabolisms, as may have been the case for some species of dinosaurs, possibly the smaller, feathered, carnivorous theropods. Likewise, some tuna and shark species are known to be able to produce heat internally, but in 2015 researchers at the US National Marine Fisheries Service announced that five species of the opah fish were found to be whole-body endotherms. Clearly, the boundaries between us supposedly higher mammals and everything else is far less secure than we had previously believed.

At times, science terminology might appear as too abstruse, too removed from the everyday and of little practical use outside of a pub quiz, but then does being able to critique Shakespeare or Charles Dickens help to reduce climate change or create a cure for cancer? Of course we should strive to be fully-rounded individuals, but for too long STEM has been side-lined or stereotyped as too difficult or irrelevant when compared with the humanities.

Lack of understanding of the subtleties and gradations (as opposed to clearly defined boundaries) in science make it easy for anti-science critics to generate public support. Ironically, this criticism tends to take one of two clearly opposing forms: firstly, that science is mostly useless - as epitomised by the Ig Nobel Prize; and alternatively, that it leads to dangerous inventions, as per the tabloid scare-mongering around genetically modified organisms (GMOs) or 'Frankenfoods' as they are caricatured.

Being able to discern nuanced arguments such as the current understanding of animal thermoregulation is a useful tool for all of us. Whether it is giving the public a chance to vote in scientifically-related referendums or just arming them so as to avoid quack medicine, STEM journalism needs to improve beyond the lazy complacency that has allowed such phrases as 'warm-blooded', 'living fossil', 'ice age' and 'zero gravity' to be repeatedly misused. Only then will science be seen as the useful, relevant and above all a much more approachable discipline than it is currently deemed to be.

Friday, 21 December 2018

The Twelve (Scientific) Days Of Christmas

As Christmas approaches and we get over-saturated in seasonal pop songs and the occasional carol, I thought it would be appropriate to look at a science-themed variation to this venerable lyric. So without further ado, here are the twelve days of Christmas, STEM-style.

12 Phanerozoic periods

Although there is evidence that life on Earth evolved pretty much as soon as the conditions were in any way suitable, microbes had the planet to themselves for well over three billion years. Larger, complex organisms may have gained a kick-start thanks to a period of global glaciation - the controversial Snowball Earth hypothesis. Although we often hear of exoplanets being found in the Goldilocks zone, it may also take an awful lot of luck to produce a life-bearing environment. The twelve geological periods of the Phanerozoic (literally, well-displayed life) cover the past 542 million years or so and include practically every species most of us have ever heard of. Hard to believe that anyone who knows this could ever consider our species to be the purpose of creation!

11 essential elements in humans

We often hear the phrase 'carbon-based life forms', but we humans actually contain over three times the amount of oxygen than we do of carbon. In order of abundance by mass, the eleven vital elements are oxygen, carbon, hydrogen, nitrogen, calcium, phosphorus, potassium, sulfur, sodium, chlorine and magnesium. Iron, which you might think to be present in larger quantities, is just a trace mineral; adults have a mere 3 or 4 grams. By comparison, we have about 25 grams of magnesium. In fact, iron and the other trace elements amount to less than one percent of our total body mass. Somehow, 'oxygen-based bipeds' just doesn't have the same ring to it.

10 fingers and toes

The evolution of life via natural selection and genetic mutation consists of innumerable, one-off events. This is science as history, although comparative studies of fossils, DNA and anatomy are required instead of written texts and archaeology. It used to be thought that ten digits was canonical, tracing back to the earliest terrestrial vertebrates that evolved from lobe-finned fish. Then careful analysis of the earliest stegocephalians of the late Devonian period such as Acanthostega showed that their limbs terminated in six, seven or even eight digits. The evolution of five-digit limbs seems to have occurred only once, in the subsequent Carboniferous period, yet of course we take it - and the use of base ten counting - as the most obvious of things. Just imagine what you could play on a piano if you had sixteen fingers!

9 climate regions

From the poles to the equator, Earth can be broadly divided into the following climate areas: polar and tundra; boreal forest; temperate forest; Mediterranean; desert; dry grassland; tropical grassland; tropical rainforest. Mountains are the odd region out, appearing in areas at any latitude that contains the geophysical conditions suitable for their formation. Natural selection leads to the evolution of species suited to the local variations in daylight hours, weather and temperature but the labels can be deceptive; the Antarctic for example contains a vast polar desert. We are only just beginning to understand the complex feedback systems between each region and its biota at a time when species are becoming extinct almost faster than they can be catalogued. We upset the relative equilibrium at our peril.

8 major planets in our solar system

When I was a child, all astronomy books described nine known planets, along with dozens of moons and numerous asteroids. Today we know of almost four thousand planets in other solar systems, some of a similar size to Earth (and even some of these in the Goldilocks zone). However, since 1996 our solar system has been reduced to eight planets, with Pluto amended to the status of a dwarf planet. Technically, this is because it fails one of the three criteria of major planets, in that it sometimes crosses Neptune’s orbit rather than sweeping it clear of other bodies. However, as there is at least one Kuiper belt object, Eris, almost as large as Pluto, it makes sense to stick to a definition that won’t see the number of planets continually rise with each generation of space telescope. This downgrading appears to have upset a lot of people, so it’s probably a good to mention that science is as much a series of methodologies as it is a body of knowledge, with the latter being open to change when required - it’s certainly not set-in-stone dogma! So as astronomer Neil DeGrasse Tyson and author of the best-selling The Pluto Files: The Rise and Fall of America's Favorite Planet put it: "Just get over it!"

7 colours of the rainbow

This is one of those everyday things that most of us never think about. Frankly, I don't know anyone who has been able to distinguish indigo from violet in a rainbow and yet we owe this colour breakdown not to an artist but to one of the greatest physicists ever, Sir Isaac Newton. As well as fulfilling most of the criteria of the modern day scientist, Newton was also an alchemist, numerologist, eschatologist (one of his predictions is that the world will end in 2060) and all-round occultist. Following the mystical beliefs of the Pythagoreans, Newton linked the colours of the spectrum to the notes in Western music scale, hence indistinguishable indigo making number seven. This is a good example of how even the best of scientists are only human.

6 mass extinction events

Episode two of the remake of Carl Sagan's Cosmos television series featuring Neil DeGrasse Tyson was called 'Some of the Things That Molecules Do'. It explored the five mass extinction events that have taken place over the past 450 million years. Tyson also discusses what has come to be known as the Holocene extinction, the current, sixth period of mass dying. Although the loss of megafauna species around the world has been blamed on the arrival of Homo sapiens over the past 50,000 years, the rapid acceleration of species loss over the last ten millennia is shocking in the extreme. It is estimated that the current extinction rate is anywhere from a thousand to ten thousand times to the background rate, resulting in the loss of up to two hundred plant or animals species every day. Considering that two-thirds of our pharmaceuticals are derived or based on biological sources, we really are shooting ourselves in the foot. And that's without considering the advanced materials that we could develop from nature.

5 fundamental forces

Also known as interactions, in order from strongest to weakest these are: the strong nuclear force; electro-magnetism; the weak nuclear force; and gravity. One of the most surprising finds in late Twentieth Century cosmology was that as the universe expands, it is being pushed apart at an ever-greater speed. The culprit has been named dark energy, but that's where our knowledge ends of this possible fifth force. Although it appears to account for about 68% of the total energy of the known universe, the label 'dark' refers to the complete lack of understanding as to how it is generated. Perhaps the most radical suggestion is that Einstein's General Theory of Relativity is incorrect and that an overhaul of the mechanism behind gravity would remove the need for dark energy at all. One thing is for certain: we still have a lot to learn about the wide-scale fabric of the universe.

4 DNA bases

Despite being one of the best-selling popular science books ever, Bill Bryson's A Short History of Nearly Everything manages to include a few howlers, including listing thiamine (AKA vitamin B1) as one of the four bases, instead of thymine. In addition to an understanding how the bases (adenine, cytosine, guanine and thymine) are connected via the double helix backbone, the 1953 discovery of DNA's structure also uncovered the replication mechanism, in turn leading to the development of the powerful genetic editing tools in use today. Also, the discovery itself shows how creativity can be used in science: Watson and Crick's model-building technique proved to be a faster way of generating results than the more methodical x-ray crystallography of Rosalind Franklin and Maurice Wilkins - although it should be noted that one of Franklin's images gave her rivals a clue as to the correct structure. The discovery also shows that collaboration is often a vital component of scientific research, as opposed to the legend of the lonely genius.

3 branches of science

When most people think of science, they tend to focus on the stereotypical white-coated boffin, beavering away in a laboratory filled with complex equipment. However, there are numerous branches or disciplines, covering the purely theoretical, the application of scientific theory, and everything in between. Broadly speaking, science can be divided into the formal sciences, natural sciences and social sciences, each covering a variety of categories themselves. Formal sciences include mathematics and logic and has aspects of absolutism about it (2+2=4). The natural or 'hard' sciences are what we learn in school science classes and broadly divide into physics, chemistry and biology. These use observation and experiment to develop working theories, but maths is often a fundamental component of the disciplines. Social or 'soft' sciences speak for themselves, with sub-disciplines such as anthropology sometimes crossing over into humanities such as archaeology. So when someone tells you that all science is impossibly difficult, you know they obviously haven't considered just what constitutes science!

2 types of fundamental particles

Named after Enrico Fermi and Satyendra Nath Bose respectively, fermions and bosons are the fundamental building blocks of the universe. The former, for example quarks and electrons, are the particles of mass and obey the Pauli Exclusion Principle, meaning no two fermions can exist in the same place in the same state. The latter are the carriers of force, with photons being the best known example. One problem with these particles and their properties such as angular momentum or spin is that most analogies are only vaguely appropriate. After all, we aren't used to an object that has to rotate 720 degrees in order to get back to its original state! In addition, there are many aspects of underlying reality that are far from being understood. String theory was once mooted as the great hope for unifying all the fermions and bosons, but has yet to achieve absolute success, while the 2012 discovery of the Higgs boson is only one potential advance in the search for a Grand Unifying Theory of creation.

1 planet Earth

There is a decorative plate on my dining room wall that says "Other planets cannot be as beautiful as this one." Despite the various Earth-sized exoplanets that have been found in the Goldilocks zone of their solar system, we have little chance in the near future of finding out if they are inhabited as opposed to just inhabitable. Although the seasonal methane on Mars hints at microbial life there, any human colonisation will be a physically and psychologically demanding ordeal. The idea that we can use Mars as a lifeboat to safeguard our species - never mind our biosphere - is little more than a pipedream. Yet we continue to exploit our home world with little consideration for the detrimental effects we are having on it. As the environmental movement says: there is no Planet B. Apart from the banning of plastic bags in some supermarkets, little else appears to have been done since my 2010 post on reduce, reuse and recycle. So why not make a New Year’s resolution to help future generations? Wouldn’t that be the best present for your children and your planetary home?

Wednesday, 12 December 2018

New neurons: astrocytes, gene therapy and the public fear of brain modification

Ever since the first cyberpunk novels of the early 1980s - and the massive increase of public awareness in the genre thanks to Hollywood - the idea of artificially-enhanced humans has been a topic of intense discussion. Either via direct augmentation of the brain or the development of a brain-computer interface (BCI), the notion of Homo superior has been associated with a dystopian near-future that owes much to Aldous Huxley's Brave New World. After reading about current research into repairing damaged areas of the brain and spinal cord, I thought it would be good to examine this darkly-tinged area.

Back in 2009 I posted about how science fiction has to some extent been confused with science fact, which coupled with the fairly appalling quality of much mainstream media coverage of science stories, has led to public fear where none is necessary and a lack of concern where there should be heaps. When it comes to anything suggestive of enhancing the mind, many people immediately fall back on pessimistic fictional examples, from Frankenstein to Star Trek's the Borg. This use of anti-scientific material in the consideration of real-world STEM is not an optimal response, to say the least.

Rather than working to augment normal humans, real research projects on the brain are usually funded on the basis that they will generate improved medical techniques for individuals with brain or spinal cord injuries. However, a combination of the fictional tropes mentioned above and the plethora of internet-disseminated conspiracy theories, usually concerning alleged secret military projects, have caused the public to concentrate on entirely the wrong aspects.

The most recent material I have read concerning cutting-edge work on the brain covers three teams' use of astrocytes to repair damaged areas. This is an alternative to converting induced pluripotent stem cells (iPSCs) to nerve cells, which has shown promise for many other types of cell. Astrocytes are amazing things, able to connect with several million synapses. Apparently Einstein's brain had far more of them than usual in the region connected with mathematical thinking. The big question would be whether this accumulation was due to nature or nurture, the latter being the high level of exercise Einstein demanded of this region of his brain.

Astrocyte research for brain and spinal cord repair has been ongoing since the 1990s, in order to discover if they can be reprogrammed as functional replacements for lost neurons without any side effects. To this end, mice have been deliberately brain-damaged and then attempts made to repair that damage via converted astrocytes. The intention is to study if stroke victims could be cured via this method, although there are hopes that eventually it may also be a solution for Parkinson's disease, Alzheimer's and even ALS (motor neurone disease). The conversion from astrocyte to neuron is courtesy of a virus that introduces the relevant DNA, although none of the research has as yet proven that the converted cells are fully functional neurons.

Therefore, it would seem we are some decades away from claiming that genetic manipulation can cure brain-impairing diseases. But geneticists must share some of the blame for giving the public the wrong impression. The hyperbole surrounding the Human Genome Project gave both public and medical workers a false sense of optimism regarding the outcome of the genome mapping. In the late 1990s, a pioneer gene therapist predicted that by 2020 virtually every disease would include gene therapy as part of the treatment. We are only just over a year short of this date, but most research is still in first phase trial - and only concern diseases that don't have a conventional cure. It turned out that the mapping was just the simplest stage of a multi-part programme to understand the complexities of which genes code for which disorders.

Meanwhile, gene expression in the form of epigenetics has inspired a large and extremely lucrative wave of pseudo-scientific quackery that belongs in the same genre as homeopathy, crystal healing and all the other New Age flim-flam that uses real scientific terminology to part the gullible from their cash. The poor standard of science education outside of schools (and in many regions, probably within them too) has led to the belief that changing your lifestyle can fix genetic defects or affect cures of serious brain-based illnesses.

Alas, although gene expression can be affected by environmental influences, we are ultimately at the mercy of what we inherited from our parents. Until the astrocyte research has been verified, or a stem cell solution found, the terrible truth is that the victims of strokes and other brain-based maladies must rely upon established medical treatments.

This isn't to say that we may in some cases be able to reduce or postpone the risk with a better lifestyle; diet and exercise (of both the body and brain) are clearly important, but they won't work miracles. We need to wait for the outcome of the current research into astrocytes and iPSCs to find out if the human brain can be repaired after devastating attacks from within or without. Somehow I doubt that Homo superior is waiting round the corner, ready to take over the world from us unenhanced humans…