Wednesday 27 September 2017

Cow farts and climate fiddling: has agriculture prevented a new glaciation?

Call me an old grouch, but I have to say that one of my bugbears is the use of the term 'ice age' when what is usually meant is a glacial period. We currently live in an interglacial (i.e. warmer) era, the last glaciation having ended about 11,700 years ago. These periods are part of the Quaternary glaciation that has existed for almost 2.6 million years and deserving of the name 'Ice Age', with alternating but irregular cycles of warm and cold. There, that wasn't too difficult now, was it?

What is rather more interesting is that certain geology textbooks published from the 1940s to 1970s hypothesised that the Earth is overdue for the next glaciation. Since the evidence suggests the last glacial era ended in a matter of decades, the proposed future growth of the ice sheets could be equally rapid. Subsequent research has shown this notion to be flawed, with reliance on extremely limited data leading to over-confident conclusions. In fact, current estimates put interglacial periods as lasting anywhere from ten thousand to fifty thousand years, so even without human intervention in global climate, there would presumably be little to panic about just yet.

Over the past three decades or so this cooling hypothesis has given way to the opposing notion of a rapid increase in global temperatures. You only have to read such recent news items as the breakaway of a six thousand square kilometre piece of the Antarctic ice shelf to realise something is going on, regardless of whether you believe it is manmade, natural or a combination of both. But there is a minority of scientists who claim there is evidence for global warming - and an associated postponement of the next glaciation - having begun thousands of years prior to the Industrial Revolution. This then generates two key questions:

  1. Has there been a genuine steady increase in global temperature or is the data flawed?
  2. Assuming the increase to be accurate, is it due to natural changes (e.g. orbital variations or fluctuations in solar output) or is it anthropogenic, that is caused by human activity?

As anyone with even a vague interest in or knowledge of climate understands, the study of temperature variation over long timescales is fraught with issues, with computer modelling often seen as the only way to fill in the gaps. Therefore, like weather forecasting, it is far from being an exact science (insert as many smileys here as deemed appropriate). Although there are climate-recording techniques involving dendrochronology (tree rings) and coral growth that cover the past few thousand years, and ice cores that go back hundreds of thousands, there are still gaps and assumptions that mean the reconstructions involve variable margins of error. One cross-discipline assumption is that species found in the fossil record thrived in environments - and crucially at temperatures - similar to their descendants today. All in all this indicates that none of the numerous charts and diagrams displaying global temperatures over the past twelve thousand years are completely accurate, being more along the lines of a reconstruction via extrapolation.

Having looked at some of these charts I have to say that to my untrained eye there is extremely limited correlation for the majority of the post-glacial epoch. There have been several short-term fluctuations in both directions in the past two thousand years alone, from the so-called Mediaeval Warm Period to the Little Ice Age of the Thirteenth to Nineteenth centuries. One issue of great importance is just how wide a region did these two anomalous periods cover outside of Europe and western Asia? Assuming however that the gradual warming hypothesis is correct, what are the pertinent details?

Developed in the 1920s, the Milankovitch cycles provide a reasonable fit for the evidence of regular, long-term variations in the global climate. The theory states that changes in the Earth's orbit and axial tilt are the primary causes of these variations, although the timelines do not provide indisputable correlation. This margin of error has helped to lead other researchers towards an anthropogenic cause for a gradual increase in planet-wide warming since the last glaciation.

The first I heard of this was via Professor Iain Stewart's 2010 BBC series How Earth Made Us, in which he summarised the ideas of American palaeoclimatologist Professor William Ruddiman, author of Plows, Plagues and Petroleum: How Humans Took Control of Climate. Although many authors, Jared Diamond amongst them, have noted the effects of regional climate on local agriculture and indeed the society engaged in farming, Professor Ruddiman is a key exponent of the reverse: that pre-industrial global warming has resulted from human activities. Specifically, he argues that the development of agriculture has led to increases in atmospheric methane and carbon dioxide, creating an artificial greenhouse effect long before burning fossil fuels became ubiquitous. It is this form of climate change that has been cited as postponing the next glaciation, assuming that the current interglacial is at the shorter end of such timescales. Ruddiman's research defines two major causes for an increase in these greenhouse gases:

  1. Increased carbon dioxide emissions from burning vegetation, especially trees, as a form of land clearance, i.e. slash and burn agriculture.
  2. Increased methane from certain crops, especially rice, and from ruminant species, mostly cattle and sheep/goat.

There are of course issues surrounding many of the details, even down to accurately pinpointing the start dates of human agriculture around the world. The earliest evidence of farming in the Near East is usually dated to a few millennia after the end of the last glaciation, with animal husbandry preceding the cultivation of crops. One key issue concerns the lack of sophistication in estimating the area of cultivated land and ruminant population size until comparatively recent times, especially outside of Western Europe. Therefore generally unsatisfactory data concerning global climate is accompanied by even less knowledge concerning the scale of agriculture across the planet for most of its existence.

The archaeological evidence in New Zealand proves without a doubt that the ancestors of today's Maori, who probably first settled the islands in the Thirteenth Century, undertook enormous land clearance schemes. Therefore even cultures remote from the primary agricultural civilisations have used similar techniques on a wide scale. The magnitude of these works challenges the assumption that until chemical fertilisers and pesticides were developed in the Twentieth Century, the area of land required per person had altered little since the first farmers. In a 2013 report Professor Ruddiman claims that the level of agriculture practiced by New Zealand Maori is just one example of wider-scale agricultural land use in pre-industrial societies.

As for the role played by domesticated livestock, Ruddiman goes on to argue that ice core data shows an anomalous increase in atmospheric methane from circa 3000BCE onwards. He hypothesises that a rising human population led to a corresponding increase in the scale of agriculture, with rice paddies and ruminants the prime suspects. As mentioned above, the number of animals and size of cultivated areas remain largely conjectural for much of the period in question.  Estimates suggest that contemporary livestock are responsible for 37% of anthropogenic methane and 9% of anthropogenic carbon dioxide whilst cultivated rice may be generating up to 20% of anthropogenic methane. Extrapolating back in time allows the hypothesis to gain credence, despite lack of access to exact data.

In addition, researchers both in support and opposition to pre-industrial anthropogenic global warming admit that the complexity of feedback loops, particularly with respect to the role of temperature variation in the oceans, further complicates matters. Indeed, such intricacy, including the potential latency between cause and effect, means that proponents of Professor Ruddiman's ideas could be using selective data for support whilst suppressing its antithesis. Needless to say, cherry-picking results is hardly model science.

There are certainly some intriguing aspects to this idea of pre-industrial anthropogenic climate change, but personally I think the jury is still out (as I believe it is for the majority of professionals in this area).  There just isn't the level of data to guarantee its validity and what data is available doesn't provide enough correlation to rule out other causes. I still think such research is useful, since it could well prove essential in the fight to mitigate industrial-era global warming. The more we know about longer term variations in climate change, the better the chance we have of understanding the causes - and potentially the solutions - to our current predicament. And who knows, the research might even persuade a few of the naysayers to move in the right direction. That can't be bad!

Monday 11 September 2017

Valuing the velvet worm: noticing the most inconspicuous of species

Most of the recent television documentaries or books I've encountered that discuss extra-terrestrial life include some description of the weirder species we share our own planet with. Lumped together under the term 'extremophiles' these organisms appear to thrive in environments hostile to most other life forms, from the coolant ponds of nuclear reactors to the boiling volcanic vents of the deep ocean floor.

Although this has rightly gained attention for these often wonderfully-named species (from snottites to tardigrades) there are numerous other lifeforms scarcely noticed by anyone other than a few specialists, quietly going about their unassuming business. However, they may provide a few useful lessons for all of us, including that we should acknowledge there may be unrecognised problems generated when we make rapid yet radical modifications to local environments.

There is a small, unassuming type of creature alive today that differs little from a marine animal present in the Middle Cambrian period around five hundred million years ago. I first read about onychophorans in Stephen Jay Gould's 1989 exposition on the Burgess Shale, Wonderful Life, and although those fossil marine lobopodians are not definitively onychophorans they are presumed to be ancestral. More commonly known by one genus, peripatus, or even more colloquially as velvet worms, there are at least several hundred species around today, possibly many more. The velvet component of their name is due to their texture, but they bear more resemblance to caterpillars than to worms. They are often described as the ‘missing link' between arthropods and worms but as is usually the case this is a wildly inappropriate term in this context of biological classification. The key difference to the Burgess Shale specimens is that today's velvet worms are fully terrestrial: there are no known marine or freshwater species.

Primarily resident in the southern hemisphere, the largely nocturnal peripatus shun bright light and requiring humid conditions to survive. Although there are about thirty species here in New Zealand, a combination of their small size (under 60mm long) and loss of habitat means they are rarely seen. The introduction of predators such as hedgehogs - who of course never meet peripatus in their northern hemisphere home territory - means that New Zealand's species have even more to contend with. Although I frequently (very carefully) look under leaf litter and inside damp logs on bush walks in regions known to contain the genus Peripatoides - and indeed where others have told me they have seen them - I have yet to encounter a single specimen.

There appears to be quite limited research, with less than a third of New Zealand species fully described. However, enough is known about two species to identify their population status as 'vulnerable'. One forest in the South Island has been labelled an 'Area of Significant Conservation Value' thanks to its population of peripatus, with the Department of Conservation relocating specimens prior to road development. Clearly, they had better luck locating velvet worms than I have had! It isn't just the New Zealand that lacks knowledge of home-grown onychophorans either: in the past two decades Australian researchers have increased the number of their known species from just seven to about sixty.

Their uncanny resemblance to the Burgess Shale specimens, despite their transition from marine to terrestrial environments, has led velvet worms to be described by another well-worn phrase, 'living fossils'. However, is this short-hand in any way useful, or is it a lazy and largely inaccurate term? The recent growth in sophisticated DNA analysis suggests that even when outward anatomy may be change little, the genome itself may vary widely. Obviously DNA doesn't preserve in fossils and so any such changes cannot be tracked from the Cambrian specimens, but the genetic variation found in other types of organisms sharing a similar appearance shows that reliance on just external anatomy can be deceptive.

Due to lack of funding, basic taxonomic research, the bedrock for cladistics, is sadly lacking. In the case of New Zealand, some of the shortfall has been made up for by dedicated amateurs, but there are few new taxonomists learning the skills to continue this work - which is often seen as dull and plodding compared to the excitement of, for example, genetics. Most people might say so what interest could there be in such tiny, insignificant creatures as peripatus? After all, how likely would you be to move an ant's nest in your garden before undertaking some re-landscaping? But as shown by the changing terminology from 'food chains' to 'food webs', in most cases we still don't understand how the removal of one species might generate a domino effect on a local ecosystem.

I've previously discussed the over-reliance on 'poster' species such as giant pandas for environmental campaigns, but mere aesthetics don't equate to importance, either for us or ecology as a whole. It is becoming increasingly clear that by weight the majority of our planet's biomass is microbial. Then come the insects, with the beetles prominent both by number of species and individuals. Us large mammals are really just the icing on the cake and certainly when it comes to Homo sapiens, the rest of the biosphere would probably be far better off without us, domesticated species aside.

It would be nice to value organisms for themselves, but unfortunately our market economies require the smell of profit before they will lift a finger. Therefore if their usefulness could be ascertained, it might help generate greater financial incentive to support the wider environment. Onychophorans may seem dull, but there are several aspects to them that is both interesting in itself and might also provide something fruitful for us humans.

Firstly, they have an unusual weapon in the form of a mechanism that shoots adhesive slime at prey. Like spider silk, is it possible that this might prove an interesting line of research in the materials or pharmaceutical industries? After all, it was the prickly burrs of certain plants that inspired the development of Velcro, whilst current studies of tardigrades (the tiny 'water bears' living amongst the mosses) are investigating their near indestructability. If even a single, tiny species becomes extinct, that genome is generally lost forever: who knows what insights it might have led to? Although museum collections can be useful, DNA does decay and contamination leads to immense complexities in unravelling the original organism's genome. All in all, it's much better to have a living population to work on than rely on what can be pieced together post-extinction.

In addition, for such tiny creatures, velvet worms have developed complex social structures; is it possible that analysis of their brains might be useful in computing or artificial intelligence? Of course it is unlikely - and extinction is nothing if not natural - but the current rate is far greater than it has been outside of mass extinctions. Losing a large and obvious species such as the Yangtze River dolphin (and that was despite it being labelled a ‘national treasure') is one thing, but how many small, barely-known plants and animals are going the same way without anyone noticing? Could it be that right now some minute, unassuming critter is dying out and that we will only find out too late that it was a vital predator of crop-eating pests like snails or disease vectors such as cockroaches?

It has been said that ignorance is bliss, but with so many humans needing to be fed, watered and treated for illness, now more than ever we need as much help as we can get. Having access to the complex ready-made biochemistry of a unique genome is surely easier than attempting to synthesise one from scratch or recover it from a long-dead preserved specimen? By paying minimal attention to the smallest organisms that lie all around us, we could be losing so much more than just an unobtrusive plant, animal or fungus.

We can't save every species on the current endangered list but more attention could be given to the myriad of life forms that get side-lined by the cute and cuddly flagship species, usually large animals. Most of us would be upset by the disappearance of the eighteen hundred or so giant pandas still left in the wild, but somehow I doubt their loss would have as great an impact on the surrounding ecosystem than that of some far less well known flora or fauna. If you think that's nonsense, then consider the vital roles that bees and dung beetles play in helping human agriculture.

Although the decimation of native New Zealand wildlife has led to protective legislation for all our vertebrates and a few famous invertebrates such as giant weta, the vast majority of other species are still left to their own devices. That's not to say that the ecosystems in most other countries are given far less support, of course. But without funding for basic description and taxonomy, who knows what is even out there, never mind whether it might be important to humanity? Could it be that here is a new field for citizen scientists to move into?

Needless to say, the drier climes brought on by rising temperatures will not do peripatus any favours, thanks to its need to remain in damp conditions. Whether by widespread use of the poison 1080 (in the bid to create a pest-free New Zealand by 2050) or the accidental importation of a non-native fungus such as those decimating amphibians worldwide and causing kauri dieback in New Zealand, there are plenty of ways that humans could unwittingly wipe out velvet worms, etal. So next time you watch a documentary on the demise of large, familiar mammals, why not spare a thought for all those wee critters hiding in the bush, going about their business and trying to avoid all the pitfalls us humans have unthinkingly laid for them?

Tuesday 29 August 2017

Cerebral celebrities: do superstar scientists harm science?

One of my earliest blog posts concerned the media circus surrounding two of the most famous scientists alive today: British physicist Stephen Hawking and his compatriot the evolutionary biologist Richard Dawkins. In addition to their scientific output, they are known in public circles thanks to a combination of their general readership books, television documentaries and charismatic personalities. The question has to be asked though, how much of their reputation is due to their being easily-caricatured and therefore media-friendly characters rather than what they have contributed to human knowledge?

Social media has done much to democratise the publication of material from a far wider range of authors than previously possible, but the current generation of scientific superstars who have arisen in the intervening eight years appear party to a feedback loop that places personality as the primary reason for their media success. As a result, are science heroes such as Neil deGrasse Tyson and Brian Cox merely adding the epithet 'cool' to STEM disciplines as they sit alongside the latest crop of media and sports stars? With their ability to fill arenas usually reserved for pop concerts or sports events, these scientists are seemingly known far and wide for who they are as much as for what they have achieved. It might seem counterintuitive to think that famous scientists and mathematicians could be damaging STEM, but I'd like to put forward five ways by which this could be occurring:

1: Hype and gossip

If fans of famous scientists spend their time reading, liking and commenting at similarly trivial levels, they may miss important material from other, less famous sources. A recent example that caught my eye was a tweet by British astrophysicist and presenter Brian Cox, containing a photograph of two swans he labelled ‘Donald' and ‘Boris'. I assume this was a reference to the current US president and British foreign secretary, but with over a thousand 'likes' by the time I saw it I wonder what other, more serious, STEM-related stories might have been missed in the rapid ebb and flow of social media.

As you would expect with popular culture fandom the science celebrities' material aimed at a general audience receives the lion's share of attention, leaving the vast majority of STEM popularisations under-recognised. Although social media has exacerbated this, the phenomenon does pre-date it. For example, Stephen Hawking's A Brief History of Time was first published in 1988, the same year as Timothy Ferris's Coming of Age in the Milky Way, a rather more detailed approach to similar material that was left overshadowed by its far more famous competitor. There is also the danger that celebrities with a non-science background might try to cash in on the current appeal of science and write poor-quality popularisations. If you consider this unlikely, you should bear in mind that there are already numerous examples of extremely dubious health, diet and nutrition books written by pop artists and movie stars. If scientists can be famous, perhaps the famous will play at being science writers.

Another result of this media hubbub is that in order to be heard, some scientists may be guilty of the very hype usually blamed on the journalists who publicise their discoveries. Whether to guarantee attention or self-promoting in order to gain further funding, an Australian research team recently came under fire for discussing a medical breakthrough as if a treatment was imminent, despite having so are only experimented on mice! This sort of hyperbole both damages the integrity of science in the public eye and can lead to such dangerous outcomes as the MMR scandal, resulting in large numbers of children not being immunised.

2: Hero worship

The worship of movie stars and pop music artists is nothing new and the adulation accorded them reminds me of the not dissimilar veneration shown to earlier generations of secular and religious leaders. The danger here then is for impressionable fans to accept the words of celebrity scientists as if they were gospel and so refrain from any form of critical analysis. When I attended an evening with astrophysicist Neil deGrasse Tyson last month I was astonished to hear some fundamental misunderstandings of science from members of the public. It seemed as if Dr Tyson had gained a personality cult who hung on each utterance but frequently failed to understand the wider context or key issues regarding the practice of science. By transferring hero worship from one form of human activity to another, the very basis - and differentiation - that delineates the scientific enterprise may be undermined.

3: Amplifying errors

Let's face it, scientists are human and make mistakes. The problem is that if the majority of a celebrity scientist's fan base are prepared to lap up every statement, then the lack of critical analysis can generate further issues. There are some appalling gaffes in the television documentaries and popular books of such luminaries as Sir David Attenborough (as previously discussed) and even superstar Brian Cox is not immune: his 2014 book Human Universe described lunar temperatures dropping below -2000 degrees Celsius! Such basic errors imply that the material is ghost-written or edited by authors with little scientific knowledge and no time for fact checking. Of course this may embarrass the science celebrity in front of their potentially jealous colleagues, but more importantly can serve as ammunition for politicians, industrialists and pseudo-scientists in their battles to persuade the public of the validity of their own pet theories - post-truth will out, and all that nonsense.

4: Star attitude

With celebrity status comes the trappings of success, most usually defined as a luxury lifestyle. A recent online discussion here in New Zealand concerned the high cost of tickets for events featuring Neil deGrasse Tyson, Brian Greene, David Attenborough, Jane Goodall and later this year, Brian Cox. Those for Auckland-based events were more expensive than tickets to see Kiwi pop star Lorde and similar in price for rugby matches between the All Blacks and British Lions. By making the tickets this expensive there is little of chance of attracting new fans; it seems to be more a case of preaching to the converted.

Surely it doesn't have to be this way: the evolutionary biologist Beth Shapiro, author of How to Clone a Mammoth, gave an excellent free illustrated talk at Auckland Museum a year ago. It seems odd that the evening with Dr Tyson, for example, consisting of just himself, interviewer Michelle Dickinson (A.K.A. Nanogirl) and a large screen, cost approximately double that of the Walking with Dinosaurs Arena event at the same venue two years earlier, which utilised US$20 million worth of animatronic and puppet life-sized dinosaurs.

Dr Tyson claims that by having celebrity interviewees on his Star Talk series he can reach a wider audience, but clearly this approach is not feasible when his tour prices are so high. At least Dr Goodall's profits went into her conservation charity, but if you consider that Dr Tyson had an audience of probably over 8000 in Auckland alone, paying between NZ$95-$349 (except for the NZ$55 student tickets) you have to wonder where all this money goes: is he collecting ‘billions and billions' of fancy waistcoats? It doesn't look as if this trend will soon stop either, as Bill Nye (The Science Guy) has just announced that he will be touring Australia later this year and his tickets start at around NZ$77.

5: Skewing the statistics

The high profiles of sci-comm royalty and their usually cheery demeanour implies that all is well in the field of scientific research, with adequate funding for important projects. However, even a quick perusal of less well-known STEM professionals on social media prove that this is not the case. An example that came to my attention back in May was that of the University of Auckland microbiologist Dr Siouxsie Wiles, who had to resort to crowdfunding for her research into fungi-based antibiotics after five consecutive funding submissions were rejected. Meanwhile, Brian Cox's connection to the Large Hadron Collider gives the impression that even such blue-sky research as the LHC can be guaranteed enormous budgets.

As much as I'd like to thank these science superstars for promoting science, technology and mathematics, I can't quite shake the feeling that their cult status is too centred on them rather than the scientific enterprise as a whole.  Now more than ever science needs a sympathetic ear from the public, but this should be brought about by a massive programme to educate the public (they are the taxpayers, after all) as to the benefits of such costly schemes as designing nuclear fusion reactors and the research on climate change. Simply treating celebrity scientists in the same way as movie stars and pop idols won't help an area of humanity under siege from so many influential political and industrial leaders with their own private agendas. We simply mustn't allow such people to misuse the discipline that has raised us from apemen to spacemen.

Friday 11 August 2017

From steampunk to Star Trek: the interwoven strands between science, technology and consumer design

With Raspberry Pi computers having sold over eleven million units by the end of last year, consumer interest in older technology appears to have become big business. Even such decidedly old-school devices as crystal radio kits are selling well, whilst replicas of vintage telescopes are proof that not everyone has a desire for the cutting-edge. I'm not sure why this is so, but since even instant Polaroid-type cameras are now available again - albeit with a cute, toy-like styling - perhaps manufacturers are just capitalising on a widespread desire to appear slightly out of the ordinary. Even so, such products are far closer to the mainstream than left field: instant-developing cameras for example now reach worldwide sales of over five million per year. That's hardly a niche market!

Polaroid cameras aside, could it be the desire for a less minimal aesthetic that is driving such purchases? Older technology, especially if it is pre-integrated circuit, has a decidedly quaint look to it, sometimes with textures - and smells - to match. As an aside, it's interesting that whilst on the one hand current miniaturisation has reduced energy consumption for many smaller pieces of technology from the Frankenstein laboratory appearance of valve-based computing and room-sized mainframes to the smart watch etal, the giant scale of cutting-edge technology projects require immense amounts of energy, with nuclear fusion reactors presumably having overtaken the previous perennial favourite example of space rockets when it comes to power usage.

The interface between sci-tech aesthetics and non-scientific design is a complicated one: it used to be the case that consumer or amateur appliances were scaled-down versions of professional devices, or could even be home-made, for example telescopes or crystal radios. Nowadays there is a massive difference between the equipment in high-tech laboratories and the average home; even consumer-level 3D printers won't be able to reproduce gravity wave detectors or CRISPR-Cas9 genome editing tools any time soon.

The current trend in favour - or at least acknowledgement - of sustainable development, is helping to nullify the pervasive Victorian notion that bigger, faster, noisier (and smellier) is equated with progress. It's therefore interesting to consider the interaction of scientific ideas and instruments, new technology and consumerism over the past century or so. To my mind, there appear to be five main phases since the late Victorian period:
  1. Imperial steam
  2. Streamlining and speed
  3. The Atomic Age
  4. Minimalism and information technology
  5. Virtual light

1) Imperial steam

In the period from the late Nineteenth Century's first generation of professional scientists up to the First World War, there appears to have been an untrammelled optimism for all things technological. Brass, iron, wood and leather devices - frequently steam-powered - created an aesthetic that seemingly without effort has an aura of romance to modern eyes.

Although today's steampunk/alternative history movement is indebted to later authors, especially Michael Moorcock, as much as it is to Jules Verne and H.G. Wells, the latter pair are only the two most famous of a whole legion of late Victorian and Edwardian writers who extolled - and occasionally agonised over - the wonders of the machine age.

I must confess I much prefer steam engines to electric or diesel locomotives, despite the noise, smuts and burning of fossil fuels. Although the pistons and connecting rods of these locomotives might be the epitome of the design from this phase, it should be remembered that it was not unknown for Victorian engineers to add fluted columns and cornucopia reliefs to their cast iron and brass machinery, echoes of a pre-industrial past. An attempt was being made, however crude, to tie together the might of steam power to the Classical civilisations that failed to go beyond the aeolipile toy turbine and the Antikythera mechanism.

2) Streamlining and speed

From around 1910, the fine arts and then decorative arts developed new styles obsessed with mechanical movement, especially speed. The dynamic work of the Futurists led the way, depicting the increasing pace of life in an age when humans and machines were starting to interact ever more frequently. The development of heavier-than-air flight even led to a group of 'aeropainters' whose work stemmed from their experience of flying.

Although scientific devices still had some of the Rube Goldberg/Heath Robinson appearance of their Nineteenth Century forebears, both consumer goods and vehicles picked up the concept of streamlining to suggest a sophisticated, future-orientated design. Items such as radios and toasters utilised early plastics, stainless steel and chrome to imply a higher level of technology than their interiors actually contained. This is in contrast to land, sea and aerial craft, whereby the practical benefits of streamlining happily coincided with an attractive aesthetic, leading to design classics such as the Supermarine seaplanes (forerunners of the Spitfire) and the world speed record-holding A4 Pacific Class steam locomotives.

3) The Atomic Age

By the 1950s practically anything that could be streamlined was, whether buildings that looked like ocean liners or cars with rocket-like tailfins and dashboards fit for a Dan Dare spaceship. However, a new aesthetic was gaining popularity in the wake of the development of atomic weapons. It seems to have been an ironic move that somewhere between the optimism of an era of exciting new domestic gadgets and the potential for nuclear Armageddon, the Bohr (classical physics) model of the atom itself gained a key place in post-war design.

Combined with rockets and space the imagery could readily be termed 'space cadet', but it wasn't the only area of science to influence wider society. Biological research was undergoing a resurgence, which may explain why stylised x-ray forms, amoebas and bodily organs become ubiquitous on textiles, furnishings, and fashion. Lighting fixtures were a standout example of items utilising designs based on the molecular models used in research laboratories (which famously gave Crick and Watson the edge in winning the race to understand the structure of DNA).

Monumental architecture also sought to represent the world of molecules on a giant scale, culminating in the 102 metre-high Atomium built in Brussels for the 1958 World's Fair. It could be said that never before had science- and technological-inspired imagery been so pervasive in non-STEM arenas.

4) Minimalism and information technology

From the early 1970s the bright, optimistic designs of the previous quarter century were gradually replaced by the cool, monochromatic sophistication of minimalism. Less is more became the ethos, with miniaturisation increasing as solid-state electronics and then integrated circuits became available. A plethora of artificial materials, especially plastics, meant that forms and textures could be incredibly varied if refined.

Perhaps a combination of economic recession, mistrust of authority (including science and a military-led technocracy) and a burgeoning awareness of environmental issues led to the replacement of exuberant colour with muted, natural tones and basic if self-possessed geometries. Consumers could now buy microcomputers and video games consoles; what had previously only existed in high-tech labs or science fiction became commonplace in the household. Sci-fi media began a complex two-way interaction with cutting-edge science; it's amazing to consider that only two decades separated the iPad from its fictional Star Trek: The Next Generation predecessor, the PADD.

5) Virtual light

With ultra high-energy experiments such as nuclear fusion reactors and the ubiquity of digital devices and content, today's science-influenced designs aim to be simulacra of their professional big brothers. As stated earlier, although consumer technology is farther removed from mega-budget science apparatus than ever, the former's emphasis on virtual interfaces is part of a feedback loop between the two widely differing scales.

The blue and green glowing lights of everything from futuristic engines to computer holographic interfaces in many Hollywood blockbusters are representations of both the actual awesome power required by the likes of the Large Hadron Collider and as an analogy for the visually-unspectacular real-life lasers and quantum teleportation, the ultimate fusion (sorry, couldn't resist that one) being the use of the real National Ignition Facility target chamber as the engine core of the USS Enterprise in Star Trek: Into Darkness.

Clearly, this post-industrial/information age aesthetic is likely to be with us for some time to come, as consumer-level devices emulate the cool brilliance of professional STEM equipment; the outer casing is often simple yet elegant, aiming not to distract from the bright glowing pixels that take up so much of our time. Let's hope this seduction by the digital world can be moderated by a desire to keep the natural, material world working.

Friday 28 July 2017

Navigating creation: A Cosmic Perspective with Neil deGrasse Tyson


I recently attended an interesting event at an Auckland venue usually reserved for pop music concerts. An audience in the thousands came to Neil deGrasse Tyson: A Cosmic Perspective, featuring the presenter of Cosmos: A Spacetime Odyssey and radio/tv show StarTalk. The 'Sexiest Astrophysicist Alive' presented his brand of science communication to an enormous congregation (forgive the use of the word) of science fans aged from as young as five years old. So was the evening a success? My fellow science buffs certainly seemed to have enjoyed it, so I decided it would be worthwhile to analyse the good doctor's method of large-scale sci-comm.

The evening was split into three sections, the first being the shortest, a primer as to our location in both physical and psychological space-time. After explaining the scale of the universe via a painless explanation of exponents, Dr Tyson used the homespun example of how stacking the 'billions' (which of course he declared to be Carl Sagan's favourite word) of Big Macs so far sold could be stacked many times around the Earth's circumference and even then extend onwards to the Moon and back. Although using such a familiar object in such unusual terrain is a powerful way of taking people outside their comfort territory, there was nothing new about this particular insight, since Dr Tyson has been using it since at least 2009; I assume it was a case of sticking to a tried-and-trusted method, especially when the rest of the evening was (presumably) unscripted.

Billions of Big Macs around the Earth and moon

Having already belittled our location in the universe, the remainder of the first segment appraised our species' smug sense of superiority, questioning whether extra-terrestrials would have any interest in us any more than we show to most of the biota here on Earth. This was a clear attempt to ask the audience to question the assumptions that science fiction, particularly of the Hollywood variety, has been popularising since the dawn of the Space Age. After all, would another civilisation consider us worthy of communicating with, considering how much of our broadcasting displays obvious acts of aggression? In this respect, Neil deGrasse Tyson differs markedly from Carl Sagan, who argued that curiosity would likely be a mutual connection with alien civilisations, despite their vastly superior technology. Perhaps this difference of attitude isn't surprising, considering Sagan's optimism has been negated by both general circumstance and the failure of SETI in the intervening decades.

Dr Tyson also had a few gibes at the worrying trend of over-reliance on high technology in place of basic cognitive skills, describing how after once working out some fairly elementary arithmetic he was asked which mobile app he had used to gain the result! This was to become a central theme of the evening, repeated several times in different guises: that rather than just learning scientific facts, non-scientists can benefit from practising critical thinking in non-STEM situations in everyday life.

Far from concentrating solely on astrophysical matters, Dr Tyson also followed up on topics he had raised in Cosmos: A Spacetime Odyssey regarding environmental issues here on Earth. He used Apollo 8's famous 'Earthrise' photograph (taken on Christmas Eve 1968) as an example of how NASA's lunar landing programme inspired a cosmic perspective, adding that organisation such as the National Oceanic and Atmospheric Administration and the Environmental Protection Agency were founded during the programme. His thesis was clear: what began with political and strategic causes had fundamental benefits across sectors unrelated to space exploration; or as he put it "We're thinking we're exploring the moon and we discovered the Earth for the first time."

The second and main part of the event was Tyson's discussion with New Zealand-based nanotechnologist and science educator Michelle Dickinson, A.K.A. Nanogirl. I can only assume that there aren't any New Zealand astronomers or astrophysicists as media-savvy as Dr Dickinson, or possibly it's a case of celebrity first and detailed knowledge second, with a scientifically-minded interviewer deemed to have an appropriate enough mindset even if not an expert in the same specialisation.

The discussion/interview was enlightening, especially for someone like myself who knows Neil deGrasse Tyson as a presenter but very little about him as a person. Dr Tyson reminisced how in 1989 he accidentally become a media expert solely on the basis of being an astrophysicist and without reference to him as an Afro-American, counter to the prevailing culture that only featured Afro-Americans to gain their point of view.

Neil deGrasse Tyson: A Cosmic Perspective

Dr Tyson revealed himself to be both a dreamer and a realist, the two facets achieving a focal point with his passion for a crewed mission to Mars. He has often spoken of this desire to increase NASA's (comparatively small) budget so as reinvigorate the United States via taking humans out from the humdrum comfort zone of low earth orbit. However, his understanding of how dangerous such a mission would be led him to state he would only go to Mars once the pioneering phase was over!

His zeal for his home country was obvious - particularly the missed opportunities and the grass roots rejection of scientific expertise prevalent in the United States - and it would be easy to see his passionate pleas for the world to embrace Apollo-scale STEM projects as naïve and out-of-touch. Yet there is something to be said for such epic schemes; if the USA is to rise out of its present lassitude, then the numerous if unpredictable long-term benefits of, for example, a Mars mission is a potential call-to-arms.

The final part of the evening was devoted to audience questions. As I was aware of most of the STEM and sci-comm components previously discussed this was for me perhaps the most illuminating section of the event. The first question was about quantum mechanics, and so not unnaturally Dr Tyson stated that he wasn't qualified to answer it. Wouldn't it be great if the scientific approach to expertise could be carried across to other areas where people claim expert knowledge that they don't have?

I discussed the negative effects that the cult of celebrity could have on the public attitude towards science back in 2009 so it was extremely interesting to hear questions from several millennials who had grown up with Star Talk and claimed Neil deGrasse Tyson as their idol. Despite having watched the programmes and presumably having read some popular science books, they fell into some common traps, from over-reliance on celebrities as arbiters of truth to assuming that most scientific theories rather than just the cutting edge would be overturned by new discoveries within their own lifetimes.

Dr Tyson went to some lengths to correct this latter notion, describing how Newton's law of universal gravitation for example has become a subset of Einstein's General Theory of Relativity. Again, this reiterated that science isn't just a body of facts but a series of approaches to understanding nature. The Q&A session also showed that authority figures can have a rather obvious dampening effect on people's initiative to attempt critical analysis for themselves. This suggests a no-win situation: either the public obediently believe everything experts tell them (which leads to such horrors as the MMR vaccine scandal) or they fail to believe anything from STEM professionals, leaving the way open for pseudoscience and other nonsense. Dr Tyson confirmed he wants to teach the public to think critically, reducing gullibility and thus exploitation by snake oil merchants. To this end he follows in the tradition of James 'The Amazing' Randi and Carl Sagan, which is no bad thing in itself.

In addition, by interviewing media celebrities on StarTalk Dr Tyson stated how he can reach a far wider audience than just dedicated science fans. For this alone Neil deGrasse Tyson is a worthy successor to the much-missed Sagan. Let's hope some of those happy fans will be inspired to not just dream, but actively promote the cosmic perspective our species sorely needs if we are to climb out of our current doldrums.

Monday 10 July 2017

Genius: portraying Albert Einstein as a human being, not a Hollywood stereotype

I recently watched the National Geographic docudrama series Genius, presenting a warts-and-all look at the life and work of Albert Einstein. In these post-truth times in which even a modicum of intellectual thought is often regarded with disdain, it's interesting to see how a scientific icon is portrayed in a high-budget, high-profile series.

A few notable examples excepted, Dr Frankenstein figures still inform much of Hollywood's depiction of STEM practitioners. Inventors are frequently compartmentalised as either patriotic or megalomaniac, often with a love of military hardware; Jurassic Park's misguided and naive Dr John Hammond seemingly a rare exception. As for mathematicians, they are often depicted with more than a touch of insanity, such as in Pi or Fermat's Room.

So does Genius break the mould or follow the public perception of scientists as freaky, geeky, nerdy or plain evil? The script is a fairly sophisticated adaptation of real life events, although the science exposition suffers as a result. Despite some computer graphic sequences interwoven with the live action, the attempts to explore Einstein's thought experiments and theories are suggestive rather than comprehensive, the tip of the iceberg when it comes to his scientific legacy. Where the series succeeds is in describing the interaction of all four STEM disciplines: science, technology, engineering and mathematics; and the benefits when they overlap. The appalling attitudes prevalent in the academia of his younger years are also brought to vivid life, with such nonsense as not questioning tutors piled onto the usual misogyny and xenophobia.

Albert Einstein

Contrary to the popular conception of the lone genius - and counter to the series' title - the role of Einstein's friends such as Marcel Grossmann and Michele Besso as his sounding boards and mathematical assistants is given a high profile. In addition, the creative aspect of science is brought to the fore in sequences that show how Einstein gained inspiration towards his special and general theories of relativity.

The moral dimension of scientific research is given prominence, from Fritz Haber's development of poison gas to Leo Szilard's persuasion of Einstein to both encourage and later dissuade development of atomic weapons. As much as the scientific enterprise might appear to be separate from the rest of human concern, it is deeply interwoven with society; the term 'laboratory conditions' applies to certain processes, not to provide a wall to isolate science from everything else. Scientists in Genius are shown to have the same human foibles as everyone else, from Einstein's serial adultery (admittedly veering to Hollywood family drama at times, paternal guilt complex etal) to Philipp Lenard's dismissal of Einstein's theories due to his anti-Semitism rather than any scientific evidence. So much for scientific impartiality!

The last few episodes offer a poignant description of how even the greatest of scientific minds lose impetus, passing from creative originality as young rebels to conservative middle age stuck-in-the-muds, out of touch with the cutting edge. General readership books on physics often claim theoretical physicists do their best work before they are thirty, with a common example being that Einstein might as well have spent his last twenty years fishing. Although not as detailed as the portrayal of his early, formative years, Einstein's obsessive (but failed) quest to find fault with quantum mechanics is a good description of how even the finest minds can falter.

All in all, the first series of Genius is a very noble attempt to describe the inspiration and background that led to some revolutionary scientific theories. The irony is that by concentrating on Einstein as a human being it might help the wider public gain a better appreciation, if not comprehensive understanding, of the work of scientists and role of STEM in society. Surely that's no bad thing, especially if it makes Hollywood rethink the lazy stereotype of the crazy-haired scientist seeking world domination. Or even encourages people to listen to trained experts rather than the rants of politicians and religious nutbars. Surely that's not a difficult choice?