Friday, 23 December 2016

O Come, All ye Fearful: 12 woes for Christmas future

This month I thought I would try and adopt something of the Yuletide spirit by offering something short and sharp (if not sweet) that bares a passing resemblance to the carol On the Twelve Days of Christmas. However, instead of gifts I'll be attempting to analyse twelve key concerns that humanity may face in the near future, some being more immediate - not to mention inevitable - than others.

I'll start off with the least probable issues then gradually work down to those most likely to have widespread effects during the next few decades. As it is meant to be a season of good cheer I'll even suggest a few solutions or mitigation strategies where these are applicable. The ultimate in low-carb gifts: what more could you ask for?

12: ET phones Earth. With the SETI Institute and Breakthrough Listen project leading efforts to pick up signals from alien civilisations, what are the chances that we might receive an extra-terrestrial broadcast in the near future? Although many people might deem this just so much science fiction, the contents of a translated message (or autonomous probe) could prove catastrophic. Whether it would spark faith-based wars or aid the development of advanced technology we couldn't control - or be morally fit enough to utilise - there may be as many negative issues as positive ones.

Solution: Keeping such information secret, especially the raw signal data, would be incredibly difficult. Whether an international translation project could be conducted in secret is another matter, with censorship allowing a regular trickle of the less controversial information into the public domain. Whilst this is the antithesis of good scientific practice, it could prove to be the best solution in the long term. Not that most politicians are ever able to see anything that way, however!

11. Acts of God. There is a multitude of naturally-occurring events that are outside of human control, both terrestrial (e.g. super volcano, tsunami) and extra-terrestrial, such as asteroid impacts. Again, until recently few people took much interest in the latter, although Hollywood generated some awareness via several rather poor movies in the late 1990s. The Chelyabinsk meteor of February 2013 (rather than meteorite, as most of the material exploded at altitude led to 1500 injuries, showing that even a small object that doesn't reach the ground intact can cause havoc. Since 2000, there have been over twenty asteroid impacts or atmospheric break-ups ranging from a kiloton up to half a megaton.

Solution: Although there are various projects to assess the orbits of near-Earth objects (NEOs), the development of technologies to deflect or destroy impactors requires much greater funding than is currently in place. Options range from devices that use just their velocity to knock NEOs off-course to the brute force approach of high-powered lasers and hydrogen bombs. However, with the cancellation of NASA's Ares V heavy launch vehicle it's difficult to see how such solutions could be delivered in time. Hopefully in the event something would be cobbled together pretty quickly!

10. Grey goo scenario. As defined by Eric Drexler in his 1986 book Engines of Creation, what if self-replicating nanobots (developed for example, for medical purposes), break their programming and escape into the world, eating everything in their path? Similar to locust swarms, they would only be limited by the availability of raw materials.

Solution: The Royal Society's 2004 report on nanoscience declared that the possibility of von Neumann machines are some decades away and therefore of little concern to regulators. Since then, other research has suggested there should be limited need to develop such machines anyway. So that's good to know!

9. Silicon-destroying lifeforms. What if natural mutations lead to biological organisms that can seriously damage integrated circuitry? A motherboard-eating microbe would be devastating, especially in the transport and medical sectors, never mind the resulting communication network outages and financial chaos. This might sound as ridiculous as any low-grade science fiction plot, but in 1975 nylon-eating bacteria were discovered. Since then, research into the most efficient methods to recover metals from waste electronics have led to experiments in bioleaching. As well as bacteria, the fungus Aspergillus niger has been shown to breakdown the metals used in circuits.

Solution: As bioleaching is potentially cheaper and less environmentally damaging it could become widespread. Therefore it will be up to the process developers to control their creations. Fingers crossed, then!

8. NCB. Conventional weapons may be more common place, but the development of nuclear, chemical and biological weapons by rogue states and terrorist organisations is definitely something to be worried about. The International Atomic Energy Agency has a difficult time keeping track of all the radioactive material that is stolen or goes missing each year.  As the 1995 fatal release of the nerve agent sarin on the Tokyo subway shows, terrorists are not unwilling to use weapons of mass destruction on the general public.

Solution: There's not much I can suggest here. Let's hope that the intelligence services can keep all the Dr Evils at bay.

7. Jurassic Park for real. At Harvard last year a chicken embryo's genes were tweaked in such a way as to create a distinctly dinosaurian snout rather than a beak. Although it may be sometime before pseudo-velociraptors are prowling (high-fenced) reserves, what if genome engineering was used to develop Homo superior? A 2014 paper from Michigan State University suggests both intellectual and physical improvements via CRISPR-cas9 technology is just around the corner.

Solution: If the tabloids are to be believed (as if) China will soon be editing human genomes, to fix genetic diseases as well as generating enhanced humans. Short of war, what's to stop them?

Planet Earth wrapped as a Christmas present

6. DIY weaponry. The explosion in 3D printers for the domestic market means that you can now make your own handguns. Although current designs wear out after a few firings, bullets are also being developed that will work without limiting their lifespan. Since many nations have far more stringent gun laws than the USA, an increase in weaponry among the general public is just what we don't need.

Solution: how about smart locking systems on printers so they cannot produce components that could be used to build a weapon? Alternatively, there are now 3D printer models that can manufacture prototype bulletproof clothing. Not that I'd deem that a perfect solution!

5. Chemical catastrophe. There are plenty of chemicals no longer in production that might affect humanity or our agriculture. These range from the legacy effects of polychlorinated biphenyl (PCB), a known carcinogen, to the ozone depletion causing by CFCs, which could be hanging around the stratosphere for another century; this doesn't just result in increased human skin cancer - crops are also affected by the increased UVB.

Solution: we can only hope that current chemical development now has more rigorous testing and government regulation than that accorded to PCBs, CFCs, DDTs, et al. Let's hope all that health and safety legislation pays off.

4. The energy crisis. Apart from the obvious environmental issues around fossil fuels, the use of fracking generates a whole host of problems on its own, such as the release of methane and contamination of groundwater by toxic chemicals, including radioactive materials.

Solution: more funding is required for alternatives, especially nuclear fusion (a notoriously expensive area to research). Iceland generated 100% of its electricity from renewables whilst Portugal managed 4 consecutive days in May this year via wind, hydro, biomass and solar energy sources. Greater recycling and more incentives for buying electric and hybrid vehicles wouldn't hurt either!

3. Forced migration. The rise in sea levels due to melt water means that it won't just be Venice and small Pacific nations that are likely to become submerged by the end of the century. Predictions vary widely, but all in the same direction: even an increase of 150mm would be likely to affect over ten million people in the USA alone, with probably five times that number in China facing similar issues.

Solution: a reduction in greenhouse gas emissions would seem to be the thing. This requires more electric vehicles and less methane-generating livestock. Arnold Schwarzenegger's non-fossil fuel Hummers and ‘Less meat, less heat, more life' campaign would appear to be good promotion for the shape of things to come - if he can be that progressive, there's hope for everyone. Then of course there's the potential for far more insect-based foodstuffs.

2. Food and water. A regional change in temperature of only a few degrees can seriously affect crop production and the amount of water used by agriculture. Over 700 million people are already without clean water, with shortages affecting agriculture even in developed regions - Australia and California spring to mind. Apparently, it takes a thousand litres of water to generate a single litre of milk!

Solution: A few far-sighted Australian farmers are among those developing methods to minimise water usage, including a few low-tech schemes that could be implemented anywhere. However, really obvious solutions would be to reduce the human population and eat food that requires less water. Again, bug farming seems a sensible idea.

1. Preventing vegegeddon. A former professor at Oxford University told me that some of his undergraduates have problems relating directly to others, having grown up in an environment with commonplace communication via electronic interfaces. If that's the problem facing the intellectual elite, what hope for the rest of our species? Physical problems such as poor eyesight are just the tip of the iceberg: the human race is in severe danger of degenerating into low-attention ‘sheeple' (as they say on Twitter). Children are losing touch with the real world, being enticed into virtual environments that on the surface are so much more appealing. Without knowledge or experience of reality, even stable democracies are in danger of being ruled by opportunistic megalomaniacs, possibly in orange wigs.

Solution: Richard Louv, author of  Last Child in the Woods: Saving Our Children From Nature-Deficit Disorder suggests children require unstructured time out of doors in order to gain an (occasionally painful) understanding of  the real world; tree-climbing, fossicking, etc. Restricting time on electronic devices would seem to go hand in hand with this.

Well, that about wraps it up from me. And if the above seems somewhat scary, then why not do something about it: wouldn't working for a better future be the best Christmas present anyone could ever give?

Thursday, 24 November 2016

Unwanted aliens: is a predator-free New Zealand realistic by 2050?

In a moment of half-baked madness worthy of Donald Trump, the New Zealand Government has announced a plan to make the nation predator-free by 2050. As can be imagined this statement has attracted a wide range of opinions, even from across various conservation groups. These vary from the extremely optimistic viewpoint of Forest and Bird advocacy manager Kevin Hackwell, who claims it is achievable even earlier, to the Green Party's conservation spokesman Kevin Hague, who publicised a University of Auckland study estimating the project's budget at an astonishing if not untenable NZ$9 billion.

With the government prepared to provide just one-third of the plan's funding, it's difficult to imagine which private sector companies would be willing to supply the lion's share over the next three decades. As expected, the response of New Zealand's political opposition has been to pour very cold water on the plan, including the claim that no nation has ever managed to wipe out its population of rats (Hamelin and its Pied Piper notwithstanding).

One of the most essential questions is what is defined as a pest in the context of this proposal?  The relevant Department of Conservation (DoC) page names three principle animal pests: possums, rats and stoats, with a further page expanding the list to other introduced animals and freshwater fish, including cats and dogs (both domestic and feral). Some of the species listed were deliberate introductions, mainly in the Nineteenth Century, whilst others came in accidentally under the radar - New Zealand's biosecurity protocols not always being as draconian as they are now.

A few statistics offer a frightening idea of the scale required: as of 2001 it was estimated that there were seventy million possums in New Zealand, eating 21,000 tonnes of vegetation every night. Needless to say, much of this material consists of endemic species such as pohutukawa and southern rata trees. This then has a knock-on effect for the native fauna that feeds or nests on these species, which of course is in addition to being direct prey for the possum.

Although cats and dogs might be thought of more as pets than pests, even in low numbers they can be devastating to native wildlife. A classic example is the extinction of the Stephens Island wren thanks to a number of feral cats, whilst it is thought that one stray dog managed to kill more than five hundred large brown kiwi in the Waitangi State Forest in less than a year.

DoC's Battle for our Birds scheme relies on aerial drops of poison and ground baits/traps to eradicate the key non-native pests. This year their target area was almost 900,000 hectares; to give an indication of the increase in scale necessary for a nationwide eradication, New Zealand is close to 27 million hectares in total. Perhaps the much-misused term 'paradigm shift' could be safely applied in this circumstance?

At this point it should be mentioned that there are varied opinions as to what the government's planned outcome is. After all, there have been humans living in New Zealand for over seven centuries, so there is little chance of any except the most remote locales returning to a pristine ‘natural' wilderness, even if we knew exactly what that meant. Having said that, the Pleistocene Park project in Russia is attempting something along similar lines. A small region of north Siberian tundra is being converted into glacial period steppe, using musk ox and other large animals as surrogates for extinct mega fauna such as mammoth and woolly rhinoceros. The resulting flora appears to be much more diverse and interesting than the unmanaged wilderness surrounding it, which is ironically the antithesis of what one would expect or hope for with untouched versus deliberately altered landscapes!

Then there's the scale issue: whilst possums, rats and mustelids are relatively easy to track and observe, small species such as wasps and argentine ants are far more difficult to locate, never mind eradicate. Although they don't inflict as much obvious damage to the native flora and fauna, they can nonetheless cause fundamental changes to the ecosystem. Wasps for example eat honeydew, which is an important food source for lizards and native birds such as kaka.

It isn't just insects that would be tricky to wipe out. The rainbow or ‘plague' skink was accidentally introduced from Australia about half a century ago and now seems ubiquitous in Auckland; I've seen it everywhere from volcanos to paddocks, gardens to garages, even inside a bookshop. Thanks to much faster reproduction and maturation rates than native equivalents, it appears to be rapidly out-competing them.

One issue that prevents a complete turning back of the clock is the extinction of dozens of species since the arrival of humans in the country. How can the ecosystem, especially food webs, maintain a long-term balance with key species missing? No-one is suggesting we bring in cassowaries to replace the nine species of moa. Of course, being large creatures they were probably none too numerous, yet there is an hypothesis that they may have been involved in an evolutionary arms race with lancewood, the juvenile trees being well-protected against moa browsing them.

Therefore any attempt to preserve a largely native ecosystem will need to ensure the food webs are fully-functional, with plenty of indigenous pollinators such as short-tailed bats and kereru (native pigeon). Key native species need to identified and preserved just as much as introduced ones removed. This in turn begs the obvious point that since evolution is an ongoing process, are we attempting to freeze the environment at a particular snapshot in time rather than allowing nature to take its course? Even accounting for punctuated equilibrium, natural selection hasn't suddenly stopped in New Zealand any more than it has elsewhere.

The pest-free project will presumably need to tackle species in a certain order, since if mustelids and feral cats are eliminated then rats will proliferate, whilst without rats as prey, the former species will be forced to look for alternative food sources instead; doubtless native birds would form the mainstay of this.

As I have discussed elsewhere, it shouldn't just be the enemies of the native poster species that are targeted. There are plenty of critters less famous than parrot kakapo and ancient reptile tuatara that deserve some attention too, with the endemic weta an obvious example (over twenty percent of its species are currently under threat). Invertebrates play an almost unknown role in nutrient recycling and waste disposal, as well as appearing on the menu of more conspicuous animals. Considering that the takahe, the largest species of swamp hen, was thought extinct for half a century, perhaps we shouldn't be surprised about how little is known concerning the size and condition of native creepy crawly populations. However small and insignificant we might judge them, we ignore their loss at our peril.

Also often overlooked are the native freshwater creatures. Competition comes in the form of the high number of invasive species that compete or predate on them. A key example is the aggressive gambusia, a Mexican fish introduced to eat all the mosquito larvae - which of course it fails to do. Interestingly enough, the DoC website excludes some introduced species from its list of pests: salmon and trout for example are categorised as 'sports fish'. Therefore is economics the government's primary motive for the pest eradication plan, rather than good old-fashioned conservation for the sake of it? After all, the extremely rare takahe was once given second place to herds of elk that had been introduced to serve as a big game animal.

There may be something in this. Mainstream politicians are renowned for their lip service commitment to environmental issues. Could it be that in the wake of the highly negative stories earlier this year concerning exceeded fishing quotas and river pollution, the government is fighting to redeem New Zealand's '100% Pure' brand image?  In addition, agriculture might benefit from an increase in native species' populations. An outstanding example of the latter is shown by a Federated Farmers of NZ estimate that native bees provide pollination services to the tune of NZ$4.5 billion per year!

Finally, we get to flora. As Bec Stanley, a curator at Auckland's Botanic Gardens, is keen to point out, the majority of people have plant blindness compared to their interest in animals. There are thought to around three invasive plant species for every four natives, with old man's beard, gorse, ragwort and nightshade being amongst the best-known culprits. These can smother and kill native plants, thus depriving indigenous animals of food. Despite being vital to the ecosystem, the war on introduced vegetation really seems to be underdeveloped compared to that against non-native animals.

It doesn't take much to upset the balance of at least a local-scale environment. The surviving remnants of mighty kauri forest are currently facing a disease thought to be caused by an introduced water-mould pathogen, a clear case of David conquering Goliath. Without careful consideration, the project to rid New Zealand of introduced pest species could end up doing more harm than good. The motives are potentially dubious and the research chronically under-funded. It remains to be seen whether there is the willpower to see it through or if it is just one more piece of political rhetoric that evaporates by the next election. Personally, I'm in favour of the idea, but uncertain of how realistic it is. Regardless, the citizens of New Zealand need to do their best, lest many more species join the ranks of moa, huia, adzebill and many, many others. After all, who wants their children living in an environment dominated by feral pigeons, rats and possum?

Thursday, 27 October 2016

Murky waters: why is the aquatic ape hypothesis so popular?

Whilst not in the same class as the laughably abysmal Discovery Channel mockumentaries on the likes of mermaids and extant (rather than extinct) megalodon, the recent two-part David Attenborough BBC Radio 4 documentary The Waterside Ape has left me gritting my teeth...grrr.

The programme has confirmed something I suspected from his 2010 BBC television series and associated book, First Life: namely, that the style of his exposition takes priority over the substance of his material. I'll quickly recap on the howler he made in an episode of First Life, ironically one that featured renowned trilobite expert Richard Fortey, albeit in a different sequence. When discussing trilobites, Sir David briefly mentions that they get their name from having three segments from front to rear: head, body and pygidium (tail) - which is totally wrong!

The name is the give-away. Tri-lobe refers to the three segments across the width of the body: a central lobe and two lateral lobes. Many creatures have the head, body and tail segmentation, so it would be far from unique in trilobites. I find this example of incorrect information rather discomforting, especially from someone like Sir David who has been a fan of trilobites since childhood. You have to wonder why experts aren't invited to give BBC science and nature documentaries the once-over before broadcast, just in case any gaffes have got through to the final cut?

The issue then, is that if we non-professionals believe the content espoused by such senior figures in the field of science communication - and if such material goes without basic error-checking from professionals - how is the public to receive a half-decent science education? Of course science isn't a body of knowledge but a toolkit of investigation techniques, but few of the general public have the ability to test hypotheses themselves or access the jargon-filled original scientific papers. So relying on books and media from distinguished communicators is the primary way of increasing our STEM (Science, Technology, Engineering and Mathematics) knowledge.

Back to The Waterside Ape. The hypothesis is an old one, dating back to marine biologist - and let's face it, oddball theorist - Sir Alister Hardy's first, unpublished speculations in 1930. However, the idea didn't achieve widespread dissemination until Elaine Morgan began to publicise it in the early 1970's. Otherwise known as a fiction writer, Morgan's output on the aquatic ape hypothesis was originally considered to be a feminist critique rather than particularly serious science, bearing in mind that the author lacks professional training or experience in the field of evolutionary biology.

Whether it is thanks to dissemination via the World Wide Web, her pro-aquatic ape books have become ever more popular over the past twenty years. This is in spite of the ever-increasing number of hominin fossils and sophisticated analytical techniques that have shown little support for the idea. I'm not going to examine the evidence for and against the hypothesis, since that has been done by many others and I'm marginally less qualified to assess it than Elaine Morgan. Instead, I'm more interested in how and why the idea has maintained popular appeal when the general consensus among the specialists is that it is profoundly incorrect.

Could it be that the engaging quality of Morgan's writing obscures a lack of dry (geddit?) analysis upon a subject that could at best be deemed as controversial - and thus fool the general readership as to its validity? Or is there more to it than that? The BBC seem to have maintained an on-going interest in supporting her work over the past two decades.

Indeed, The Waterside Ape is not David Attenborough's first foray into the idea. He made another two-part BBC Radio 4 series called Scars of Evolution back in 2005, which included some of the same interviews as the recent programmes. The BBC and Discovery Channel also collaborated in 1998 on a television documentary favouring the hypothesis called surprisingly enough The Aquatic Ape, albeit without Attenborough's involvement.

A key argument that I'm sure gets public support is that the of a radical - and female - outsider being shunned by the conservative, male-dominated establishment, with Elaine Morgan pitted against the reactionary old guard of palaeontologists, biologists, etc. Her plight has been described in the same vein as meteorologist Alfred Wegener's battle with orthodox geology between the world wars, but in Wegener's case his hypothesis of continental drift lacked a mechanism until plate tectonics was formulated several decades later. As for the aquatic ape, there seems to be a suite of models describing a gamut of ideas, from the uncontroversial speculation of hominins wading for iodine- and Omega-3-rich foodstuffs (promoting brain growth) to human ancestors being Olympic-class ocean swimmers who would feel at home in a Discovery Channel mermaid mockumentary.

We shouldn't ignore the emotive aspects of the hypothesis, which the various programmes have described as a "fascinating idea" that would be "lovely to confirm". Since most people still think of dolphins as innocent, life-saving and cute (when in fact they play brutal cat-and-mouse games with live porpoises) could this be a psychological attempt to salvage something of our own rapacious species?

Elaine Morgan admitted that her first book was a response to her annoyance with the 'killer ape' theories of the 1960's, as espoused in Robert Ardrey's seminal 1961 volume African Genesis. In these post-modern, politically-correct times of gatherers first and hunters second, Raymond Dart and Robert Ardrey's once-influential machismo ape-man has fallen from favour. Unfortunately, the famous Ardrey-influenced Dawn of Man sequence in 2001: A Space Odyssey promotes just such a viewpoint, so perhaps it isn't any wonder that supporting a more tranquil aquatic ancestry might appear to be an easy way to bring 21st century sensitivities to a world reeling from constant violence.

Another possible reason for the hypothesis' widespread support is that it relies on what appears to be an impressive accumulation of facts in the Darwinian mould, without recourse to difficult mathematics or sophisticated technical jargon. For those unable to get a clear understanding of major contemporary science (Higgs boson, anyone?) the idea of aquatic ape ancestors is both romantic and easy to digest, if the supporting evidence is taken en masse and the individual alternatives for each biological feature ignored or undeclared.

Clearly, whoever thinks that science is detached from emotion should think again when considering the aquatic/waterside/paddle-boarding ape. Although on the surface a seductive idea, the collection of proofs are selective, inadequate and in some cases just plain wrong. It might be good enough for the sloppy pseudo-scientific archaeology of Graham Hancock and Erich von Daniken, but good science needs rather more to go on. Yes, there are some intriguing nuggets, but as Dr Alice Roberts said in her critique of the recent Attenborough radio series, science is about evidence, not wishful thinking. Unfortunately, the plethora of material contains rather more subtleties than trilobite nomenclature, so I can only sigh again at just how many equally poorly-concocted ideas may be swashing around the world of popular science communication. Come on, Sir David, please read past the romance and dig a bit deeper: the world needs people like you!

Monday, 26 September 2016

Mopping up spilt milk: pollution in the New Zealand dairy sector

It's been slow to dawn on New Zealanders, but for a country that prides itself on a '100% Pure' image our environmental pollution record is fairly appalling - and shows few signs of alleviation. Politicians who point to the large percentage of the nation's electricity generation coming from renewable sources, not to mention the slow but sturdy growth in hybrid vehicles, are completely missing the point: it has been claimed that over half of New Zealand's greenhouse gas emissions emanate from agri business.

Although the quantity of sheep in the country has plummeted from a 1982 peak of around 70 million to less than 30 million last year, cattle numbers continue to rise. There are about 3.6 million livestock on beef farms and circa 6.5 million dairy cattle. The latter sector generates twenty percent of New Zealand's exports and seven percent of its GDP, so it forms a substantial component of the kiwi economy. But with plans to double the country's dairy production by 2025, the term 'sustainable development' appears to be, well, unsustainable.

Since cattle create as much waste product as fourteen humans, it's not difficult to imagine some of the more obvious forms of dairy pollutant, smell and all. As New Zealand dung beetles are primarily forest dwellers there have been trials of introduced dung beetle species to help clean up the waste, with a reduction in nitrous oxide emissions from the soil and a lowering of cattle disease as side benefits. However, pastoral poo is only one element in the catalogue of pollutants caused by dairy farming.

Last summer I was taken to an outdoor swimming hole not far from Wanganui, consisting of a rectangular concrete-lined pool situated on the edge of a forest. I was informed that children had swam there until a decade or so, but no more: several signs warned that the water is contaminated and no longer safe for humans. This story has been repeated throughout New Zealand, with agriculture being by far the most common culprit. It isn't just artificial environments that have this problem; reports suggest that within the past twenty years about two-thirds of monitored swimming areas within rivers have become too polluted. And that's just for people; there's far less concern for the effects on river fauna and flora.

Although environmentalists have been issuing warnings for years, not enough has been done to alleviate this problem. Last month approximately five thousand inhabitants of Havelock North were taken ill due to tap water contaminated by campylobacter. The source was a series of bores which the director of the Infectious Diseases Research Centre at Massey University, Professor Nigel French, put down to pollution from sheep and cattle. Sources of contamination could include carcases of dead livestock, as well as faecal matter getting into waterways that provide the source of unchlorinated - and therefore at risk - tap water.

In fact, the outbreak appears to be the tip of the iceberg. Despite some hundreds of cases of illegal effluent discharge brought against New Zealand farmers each year, many more escape prosecution. It has to be said this seems to be a regular occurrence for the Ministry for Primary Industries, judging by the recent reports of their waiving prosecutions for commercial fishing vessels caught flouting bycatch and dumping laws. Turning a blind eye seems to be the order of the day when it comes to protecting food production - or at least the food producers. This philosophy seems to be driven by those who clearly have little understanding of the complexity - and at times fragility - of food webs. Not so much short-term thinking as profound myopia!

In addition to the organic matter there are chemical pollutants that can find their way into water supplies situated close to farms. Since the 1990s, the New Zealand Ministry for the Environment has been monitoring ground water for nitrates and has found levels substantially above those recommended for drinking water. Although chemical fertiliser has been blamed in addition to livestock effluent, environmental mapping suggests the latter is the primary cause, since the polluted areas heavily coincide with the widest-scale dairy production.

As well as polluting waterways dairy farmers have also been caught stealing billions of litres of water each year from rivers and aquifers, especially in the Canterbury region. Whilst not a form of pollution per se, this is obviously somewhat lacking in the environmentally-friendly stakes. The deforestation of low-lying plains for cattle grazing is also a source of pollution, as the lack of tree roots, besides allowing greater flooding, can generate increased run-off into rivers. This polluted water can lead to algal blooms, lowering oxygen levels and so endangering freshwater fish. That might not sound of any great concern except to diehard anglers, but for any whitebait fans, four of the five Galaxiidae species whose young form this delicacy are now said to be threatened.

The systematic destruction of forests to make way for pastoral land use has been repeatedly raised as a concern not just by environmental organisations but by the New Zealand Ministry of Agriculture and Forestry (MAF) itself. Their 2006 report claimed close to half a million hectares of the nation's forests were at risk of conversion to land for cattle grazing.

In addition, overseas forests are also affected: since 2008 the amount of palm kernels imported into New Zealand as a dairy cattle feed supplement has doubled to over 2 million tons per annum. This accounts for about twenty-five percent of global production and comes at the expense of destruction of rainforests in nations including Indonesia and Malaysia. Although the state-owned farm company Landcorp Farming Ltd is in the process of moving to a different supplement over the next year or so, the dairy giant Fonterra has not announced similar intentions. What's wrong with those guys: a surfeit of Milton Friedman in their formative years?

Having covered solids and liquids, it's time to move on to gas. As I've mentioned on various occasions, methane is a primary greenhouse gas. It was therefore shocking to discover that per capita, New Zealand has the greatest annual methane emission rate worldwide, accounting for over forty percent of the country's greenhouse gas emanations. The methane emission from dairy cattle alone has continually increased over the past quarter century, although the amount reported varies from ten percent to a whopping fifty percent or so. Perhaps that's not surprising, considering cattle can each generate up to 500 litres of methane per day!

There is some recent cause for hope, with various trials under way to reduce bovine emissions. These range from vaccination to selective breeding to diets bases on forage rape, with the latter showing that the change in feed affects fermentation - and therefore reduces methane production - in sheep. However, it wouldn't hurt to see the Government funding more research in this matter: one widely-reported paper last year was Massey University's The New Zealand Dairy Farming: Milking Our Environment for All its Worth, which received much criticism from the dairy sector when it was revealed to consist primarily of a student thesis.

It's very easy to become depressed with such deleterious effects coming from just one sector. Of course no nation can afford to rest on its laurels: we cannot turn the clock back. The halcyon image of bucolic ruralism is a myth perpetrated by those who have never worked on the land and farmers deserve the benefits of modern technology in their work as much as anyone. The development of sophisticated tools and software can aid the dairy sector in preserving the environment. as long as there is enough public money to support this eco-friendly research. But Government funding for this type of sustainable development appears to be sadly lacking. Doesn't it make sense that those who run God's Own Country should try a little harder to prove that the 100% Pure tagline isn't just marketing spin?

Friday, 26 August 2016

The benefit of hindsight: the truth behind several infamous science quotes

With utmost apologies to Jane Austen fans, it is a truth universally acknowledged that most people misinterpret science as an ever-expanding corpus of knowledge rather than as a collection of methods for investigating natural phenomena. A simplistic view for those who adhere to the former misapprehension might include questioning science as a whole when high-profile practitioners make an authoritative statement that is proven - in a scientific sense - to be incorrect.

Amongst the more obvious examples of this are the numerous citations from prominent STEM (Science, Technology, Engineering and Mathematics) professionals that are inaccurate to such an extreme as to appear farcical in light of later evidence. I have already discussed the rather vague of art of scientific prognostication in several connected posts but now want to directly examine several quotations concerning applied science. Whereas many quotes are probably as deserving of contempt as the popular opinion of them, I believe the following require careful reading and knowledge of their context in which to attempt any meaningful judgement.

Unlike Hollywood, STEM subjects are frequently too complex for simple black versus white analysis. Of course there have been rather derisible opinions espoused by senior scientists, many of which - luckily - remain largely unknown to the wider public. The British cosmologist and astronomer Sir Fred Hoyle has a large number of these just to himself, from continued support for the Steady State theory long after the detection of cosmic microwave background radiation, to the even less defensible claims that the Natural History Museum's archaeopteryx fossil is a fake and that flu germs are really alien microbes!

Anyhow, here's the first quote:

1) Something is seriously wrong with space travel.

Richard van der Riet Woolley was the British Astronomer Royal at the dawn of the Space Age. His most infamous quote is the archetypal instance of Arthur C. Clarke's First Law:  "When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong."

Although a prominent astronomer, van der Riet Woolley had little knowledge of the practical mechanics that would be required for spaceflight. By the mid-1930s the British Interplanetary Society had developed detailed (although largely paper-only) studies into a crewed lunar landing mission. In 1936 Van der Riet Woolley publically criticised such work, stating that the development of even an unmanned rocket would present fundamental technical difficulties. Bear in mind that this was only six years before the first V2 rocket, which was capable of reaching an altitude of just over 200km!

In 1956, only one year before Sputnik 1 - and thirteen years prior to Apollo 11 - the astronomer went on to claim that near-future space travel was unlikely and a manned lunar landing "utter bilge, really". Of course this has been used as ammunition against him ever since, but the quote deserves some investigation. Van der Riet Woolley goes on to reveal that his primary objection appears to have changed (presumably post-V2 and its successors) from an engineering problem to an economic one, stating that it would cost as much as a "major war" to land on the moon.

This substantially changes the flavour of his quote, since it is after all reasonably accurate. In 2010 dollars, Project Apollo has an estimated budget of about US$109 billion - incidentally about 11% of the cost of the contemporary Vietnam War. In addition, we should bear in mind that a significant amount of the contractors' work on the project is said to have consisted of unpaid overtime. Is it perhaps time to reappraise the stargazer from a reactionary curmudgeon to an economic realist?

Indeed, had Apollo been initiated in a subsequent decade, there is reasonable evidence to suggest it would have failed to leave the ground, so to speak. The uncertainty of the post-Vietnam and Watergate period, followed by the collapse of the Soviet Union, suggest America's loss of faith in technocracy would have effectively cut Apollo off in its prime. After all, another colossal American science and engineering project, the $12 billion particle accelerator the Superconducting Super Collider, was cancelled in 1993 after being deemed unaffordable. Yet up to that point only about one-sixth of its estimated budget had been spent.

In addition, van der Riet Woolley was not alone among STEM professionals: for three decades from the mid-1920s the inventor of the vacuum tube Lee De Forest is said to have claimed that space travel was impractical. Clearly, the Astronomer Royal was not an isolated voice in the wilderness but part of a large consensus opposed to the dreamers in the British Interplanetary Society and their ilk. Perhaps we should allow him his pragmatism, even if it appears a polar opposite to one of Einstein's great aphorisms: "The most beautiful thing we can experience is the mysterious. It is the source of all true art and science. .."

Talking of whom…

2) Letting the genie out of the bottle.

In late 1934 an American newspaper carried this quotation from Albert Einstein: "There is not the slightest indication that (nuclear energy) will ever be obtainable. It would mean that the atom would have to be shattered at will." This seems to be rather amusing, considering the development of the first self-sustaining nuclear chain reaction only eight years later. But Einstein was first and foremost a theorist, a master of the thought experiment, his father's work in electrical engineering not being noticeably sustained in his son. There is obviously a vast world of difference between imagining riding a beam of light to the practical difficulties in assembling brand new technologies with little in the way of precedent. So why did Einstein make such a definitive prediction?

I think it is possible that it may also have been wishful thinking on Einstein's part; as a pacifist he would have dreaded the development of a new super weapon. As the formulator of the equivalence between mass and energy, he could have felt in some way responsible for initiating the avalanche that eventually led to Hiroshima and Nagasaki. Yet there is no clear path between E=mc2 and a man-made chain reaction; it took a team of brilliant experimental physicists and engineers in addition to theorists to achieve a practical solution, via the immense budget of $26 billion (in 2016 dollars).

It is hardly as if the good professor was alone in his views either, as senior officials also doubted the ability to harness atomic fission for power or weaponry. In 1945 when the Manhattan Project was nearing culmination, the highest-ranking member of the American military, Fleet Admiral William Leahy, apparently informed President Truman that the atomic bomb wouldn't work. Perhaps this isn't as obtuse as it sounds, since due to the level of security only a very small percentage of the personnel working on the project knew any of the details.

Leahy clearly knew exactly what the intended outcome was, but even as "an expert in explosives" had no understanding of the complexity of engineering involved. An interesting associated fact is that despite being a military man, the Admiral considered the atomic bomb unethical for its obvious potential as an indiscriminate killer of civilians. Weapons of mass destruction lack any of the valour or bravado of traditional 'heroic' warfare.  Is it possible that this martial leader wanted the bomb to fail for moral reasons, a case of heart over mind? In which case, is this a rare example in which the pacifism of the most well-known scientist was in total agreement with a military figurehead?

Another potential cause is the paradigm shift that harnessing the power of the atom required. In the decade prior to the Manhattan Project, New Zealand physicist Ernest Rutherford had referred to the possibility of man-made atomic energy as "moonshine" whilst another Nobel laureate, American physicist Robert Millikan, had made similar sentiments in the 1920s. And this from men who were pioneers in understanding the structure of the atom!

As science communicator James Burke vividly described in his 1985 television series The Day the Universe Changed, major scientific developments often require substantial reappraisals in outlook, seeing beyond what is taken for granted. The cutting edge of physics is often described as being ruled by theorists in their twenties; eager young turks who are more prepared to ignore precedents. When he became a pillar of the establishment, Einstein ruefully commented: "To punish me for my contempt for authority, fate made me an authority myself."

Perhaps then, such fundamental shifts in technology as the development of space travel and nuclear fission require equally revolutionary changes in mind set and we shouldn't judge the authors of our example quotes too harshly. Then again, if you are an optimist, Clarke's First Law might seem applicable in this situation, in which case quotes from authority figures with some knowledge of the subject in hand should take note of the ingenuity of our species. If there is a moral to this to story, it is other than the speed of light in a vacuum and the Second Law of Thermodynamics, never say never...

Wednesday, 27 July 2016

Resistance is futile: the ongoing war against super bugs

As I'm currently three days into an irritating cough (aren't they all?) accompanied by a sore throat, I've just taken a soothing lozenge. The packet states the lozenges contain a combination of two antibacterial agents which aim to help kill the bacteria causing the infection. However, the packet also notes - in a somewhat smaller font size - that there is no clinical proof an antibacterial agent will reduce the severity or duration of the infection. Could this be because common colds and influenza are caused by viruses not bacteria? I don't suppose the pharmaceutical industry could possibly be duping an ignorant public in the name of profit margins?

Working in a hot desking environment, I frequently remind colleagues not to overdue usage of anti-bacterial sprays on their desk, keyboards, mouse and telephone. Not that I'm exactly certain how damaging the company-supplied sprays are, environmentally-speaking: for all I know, they may be good enough to destroy all the 'bad' bacteria, but I'd rather be safe than sorry. Instead, I recommend the method I use at work, namely washing my hands before eating. Simple, and hopefully less likely to encourage super bugs.

It seems to me that basic hygiene is preferable to the chemical war on microbes, since (a) some are beneficial, including for building immunity; and (b) some strains may survive the cull and lead to a desk biota high in resistant bacteria: after all, isn't that just Darwinian natural selection being given an unintentional boost? Unfortunately, there has been a large increase in sick leave since we moved from conventional offices to hot-desking. Therefore something is clearly going wrong, regardless of approach!

The best well-known of the super bugs has to be Methicillin-resistant Staphylococcus aureus (MRSA), beloved of news journalists but very few others. Although the resistance was first recognised around 1960, the past twenty-five years or so has seen a plethora of scientific reports describing outbreaks separated from healthcare environments. Therefore popular news articles about super bugs in hospitals - and the over-use of antibiotics that have aided their increase in range - only started hitting the headlines after the bacteria had already spread to other types of locale.

This latter community-associated or CA-MRSA is therefore at least as great a risk as the hospital variant, often affecting younger people. MRSA naturally occurs in several percent of the population anyway, so it would be difficult to totally eradicate by any foreseeable method. Many common antibiotics are already useless against MRSA, which can be spread by direct skin contact as well as via objects - such as computer keyboards and mice I might add, to anyone considering converting their offices to hot desking. In addition, the far less well-known methicillin-sensitive Staphylococcus aureus (MSSA) is also on the increase.

Another key reason for the increase of resistant microbes is thanks to the use of antibiotics on farmed animals. Whilst it might seem sensible for densely-packed livestock to be inoculated - frankly I don't mind paying more for free range rather than battery-farmed eggs, but I realise that isn't an option for many - the discovery in the 1940s that antibiotics can be used to promote growth imply profit is yet again the key factor here. Far from being a simple precautionary measure against the spread of infection, livestock and poultry has been given pharmaceuticals in order to maximise produce without an associated increase in feeding costs.

In 1969 the Swann report on the Use of Antibiotics in Animal Husbandry and Veterinary Medicine recommended a cease on their use as growth promoters. After a long period of inertia, the European Union eventually banned such usage for eight antibiotics, four in 1989 and a further four in 2006. Unfortunately many other nations, including the USA, are still pumping enormous amounts of pharmaceuticals into farm animals.

I've found very little in the way of research projects that seek to lessen this dependency. Possibly the method containing the least disruption would be to develop drugs that have similar effects on animal growth but aren't required as human medicine. Perhaps the pharmaceutical giants just aren't finding antibiotic development profitable enough anymore; after all, if medical practice wants to prevent the spread of resistant bacteria it needs to minimise use of antibiotics.

The effects agricultural usage is having is wide-ranging, from pathogens crossing from livestock to humans and back again, to infections spreading to pets and even into wild animals such as flies and rodents. However, the USA seems to have made little effort to follow the EU, with about 80% of the antibiotics sold there being used on farm livestock. Yet another MRSA variant, CC398, has been gaining ground, particularly in pigs and can transfer to humans in the form LA-MRSA. What price a cheap bacon sandwich?

It isn't as if the American scientific establishment hasn't been amassing data to support the case for stopping the practice, which over the past half century or so has led to other, less well-known strains such as Campylobacter coli gaining immunity, Despite high levels of infected produce, large-scale recalls and perhaps over 100,000 deaths per annum in the USA alone (farm workers and food processors can pick up strains, not just the end consumer), commerce appears to be winning over common sense.

It isn't completely bad news: research by the University of Southampton indicates that copper might become useable as an inhibitor (which seems strange - I thought silver might be the metal of choice, considering its anti-bacterial properties - guess that proves I'm not a research chemist, then!) In addition, some of the main fast food chains have started to cut down on buying produce from antibiotic-pumped livestock. But is this too little much too late? With most pharmaceutical production in the hands of a few giant multi-nationals, the human race is largely beholden to a very small number of executives. My suggestion would be...err...just don't get ill? Or work in a hot desking environment. Or leave your home, ever...hmm...

Tuesday, 21 June 2016

Military intelligence: how martial sci-tech does science few favours

I recently read an article about the USA's latest aircraft carrier the USS Gerald R. Ford that contained two bewildering facts: that at a combined research and construction cost of around US$18 billion it is the most expensive warship ever built; and that although only the first of three ships to be built in the class - and with an intended lifespan of half a century - it may already be obsolete.

So if potential aggressor nations now have the anti-ship missile technology to sink the carrier, is it little more than an enormous waste of taxpayer funds? There are reports of war games and simulations over the past three decades which fundamentally undermine the Victorian notion of technological progress - that bigger, stronger, faster equals better. This is particularly apt if your opponent uses 'unfair' and/or 'underhand' tactics such as stealth systems and guerrilla strategies. Then why are these colossal projects still being funded?

The USS Gerald R. Ford is merely the (admittedly very large) tip of an enormous iceberg concerning military expenditure of recent decades. Just to drive the point home, here's a few other recent examples:
  1. The US Navy's aircraft carrier-version of the Lightening II Joint Strike Fighter is the F-35C, with some estimates suggesting each combat-ready aircraft costs up to $337 million.
  2. The US Air Force's F-22 Raptor programme was shut down after only 187 operational aircraft were built, as the price per airframe was even higher, around $350 million.
  3. The apotheosis of combat aircraft has to be the B-2 Spirit stealth bomber. Only 21 were ever built, at a whopping $737 million each, excluding the research and development costs, which may double or even triple this number.
  4. So as to not seem unfairly biased against the USA, other nations also have their share of military expenditure. For example, South Korea's K2 Black Panther is the most expensive main battle tank ever built, with per-unit costs of US$8.5 million each.
So who's to blame for all this? The USS Gerald R. Ford for example was approved during George W. Bush's administration but is only nearing completion eight years after he has left office. At least in democracies, politicians usually come and go in less than a decade whilst defence contractors last much longer. Could the armaments sector be duping administrations into giving them a lifeline? A large proportion of manufacturing has migrated to developing nations but due to the sensitive nature of the sector, advanced military technology is one of the few areas still concentrated within the developed West.

It's difficult to collate anything like exact figures, but the proportion of STEM (Science, Technology, Engineering and Mathematics) professionals worldwide who work on military projects is frequently given as 20% to 25%. Is it feasible that this high level of involvement in an area that is both secretive and horrendously expensive may be counter-productive to the public's attitude to science in general?

After all, no other sector has access to such enormous amounts of tax payer's funds without being responsible to some form of public scrutiny. Then again, since the early 1980s we have been sold a vision of military technology that is a mostly one-sided glorification of armaments and the requirements for ever-increasing expenditure in the name of freedom.

How many mainstream Hollywood movies since 1986's Top Gun - including plenty of sci-fi epics - can be seen as glossy advertisements for advanced weaponry? It may seem odd considering the conventional portrayal of movie scientists but homages to the military-industrial complex show little sign of abating. Praise be to the sophistication of the technology, whilst damning those who develop it as untrustworthy schemers outside of mainstream society. It's a curious phenomenon!

However, developing advanced technology for military purposes is hardly new. The ancient Greek Archimedes developed anti-ship devices whilst Leonardo da Vinci wrote effusive letters to prospective patrons about his land, sea and even aerial weapons, albeit some were of dubious practicality.

Today's society is supposedly more refined than those earlier times, yet whilst a concerted effort is being made to attract more women to STEM subjects, the macho nature of armaments presumably ensures the sector remains male-dominated. If proof would were needed of the interest in all things explosive, the global success of the TV show Mythbusters should be a good indicator. If an example of the crazy nature of unrestrained masculinity needs delineating, then how about atomic bomb pioneer Edward Teller's promotion of nuclear devices for civil engineering projects? For every J. Robert Oppenheimer there were far more Tellers.

It isn't just the sheer cost of contemporary military projects that can lead to the ire of taxpayers. There have been some almost farcical instances of under-performance, such as the degradation of the B-2's anti-radar coating by high levels of humidity (never mind rain). It's easy to blame the scientists and engineers in such circumstances; after all, the politicians and generals leave the cutting-edge technology to the experts! But talk about over-promise and under-deliver...

One area that presumably didn't exist before the Twentieth Century's development of weapons of mass destruction cannot be blamed on STEM professionals and that is the deliberate use of civilians as guinea pigs. From the US and British atomic bomb tests that affected local populations as well as military personal to the cloud-seeding experiments over heavily-populated areas that may have led to fatal downpours, it seems no-one is safe from their own armed forces.

Of course, a large proportion of the degradation of the image of scientists as authority figures may have occurred during the Cold War, when it became apparent that military technocrats of the period earned their reputation as 'architects of the apocalypse'. There's obviously a lot of complexity around this issue. Arguments range back and forth, on such topics as once the Apollo moon landings proved America's technological superiority to the Soviet Union, the project was rapidly wound up; or how did the more right-wing elements of society feel when that same know-how was stalemated by markedly inferior forces in Vietnam?

The space shuttle was another victim of military requirements, the orbiter's unprecedented size being needed for the then large spy satellites - and the intention to fly two of them from Vandenburg Air Force base for 'shadow' missions. In a sense, the military could be seen to have had their fingers in many leading but nominally civilian pies.

This isn't to say that there haven't been productive examples of military technology modified for civilian usage, from early manned spacecraft launched on adapted ICBMs to the ARPANET providing a foundation for the Internet.

Even so, it is easy to look at the immense worldwide expenditure on weapon development and wonder what could be achieved if even a few percent of that funding was redirected elsewhere. There's no doubt about it: the sheer quantity, sophistication and expensive of modern military hardware provides some legitimate public concerns as to the role of science and technology in the name of 'defence'. Especially if $18 billion worth of aircraft carrier is little more than a showy piece of machismo that belongs to the last half century, not the next.

Wednesday, 25 May 2016

From Dr Strangelove to Dr Evil: Hollywood's anti-science stance

Despite the decades of hard work by the likes of Bill Nye, Stephen Hawking, Carl Sagan, Stephen Jay Gould etal, there is still an enormous amount of public suspicion surrounding scientists and their work. From wavering opinion concerning climate change to the negative publicity revolving around genetically-modified crops (A.K.A. 'Frankenfoods') it seems that popular opinion of scientists isn't far above that meted out in recent years to politicians and merchant bankers.

Tabloid media cannot be solely to blame for this, although the ridiculous scaremongering stories given front page attention, frequently involving medical science, are certainly no help. Instead, I would argue that some of the blame for the public attitude to STEM (Science, Technology, Engineering and Mathematics) comes from that ubiquitous global communicator, mainstream Hollywood. So where did the world's movie capital get its ideas from?

It seems that the denigration of science and its technological applications has probably existed as long as modern science itself. Before there were films to spread the negativity, literature had a mixed opinion of the discipline. Could some of the most famous apparently anti-scientific publications from Europe have inspired Hollywood's pioneers, many of whom were European emigrés?

Jonathan Swift's third book of Gulliver's Travels concerns the scientific elite of a floating island called Laputa. First published in 1726 during the so-called Age of Enlightenment, the book is typical of Swift's no holds barred approach to satire, making much use of the learning of the day. Despite being far more concerned with social and political issues rather than an anti-scientific stance, the material is still echoed today in the popular media.

Granted, many would agree that some of the more expensive STEM research projects such as the Large Hadron Collider could wait until global issues concerning hunger, medicine, environmental degradation - and poverty in general - are solved, but then wealth is rarely evenly distributed. After all, the USA apparently spends twice as much on pet grooming as it does on nuclear fusion research. Incidentally, isn't this bizarre in itself: it's not just that we consider ourselves so much more rational than all other animals, but that the human brain is the most complex object in the known universe. That's a pretty scary thought!

As for Mary Shelley's classic novel whose title is evoked during criticism of GM foods, she may have been inspired by the general feeling of doom then in the air; almost literally in fact, due to the 1815 eruption of Mount Tambora, with volcanic dust creating 1816's 'Year without a Summer'. As an aside, the astonishingly lurid colours of J.M.W. Turner's sunsets of the period were another artistic response associated with the high-altitude volcanic aerosols.

In addition to the extremely cold, wet conditions of that year, Shelley is thought to have stopped near to the original Frankenstein Castle in Germany, where alchemy and other dubious dark arts were reputed to have been practiced. Combined with Luigi Galvani's experiments on frogs' legs - originally performed several decades earlier but much imitated still in Shelley's time, including on human cadavers - the novel is clearly a reflection of widespread anxieties of the time.

With the expansion of industrial cities and their associated squalor, the mid-Nineteenth Century saw the origin of philosophies that associated technological advances (and their scientific underpinnings) with a debasement of humanity. William Blake's description of 'satanic mills' epitomises this mode of thought, seen in as diverse a range of expression as the Pre-Raphaelite Brotherhood of artists, the Arts and Crafts movement, even the political writings of Marx and Engels. To blame the greed of the new captains of industry on science is obviously unfair, but then the latter were a far easier target. After all, the English chemist and political radical Joseph Priestley fled to the United States after an authority-sponsored mob burnt down his house in 1791.

Blake's over-wraught emoting ("Science is the Tree of Death") is amongst the strongest negativity of the period, but can we blame him, considering science was, as it is today, often wrongly blamed as the root cause of the widespread destruction of nature to make way for a soulless, artificial environment? But it wasn't just a response to the changes to society and landscape that Blake took exception to: he detested the mechanistic vision of the universe built upon the work of Galileo and Newton, believing that too much knowledge destroyed wonder and awe.

This is clearly as subjective a viewpoint as any discussion of a work of art; it can be easily rebuffed, although the attitude behind it should be treated seriously. Happily, today's plethora of glossy coffee table books on such scientifically-gleaned wonders as Hubble Space Telescope imagery show there is still plenty to be in awe of.

Mainstream cinema frequently paints a very A versus B picture of the world (think classic westerns or war films). But science can rarely fit into such neat parcels: consider how the more accurate general theory of relativity can live alongside its predecessor from Newton. In addition, it's very tricky to make interesting drama within a traditional narrative structure that utilises scientist protagonists unless it's a disaster movie (even the likes of Jurassic Park falls within this category.)

It isn't difficult to recall many negative examples of scientists in Hollywood movies, from at best those too wrapped up in their own work to notice its wider effects, to at worst insane megalomaniacs intent on either world domination or destruction. In contrast, how many sympathetic movie scientists are there?

It seems such a shame that such a ubiquitous form of entertainment consistently portrays such a lack of sympathy towards science. Even the film version of Carl Sagan's novel Contact lacked the cosmic spiritual elements of the source material, as if afraid that a combination of astrophysics and the mystical wouldn't be comprehensible to audiences (2001 syndrome, perhaps?) Science fiction films these days often seem keen to boast of their technical consultants, so what about a more sympathetic attitude to the practitioners of science itself? After all, most scientists don't live with their private armies in secret headquarters bases, planning to takeover the world...

Friday, 1 April 2016

Hollywood's natural history hobbit hoax: did Peter Jackson create Homo floresiensis for publicity purposes?

Judging by the limited ingredients of contemporary blockbusters, cinema audiences are fairly easy to please. Or are they? Peter Jackson's magnum opus The Lord of the Rings trilogy made an absolute mint at the box office and garnered seventeen Oscar wins besides critical acclaim. In contrast, The Hobbit trilogy received but a single Oscar accompanying some rather lukewarm reviews.

The reason for the critical indifference and lack of awards has been put down to franchise fatigue, although to be fair stretching a children's book over three long movies whilst partly improvising the script at a late stage couldn't have helped. So if you are a world-renowned film maker well aware that you are judged by many of your fans and much of your peer group on the success - and possibly the quality - of your latest film, it wouldn't be surprising if you go to great lengths to maximise that success. Just how far Peter Jackson went for The Hobbit trilogy is read on...

It's been some years since I visited Weta Cave in Wellington, where close-up views of various costumes and props from movies including the LOTR trilogy leaves you in no doubt about the superb workmanship the effects house is capable of. Some of the exhibits and merchandise included non-human characters from Middle Earth and District 9, the quality of which got me thinking. Peter Jackson is known to have visited the Natural History Museum when in London recording the soundtrack for The Lord of the Rings. This in itself is not suspect, except that the museum was at the time hosting an exhibition about the infamous Piltdown Man.

For anyone who knows anything about science scandals, Piltdown Man has to be among the most notorious. The 1908 discovery in southern England of a hominin skull of unknown species was rapidly followed by numerous associated finds, all touted as genuine by professional scientists. In fact, by 1913 some palaeontologists had already suggested what was finally confirmed forty years later: the entire assemblage was a fraud, the skull itself including an orang utan jawbone with filed-down teeth! The fact that so many specialists authenticated the remains is bizarre, although it may be that patriotic wishful thinking (to confirm prehistoric hominins had lived in Britain) overrode any semblance of impartiality.

Back to Peter Jackson and his hobbit conundrum. Although LOTR trilogy did the bums-on-seats business (that's an industry term, in case you were wondering), Jackson's next film was the 2005 King Kong remake. Included in the record-breaking US$207 million production costs was a $32 million overspend which the director himself was personally responsible for. Having already been put into turnaround (that's cold feet in Hollywoodese) in the previous decade, Jackson was determined to complete the film to his own exacting standards, thus resulting in the financial woes surrounding the production.

So just how do you get the massive budget to make a prequel trilogy that's got a less involved storyline (sound vaguely familiar, Star Wars fans?) directly after you've made the most expensive film in history, which is not even a remake but a second remake? How about generating tie-in publicity to transfer from the real world to Middle Earth?

Around the time that Peter Jackson's production company Three Foot Six was being renamed (or if you prefer, upgraded) to Three Foot Seven, worldwide headlines announced the discovery of a small stature hominin of just this height. The first of the initial nine specimens found on the island of Flores, labelled LB1, would have been a mere 1.06 metres tall when alive, which is three feet six inches give or take a few millimetres.

Coincidence? When in doubt, adherents of scientific methods should follow the principle of parsimony, A.K.A. Occam's razor. Which in this case has led to me putting my conspiracy hat on.

Consider this: the new species rapidly became far better known by its nickname the 'hobbit people' than as Homo floresiensis. Which was handy for anyone about to spend US$225 million on three films involving hobbits. In addition, it was discovered at the perfect time for Jackson to get maximum publicity (admittedly not the release of the first hobbit film, but for purposes of convincing his American backers of the audience anticipation).

The smoking gun evidence for me is the almost comical resemblance the remains bear to Tolkien's creations. For example, the feet are said to be far longer and flatter than any other known hominin species. Remind you of anything you've seen at the movies? It's just a shame that hair doesn't survive as long as the alleged age of the specimens - which based on the stratigraphy has been estimated from 94,000 to 13,000 years ago.

In addition, how could such creatures have built the bamboo rafts or dug-out boats necessary to reach the island in the first place? When sea levels dropped during glaciation periods Flores was still convincingly isolated from the mainland. Braincase analysis shows that Homo floresiensis had an orange-sized brain. Since the tools found with the semi-petrified organic remains were simple stone implements, the idea of real-life hobbits sailing the high seas appears absurd in the extreme.

Several teams have attempted to extract DNA from the water-logged and delicate material but after a decade's effort none have been successful. This seems surprising, considering the quality of contemporary genetic replication techniques, but perhaps not if the material consists of skilfully crafted fakes courtesy of Weta Workshop. Some of the fragments appear similar to chimpanzee anatomy, but then Peter Jackson has always tried to make his creatures as realistic as possible. Indeed, he even hired a zoologist to ensure that his King Kong was anatomically correct (I recall hearing that some of his over-sized gorilla's behind needed reworking to gain accuracy. Now that's dedication!)

There has also been some rather unscientific behaviour concerning the Homo floresiensis remains which appears counter to the great care usually associated with such precious relics. At one point, the majority of material was hidden for three months by one of the Indonesian paleoanthropologists, only for what was returned to include damaged material missing several pieces. All in all, there is much about the finds to fuel speculation as to their origin.

In summary, if you wanted to promote worldwide interest in anything hobbit-wise what could be better yet not too obvious? Just how the much the joint Australian-Indonesian archaeology and palaeontology team were in the know is perhaps the largest mystery still remaining. I've little doubt that one day the entire venture will be exposed, perhaps in a documentary made by Peter Jackson himself. Now that would definitely be worth watching!

Tuesday, 15 March 2016

Pre-teen coding electronic turtles: should children learn computer programming?

Way back in the mists of time when I was studying computer science at senior school, I was part of the first year at my (admittedly rural and far from innovative) school to use actual computers. Previous years had been stuck in the realm of punched tape and other such archaic wonders, so I was lucky to have access to the real thing. Now that we use smartphones with several hundred thousand times more memory than the first computer I owned - a Sinclair ZX Spectrum 48, if you're interested - I'd like to ask is it worthwhile teaching primary school children programming skills rather than just learning front-end interfaces?

I remember being amazed to learn that about the same time as I was getting to grips with the Sinclair version of BASIC, infants in Japan were already being taught the rudiments of programming via turtle robots and Logo. These days of course, children learn to use digital devices pretty much from the egg, but back then it seemed progressive in the extreme. My elder (but still pre-teen) daughter has so far dabbled with programming, mostly using drag and drop interfaces in game coding sessions and at her school's robot club, which involves the ROBOTC language and Vex robots.

Ironically, if I still lived in Britain then my younger daughter would already be learning computer science at school too, as in 2014 the UK Government made the subject mandatory for all children from five years' old. Not that this step came easily: apparently there was a struggle in the lead up to the curriculum change to find enough qualified teachers. Clearly, the effort involved in establishing such as policy suggests the level of importance placed upon it.

In contrast to the UK, New Zealand has slipped in the educational avant-garde. Digital technology is not a compulsory subject here and many lower-decile schools use old, unsupported software such as the Windows XP operating system. A combination of untrained teachers and parental attitudes is being blamed for a decline in the number of programmers in the country. I know of one Auckland-based technology centre where the majority of hands-on coders are predominantly recruited from overseas and incidentally - unlike the less-technical roles - are mostly men. Of course, the shortage could be partly due to the enticement of kiwi developers to the far larger and better-paid job markets in Australia, the UK and USA, but even so it seems clear that there is a definitive deficiency in New Zealand-born programmers.

Luckily, programming is a discipline where motivated children can learn coding for free, with online resources provided by numerous companies all the way up to Google and Microsoft. However, this presupposes both adequate internet access and parental support, or at least approval. If the current generation of parents don't understand the value of the subject, then it's highly unlikely many children will pick up the bug (ahem, that's a programming pun, of sorts.)

Compared to the BASIC and Logo languages available in my teenage years there is now a bewildering array of computer languages, interfaces and projects that teach the rudiments of programming, with colourful audio-visual interfaces such as Alice, Scratch (a bit like a virtual lego), CiMPLE, Kodu, etc, all intended for a pre-teen audience. Of course, they are far removed from complexity of professional languages such as the C family or Java - I have to say that Object-Orientated Programming was certainly a bit of a shock for me - but these applications are more about whetting the appetite and generating quick results so as to maintain interest.

So what are the reasons why learning to code might be a good idea for young children, rather than just teaching them to use software such as the ubiquitous Microsoft Office? Might not the first three or four years at school be better spent learning the traditional basics of reading, writing and arithmetic? After all, this period is crucial to gaining the frameworks of grammar and mathematics, which in their own way provide a solid background for some of the key elements of coding such as syntax, operators and of course spelling!

Apart from the obvious notion that the demand for programmers is likely to increase in the next generation, not just for computers and touch devices, but for all sorts of consumer items from cars to watches (at least until computers become sophisticated enough -and fast enough - for programming in everyday language) there are benefits and skills useful in the wider world. The following reasons are probably just the tip of the iceberg:
  • It exercises the mind, sharpening analytical thinking and trouble-shooting abilities
  • Coding can be thought of as akin to learning a foreign language or how to read music, so may hone those skills
  • Programming can generate a fruitful combination of creative and mathematical skills, which is difficult to obtain in most other subjects
  • This is the age of information economies, so programming is the largest employment growth sector in much of the developed world.
One worrying trend is the decline in the number of female programmers over the past quarter century. Perhaps this isn't surprising in the game coding field, considering that the vast majority of its themes are centered on the military and fantasy violence. But then doesn't this extremely popular, highly visible and decidedly lucrative sector of contemporary computing bolster the notion widespread among women that leisure-time computing is primarily the domain of socially-inadequate young men?

Research suggests that women consider computers as a tool to aid numerous disciplines whilst men look upon them more as an end in themselves. Surely learning to use them in-depth at an early age could help achieve a more liberal attitude from either extreme? Computers - and indeed the increasing number of programmable consumer devices - are not going away any time soon. If the near future of humanity will rely ever more closely on interfacing with these machines, then shouldn't as many of us as possible gain some understanding of what goes on 'under the hood'? After all, there has to be someone out there who can make a less buggy operating system than Windows 10!

Wednesday, 24 February 2016

Drowning by numbers: how to survive the information age

2002 was a big year. According to some statistics, it was the year that digital storage capacity overtook analogue: books gave way to online information; binary became king. Or hyperbole to that effect. Between email, social media, websites and the interminable selfie, we are all guilty to greater or lesser extent of creating data archived in digital format. The human race now generates zettabytes of data every year (a zettabyte being a trillion gigabytes, in case you're still dealing in such minute amounts of data).

So what's so bad about that? More and more we rely on syntheses of information in order to keep up with the exponential amount of knowledge revealed to our species by the scientific and other methods. Counter to Plato's 2400 year-old dialogue Phaedrus, we can no longer work out everything important for ourselves; instead, we must rely on analysis and results created by other, often long-dead, humans. Even those with superb memories cannot retain more than a miniscule fraction of the information known about even one discipline. In addition, we can now create data-rich types of content undreamed of in Plato's time. Some, MRSI medical scans being an ad-hoc example , may require long-term storage. If quantum computing becomes mainstream, then that will presumably generate an exponential growth in data.

What then, are the primary concerns of living in a society that has such high demands for the creation and safe storage of data? I've been thinking about this for a while now and the following is my analysis of the situation.

1. Storage. In recent years it has become widely known that CDs and to a lesser extent DVDs are subject to several forms of disk rot. I've heard horror stories of people putting their entire photo and/or video collection onto portable hard drives, only for these to fail within a year or two, the data being irrevocably lost. With the advent of cloud storage, this lessens the issue, but not completely. Servers are still subject to all sorts of problems, with even enterprise-level solutions suffering due to insufficient disaster recovery and resilience (to use terms us web developers use). I'm not saying audio tapes, vinyl records and VHS were any better, far from it, but there is a lot less data stored in these formats. There are times when good old-fashioned paper still rules - as it still does in the legal and compliance sectors I've had contact with.

2. Security and privacy. As for safety, the arms race against hackers, etal, is well and truly engaged. Incompetence also has its place. When living in the UK I once received a letter stating that my children's social services records, including their contact details, had become publicly available. This turned out to be due to loss of a memory stick containing database passwords. As for identify theft, well, let's just say that Facebook is a rude word. I managed to track down an old friend after nearly twenty years' incommunicado, finding details such as his address, wife's name and occupation, etc, mostly via Facebook, in less than half an hour. Lucky I'm not a stalker, really!

Even those who avoid social media may find themselves with some form of internet presence. I had a friend who signed a political petition on paper and then several years' later found his name on a petition website. Let's hope it was the sort of campaign that didn't work against his career - these things can happen.

And then there's the fact that being a consumer means numerous manufacturers and retail outlets will have your personal details on file. I've heard that in some countries if you - and more particularly your smartphone - enter a shopping mall, you may get a message saying that as a loyal customer of a particular store there is a special sale on just for you, the crunch being that you only have a limited time, possibly minutes, to get to the outlet and make a purchase. Okay, that doesn't sound so bad, but the more storage locations that contain your personal details, the greater the chance they will be used against you. Paranoid? No, just careful. Considering how easy it was for me to become a victim of financial fraud about fifteen years ago, I have experience of these things.

As any Amazon customer knows, you are bombarded with offers tailored via your purchase record. How long will it be before smart advertising billboards recognise your presence, as per Steven Spielberg's Minority Report? Yes, the merchandiser's dream of ultimate granularity in customer targeting, but also a fundamental infringement of their anonymity. Perhaps everyone will end up getting five seconds' of public fame on a daily basis, thanks to such devices. Big Brother is truly watching you, even if most of the time it's for the purpose of flogging you consumer junk.

3. Efficiency. There are several million blog posts each day, several hundred billion emails and half a billion daily tweets. How can we possibly extract the wheat from the chaff (showing my age with that idiom), if we spend so much time ploughing through social media? I, for one, am not convinced there's much worth in a lot of this new-fangled stuff anyway (insert smiley here). I really don't want to know what friends, relatives or celebrities had for breakfast or which humorous cat videos they've just watched. Of course it's subjective, but I think there's a good case for claiming the vast majority of digital content is a complete load of rubbish. So how can we live useful, worthwhile or even fulfilled lives when surrounded by it? In other words, how do we find the little gems of genuine worth among the flood of noise? It seems highly probable that a lot of the prominent nonsense theories such as moon landing hoax wouldn't be anywhere near as popular if it wasn't for the World Wide Web disseminating them.

4. Fatigue and overload. Research has shown that our contemporary news culture (short snippets repeated ad nauseum over the course of a day or so) leads to a weary attitude. Far from empowering us, bombarding everyone with the same information, frequently lacking context, can rapidly lead to antipathy. Besides which, if information is inaccurate in the first place it can quickly achieve canonical status as it gets spread across the digital world. As for the effect all this audio-visual over-stimulation is having on children's attention where was I?

5. The future. So are there any solutions to these issues? I assume as we speak there are research projects aiming to develop heuristic programs that are the electronic equivalent of a personal assistant. If a user carefully builds their personality profile, then the program would be expected to extract nuggets of digital gold from all the sludge. Yet even personally-tailored smart filters that provide daily doses of information, entertainment, commerce and all points in between have their own issues. For example, unless the software is exceptional (i.e. rather more advanced than anything commercially available today) you would probably miss out on laterally- or tangentially-associated content. Even for scientists, this sort of serendipity is a great boon to creativity, but is rarely found in any form of machine intelligence. There's also the risk that corporate or governmental forces could bias the programming…or is that just the paranoia returning? All I can say: knowledge is power.

All in all, this sounds a touch pessimistic. I think Arthur C. Clarke once raised his concern about the inevitable decay within societies that overproduced information. The digital age is centered on the dissemination of content that is both current and popular, but not necessarily optimal. We are assailed by numerous sources of data, often created for purely commercial purposes; rarely for anything of worth. Let's hope we don't end up drowning in videos of pesky kittens. Aw, aren't they cute, though?

Tuesday, 26 January 2016

Spreading the word: 10 reasons why science communication is so important

Although there have been science-promoting societies since the Renaissance, most of the dissemination of scientific ideas was played out at royal courts, religious foundations or for similarly elite audiences. Only since the Royal Institution lectures of the early 19th century and such leading lights as Michael Faraday and Sir Humphry Davy has there been any organised communication of the discipline to the general public.

Today, it would appear that there is a plethora - possibly even a glut - in the market. carries over 192,000 popular science books and over 4,000 science documentary DVD titles, so there's certainly plenty of choice! Things have dramatically improved since the middle of the last century, when according to the late evolutionary biologist Stephen Jay Gould, there was essentially no publicly-available material about dinosaurs.

From the ubiquity of the latter (especially since the appearance of Steven Spielberg's originally 1993 Jurassic Park) it might appear that most science communication is aimed at children - and, dishearteningly, primarily at boys - but this really shouldn't be so. Just as anyone can take evening courses in everything from pottery to a foreign language, why shouldn't the public be encouraged to understand some of the most important current issues in the fields of science, technology, engineering and mathematics (STEM), at the same time hopefully picking up key methods of the discipline?

As Carl Sagan once said, the public are all too eager to accept the products of science, so why not the methods? It may not be important if most people don't know how to throw a clay pot on a wheel or understand why a Cubist painting looks as it does, but it certainly matters as to how massive amounts of public money are invested in a project and whether that research has far-reaching consequences.
Here then are the points I consider the most important as to why science should be popularised in the most accessible way - although without oversimplifying the material to the point of distortion:

1. Politicians and the associated bureaucracy need basic understanding of some STEM research, often at the cutting edge, in order to generate new policies. Yet as I have previously examined, few current politicians have a scientific background. If our elected leaders are to make informed decisions, they need to understand the science involved. It's obvious, but then if the summary material they are supplied with is incorrect or deliberately biased, the outcome may not be the most appropriate one. STEM isn't just small fry: in 2010 the nations with the ten highest research and development budgets had a combined spend of over US$1.2 trillion.

2. If public money is being used for certain projects, then taxpayers are only able to make valid disagreements as to how their money is spent if they understand the research (military R&D excepted of course, since this is usually too hush-hush for the rest of us poor folk to know about). In 1993 the US Government cancelled the Superconducting Super Collider particle accelerator as it was deemed good science but not affordable science. Much as I love the results coming out of the Large Hadron Collider, I do worry that the immense amount of funding (over US$13 billion spent by 2012) might be better used elsewhere on other high-technology projects with more immediate benefits. I've previously discussed both the highs and lows of nuclear fusion research, which surely has to be one of the most important areas in mega-budget research and development today?

3. Criminal law serves to protect the populace from the unscrupulous, but since the speed of scientific advances and technological change run way ahead of legislation, public knowledge of the issues could help prevent miscarriages of justice or at least wasting money. The USA population has spent over US$3 billion on homeopathy, despite a 1997 report by the President of the National Council Against Health Fraud that stated "Homeopathy is a fraud perpetrated on the public." Even a basic level of critical thinking might help in the good fight against baloney.

4. Understanding of current developments might lead to reliance as much on the head as the heart. For example, what are the practical versus moral implications for embryonic stem cell research (exceptionally potent with President Obama's State of the Union speech to cure cancer). Or what about the pioneering work in xenotransplantation: could the next few decades see the use of genetically-altered pig hearts to save humans, and if so would patients with strong religious convictions agree to such transplants?

5. The realisation that much popular journalism is sensationalist and has little connection to reality. The British tabloid press labelling of genetically-modified crops as 'Frankenstein foods' is typical of the nonsense that clouds complex and serious issues for the sake of high sales. Again, critical thinking might more easily differentiate biased rhetoric from 'neutral' facts.

6. Sometimes scientists can be paid to lie. Remember campaigns with scientific support from the last century that stated smoking tobacco is good for you or that lead in petrol is harmless? How about the DuPont Corporation refusing to stop CFC production, with the excuse that capitalist profit should outweigh environmental degradation and the resulting increase in skin cancer? Whistle-blowers have often been marginalised by industry-funded scientists (think of the initial reaction to Rachel Carson concerning DDT) so it's doubtful anything other than knowledge of the issues would penetrate the slick corporate smokescreen.

7. Knowing the boundaries of the scientific method - what science can and cannot tell us and what should be left to other areas of human activity - is key to understanding where the discipline should fit into society. I've already mentioned the moral implications and whether research can be justified due to the potential outcome, but conversely, are there habits and rituals, or just societal conditioning, that blinds us to what could be achieved with public lobbying to governments?

8. Nations may be enriched as a whole by cutting out nonsense and focusing on solutions for critical issues, for example by not having to waste time and money explaining that global warming and evolution by natural selection are successful working theories due to the mass of evidence. Notice how uncontroversial most astronomical and dinosaur-related popularisations are. Now compare to the evolution of our own species. Enough said!

9. Improving the public perspective of scientists themselves. A primary consensus still seems to promote the notion of lone geniuses, emotionally removed from the rest of society and frequently promoting their own goals above the general good. Apart from the obvious ways in which this conflicts with other points already stated, much research is undertaken by large, frequently multi-national teams; think Large Hadron Collider, of course. Such knowledge may aid removal of the juvenile Hollywood science hero (rarely a heroine) and increase support for the sustained efforts that require public substantial funding (nuclear fusion being a perfect example).

10. Reducing the parochialism, sectarianism and their associated conflict that if anything appears to be on the increase. It's a difficult issue and unlikely that it could be a key player but let's face it, any help here must be worth trying. Neil deGrasse Tyson's attitude is worth mentioning: our ideological differences seem untenable against a cosmic perspective. Naïve perhaps, but surely worth the effort?

Last year Bill Gates said: "In science, we're all kids. A good scientist is somebody who has redeveloped from scratch many times the chain of reasoning of how we know what we know, just to see where there are holes." The more the rest of us understand this, isn't there a chance we would notice the holes in other spheres of thought we currently consider unbending? This can only be a good thing, if we wish to survive our turbulent technological adolescence.