Monday 26 June 2017

The power of pond scum: are microalgae biofuels a realistic proposition?

I've previously discussed some very humble organisms but they don't get much humbler than microalgae, photosynthetic organisms that generate about half our planet's atmospheric oxygen. Imagine then what potential there might be for their exploitation in a world of genetic manipulation and small-scale engineering? The total number of algal species is unknown, but estimates suggest some hundreds of thousands. To this end, private companies and government projects around the world have spent the past few decades - and a not inconsiderable amount of funding - to generate a replacement for fossil fuels based on these tiny plants.

For anyone with even a microgram's worth of common sense, developing eco-friendly substitutes for oil, coal and gas is a consummation to be devoutly wished for, but behind the hype surrounding microalgae-derived fuel there is a wealth of opposing opinions and potential some shady goings-on. Whilst other projects such as creating ethanol from food crops are continuing, the great hope - and hype -that surrounded algae-based solutions appears to be grinding to a halt.

Various companies were forecasting that 2012 would be the year that the technology achieved commercial viability, but this now appears to be rather over-eager. Therefore it's worth exploring what happens when hope, high-value commerce and cutting-edge technology meet. There are some big names involved in the research too: ExxonMobil, Shell and BP each pumped tens to hundreds of millions of dollars into microalgae fuel projects, only to either make substantial funding cuts or shut them down altogether since 2011.
Microalgae-derived biofuel
Manufacturing giants such as General Electric and Boeing have been involved in research for new marine and aircraft fuels, whilst the US Navy undertook tests in 2012 whereby algae-derived fuel was included in a 50:50 blend with conventional fossil fuel for ships and naval aircraft. Even shipping companies have become interested, with one boffin-worthy idea being for large cruise ships to grow and process their own fuel on-board. Carriers including United Airlines, Qantas, KLM and Air New Zealand have invested in these kerosene-replacement technologies, with the first two of these airlines having trialled fuel blends including 40% algae derivative. So what has gone wrong?

The issue appears to be one of scale: after initial success with laboratory-sized testing, the expansion to commercial production has encountered a range of obstacles that will most likely delay widespread implementation for at least another quarter century.

The main problems are these:
  1. The algae growing tanks need to be on millions of acres of flat land and there are arguments there just isn't enough such land in convenient locations.
  2. The growing process requires lots of water, which means large transportation costs to get the water to the production sites. Although waste water is usable, some estimates suggest there is not enough of this - even in the USA - for optimal production.
  3. Nitrogen and phosphorus are required as fertiliser, further reducing commercial viability. Some estimates suggest half the USA's annual phosphorus amount would need to be requisitioned for use in this one sector!
  4. Contamination by protozoans and fungi can rapidly destroy a growing pond's entire culture.
In 2012 the US National Academy of Sciences appeared to have confirmed these unfortunate issues. Reporting on the Department of Energy goal to replace 5% of the nation's vehicle fossil fuel consumption with algae-derived biofuel, the Academy stated that this scale of production would make unfeasibly large impacts on water and nutrient usage, as well heavy commitments from other energy sources.

In a bid to maintain solvency, some independent research companies appear to have minimised such issues for as long as possible, finally diversifying when it appeared their funding was about to be curtailed or cut-off. As with nuclear fusion research, commercial production of microalgae fuels hold much promise, but those holding the purse strings aren't as patient as the researchers.

There may be a hint of a silver lining to all this, even if wide scale operations are postponed many decades. The microalgae genus Chlorella - subject of a Scottish biofuel study - is proving to be a practical source of dietary supplements, from vitamins and minerals to Omega-3. It only lacks vitamin B12, but is an astonishing 50-60% protein by weight. As well as human consumption, both livestock and aquaculture feed supplements can be derived from microalgae, although as usual there is a wealth of pseudoscientific nonsense in the marketing, such as the notion that it has an almost magical detox capability. Incidentally, Spirulina, the tablets and powder sold in health food outlets to make into green gloop smoothies, is not microalgae but a B12-rich cyanobacteria, colloquially - and confusingly - known as blue-green algae. Glad that's cleared that one up!

If anything, the research into microalgae-derived biofuels is a good example of how new technology and commercial enterprise uneasily co-exist; each needs the other, but gaining a workable compromise is perhaps just a tricky as the research itself. As for Government-funded projects towards a better future for all, I'll leave you to decide where the interests of our current leaders lie...

Saturday 10 June 2017

Owning the aliens: who should support endangered species thriving outside their home territories?

On holiday in Fiji last year I was surprised to learn that the most commonly-seen animals - with the exception of flying foxes - were recent introductions from other countries, primarily India. Examples include the red-vented bulbul, mynah bird, house gecko, and mongoose, all of which have brought their own problems to either native wildlife or Fijian agriculture.

From Hawaii to New Zealand, the deliberate or accidental introduction of non-native animals, plants and fungi has had profoundly negative effects on these previously isolated ecosystems. So what happens if an introduced organism, especially one that has a deleterious effect on wildlife, thrives in its transplanted habitat whilst becoming endangered across its original range? Two questions spring to mind: should the adopted homeland be able to exterminate the alien invader with impunity; and/or should the country of origin fund work in the invaded nation during a 'lifeboat' phase, until the home turf is suitable for restocking?

Almost inevitably, the countries with the highest number of at-risk species tend to be the poorer ones, Australia and the United States excepted. Reports over the past four years list a variety of nations with this sorry state of affairs, but amongst different conservation groups those within the top ten for endangered animal species include Indonesia, Malaysia, Ecuador, Mexico, India and Brazil. In some of these there is little political willpower - or indeed funding - to support anything deemed non-critical, with biodiversity seen as a nice-to-have.

For small nations such as Fiji there is little in the way of an environmental lobby. NatureFiji-MareqetiViti is an organisation that attempts to safeguard such threatened animals as the Fijian Crested Iguana whilst enhancing regional biosecurity, but with grants - including from the European Union - rarely exceeding a few tens or hundreds of thousands Fijian dollars they are woefully underfunded.

Which brings us to New Zealand, with its collection of endangered birds, lizards, freshwater fish and the Maui dolphin. In addition to Department of Conservation (Doc) budget cuts over the past decade - claimed by some organisations to total a 21% decline in real terms - the nation is home to several Australian animals that are nationally vulnerable in their native homeland across the Tasman Sea.

The green and golden bell frog (Litoria aurea) is a prime example of this, with a rapidly reducing Australian range having generated a status of 'globally vulnerable' yet being common enough in the northern part of New Zealand's North Island. I found this specimen at Auckland's Botanic Gardens earlier this year.


Therefore should the Australian Government fund a captive breeding programme - or simply a round-up - of individuals in New Zealand? After all, the latter has its own four native frog species, all rare and/or endangered, for its herpetologists to concentrate on.

There is a precedent for this. In 2003, three Australian trappers captured rare brush-tailed rock-wallabies on New Zealand's Kawau Island, where the marsupial's 'noxious' pest status meant
they were about to be targeted for eradication. The project included support from DoC but presumably - it's difficult to ascertain - the funding came from Australia.

Of course Australia may be able to afford to engage in restocking programmes abroad, but few other nations are in the same position. Although the largest conservation organisation in the world, the World Wide Fund for Nature (World Wildlife Fund in North America) has a comparatively large budget, even it cannot afford to support every repatriation or gene pool nursery scheme. Meanwhile, local charities such as NatureFiji-MareqetiViti tend to rely on volunteers rather than trained professionals and don't have the scope or capability for logistically-complex international undertakings.

With the USA becoming increasingly insular and Europe consumed with its own woes, the potential funding sources for these interim lifeboats is rapidly drying up. There are a few eco-angels, such as Norway's US$1 billion donation to Brazil - intended to curtail Amazonian rainforest destruction - but they are few and far between. It's one thing to support in-situ environmental issues, but another to raise funds to save selected endangered species thriving away from their native ecosystem.

It appears that there is no single solution to this, meaning that except for a few lucky 'poster' cases, many at-risk species may well fail to gain attention and be allowed to die out (or even be exterminated as foreign pests). The original home territory might no longer contain a suitable environment for them to thrive in whilst the foster nation lacks the impetus or funding to look after those pesky alien invaders. It seems that there are difficult times ahead!

Tuesday 23 May 2017

Water, water, everywhere: the hidden holism of H2O

Like other northern regions of New Zealand, the summer of 2017 saw Auckland residents facing City Council requests to conserve water, as well as a hosepipe ban in effect during March and April. It therefore seems ironic that the water shortage occurred at the same time as flooding in the west of the city; thanks to a tropical downpour - one of several so far this year - the equivalent of an entire month's rain fell over a day or two. Clearly, water shortages are going to become an ever-increasing issue, even in nations with well-developed infrastructure.

The British historian of science James Burke, writer-presenter of The Day the Universe Changed, also made three television series called Connections 1, 2 and 3 (in 1978, 1994 and 1997 respectively) which examined the historical threads linking scientific and technological advances with changes in other areas of society. Therefore I'd like to take a similarly holistic approach to the wonderful world of water consumption and see how it ties into the world in general.

Although the statistics vary - it's difficult to assess with any great precision - there are published figures suggesting that the populace of richer nations use up to 5000 litres of water each per day, mostly hidden in food production. Many websites now supply details of the amount of water used to grown certain crops and foodstuffs, so you can easily raise your guilt level simply by comparing your diet to the water involved in its generation; and that's without considering the carbon mileage or packaging waste, either!

I've previously discussed the high environmental cost of cattle farming, with both dairy and beef herds being prominent culprits in water pollution as well as consumption. However, there are plenty of less-obvious foodstuffs proven to be notorious water consumers, for example avocado and almonds. Although the latter might be deemed a luxury food, much of the global supply is now used to make almond milk; with consumption increasing up to 40% year-on-year, this is one foodstuff much in demand.

Even though it is claimed to require much less water than the equivalent volume of dairy produce, almond farming is still relevant due to the massive increase in bulk production, especially in California (home to 80% of the global almond harvest). The reasons for the popularity of almond milk are probably two-fold: firstly, the public is getting more health-conscious; and secondly, a reduction or abstention in dairy produce is presumed to lessen food allergies/intolerance. These obviously link to prominent concerns in the West, in the form of high-calorie/low-exercise diets leading to mass obesity and over-use of cleaning chemicals in the home, preventing children from developing good anti-microbial resistance. Clearly, there is a complex web when it comes to water and the human race.

Even for regions chronically short of water such as California, more than three-quarters of fresh water usage is by agriculture. In order to conserve resources, is it likely that we may soon face greater taxes on commercially-grown water-hogging produce and bans on the home-growth of crops that have a low nutrition to water consumption ratio? I've recently read several books discussing probable issues over the next half century with the humble lettuce appearing as a good example of the latter.

Talking of which, the wet and windy conditions in New Zealand of the past year - blamed at least partially on La Niña - have led to record prices for common vegetables: NZ$9 for a lettuce and NZ$10 for a cauliflower, even in major supermarket chains. British supermarkets were forced to ration some fruit and vegetables back in February, due to their Mediterranean growers suffering from storms and floods. This suggests that even for regions with sophisticated agricultural practices there is a fine line between too much and too little fresh water. Isn't it about time that the main food producers developed a more robust not to mention future-proof infrastructure, considering the increased impact that climate change is likely to have?

The world is also paying a heavy price for bottled water, a commercial enterprise that largely breaks all boundaries of common sense. In the USA alone it costs several thousand times the equivalent volume of tap water and there are some reports that there may be chemical leaching from reusing plastic bottles. As you might expect, there is also an extremely high environmental cost. This includes the fossil fuels used by bottling plants and transportation, the lowering of the water table (whose level is so critical in areas utilising less sophisticated farming technologies) and the impact of plastic waste: the USA only recycles about 23% of its plastic water bottles, resulting in 38 billion bottles dumped each year at a cost of around US$1 billion. All in all, bottled water for nations with highly developed infrastructure seems like an insane use of critical resources.

Although accelerated population growth has become a widespread fear, there are indicators that later this century the global figure may peak at around nine billion and then level off. Increasing urbanisation is seen a primary cause for this and not just in developing nations; Auckland for example (New Zealand's largest city by far) experienced 8% population growth in the seven years from 2006. A larger population obviously requires more food, but a more urban and therefore generally better educated, higher income populace tends to demand access to processed, non-local and above all water-intensive foods. China is the touchstone here, having seen a massive increase in fish and meat consumption over the past half century; the latter has risen from 8 million tons per year in 1978 to over 70 million tons in recent years.

It has been claimed that 70% of industrial waste generated in developing nations is dumped into water courses, meaning that there will be a massive cost for environmental clean-up before local sources can be fully utilised. The mass outbreak of E-coli in Hawke's Bay, New Zealand, in February this year shows that even developed nations are having difficulty maintaining water quality, whilst there has been a shocking admittance of lead contamination above safe levels in 41 American states over the past three years. Does this mean bottled water - heretofore the lifeline of Western tourists abroad - is becoming a necessity in the West after all?

Some might argue that thanks to global warming there will be more water available due to the melting of polar caps and glaciers, which after all contain almost two-thirds of the world's fresh water resources. However, these sources are mostly located far from high-density populations and upon marine contamination they require energy-demanding desalination technology. It's small comfort that current estimates suggest that by 2025 about 14% of the global population will rely on desalination plants for their fresh water needs.

In the West we tend to take clean, safe water completely for granted but thanks to the demands of living in a society run on rampant consumerism - coupled with poor science education - everyday decisions are being made that affect the environment, waste critical resources and damage our own health. Pundits are predicting that water will be the new oil: liquid gold, a precious commodity to be fought over, if necessary. Surely this is one resource that all of us can do something to support, whether it is cutting down on water-intensive foodstuffs, using tap rather than bottled water, or simply turning off a tap sooner than usual!

Monday 8 May 2017

Weather with you: meteorology and the public perception of climate change

If there's one thing that appears to unite New Zealanders with the British it is the love of discussing the weather. This year has been no exception, with New Zealand's pre-summer forecasts - predicting average temperatures and rainfall - proving wildly inaccurate. La Niña has been blamed for what Wellingtonians have deemed a 'bummer summer', January having provided the capital with its fewest 'beach days' of any summer in the last thirty years. Sunshine hours, temperature, rainfall and wind speed data from the MetService support this as a nationwide trend; even New Zealand flora and fauna have been affected with late blossoming and reduced breeding respectively.

However, people tend to have short memories and often recall childhood weather as somehow superior to that of later life. Our rose-tinted spectacles make us remember long, hot summer school holidays and epic snowball fights in winter, but is this a case of remembering the hits and forgetting the misses (meteorologically speaking)? After all, there are few things more boring than a comment that the weather is the same as the previous ten comments and surely our memories of exciting outdoor ventures are more prominent than being forced to stay indoors due to inclement conditions?

Therefore could our fascination with weather but dubious understanding - or even denial - of climate change be due to us requiring personal or even emotional involvement in a meteorological event? Most of us have had the luck not to experience extreme weather (or 'weather bombs' as the media now term them), so unless you have been at the receiving end of hurricanes or flash floods the weather is simply another aspect of our lives, discussed in everyday terms and rarely examined in detail.

Since we feel affected by weather events that directly impact us (down to the level of 'it rained nearly every day on holiday but the locals said it had been dry for two months prior') we have a far greater emotional response to weather than we do to climate. The latter appears amorphous and almost mythical by comparison. Is this one of the reasons that climate change sceptics achieve such success when their arguments are so unsupported?

Now that we are bombarded with countless pieces of trivia, distracting us from serious analysis in favour of short snippets of multimedia edutainment, how can we understand climate change and its relationship to weather? The standard explanation is that weather is short term (covering hours, days or at most weeks) whilst climate compares annual or seasonal variations over far longer timeframes. Neil deGrasse Tyson in Cosmos:A Spacetime Odyssey made the great analogy that weather is like the zigzag path of a dog on a leash whereas its owner walks in a straight line from A to B. So far so good, but there's not even a widespread designation for the duration that counts as valid for assessing climate variability.

As such this leads us to statistics. Everyone thinks they understand the word 'average' but averages can represent the mean, median or mode. Since the period start and end date can be varied, as can the scaling on infographics (a logarithmic axis, for example), these methods allow a single set of statistics to be presented in a wide variety of ways.

The laws of probability rear their much-misinterpreted head too. The likelihood of variation may change wildly, depending on the length of the timeframe: compare a five-year block to that of a century and you can see that climate statistics is a tricky business; what is highly improbable in the former period may be inevitable over the latter. As long as you are allowed to choose the timeframe, you can skew the data to support a favoured hypothesis. So much then for objective data!

By comparison, if someone is the recipient of a worse than expected summer, as per New Zealand in 2017, then that personal experience may well be taken as more important than all the charts of long-term climate trends. It might just be the blink of an eye in geological terms, but being there takes precedence over far less emotive science and mathematics.

Perhaps then we subconsciously define weather as something that we feel we experience whilst climate is a more abstract notion, perhaps a series of weather events codified in some sort of order? How else can climate change deniers, when faced with photographs proving glacial or polar cap shrinkage, offer alternative explanations to global warming?

This is where politics comes into the mix. Whereas weather has little obvious involvement with politics, climate has become heavily politicised in the past thirty years, with party lines in some nations (mentioning no names) clearly divided. Although some of the naysayers have begun to admit global warming appears to be happening - or at least that the polar caps and glaciers are melting - they stick to such notions that (a) it will be too slow to affect humans - after all, there have been far greater swings in temperature in both directions in previous epochs - and (b) it has natural causes. The latter implies there is little we can do to mitigate it (solar output may be involved, not just Earth-related causes) and so let's stick our head in the sand and do some ostrich impressions.

As an aside, I've just finished reading a 1988 book called Prehistoric New Zealand. Its three authors are a palaeontologist (Graeme Stevens), an archaeologist (Beverley McCulloch)  and an environmental researcher (Matt McGlone) so the content covers a wide range of topics, including the nation's geology, climate, wildlife and human impact. Interestingly, the book states if anything the climate appears to be cooling and the Earth is probably heading for the next glaciation!

Unfortunately no data is supplied to support this, but Matt McGlone has since confirmed that there is a wealth of data supporting the opposite conclusion. In 2008 the conservative American Heartland Institute published a list of 500 scientists it claimed supported the notion that current climate change has solely natural causes. McGlone was one of many scientists who asked for his name to be removed from this list, stating both his work and opinions were not in agreement with this idea.

So are there any solutions or is it simply the case that we believe what we personally experience but have a hard time coming to terms with less direct, wider-scale events? Surely there are enough talented science communicators and teachers to convince the public of the basic facts, or are people so embedded in the now that even one unseasonal rain day can convince them - as it did some random man I met on the street - that climate change is a myth?

Saturday 22 April 2017

Which way's up? Mental mapping and conditioning by familarity

I recently watched a television documentary on Irish prehistory that noted if you cunningly turned a conventional map of the British Isles ninety degrees anti-clockwise, then Ireland would appear to be an integral part of Europe's maritime trade routes and not stuck out on the edge of the known world. Be that as it may, it's interesting how easily we accept conventions without analysis. As you might expect, just because something is a convention doesn't necessarily mean it is superior, only that it has achieved such a commonplace status that it will usually be taken for granted. It's not the logical approach, but then we're not Vulcans!

Take maps of the world. Map projections have usually arisen in reponse to practical needs or due to the contingency of history. Most global maps today use the Mercator projection, which whilst being useful for maritime navigation in a time before GPS, increasingly distorts areas as they approach the poles. This shouldn't seem surprising, since after all we're taken a near-spherical object, transposing it onto the surface of a cylinder, and then unrolling that onto a two-dimensional plane.

In fact there are dozens of different map projections but none are good for all regions and purposes. This doesn't mean that the Mercator projection is ideal; far from it, since heavily-populated regions such as Africa appear too small whilst barely-populated areas such as Greenland and Antarctica are far too large. However, it is popular because it is familiar because it is popular...and so on. Like QWERTY keyboards, it may no longer be required for the purpose it originally served but is now far too common to be replaced without a great deal of hassle.

Aside from projection, there's also the little matter of direction. There are novelty maps with the south pole at the top, most commonly created by Australians, but since 88% of the human race currently live in the Northern hemisphere (which has 68% on the total landmass) it's hardly surprising that the North Pole is conventionally top-most.

However, this hasn't always been the case: before there was worldwide communication, the ancient Egyptians deemed 'upper' as towards the equator and 'lower' away from it. Early medieval Arab scholars followed suit whilst the mappa mundi of medieval Christian Europe placed East at the top of a topography centred on Jerusalem.

Photographs of the Earth that show a recognisable landmass usually present north uppermost too; there is no such thing as 'right' way up for our solar system, but the origin of the first great civilisations has set the geographic orientation for our global society.

None of this might seem particularly important, but ready acceptance of familiar conventions can easily lead to lack of critical thinking. For example, in the Nineteenth and early Twentieth Centuries, Great Britain exported pre-fabricated buildings to Australia and New Zealand, but as some architects failed to recognise that the Southern hemisphere sun is due north at midday there are examples with the main windows on the south-facing wall. Even the fact that most humans live in the Northern hemisphere has lead to the incorrect assumption that - thanks to their summer - the earth is closer to the sun in June than it is in December. There is such a thing as hemisphere parochialism after all!

If we can learn anything from this it is that by accepting popular conventions without considering their history or relevance, we are switching off critical faculties that might otherwise generate replacement ideas more suitable for the present. Unfortunately, we frequently prefer familiarity over efficiency, so even though tried and trusted conventions may no longer be suitable for changed circumstances we solidly cling to them. Thus we stifle improvements as a trade-off for our comfort. I guess that's what they call human nature...

Saturday 1 April 2017

The moons of Saturn and echoes of a synthetic universe

As fans of Star Wars might be aware, George Lucas is nothing if not visually astute. His thumbnail sketches for the X-wing, TIE fighter and Death Star created the essence behind these innovative designs. So isn't it strange that there is a real moon in our solar system that bears an astonishing resemblance to one of Lucas's creations?

At the last count Saturn had 53 confirmed moons, with another 9 provisionally verified - and as such assigned numbers rather than names. One of the ringed planet's natural satellites is Mimas, discovered in 1789 and at 396 kilometres in diameter about as small as an object can be yet conform to an approximate sphere. The distinguishing characteristic of Mimas is a giant impact crater about 130 kilometres in diameter, which is named Herschel after the moon's discoverer, William Herschel. For anyone who has seen Star Wars (surely most of the planet by now), the crater gives Mimas an uncanny resemblance to the Death Star. Yet Lucas's original sketch for the battle station was drawn in 1975, five years before Voyager 1 took the first photograph with a high enough resolution to show the crater.


Okay, so one close resemblance between art and nature could be mere coincidence. But amongst Saturn's retinue of moons is another with an even more bizarre feature. At 1469 kilometres in diameter Iapetus is the eleventh largest moon in the solar system. Discovered by Giovanni Cassini in 1671, it quickly became apparent that there was something extremely odd about it, with one hemisphere much brighter than the other.

As such, it attracted the attention of Arthur C. Clarke, whose novel 2001: A Space Odyssey described Japetus (as he called it) as the home of the Star Gate, an artificial worm hole across intergalactic space. He explained the brightness differentiation as being due to an eye-shaped landscape created by the alien engineers of the Star Gate: an enormous pale oval with a black dot at its centre. Again, Voyager 1 was the first spacecraft to photograph Iapetus close up…revealing just such a feature! Bear in mind that this was 1980, whereas the novel was written between 1965 and 1968. Carl Sagan, who worked on the Voyager project, actually sent Clarke a photograph of Iapetus with a comment "Thinking of you..." Clearly, he had made the connection between reality and fiction.

As Sagan himself was apt to say, extraordinary claims require extraordinary evidence. Whilst a sample of two wouldn't make for a scientifically convincing result in most disciplines, there is definitely something strange about two Saturnian moons that are found to closely resemble elements in famous science fiction stories written prior to the diagnostic observations being made. Could there be something more fundamental going on here?

One hypothesis that has risen in popularity despite lacking any hard physical evidence is that of the simulated universe. Nick Bostrum, the director of the University of Oxford's Future of Humanity Institute has spent over a decade promoting the idea. Instead of experimental proof Bostrum uses probability theory to support his suppositions. At its simplest level, he notes that the astonishing increase in computing power over the past half century implies an ability in the near future to create detailed recreations of reality within a digital environment; basically, it's The Matrix for real (or should that be, for virtual?)

It might sound like the silliest science fiction, as no-one is likely to be fooled by current computer game graphics or VR environments, but with quantum computing on the horizon we may soon have processing capabilities far beyond those of the most powerful current mainframes. Since the ability to create just one simulated universe implies the ability to create limitless - even nested - versions of a base reality, each with potentially tweaked physical or biological laws for experimental reasons, the number of virtual realities must far outweigh the original model.

As for the probability of it being true in our universe, this key percentage varies widely from pundit to pundit. Astronomer and presenter Neil deGrasse Tyson has publicly admitted he considers it an even chance likelihood, whilst Space-X and Tesla entrepreneur Elon Musk is prepared to go much further, having stated that there is only a one in a billion chance that our universe is the genuine physical one!

Of course anyone can state a probability for a hypothesis as being fact without providing supporting evidence, but then what is to differentiate such an unsubstantiated claim from a religious belief? To this end, a team of researchers at the University of Bonn published a paper in 2012 called 'Constraints on the Universe as a Numerical Simulation', defining possible methods to verify whether our universe is real or virtual. Using technical terms such as 'unimproved Wilson fermion discretization' makes it somewhat difficult for anyone who isn't a subatomic physicist to get to grips with their argument (you can insert a smiley here) but the essence of their work involves cosmic rays. The paper states that in a virtual universe these are more likely to travel along the axes of a multi-dimensional, fundamental grid, rather than appear in equal numbers in all directions. In addition, they will exhibit energy restrictions at something called the Greisen-Zatsepin-Kuzmin cut-off (probably time for another smiley). Anyhow, the technology apparently exists for the relevant tests to be undertaken, assuming the funding could be obtained.

So could our entire lives simply be part of a Twenty-Second Century schoolchild's experiment or museum exhibit, where visitors can plug-in, Matrix-style, to observe the stupidities of their ancestors? Perhaps historians of the future will be able to run such simulations as an aide to their papers on why the hell, for example, the United Kingdom opted out of the European Union and the USA elected Donald Trump?

Now there's food for thought.