Wednesday 25 May 2016

From Dr Strangelove to Dr Evil: Hollywood's anti-science stance

Despite the decades of hard work by the likes of Bill Nye, Stephen Hawking, Carl Sagan, Stephen Jay Gould etal, there is still an enormous amount of public suspicion surrounding scientists and their work. From wavering opinion concerning climate change to the negative publicity revolving around genetically-modified crops (A.K.A. 'Frankenfoods') it seems that popular opinion of scientists isn't far above that meted out in recent years to politicians and merchant bankers.

Tabloid media cannot be solely to blame for this, although the ridiculous scaremongering stories given front page attention, frequently involving medical science, are certainly no help. Instead, I would argue that some of the blame for the public attitude to STEM (Science, Technology, Engineering and Mathematics) comes from that ubiquitous global communicator, mainstream Hollywood. So where did the world's movie capital get its ideas from?

It seems that the denigration of science and its technological applications has probably existed as long as modern science itself. Before there were films to spread the negativity, literature had a mixed opinion of the discipline. Could some of the most famous apparently anti-scientific publications from Europe have inspired Hollywood's pioneers, many of whom were European emigrés?

Jonathan Swift's third book of Gulliver's Travels concerns the scientific elite of a floating island called Laputa. First published in 1726 during the so-called Age of Enlightenment, the book is typical of Swift's no holds barred approach to satire, making much use of the learning of the day. Despite being far more concerned with social and political issues rather than an anti-scientific stance, the material is still echoed today in the popular media.

Granted, many would agree that some of the more expensive STEM research projects such as the Large Hadron Collider could wait until global issues concerning hunger, medicine, environmental degradation - and poverty in general - are solved, but then wealth is rarely evenly distributed. After all, the USA apparently spends twice as much on pet grooming as it does on nuclear fusion research. Incidentally, isn't this bizarre in itself: it's not just that we consider ourselves so much more rational than all other animals, but that the human brain is the most complex object in the known universe. That's a pretty scary thought!

As for Mary Shelley's classic novel whose title is evoked during criticism of GM foods, she may have been inspired by the general feeling of doom then in the air; almost literally in fact, due to the 1815 eruption of Mount Tambora, with volcanic dust creating 1816's 'Year without a Summer'. As an aside, the astonishingly lurid colours of J.M.W. Turner's sunsets of the period were another artistic response associated with the high-altitude volcanic aerosols.

In addition to the extremely cold, wet conditions of that year, Shelley is thought to have stopped near to the original Frankenstein Castle in Germany, where alchemy and other dubious dark arts were reputed to have been practiced. Combined with Luigi Galvani's experiments on frogs' legs - originally performed several decades earlier but much imitated still in Shelley's time, including on human cadavers - the novel is clearly a reflection of widespread anxieties of the time.

With the expansion of industrial cities and their associated squalor, the mid-Nineteenth Century saw the origin of philosophies that associated technological advances (and their scientific underpinnings) with a debasement of humanity. William Blake's description of 'satanic mills' epitomises this mode of thought, seen in as diverse a range of expression as the Pre-Raphaelite Brotherhood of artists, the Arts and Crafts movement, even the political writings of Marx and Engels. To blame the greed of the new captains of industry on science is obviously unfair, but then the latter were a far easier target. After all, the English chemist and political radical Joseph Priestley fled to the United States after an authority-sponsored mob burnt down his house in 1791.

Blake's over-wraught emoting ("Science is the Tree of Death") is amongst the strongest negativity of the period, but can we blame him, considering science was, as it is today, often wrongly blamed as the root cause of the widespread destruction of nature to make way for a soulless, artificial environment? But it wasn't just a response to the changes to society and landscape that Blake took exception to: he detested the mechanistic vision of the universe built upon the work of Galileo and Newton, believing that too much knowledge destroyed wonder and awe.

This is clearly as subjective a viewpoint as any discussion of a work of art; it can be easily rebuffed, although the attitude behind it should be treated seriously. Happily, today's plethora of glossy coffee table books on such scientifically-gleaned wonders as Hubble Space Telescope imagery show there is still plenty to be in awe of.

Mainstream cinema frequently paints a very A versus B picture of the world (think classic westerns or war films). But science can rarely fit into such neat parcels: consider how the more accurate general theory of relativity can live alongside its predecessor from Newton. In addition, it's very tricky to make interesting drama within a traditional narrative structure that utilises scientist protagonists unless it's a disaster movie (even the likes of Jurassic Park falls within this category.)

It isn't difficult to recall many negative examples of scientists in Hollywood movies, from at best those too wrapped up in their own work to notice its wider effects, to at worst insane megalomaniacs intent on either world domination or destruction. In contrast, how many sympathetic movie scientists are there?

It seems such a shame that such a ubiquitous form of entertainment consistently portrays such a lack of sympathy towards science. Even the film version of Carl Sagan's novel Contact lacked the cosmic spiritual elements of the source material, as if afraid that a combination of astrophysics and the mystical wouldn't be comprehensible to audiences (2001 syndrome, perhaps?) Science fiction films these days often seem keen to boast of their technical consultants, so what about a more sympathetic attitude to the practitioners of science itself? After all, most scientists don't live with their private armies in secret headquarters bases, planning to takeover the world...

Friday 1 April 2016

Hollywood's natural history hobbit hoax: did Peter Jackson create Homo floresiensis for publicity purposes?

Judging by the limited ingredients of contemporary blockbusters, cinema audiences are fairly easy to please. Or are they? Peter Jackson's magnum opus The Lord of the Rings trilogy made an absolute mint at the box office and garnered seventeen Oscar wins besides critical acclaim. In contrast, The Hobbit trilogy received but a single Oscar accompanying some rather lukewarm reviews.

The reason for the critical indifference and lack of awards has been put down to franchise fatigue, although to be fair stretching a children's book over three long movies whilst partly improvising the script at a late stage couldn't have helped. So if you are a world-renowned film maker well aware that you are judged by many of your fans and much of your peer group on the success - and possibly the quality - of your latest film, it wouldn't be surprising if you go to great lengths to maximise that success. Just how far Peter Jackson went for The Hobbit trilogy is astounding...so read on...

It's been some years since I visited Weta Cave in Wellington, where close-up views of various costumes and props from movies including the LOTR trilogy leaves you in no doubt about the superb workmanship the effects house is capable of. Some of the exhibits and merchandise included non-human characters from Middle Earth and District 9, the quality of which got me thinking. Peter Jackson is known to have visited the Natural History Museum when in London recording the soundtrack for The Lord of the Rings. This in itself is not suspect, except that the museum was at the time hosting an exhibition about the infamous Piltdown Man.

For anyone who knows anything about science scandals, Piltdown Man has to be among the most notorious. The 1908 discovery in southern England of a hominin skull of unknown species was rapidly followed by numerous associated finds, all touted as genuine by professional scientists. In fact, by 1913 some palaeontologists had already suggested what was finally confirmed forty years later: the entire assemblage was a fraud, the skull itself including an orang utan jawbone with filed-down teeth! The fact that so many specialists authenticated the remains is bizarre, although it may be that patriotic wishful thinking (to confirm prehistoric hominins had lived in Britain) overrode any semblance of impartiality.

Back to Peter Jackson and his hobbit conundrum. Although LOTR trilogy did the bums-on-seats business (that's an industry term, in case you were wondering), Jackson's next film was the 2005 King Kong remake. Included in the record-breaking US$207 million production costs was a $32 million overspend which the director himself was personally responsible for. Having already been put into turnaround (that's cold feet in Hollywoodese) in the previous decade, Jackson was determined to complete the film to his own exacting standards, thus resulting in the financial woes surrounding the production.

So just how do you get the massive budget to make a prequel trilogy that's got a less involved storyline (sound vaguely familiar, Star Wars fans?) directly after you've made the most expensive film in history, which is not even a remake but a second remake? How about generating tie-in publicity to transfer from the real world to Middle Earth?

Around the time that Peter Jackson's production company Three Foot Six was being renamed (or if you prefer, upgraded) to Three Foot Seven, worldwide headlines announced the discovery of a small stature hominin of just this height. The first of the initial nine specimens found on the island of Flores, labelled LB1, would have been a mere 1.06 metres tall when alive, which is three feet six inches give or take a few millimetres.

Coincidence? When in doubt, adherents of scientific methods should follow the principle of parsimony, A.K.A. Occam's razor. Which in this case has led to me putting my conspiracy hat on.

Consider this: the new species rapidly became far better known by its nickname the 'hobbit people' than as Homo floresiensis. Which was handy for anyone about to spend US$225 million on three films involving hobbits. In addition, it was discovered at the perfect time for Jackson to get maximum publicity (admittedly not the release of the first hobbit film, but for purposes of convincing his American backers of the audience anticipation).

The smoking gun evidence for me is the almost comical resemblance the remains bear to Tolkien's creations. For example, the feet are said to be far longer and flatter than any other known hominin species. Remind you of anything you've seen at the movies? It's just a shame that hair doesn't survive as long as the alleged age of the specimens - which based on the stratigraphy has been estimated from 94,000 to 13,000 years ago.

In addition, how could such creatures have built the bamboo rafts or dug-out boats necessary to reach the island in the first place? When sea levels dropped during glaciation periods Flores was still convincingly isolated from the mainland. Braincase analysis shows that Homo floresiensis had an orange-sized brain. Since the tools found with the semi-petrified organic remains were simple stone implements, the idea of real-life hobbits sailing the high seas appears absurd in the extreme.

Several teams have attempted to extract DNA from the water-logged and delicate material but after a decade's effort none have been successful. This seems surprising, considering the quality of contemporary genetic replication techniques, but perhaps not if the material consists of skilfully crafted fakes courtesy of Weta Workshop. Some of the fragments appear similar to chimpanzee anatomy, but then Peter Jackson has always tried to make his creatures as realistic as possible. Indeed, he even hired a zoologist to ensure that his King Kong was anatomically correct (I recall hearing that some of his over-sized gorilla's behind needed reworking to gain accuracy. Now that's dedication!)

There has also been some rather unscientific behaviour concerning the Homo floresiensis remains which appears counter to the great care usually associated with such precious relics. At one point, the majority of material was hidden for three months by one of the Indonesian paleoanthropologists, only for what was returned to include damaged material missing several pieces. All in all, there is much about the finds to fuel speculation as to their origin.

In summary, if you wanted to promote worldwide interest in anything hobbit-wise what could be better yet not too obvious? Just how the much the joint Australian-Indonesian archaeology and palaeontology team were in the know is perhaps the largest mystery still remaining. I've little doubt that one day the entire venture will be exposed, perhaps in a documentary made by Peter Jackson himself. Now that would definitely be worth watching!

Tuesday 15 March 2016

Pre-teen coding electronic turtles: should children learn computer programming?

Way back in the mists of time when I was studying computer science at senior school, I was part of the first year at my (admittedly rural and far from innovative) school to use actual computers. Previous years had been stuck in the realm of punched tape and other such archaic wonders, so I was lucky to have access to the real thing. Now that we use smartphones with several hundred thousand times more memory than the first computer I owned - a Sinclair ZX Spectrum 48, if you're interested - I'd like to ask is it worthwhile teaching primary school children programming skills rather than just learning front-end interfaces?

I remember being amazed to learn that about the same time as I was getting to grips with the Sinclair version of BASIC, infants in Japan were already being taught the rudiments of programming via turtle robots and Logo. These days of course, children learn to use digital devices pretty much from the egg, but back then it seemed progressive in the extreme. My elder (but still pre-teen) daughter has so far dabbled with programming, mostly using drag and drop interfaces in game coding sessions and at her school's robot club, which involves the ROBOTC language and Vex robots.

Ironically, if I still lived in Britain then my younger daughter would already be learning computer science at school too, as in 2014 the UK Government made the subject mandatory for all children from five years' old. Not that this step came easily: apparently there was a struggle in the lead up to the curriculum change to find enough qualified teachers. Clearly, the effort involved in establishing such as policy suggests the level of importance placed upon it.

In contrast to the UK, New Zealand has slipped in the educational avant-garde. Digital technology is not a compulsory subject here and many lower-decile schools use old, unsupported software such as the Windows XP operating system. A combination of untrained teachers and parental attitudes is being blamed for a decline in the number of programmers in the country. I know of one Auckland-based technology centre where the majority of hands-on coders are predominantly recruited from overseas and incidentally - unlike the less-technical roles - are mostly men. Of course, the shortage could be partly due to the enticement of kiwi developers to the far larger and better-paid job markets in Australia, the UK and USA, but even so it seems clear that there is a definitive deficiency in New Zealand-born programmers.

Luckily, programming is a discipline where motivated children can learn coding for free, with online resources provided by numerous companies all the way up to Google and Microsoft. However, this presupposes both adequate internet access and parental support, or at least approval. If the current generation of parents don't understand the value of the subject, then it's highly unlikely many children will pick up the bug (ahem, that's a programming pun, of sorts.)

Compared to the BASIC and Logo languages available in my teenage years there is now a bewildering array of computer languages, interfaces and projects that teach the rudiments of programming, with colourful audio-visual interfaces such as Alice, Scratch (a bit like a virtual lego), CiMPLE, Kodu, etc, all intended for a pre-teen audience. Of course, they are far removed from complexity of professional languages such as the C family or Java - I have to say that Object-Orientated Programming was certainly a bit of a shock for me - but these applications are more about whetting the appetite and generating quick results so as to maintain interest.

So what are the reasons why learning to code might be a good idea for young children, rather than just teaching them to use software such as the ubiquitous Microsoft Office? Might not the first three or four years at school be better spent learning the traditional basics of reading, writing and arithmetic? After all, this period is crucial to gaining the frameworks of grammar and mathematics, which in their own way provide a solid background for some of the key elements of coding such as syntax, operators and of course spelling!

Apart from the obvious notion that the demand for programmers is likely to increase in the next generation, not just for computers and touch devices, but for all sorts of consumer items from cars to watches (at least until computers become sophisticated enough -and fast enough - for programming in everyday language) there are benefits and skills useful in the wider world. The following reasons are probably just the tip of the iceberg:
  • It exercises the mind, sharpening analytical thinking and trouble-shooting abilities
  • Coding can be thought of as akin to learning a foreign language or how to read music, so may hone those skills
  • Programming can generate a fruitful combination of creative and mathematical skills, which is difficult to obtain in most other subjects
  • This is the age of information economies, so programming is the largest employment growth sector in much of the developed world.
One worrying trend is the decline in the number of female programmers over the past quarter century. Perhaps this isn't surprising in the game coding field, considering that the vast majority of its themes are centered on the military and fantasy violence. But then doesn't this extremely popular, highly visible and decidedly lucrative sector of contemporary computing bolster the notion widespread among women that leisure-time computing is primarily the domain of socially-inadequate young men?

Research suggests that women consider computers as a tool to aid numerous disciplines whilst men look upon them more as an end in themselves. Surely learning to use them in-depth at an early age could help achieve a more liberal attitude from either extreme? Computers - and indeed the increasing number of programmable consumer devices - are not going away any time soon. If the near future of humanity will rely ever more closely on interfacing with these machines, then shouldn't as many of us as possible gain some understanding of what goes on 'under the hood'? After all, there has to be someone out there who can make a less buggy operating system than Windows 10!

Wednesday 24 February 2016

Drowning by numbers: how to survive the information age

2002 was a big year. According to some statistics, it was the year that digital storage capacity overtook analogue: books gave way to online information; binary became king. Or hyperbole to that effect. Between email, social media, websites and the interminable selfie, we are all guilty to greater or lesser extent of creating data archived in digital format. The human race now generates zettabytes of data every year (a zettabyte being a trillion gigabytes, in case you're still dealing in such minute amounts of data).

So what's so bad about that? More and more we rely on syntheses of information in order to keep up with the exponential amount of knowledge revealed to our species by the scientific and other methods. Counter to Plato's 2400 year-old dialogue Phaedrus, we can no longer work out everything important for ourselves; instead, we must rely on analysis and results created by other, often long-dead, humans. Even those with superb memories cannot retain more than a miniscule fraction of the information known about even one discipline. In addition, we can now create data-rich types of content undreamed of in Plato's time. Some, MRSI medical scans being an ad-hoc example , may require long-term storage. If quantum computing becomes mainstream, then that will presumably generate an exponential growth in data.

What then, are the primary concerns of living in a society that has such high demands for the creation and safe storage of data? I've been thinking about this for a while now and the following is my analysis of the situation.

1. Storage. In recent years it has become widely known that CDs and to a lesser extent DVDs are subject to several forms of disk rot. I've heard horror stories of people putting their entire photo and/or video collection onto portable hard drives, only for these to fail within a year or two, the data being irrevocably lost. With the advent of cloud storage, this lessens the issue, but not completely. Servers are still subject to all sorts of problems, with even enterprise-level solutions suffering due to insufficient disaster recovery and resilience (to use terms us web developers use). I'm not saying audio tapes, vinyl records and VHS were any better, far from it, but there is a lot less data stored in these formats. There are times when good old-fashioned paper still rules - as it still does in the legal and compliance sectors I've had contact with.

2. Security and privacy. As for safety, the arms race against hackers, etal, is well and truly engaged. Incompetence also has its place. When living in the UK I once received a letter stating that my children's social services records, including their contact details, had become publicly available. This turned out to be due to loss of a memory stick containing database passwords. As for identify theft, well, let's just say that Facebook is a rude word. I managed to track down an old friend after nearly twenty years' incommunicado, finding details such as his address, wife's name and occupation, etc, mostly via Facebook, in less than half an hour. Lucky I'm not a stalker, really!

Even those who avoid social media may find themselves with some form of internet presence. I had a friend who signed a political petition on paper and then several years' later found his name on a petition website. Let's hope it was the sort of campaign that didn't work against his career - these things can happen.

And then there's the fact that being a consumer means numerous manufacturers and retail outlets will have your personal details on file. I've heard that in some countries if you - and more particularly your smartphone - enter a shopping mall, you may get a message saying that as a loyal customer of a particular store there is a special sale on just for you, the crunch being that you only have a limited time, possibly minutes, to get to the outlet and make a purchase. Okay, that doesn't sound so bad, but the more storage locations that contain your personal details, the greater the chance they will be used against you. Paranoid? No, just careful. Considering how easy it was for me to become a victim of financial fraud about fifteen years ago, I have experience of these things.

As any Amazon customer knows, you are bombarded with offers tailored via your purchase record. How long will it be before smart advertising billboards recognise your presence, as per Steven Spielberg's Minority Report? Yes, the merchandiser's dream of ultimate granularity in customer targeting, but also a fundamental infringement of their anonymity. Perhaps everyone will end up getting five seconds' of public fame on a daily basis, thanks to such devices. Big Brother is truly watching you, even if most of the time it's for the purpose of flogging you consumer junk.

3. Efficiency. There are several million blog posts each day, several hundred billion emails and half a billion daily tweets. How can we possibly extract the wheat from the chaff (showing my age with that idiom), if we spend so much time ploughing through social media? I, for one, am not convinced there's much worth in a lot of this new-fangled stuff anyway (insert smiley here). I really don't want to know what friends, relatives or celebrities had for breakfast or which humorous cat videos they've just watched. Of course it's subjective, but I think there's a good case for claiming the vast majority of digital content is a complete load of rubbish. So how can we live useful, worthwhile or even fulfilled lives when surrounded by it? In other words, how do we find the little gems of genuine worth among the flood of noise? It seems highly probable that a lot of the prominent nonsense theories such as moon landing hoax wouldn't be anywhere near as popular if it wasn't for the World Wide Web disseminating them.

4. Fatigue and overload. Research has shown that our contemporary news culture (short snippets repeated ad nauseum over the course of a day or so) leads to a weary attitude. Far from empowering us, bombarding everyone with the same information, frequently lacking context, can rapidly lead to antipathy. Besides which, if information is inaccurate in the first place it can quickly achieve canonical status as it gets spread across the digital world. As for the effect all this audio-visual over-stimulation is having on children's attention spans...now where was I?

5. The future. So are there any solutions to these issues? I assume as we speak there are research projects aiming to develop heuristic programs that are the electronic equivalent of a personal assistant. If a user carefully builds their personality profile, then the program would be expected to extract nuggets of digital gold from all the sludge. Yet even personally-tailored smart filters that provide daily doses of information, entertainment, commerce and all points in between have their own issues. For example, unless the software is exceptional (i.e. rather more advanced than anything commercially available today) you would probably miss out on laterally- or tangentially-associated content. Even for scientists, this sort of serendipity is a great boon to creativity, but is rarely found in any form of machine intelligence. There's also the risk that corporate or governmental forces could bias the programming…or is that just the paranoia returning? All I can say: knowledge is power.

All in all, this sounds a touch pessimistic. I think Arthur C. Clarke once raised his concern about the inevitable decay within societies that overproduced information. The digital age is centered on the dissemination of content that is both current and popular, but not necessarily optimal. We are assailed by numerous sources of data, often created for purely commercial purposes; rarely for anything of worth. Let's hope we don't end up drowning in videos of pesky kittens. Aw, aren't they cute, though?

Tuesday 26 January 2016

Spreading the word: 10 reasons why science communication is so important

Although there have been science-promoting societies since the Renaissance, most of the dissemination of scientific ideas was played out at royal courts, religious foundations or for similarly elite audiences. Only since the Royal Institution lectures of the early 19th century and such leading lights as Michael Faraday and Sir Humphry Davy has there been any organised communication of the discipline to the general public.

Today, it would appear that there is a plethora - possibly even a glut - in the market. Amazon.com carries over 192,000 popular science books and over 4,000 science documentary DVD titles, so there's certainly plenty of choice! Things have dramatically improved since the middle of the last century, when according to the late evolutionary biologist Stephen Jay Gould, there was essentially no publicly-available material about dinosaurs.

From the ubiquity of the latter (especially since the appearance of Steven Spielberg's originally 1993 Jurassic Park) it might appear that most science communication is aimed at children - and, dishearteningly, primarily at boys - but this really shouldn't be so. Just as anyone can take evening courses in everything from pottery to a foreign language, why shouldn't the public be encouraged to understand some of the most important current issues in the fields of science, technology, engineering and mathematics (STEM), at the same time hopefully picking up key methods of the discipline?

As Carl Sagan once said, the public are all too eager to accept the products of science, so why not the methods? It may not be important if most people don't know how to throw a clay pot on a wheel or understand why a Cubist painting looks as it does, but it certainly matters as to how massive amounts of public money are invested in a project and whether that research has far-reaching consequences.
Here then are the points I consider the most important as to why science should be popularised in the most accessible way - although without oversimplifying the material to the point of distortion:

1. Politicians and the associated bureaucracy need basic understanding of some STEM research, often at the cutting edge, in order to generate new policies. Yet as I have previously examined, few current politicians have a scientific background. If our elected leaders are to make informed decisions, they need to understand the science involved. It's obvious, but then if the summary material they are supplied with is incorrect or deliberately biased, the outcome may not be the most appropriate one. STEM isn't just small fry: in 2010 the nations with the ten highest research and development budgets had a combined spend of over US$1.2 trillion.

2. If public money is being used for certain projects, then taxpayers are only able to make valid disagreements as to how their money is spent if they understand the research (military R&D excepted of course, since this is usually too hush-hush for the rest of us poor folk to know about). In 1993 the US Government cancelled the Superconducting Super Collider particle accelerator as it was deemed good science but not affordable science. Much as I love the results coming out of the Large Hadron Collider, I do worry that the immense amount of funding (over US$13 billion spent by 2012) might be better used elsewhere on other high-technology projects with more immediate benefits. I've previously discussed both the highs and lows of nuclear fusion research, which surely has to be one of the most important areas in mega-budget research and development today?

3. Criminal law serves to protect the populace from the unscrupulous, but since the speed of scientific advances and technological change run way ahead of legislation, public knowledge of the issues could help prevent miscarriages of justice or at least wasting money. The USA population has spent over US$3 billion on homeopathy, despite a 1997 report by the President of the National Council Against Health Fraud that stated "Homeopathy is a fraud perpetrated on the public." Even a basic level of critical thinking might help in the good fight against baloney.

4. Understanding of current developments might lead to reliance as much on the head as the heart. For example, what are the practical versus moral implications for embryonic stem cell research (exceptionally potent with President Obama's State of the Union speech to cure cancer). Or what about the pioneering work in xenotransplantation: could the next few decades see the use of genetically-altered pig hearts to save humans, and if so would patients with strong religious convictions agree to such transplants?

5. The realisation that much popular journalism is sensationalist and has little connection to reality. The British tabloid press labelling of genetically-modified crops as 'Frankenstein foods' is typical of the nonsense that clouds complex and serious issues for the sake of high sales. Again, critical thinking might more easily differentiate biased rhetoric from 'neutral' facts.

6. Sometimes scientists can be paid to lie. Remember campaigns with scientific support from the last century that stated smoking tobacco is good for you or that lead in petrol is harmless? How about the DuPont Corporation refusing to stop CFC production, with the excuse that capitalist profit should outweigh environmental degradation and the resulting increase in skin cancer? Whistle-blowers have often been marginalised by industry-funded scientists (think of the initial reaction to Rachel Carson concerning DDT) so it's doubtful anything other than knowledge of the issues would penetrate the slick corporate smokescreen.

7. Knowing the boundaries of the scientific method - what science can and cannot tell us and what should be left to other areas of human activity - is key to understanding where the discipline should fit into society. I've already mentioned the moral implications and whether research can be justified due to the potential outcome, but conversely, are there habits and rituals, or just societal conditioning, that blinds us to what could be achieved with public lobbying to governments?

8. Nations may be enriched as a whole by cutting out nonsense and focusing on solutions for critical issues, for example by not having to waste time and money explaining that global warming and evolution by natural selection are successful working theories due to the mass of evidence. Notice how uncontroversial most astronomical and dinosaur-related popularisations are. Now compare to the evolution of our own species. Enough said!

9. Improving the public perspective of scientists themselves. A primary consensus still seems to promote the notion of lone geniuses, emotionally removed from the rest of society and frequently promoting their own goals above the general good. Apart from the obvious ways in which this conflicts with other points already stated, much research is undertaken by large, frequently multi-national teams; think Large Hadron Collider, of course. Such knowledge may aid removal of the juvenile Hollywood science hero (rarely a heroine) and increase support for the sustained efforts that require public substantial funding (nuclear fusion being a perfect example).

10. Reducing the parochialism, sectarianism and their associated conflict that if anything appears to be on the increase. It's a difficult issue and unlikely that it could be a key player but let's face it, any help here must be worth trying. Neil deGrasse Tyson's attitude is worth mentioning: our ideological differences seem untenable against a cosmic perspective. Naïve perhaps, but surely worth the effort?

Last year Bill Gates said: "In science, we're all kids. A good scientist is somebody who has redeveloped from scratch many times the chain of reasoning of how we know what we know, just to see where there are holes." The more the rest of us understand this, isn't there a chance we would notice the holes in other spheres of thought we currently consider unbending? This can only be a good thing, if we wish to survive our turbulent technological adolescence.