Wednesday 27 July 2016

Resistance is futile: the ongoing war against super bugs

As I'm currently three days into an irritating cough (aren't they all?) accompanied by a sore throat, I've just taken a soothing lozenge. The packet states the lozenges contain a combination of two antibacterial agents which aim to help kill the bacteria causing the infection. However, the packet also notes - in a somewhat smaller font size - that there is no clinical proof an antibacterial agent will reduce the severity or duration of the infection. Could this be because common colds and influenza are caused by viruses not bacteria? I don't suppose the pharmaceutical industry could possibly be duping an ignorant public in the name of profit margins?

Working in a hot desking environment, I frequently remind colleagues not to overdue usage of anti-bacterial sprays on their desk, keyboards, mouse and telephone. Not that I'm exactly certain how damaging the company-supplied sprays are, environmentally-speaking: for all I know, they may be good enough to destroy all the 'bad' bacteria, but I'd rather be safe than sorry. Instead, I recommend the method I use at work, namely washing my hands before eating. Simple, and hopefully less likely to encourage super bugs.

It seems to me that basic hygiene is preferable to the chemical war on microbes, since (a) some are beneficial, including for building immunity; and (b) some strains may survive the cull and lead to a desk biota high in resistant bacteria: after all, isn't that just Darwinian natural selection being given an unintentional boost? Unfortunately, there has been a large increase in sick leave since we moved from conventional offices to hot-desking. Therefore something is clearly going wrong, regardless of approach!

The best well-known of the super bugs has to be Methicillin-resistant Staphylococcus aureus (MRSA), beloved of news journalists but very few others. Although the resistance was first recognised around 1960, the past twenty-five years or so has seen a plethora of scientific reports describing outbreaks separated from healthcare environments. Therefore popular news articles about super bugs in hospitals - and the over-use of antibiotics that have aided their increase in range - only started hitting the headlines after the bacteria had already spread to other types of locale.

This latter community-associated or CA-MRSA is therefore at least as great a risk as the hospital variant, often affecting younger people. MRSA naturally occurs in several percent of the population anyway, so it would be difficult to totally eradicate by any foreseeable method. Many common antibiotics are already useless against MRSA, which can be spread by direct skin contact as well as via objects - such as computer keyboards and mice I might add, to anyone considering converting their offices to hot desking. In addition, the far less well-known methicillin-sensitive Staphylococcus aureus (MSSA) is also on the increase.

Another key reason for the increase of resistant microbes is thanks to the use of antibiotics on farmed animals. Whilst it might seem sensible for densely-packed livestock to be inoculated - frankly I don't mind paying more for free range rather than battery-farmed eggs, but I realise that isn't an option for many - the discovery in the 1940s that antibiotics can be used to promote growth imply profit is yet again the key factor here. Far from being a simple precautionary measure against the spread of infection, livestock and poultry has been given pharmaceuticals in order to maximise produce without an associated increase in feeding costs.

In 1969 the Swann report on the Use of Antibiotics in Animal Husbandry and Veterinary Medicine recommended a cease on their use as growth promoters. After a long period of inertia, the European Union eventually banned such usage for eight antibiotics, four in 1989 and a further four in 2006. Unfortunately many other nations, including the USA, are still pumping enormous amounts of pharmaceuticals into farm animals.

I've found very little in the way of research projects that seek to lessen this dependency. Possibly the method containing the least disruption would be to develop drugs that have similar effects on animal growth but aren't required as human medicine. Perhaps the pharmaceutical giants just aren't finding antibiotic development profitable enough anymore; after all, if medical practice wants to prevent the spread of resistant bacteria it needs to minimise use of antibiotics.

The effects agricultural usage is having is wide-ranging, from pathogens crossing from livestock to humans and back again, to infections spreading to pets and even into wild animals such as flies and rodents. However, the USA seems to have made little effort to follow the EU, with about 80% of the antibiotics sold there being used on farm livestock. Yet another MRSA variant, CC398, has been gaining ground, particularly in pigs and can transfer to humans in the form LA-MRSA. What price a cheap bacon sandwich?

It isn't as if the American scientific establishment hasn't been amassing data to support the case for stopping the practice, which over the past half century or so has led to other, less well-known strains such as Campylobacter coli gaining immunity, Despite high levels of infected produce, large-scale recalls and perhaps over 100,000 deaths per annum in the USA alone (farm workers and food processors can pick up strains, not just the end consumer), commerce appears to be winning over common sense.

It isn't completely bad news: research by the University of Southampton indicates that copper might become useable as an inhibitor (which seems strange - I thought silver might be the metal of choice, considering its anti-bacterial properties - guess that proves I'm not a research chemist, then!) In addition, some of the main fast food chains have started to cut down on buying produce from antibiotic-pumped livestock. But is this too little much too late? With most pharmaceutical production in the hands of a few giant multi-nationals, the human race is largely beholden to a very small number of executives. My suggestion would be...err...just don't get ill? Or work in a hot desking environment. Or leave your home, ever...hmm...

Tuesday 21 June 2016

Military intelligence: how martial sci-tech does science few favours

I recently read an article about the USA's latest aircraft carrier the USS Gerald R. Ford that contained two bewildering facts: that at a combined research and construction cost of around US$18 billion it is the most expensive warship ever built; and that although only the first of three ships to be built in the class - and with an intended lifespan of half a century - it may already be obsolete.

So if potential aggressor nations now have the anti-ship missile technology to sink the carrier, is it little more than an enormous waste of taxpayer funds? There are reports of war games and simulations over the past three decades which fundamentally undermine the Victorian notion of technological progress - that bigger, stronger, faster equals better. This is particularly apt if your opponent uses 'unfair' and/or 'underhand' tactics such as stealth systems and guerrilla strategies. Then why are these colossal projects still being funded?

The USS Gerald R. Ford is merely the (admittedly very large) tip of an enormous iceberg concerning military expenditure of recent decades. Just to drive the point home, here's a few other recent examples:
  1. The US Navy's aircraft carrier-version of the Lightening II Joint Strike Fighter is the F-35C, with some estimates suggesting each combat-ready aircraft costs up to $337 million.
  2. The US Air Force's F-22 Raptor programme was shut down after only 187 operational aircraft were built, as the price per airframe was even higher, around $350 million.
  3. The apotheosis of combat aircraft has to be the B-2 Spirit stealth bomber. Only 21 were ever built, at a whopping $737 million each, excluding the research and development costs, which may double or even triple this number.
  4. So as to not seem unfairly biased against the USA, other nations also have their share of military expenditure. For example, South Korea's K2 Black Panther is the most expensive main battle tank ever built, with per-unit costs of US$8.5 million each.
So who's to blame for all this? The USS Gerald R. Ford for example was approved during George W. Bush's administration but is only nearing completion eight years after he has left office. At least in democracies, politicians usually come and go in less than a decade whilst defence contractors last much longer. Could the armaments sector be duping administrations into giving them a lifeline? A large proportion of manufacturing has migrated to developing nations but due to the sensitive nature of the sector, advanced military technology is one of the few areas still concentrated within the developed West.

It's difficult to collate anything like exact figures, but the proportion of STEM (Science, Technology, Engineering and Mathematics) professionals worldwide who work on military projects is frequently given as 20% to 25%. Is it feasible that this high level of involvement in an area that is both secretive and horrendously expensive may be counter-productive to the public's attitude to science in general?

After all, no other sector has access to such enormous amounts of tax payer's funds without being responsible to some form of public scrutiny. Then again, since the early 1980s we have been sold a vision of military technology that is a mostly one-sided glorification of armaments and the requirements for ever-increasing expenditure in the name of freedom.

How many mainstream Hollywood movies since 1986's Top Gun - including plenty of sci-fi epics - can be seen as glossy advertisements for advanced weaponry? It may seem odd considering the conventional portrayal of movie scientists but homages to the military-industrial complex show little sign of abating. Praise be to the sophistication of the technology, whilst damning those who develop it as untrustworthy schemers outside of mainstream society. It's a curious phenomenon!

However, developing advanced technology for military purposes is hardly new. The ancient Greek Archimedes developed anti-ship devices whilst Leonardo da Vinci wrote effusive letters to prospective patrons about his land, sea and even aerial weapons, albeit some were of dubious practicality.

Today's society is supposedly more refined than those earlier times, yet whilst a concerted effort is being made to attract more women to STEM subjects, the macho nature of armaments presumably ensures the sector remains male-dominated. If proof would were needed of the interest in all things explosive, the global success of the TV show Mythbusters should be a good indicator. If an example of the crazy nature of unrestrained masculinity needs delineating, then how about atomic bomb pioneer Edward Teller's promotion of nuclear devices for civil engineering projects? For every J. Robert Oppenheimer there were far more Tellers.

It isn't just the sheer cost of contemporary military projects that can lead to the ire of taxpayers. There have been some almost farcical instances of under-performance, such as the degradation of the B-2's anti-radar coating by high levels of humidity (never mind rain). It's easy to blame the scientists and engineers in such circumstances; after all, the politicians and generals leave the cutting-edge technology to the experts! But talk about over-promise and under-deliver...

One area that presumably didn't exist before the Twentieth Century's development of weapons of mass destruction cannot be blamed on STEM professionals and that is the deliberate use of civilians as guinea pigs. From the US and British atomic bomb tests that affected local populations as well as military personal to the cloud-seeding experiments over heavily-populated areas that may have led to fatal downpours, it seems no-one is safe from their own armed forces.

Of course, a large proportion of the degradation of the image of scientists as authority figures may have occurred during the Cold War, when it became apparent that military technocrats of the period earned their reputation as 'architects of the apocalypse'. There's obviously a lot of complexity around this issue. Arguments range back and forth, on such topics as once the Apollo moon landings proved America's technological superiority to the Soviet Union, the project was rapidly wound up; or how did the more right-wing elements of society feel when that same know-how was stalemated by markedly inferior forces in Vietnam?

The space shuttle was another victim of military requirements, the orbiter's unprecedented size being needed for the then large spy satellites - and the intention to fly two of them from Vandenburg Air Force base for 'shadow' missions. In a sense, the military could be seen to have had their fingers in many leading but nominally civilian pies.

This isn't to say that there haven't been productive examples of military technology modified for civilian usage, from early manned spacecraft launched on adapted ICBMs to the ARPANET providing a foundation for the Internet.

Even so, it is easy to look at the immense worldwide expenditure on weapon development and wonder what could be achieved if even a few percent of that funding was redirected elsewhere. There's no doubt about it: the sheer quantity, sophistication and expensive of modern military hardware provides some legitimate public concerns as to the role of science and technology in the name of 'defence'. Especially if $18 billion worth of aircraft carrier is little more than a showy piece of machismo that belongs to the last half century, not the next.

Wednesday 25 May 2016

From Dr Strangelove to Dr Evil: Hollywood's anti-science stance

Despite the decades of hard work by the likes of Bill Nye, Stephen Hawking, Carl Sagan, Stephen Jay Gould etal, there is still an enormous amount of public suspicion surrounding scientists and their work. From wavering opinion concerning climate change to the negative publicity revolving around genetically-modified crops (A.K.A. 'Frankenfoods') it seems that popular opinion of scientists isn't far above that meted out in recent years to politicians and merchant bankers.

Tabloid media cannot be solely to blame for this, although the ridiculous scaremongering stories given front page attention, frequently involving medical science, are certainly no help. Instead, I would argue that some of the blame for the public attitude to STEM (Science, Technology, Engineering and Mathematics) comes from that ubiquitous global communicator, mainstream Hollywood. So where did the world's movie capital get its ideas from?

It seems that the denigration of science and its technological applications has probably existed as long as modern science itself. Before there were films to spread the negativity, literature had a mixed opinion of the discipline. Could some of the most famous apparently anti-scientific publications from Europe have inspired Hollywood's pioneers, many of whom were European emigrés?

Jonathan Swift's third book of Gulliver's Travels concerns the scientific elite of a floating island called Laputa. First published in 1726 during the so-called Age of Enlightenment, the book is typical of Swift's no holds barred approach to satire, making much use of the learning of the day. Despite being far more concerned with social and political issues rather than an anti-scientific stance, the material is still echoed today in the popular media.

Granted, many would agree that some of the more expensive STEM research projects such as the Large Hadron Collider could wait until global issues concerning hunger, medicine, environmental degradation - and poverty in general - are solved, but then wealth is rarely evenly distributed. After all, the USA apparently spends twice as much on pet grooming as it does on nuclear fusion research. Incidentally, isn't this bizarre in itself: it's not just that we consider ourselves so much more rational than all other animals, but that the human brain is the most complex object in the known universe. That's a pretty scary thought!

As for Mary Shelley's classic novel whose title is evoked during criticism of GM foods, she may have been inspired by the general feeling of doom then in the air; almost literally in fact, due to the 1815 eruption of Mount Tambora, with volcanic dust creating 1816's 'Year without a Summer'. As an aside, the astonishingly lurid colours of J.M.W. Turner's sunsets of the period were another artistic response associated with the high-altitude volcanic aerosols.

In addition to the extremely cold, wet conditions of that year, Shelley is thought to have stopped near to the original Frankenstein Castle in Germany, where alchemy and other dubious dark arts were reputed to have been practiced. Combined with Luigi Galvani's experiments on frogs' legs - originally performed several decades earlier but much imitated still in Shelley's time, including on human cadavers - the novel is clearly a reflection of widespread anxieties of the time.

With the expansion of industrial cities and their associated squalor, the mid-Nineteenth Century saw the origin of philosophies that associated technological advances (and their scientific underpinnings) with a debasement of humanity. William Blake's description of 'satanic mills' epitomises this mode of thought, seen in as diverse a range of expression as the Pre-Raphaelite Brotherhood of artists, the Arts and Crafts movement, even the political writings of Marx and Engels. To blame the greed of the new captains of industry on science is obviously unfair, but then the latter were a far easier target. After all, the English chemist and political radical Joseph Priestley fled to the United States after an authority-sponsored mob burnt down his house in 1791.

Blake's over-wraught emoting ("Science is the Tree of Death") is amongst the strongest negativity of the period, but can we blame him, considering science was, as it is today, often wrongly blamed as the root cause of the widespread destruction of nature to make way for a soulless, artificial environment? But it wasn't just a response to the changes to society and landscape that Blake took exception to: he detested the mechanistic vision of the universe built upon the work of Galileo and Newton, believing that too much knowledge destroyed wonder and awe.

This is clearly as subjective a viewpoint as any discussion of a work of art; it can be easily rebuffed, although the attitude behind it should be treated seriously. Happily, today's plethora of glossy coffee table books on such scientifically-gleaned wonders as Hubble Space Telescope imagery show there is still plenty to be in awe of.

Mainstream cinema frequently paints a very A versus B picture of the world (think classic westerns or war films). But science can rarely fit into such neat parcels: consider how the more accurate general theory of relativity can live alongside its predecessor from Newton. In addition, it's very tricky to make interesting drama within a traditional narrative structure that utilises scientist protagonists unless it's a disaster movie (even the likes of Jurassic Park falls within this category.)

It isn't difficult to recall many negative examples of scientists in Hollywood movies, from at best those too wrapped up in their own work to notice its wider effects, to at worst insane megalomaniacs intent on either world domination or destruction. In contrast, how many sympathetic movie scientists are there?

It seems such a shame that such a ubiquitous form of entertainment consistently portrays such a lack of sympathy towards science. Even the film version of Carl Sagan's novel Contact lacked the cosmic spiritual elements of the source material, as if afraid that a combination of astrophysics and the mystical wouldn't be comprehensible to audiences (2001 syndrome, perhaps?) Science fiction films these days often seem keen to boast of their technical consultants, so what about a more sympathetic attitude to the practitioners of science itself? After all, most scientists don't live with their private armies in secret headquarters bases, planning to takeover the world...

Friday 1 April 2016

Hollywood's natural history hobbit hoax: did Peter Jackson create Homo floresiensis for publicity purposes?

Judging by the limited ingredients of contemporary blockbusters, cinema audiences are fairly easy to please. Or are they? Peter Jackson's magnum opus The Lord of the Rings trilogy made an absolute mint at the box office and garnered seventeen Oscar wins besides critical acclaim. In contrast, The Hobbit trilogy received but a single Oscar accompanying some rather lukewarm reviews.

The reason for the critical indifference and lack of awards has been put down to franchise fatigue, although to be fair stretching a children's book over three long movies whilst partly improvising the script at a late stage couldn't have helped. So if you are a world-renowned film maker well aware that you are judged by many of your fans and much of your peer group on the success - and possibly the quality - of your latest film, it wouldn't be surprising if you go to great lengths to maximise that success. Just how far Peter Jackson went for The Hobbit trilogy is astounding...so read on...

It's been some years since I visited Weta Cave in Wellington, where close-up views of various costumes and props from movies including the LOTR trilogy leaves you in no doubt about the superb workmanship the effects house is capable of. Some of the exhibits and merchandise included non-human characters from Middle Earth and District 9, the quality of which got me thinking. Peter Jackson is known to have visited the Natural History Museum when in London recording the soundtrack for The Lord of the Rings. This in itself is not suspect, except that the museum was at the time hosting an exhibition about the infamous Piltdown Man.

For anyone who knows anything about science scandals, Piltdown Man has to be among the most notorious. The 1908 discovery in southern England of a hominin skull of unknown species was rapidly followed by numerous associated finds, all touted as genuine by professional scientists. In fact, by 1913 some palaeontologists had already suggested what was finally confirmed forty years later: the entire assemblage was a fraud, the skull itself including an orang utan jawbone with filed-down teeth! The fact that so many specialists authenticated the remains is bizarre, although it may be that patriotic wishful thinking (to confirm prehistoric hominins had lived in Britain) overrode any semblance of impartiality.

Back to Peter Jackson and his hobbit conundrum. Although LOTR trilogy did the bums-on-seats business (that's an industry term, in case you were wondering), Jackson's next film was the 2005 King Kong remake. Included in the record-breaking US$207 million production costs was a $32 million overspend which the director himself was personally responsible for. Having already been put into turnaround (that's cold feet in Hollywoodese) in the previous decade, Jackson was determined to complete the film to his own exacting standards, thus resulting in the financial woes surrounding the production.

So just how do you get the massive budget to make a prequel trilogy that's got a less involved storyline (sound vaguely familiar, Star Wars fans?) directly after you've made the most expensive film in history, which is not even a remake but a second remake? How about generating tie-in publicity to transfer from the real world to Middle Earth?

Around the time that Peter Jackson's production company Three Foot Six was being renamed (or if you prefer, upgraded) to Three Foot Seven, worldwide headlines announced the discovery of a small stature hominin of just this height. The first of the initial nine specimens found on the island of Flores, labelled LB1, would have been a mere 1.06 metres tall when alive, which is three feet six inches give or take a few millimetres.

Coincidence? When in doubt, adherents of scientific methods should follow the principle of parsimony, A.K.A. Occam's razor. Which in this case has led to me putting my conspiracy hat on.

Consider this: the new species rapidly became far better known by its nickname the 'hobbit people' than as Homo floresiensis. Which was handy for anyone about to spend US$225 million on three films involving hobbits. In addition, it was discovered at the perfect time for Jackson to get maximum publicity (admittedly not the release of the first hobbit film, but for purposes of convincing his American backers of the audience anticipation).

The smoking gun evidence for me is the almost comical resemblance the remains bear to Tolkien's creations. For example, the feet are said to be far longer and flatter than any other known hominin species. Remind you of anything you've seen at the movies? It's just a shame that hair doesn't survive as long as the alleged age of the specimens - which based on the stratigraphy has been estimated from 94,000 to 13,000 years ago.

In addition, how could such creatures have built the bamboo rafts or dug-out boats necessary to reach the island in the first place? When sea levels dropped during glaciation periods Flores was still convincingly isolated from the mainland. Braincase analysis shows that Homo floresiensis had an orange-sized brain. Since the tools found with the semi-petrified organic remains were simple stone implements, the idea of real-life hobbits sailing the high seas appears absurd in the extreme.

Several teams have attempted to extract DNA from the water-logged and delicate material but after a decade's effort none have been successful. This seems surprising, considering the quality of contemporary genetic replication techniques, but perhaps not if the material consists of skilfully crafted fakes courtesy of Weta Workshop. Some of the fragments appear similar to chimpanzee anatomy, but then Peter Jackson has always tried to make his creatures as realistic as possible. Indeed, he even hired a zoologist to ensure that his King Kong was anatomically correct (I recall hearing that some of his over-sized gorilla's behind needed reworking to gain accuracy. Now that's dedication!)

There has also been some rather unscientific behaviour concerning the Homo floresiensis remains which appears counter to the great care usually associated with such precious relics. At one point, the majority of material was hidden for three months by one of the Indonesian paleoanthropologists, only for what was returned to include damaged material missing several pieces. All in all, there is much about the finds to fuel speculation as to their origin.

In summary, if you wanted to promote worldwide interest in anything hobbit-wise what could be better yet not too obvious? Just how the much the joint Australian-Indonesian archaeology and palaeontology team were in the know is perhaps the largest mystery still remaining. I've little doubt that one day the entire venture will be exposed, perhaps in a documentary made by Peter Jackson himself. Now that would definitely be worth watching!

Tuesday 15 March 2016

Pre-teen coding electronic turtles: should children learn computer programming?

Way back in the mists of time when I was studying computer science at senior school, I was part of the first year at my (admittedly rural and far from innovative) school to use actual computers. Previous years had been stuck in the realm of punched tape and other such archaic wonders, so I was lucky to have access to the real thing. Now that we use smartphones with several hundred thousand times more memory than the first computer I owned - a Sinclair ZX Spectrum 48, if you're interested - I'd like to ask is it worthwhile teaching primary school children programming skills rather than just learning front-end interfaces?

I remember being amazed to learn that about the same time as I was getting to grips with the Sinclair version of BASIC, infants in Japan were already being taught the rudiments of programming via turtle robots and Logo. These days of course, children learn to use digital devices pretty much from the egg, but back then it seemed progressive in the extreme. My elder (but still pre-teen) daughter has so far dabbled with programming, mostly using drag and drop interfaces in game coding sessions and at her school's robot club, which involves the ROBOTC language and Vex robots.

Ironically, if I still lived in Britain then my younger daughter would already be learning computer science at school too, as in 2014 the UK Government made the subject mandatory for all children from five years' old. Not that this step came easily: apparently there was a struggle in the lead up to the curriculum change to find enough qualified teachers. Clearly, the effort involved in establishing such as policy suggests the level of importance placed upon it.

In contrast to the UK, New Zealand has slipped in the educational avant-garde. Digital technology is not a compulsory subject here and many lower-decile schools use old, unsupported software such as the Windows XP operating system. A combination of untrained teachers and parental attitudes is being blamed for a decline in the number of programmers in the country. I know of one Auckland-based technology centre where the majority of hands-on coders are predominantly recruited from overseas and incidentally - unlike the less-technical roles - are mostly men. Of course, the shortage could be partly due to the enticement of kiwi developers to the far larger and better-paid job markets in Australia, the UK and USA, but even so it seems clear that there is a definitive deficiency in New Zealand-born programmers.

Luckily, programming is a discipline where motivated children can learn coding for free, with online resources provided by numerous companies all the way up to Google and Microsoft. However, this presupposes both adequate internet access and parental support, or at least approval. If the current generation of parents don't understand the value of the subject, then it's highly unlikely many children will pick up the bug (ahem, that's a programming pun, of sorts.)

Compared to the BASIC and Logo languages available in my teenage years there is now a bewildering array of computer languages, interfaces and projects that teach the rudiments of programming, with colourful audio-visual interfaces such as Alice, Scratch (a bit like a virtual lego), CiMPLE, Kodu, etc, all intended for a pre-teen audience. Of course, they are far removed from complexity of professional languages such as the C family or Java - I have to say that Object-Orientated Programming was certainly a bit of a shock for me - but these applications are more about whetting the appetite and generating quick results so as to maintain interest.

So what are the reasons why learning to code might be a good idea for young children, rather than just teaching them to use software such as the ubiquitous Microsoft Office? Might not the first three or four years at school be better spent learning the traditional basics of reading, writing and arithmetic? After all, this period is crucial to gaining the frameworks of grammar and mathematics, which in their own way provide a solid background for some of the key elements of coding such as syntax, operators and of course spelling!

Apart from the obvious notion that the demand for programmers is likely to increase in the next generation, not just for computers and touch devices, but for all sorts of consumer items from cars to watches (at least until computers become sophisticated enough -and fast enough - for programming in everyday language) there are benefits and skills useful in the wider world. The following reasons are probably just the tip of the iceberg:
  • It exercises the mind, sharpening analytical thinking and trouble-shooting abilities
  • Coding can be thought of as akin to learning a foreign language or how to read music, so may hone those skills
  • Programming can generate a fruitful combination of creative and mathematical skills, which is difficult to obtain in most other subjects
  • This is the age of information economies, so programming is the largest employment growth sector in much of the developed world.
One worrying trend is the decline in the number of female programmers over the past quarter century. Perhaps this isn't surprising in the game coding field, considering that the vast majority of its themes are centered on the military and fantasy violence. But then doesn't this extremely popular, highly visible and decidedly lucrative sector of contemporary computing bolster the notion widespread among women that leisure-time computing is primarily the domain of socially-inadequate young men?

Research suggests that women consider computers as a tool to aid numerous disciplines whilst men look upon them more as an end in themselves. Surely learning to use them in-depth at an early age could help achieve a more liberal attitude from either extreme? Computers - and indeed the increasing number of programmable consumer devices - are not going away any time soon. If the near future of humanity will rely ever more closely on interfacing with these machines, then shouldn't as many of us as possible gain some understanding of what goes on 'under the hood'? After all, there has to be someone out there who can make a less buggy operating system than Windows 10!