Friday 1 April 2016

Hollywood's natural history hobbit hoax: did Peter Jackson create Homo floresiensis for publicity purposes?

Judging by the limited ingredients of contemporary blockbusters, cinema audiences are fairly easy to please. Or are they? Peter Jackson's magnum opus The Lord of the Rings trilogy made an absolute mint at the box office and garnered seventeen Oscar wins besides critical acclaim. In contrast, The Hobbit trilogy received but a single Oscar accompanying some rather lukewarm reviews.

The reason for the critical indifference and lack of awards has been put down to franchise fatigue, although to be fair stretching a children's book over three long movies whilst partly improvising the script at a late stage couldn't have helped. So if you are a world-renowned film maker well aware that you are judged by many of your fans and much of your peer group on the success - and possibly the quality - of your latest film, it wouldn't be surprising if you go to great lengths to maximise that success. Just how far Peter Jackson went for The Hobbit trilogy is astounding...so read on...

It's been some years since I visited Weta Cave in Wellington, where close-up views of various costumes and props from movies including the LOTR trilogy leaves you in no doubt about the superb workmanship the effects house is capable of. Some of the exhibits and merchandise included non-human characters from Middle Earth and District 9, the quality of which got me thinking. Peter Jackson is known to have visited the Natural History Museum when in London recording the soundtrack for The Lord of the Rings. This in itself is not suspect, except that the museum was at the time hosting an exhibition about the infamous Piltdown Man.

For anyone who knows anything about science scandals, Piltdown Man has to be among the most notorious. The 1908 discovery in southern England of a hominin skull of unknown species was rapidly followed by numerous associated finds, all touted as genuine by professional scientists. In fact, by 1913 some palaeontologists had already suggested what was finally confirmed forty years later: the entire assemblage was a fraud, the skull itself including an orang utan jawbone with filed-down teeth! The fact that so many specialists authenticated the remains is bizarre, although it may be that patriotic wishful thinking (to confirm prehistoric hominins had lived in Britain) overrode any semblance of impartiality.

Back to Peter Jackson and his hobbit conundrum. Although LOTR trilogy did the bums-on-seats business (that's an industry term, in case you were wondering), Jackson's next film was the 2005 King Kong remake. Included in the record-breaking US$207 million production costs was a $32 million overspend which the director himself was personally responsible for. Having already been put into turnaround (that's cold feet in Hollywoodese) in the previous decade, Jackson was determined to complete the film to his own exacting standards, thus resulting in the financial woes surrounding the production.

So just how do you get the massive budget to make a prequel trilogy that's got a less involved storyline (sound vaguely familiar, Star Wars fans?) directly after you've made the most expensive film in history, which is not even a remake but a second remake? How about generating tie-in publicity to transfer from the real world to Middle Earth?

Around the time that Peter Jackson's production company Three Foot Six was being renamed (or if you prefer, upgraded) to Three Foot Seven, worldwide headlines announced the discovery of a small stature hominin of just this height. The first of the initial nine specimens found on the island of Flores, labelled LB1, would have been a mere 1.06 metres tall when alive, which is three feet six inches give or take a few millimetres.

Coincidence? When in doubt, adherents of scientific methods should follow the principle of parsimony, A.K.A. Occam's razor. Which in this case has led to me putting my conspiracy hat on.

Consider this: the new species rapidly became far better known by its nickname the 'hobbit people' than as Homo floresiensis. Which was handy for anyone about to spend US$225 million on three films involving hobbits. In addition, it was discovered at the perfect time for Jackson to get maximum publicity (admittedly not the release of the first hobbit film, but for purposes of convincing his American backers of the audience anticipation).

The smoking gun evidence for me is the almost comical resemblance the remains bear to Tolkien's creations. For example, the feet are said to be far longer and flatter than any other known hominin species. Remind you of anything you've seen at the movies? It's just a shame that hair doesn't survive as long as the alleged age of the specimens - which based on the stratigraphy has been estimated from 94,000 to 13,000 years ago.

In addition, how could such creatures have built the bamboo rafts or dug-out boats necessary to reach the island in the first place? When sea levels dropped during glaciation periods Flores was still convincingly isolated from the mainland. Braincase analysis shows that Homo floresiensis had an orange-sized brain. Since the tools found with the semi-petrified organic remains were simple stone implements, the idea of real-life hobbits sailing the high seas appears absurd in the extreme.

Several teams have attempted to extract DNA from the water-logged and delicate material but after a decade's effort none have been successful. This seems surprising, considering the quality of contemporary genetic replication techniques, but perhaps not if the material consists of skilfully crafted fakes courtesy of Weta Workshop. Some of the fragments appear similar to chimpanzee anatomy, but then Peter Jackson has always tried to make his creatures as realistic as possible. Indeed, he even hired a zoologist to ensure that his King Kong was anatomically correct (I recall hearing that some of his over-sized gorilla's behind needed reworking to gain accuracy. Now that's dedication!)

There has also been some rather unscientific behaviour concerning the Homo floresiensis remains which appears counter to the great care usually associated with such precious relics. At one point, the majority of material was hidden for three months by one of the Indonesian paleoanthropologists, only for what was returned to include damaged material missing several pieces. All in all, there is much about the finds to fuel speculation as to their origin.

In summary, if you wanted to promote worldwide interest in anything hobbit-wise what could be better yet not too obvious? Just how the much the joint Australian-Indonesian archaeology and palaeontology team were in the know is perhaps the largest mystery still remaining. I've little doubt that one day the entire venture will be exposed, perhaps in a documentary made by Peter Jackson himself. Now that would definitely be worth watching!

Tuesday 15 March 2016

Pre-teen coding electronic turtles: should children learn computer programming?

Way back in the mists of time when I was studying computer science at senior school, I was part of the first year at my (admittedly rural and far from innovative) school to use actual computers. Previous years had been stuck in the realm of punched tape and other such archaic wonders, so I was lucky to have access to the real thing. Now that we use smartphones with several hundred thousand times more memory than the first computer I owned - a Sinclair ZX Spectrum 48, if you're interested - I'd like to ask is it worthwhile teaching primary school children programming skills rather than just learning front-end interfaces?

I remember being amazed to learn that about the same time as I was getting to grips with the Sinclair version of BASIC, infants in Japan were already being taught the rudiments of programming via turtle robots and Logo. These days of course, children learn to use digital devices pretty much from the egg, but back then it seemed progressive in the extreme. My elder (but still pre-teen) daughter has so far dabbled with programming, mostly using drag and drop interfaces in game coding sessions and at her school's robot club, which involves the ROBOTC language and Vex robots.

Ironically, if I still lived in Britain then my younger daughter would already be learning computer science at school too, as in 2014 the UK Government made the subject mandatory for all children from five years' old. Not that this step came easily: apparently there was a struggle in the lead up to the curriculum change to find enough qualified teachers. Clearly, the effort involved in establishing such as policy suggests the level of importance placed upon it.

In contrast to the UK, New Zealand has slipped in the educational avant-garde. Digital technology is not a compulsory subject here and many lower-decile schools use old, unsupported software such as the Windows XP operating system. A combination of untrained teachers and parental attitudes is being blamed for a decline in the number of programmers in the country. I know of one Auckland-based technology centre where the majority of hands-on coders are predominantly recruited from overseas and incidentally - unlike the less-technical roles - are mostly men. Of course, the shortage could be partly due to the enticement of kiwi developers to the far larger and better-paid job markets in Australia, the UK and USA, but even so it seems clear that there is a definitive deficiency in New Zealand-born programmers.

Luckily, programming is a discipline where motivated children can learn coding for free, with online resources provided by numerous companies all the way up to Google and Microsoft. However, this presupposes both adequate internet access and parental support, or at least approval. If the current generation of parents don't understand the value of the subject, then it's highly unlikely many children will pick up the bug (ahem, that's a programming pun, of sorts.)

Compared to the BASIC and Logo languages available in my teenage years there is now a bewildering array of computer languages, interfaces and projects that teach the rudiments of programming, with colourful audio-visual interfaces such as Alice, Scratch (a bit like a virtual lego), CiMPLE, Kodu, etc, all intended for a pre-teen audience. Of course, they are far removed from complexity of professional languages such as the C family or Java - I have to say that Object-Orientated Programming was certainly a bit of a shock for me - but these applications are more about whetting the appetite and generating quick results so as to maintain interest.

So what are the reasons why learning to code might be a good idea for young children, rather than just teaching them to use software such as the ubiquitous Microsoft Office? Might not the first three or four years at school be better spent learning the traditional basics of reading, writing and arithmetic? After all, this period is crucial to gaining the frameworks of grammar and mathematics, which in their own way provide a solid background for some of the key elements of coding such as syntax, operators and of course spelling!

Apart from the obvious notion that the demand for programmers is likely to increase in the next generation, not just for computers and touch devices, but for all sorts of consumer items from cars to watches (at least until computers become sophisticated enough -and fast enough - for programming in everyday language) there are benefits and skills useful in the wider world. The following reasons are probably just the tip of the iceberg:
  • It exercises the mind, sharpening analytical thinking and trouble-shooting abilities
  • Coding can be thought of as akin to learning a foreign language or how to read music, so may hone those skills
  • Programming can generate a fruitful combination of creative and mathematical skills, which is difficult to obtain in most other subjects
  • This is the age of information economies, so programming is the largest employment growth sector in much of the developed world.
One worrying trend is the decline in the number of female programmers over the past quarter century. Perhaps this isn't surprising in the game coding field, considering that the vast majority of its themes are centered on the military and fantasy violence. But then doesn't this extremely popular, highly visible and decidedly lucrative sector of contemporary computing bolster the notion widespread among women that leisure-time computing is primarily the domain of socially-inadequate young men?

Research suggests that women consider computers as a tool to aid numerous disciplines whilst men look upon them more as an end in themselves. Surely learning to use them in-depth at an early age could help achieve a more liberal attitude from either extreme? Computers - and indeed the increasing number of programmable consumer devices - are not going away any time soon. If the near future of humanity will rely ever more closely on interfacing with these machines, then shouldn't as many of us as possible gain some understanding of what goes on 'under the hood'? After all, there has to be someone out there who can make a less buggy operating system than Windows 10!