Tuesday 15 March 2016

Pre-teen coding electronic turtles: should children learn computer programming?

Way back in the mists of time when I was studying computer science at senior school, I was part of the first year at my (admittedly rural and far from innovative) school to use actual computers. Previous years had been stuck in the realm of punched tape and other such archaic wonders, so I was lucky to have access to the real thing. Now that we use smartphones with several hundred thousand times more memory than the first computer I owned - a Sinclair ZX Spectrum 48, if you're interested - I'd like to ask is it worthwhile teaching primary school children programming skills rather than just learning front-end interfaces?

I remember being amazed to learn that about the same time as I was getting to grips with the Sinclair version of BASIC, infants in Japan were already being taught the rudiments of programming via turtle robots and Logo. These days of course, children learn to use digital devices pretty much from the egg, but back then it seemed progressive in the extreme. My elder (but still pre-teen) daughter has so far dabbled with programming, mostly using drag and drop interfaces in game coding sessions and at her school's robot club, which involves the ROBOTC language and Vex robots.

Ironically, if I still lived in Britain then my younger daughter would already be learning computer science at school too, as in 2014 the UK Government made the subject mandatory for all children from five years' old. Not that this step came easily: apparently there was a struggle in the lead up to the curriculum change to find enough qualified teachers. Clearly, the effort involved in establishing such as policy suggests the level of importance placed upon it.

In contrast to the UK, New Zealand has slipped in the educational avant-garde. Digital technology is not a compulsory subject here and many lower-decile schools use old, unsupported software such as the Windows XP operating system. A combination of untrained teachers and parental attitudes is being blamed for a decline in the number of programmers in the country. I know of one Auckland-based technology centre where the majority of hands-on coders are predominantly recruited from overseas and incidentally - unlike the less-technical roles - are mostly men. Of course, the shortage could be partly due to the enticement of kiwi developers to the far larger and better-paid job markets in Australia, the UK and USA, but even so it seems clear that there is a definitive deficiency in New Zealand-born programmers.

Luckily, programming is a discipline where motivated children can learn coding for free, with online resources provided by numerous companies all the way up to Google and Microsoft. However, this presupposes both adequate internet access and parental support, or at least approval. If the current generation of parents don't understand the value of the subject, then it's highly unlikely many children will pick up the bug (ahem, that's a programming pun, of sorts.)

Compared to the BASIC and Logo languages available in my teenage years there is now a bewildering array of computer languages, interfaces and projects that teach the rudiments of programming, with colourful audio-visual interfaces such as Alice, Scratch (a bit like a virtual lego), CiMPLE, Kodu, etc, all intended for a pre-teen audience. Of course, they are far removed from complexity of professional languages such as the C family or Java - I have to say that Object-Orientated Programming was certainly a bit of a shock for me - but these applications are more about whetting the appetite and generating quick results so as to maintain interest.

So what are the reasons why learning to code might be a good idea for young children, rather than just teaching them to use software such as the ubiquitous Microsoft Office? Might not the first three or four years at school be better spent learning the traditional basics of reading, writing and arithmetic? After all, this period is crucial to gaining the frameworks of grammar and mathematics, which in their own way provide a solid background for some of the key elements of coding such as syntax, operators and of course spelling!

Apart from the obvious notion that the demand for programmers is likely to increase in the next generation, not just for computers and touch devices, but for all sorts of consumer items from cars to watches (at least until computers become sophisticated enough -and fast enough - for programming in everyday language) there are benefits and skills useful in the wider world. The following reasons are probably just the tip of the iceberg:
  • It exercises the mind, sharpening analytical thinking and trouble-shooting abilities
  • Coding can be thought of as akin to learning a foreign language or how to read music, so may hone those skills
  • Programming can generate a fruitful combination of creative and mathematical skills, which is difficult to obtain in most other subjects
  • This is the age of information economies, so programming is the largest employment growth sector in much of the developed world.
One worrying trend is the decline in the number of female programmers over the past quarter century. Perhaps this isn't surprising in the game coding field, considering that the vast majority of its themes are centered on the military and fantasy violence. But then doesn't this extremely popular, highly visible and decidedly lucrative sector of contemporary computing bolster the notion widespread among women that leisure-time computing is primarily the domain of socially-inadequate young men?

Research suggests that women consider computers as a tool to aid numerous disciplines whilst men look upon them more as an end in themselves. Surely learning to use them in-depth at an early age could help achieve a more liberal attitude from either extreme? Computers - and indeed the increasing number of programmable consumer devices - are not going away any time soon. If the near future of humanity will rely ever more closely on interfacing with these machines, then shouldn't as many of us as possible gain some understanding of what goes on 'under the hood'? After all, there has to be someone out there who can make a less buggy operating system than Windows 10!

Wednesday 24 February 2016

Drowning by numbers: how to survive the information age

2002 was a big year. According to some statistics, it was the year that digital storage capacity overtook analogue: books gave way to online information; binary became king. Or hyperbole to that effect. Between email, social media, websites and the interminable selfie, we are all guilty to greater or lesser extent of creating data archived in digital format. The human race now generates zettabytes of data every year (a zettabyte being a trillion gigabytes, in case you're still dealing in such minute amounts of data).

So what's so bad about that? More and more we rely on syntheses of information in order to keep up with the exponential amount of knowledge revealed to our species by the scientific and other methods. Counter to Plato's 2400 year-old dialogue Phaedrus, we can no longer work out everything important for ourselves; instead, we must rely on analysis and results created by other, often long-dead, humans. Even those with superb memories cannot retain more than a miniscule fraction of the information known about even one discipline. In addition, we can now create data-rich types of content undreamed of in Plato's time. Some, MRSI medical scans being an ad-hoc example , may require long-term storage. If quantum computing becomes mainstream, then that will presumably generate an exponential growth in data.

What then, are the primary concerns of living in a society that has such high demands for the creation and safe storage of data? I've been thinking about this for a while now and the following is my analysis of the situation.

1. Storage. In recent years it has become widely known that CDs and to a lesser extent DVDs are subject to several forms of disk rot. I've heard horror stories of people putting their entire photo and/or video collection onto portable hard drives, only for these to fail within a year or two, the data being irrevocably lost. With the advent of cloud storage, this lessens the issue, but not completely. Servers are still subject to all sorts of problems, with even enterprise-level solutions suffering due to insufficient disaster recovery and resilience (to use terms us web developers use). I'm not saying audio tapes, vinyl records and VHS were any better, far from it, but there is a lot less data stored in these formats. There are times when good old-fashioned paper still rules - as it still does in the legal and compliance sectors I've had contact with.

2. Security and privacy. As for safety, the arms race against hackers, etal, is well and truly engaged. Incompetence also has its place. When living in the UK I once received a letter stating that my children's social services records, including their contact details, had become publicly available. This turned out to be due to loss of a memory stick containing database passwords. As for identify theft, well, let's just say that Facebook is a rude word. I managed to track down an old friend after nearly twenty years' incommunicado, finding details such as his address, wife's name and occupation, etc, mostly via Facebook, in less than half an hour. Lucky I'm not a stalker, really!

Even those who avoid social media may find themselves with some form of internet presence. I had a friend who signed a political petition on paper and then several years' later found his name on a petition website. Let's hope it was the sort of campaign that didn't work against his career - these things can happen.

And then there's the fact that being a consumer means numerous manufacturers and retail outlets will have your personal details on file. I've heard that in some countries if you - and more particularly your smartphone - enter a shopping mall, you may get a message saying that as a loyal customer of a particular store there is a special sale on just for you, the crunch being that you only have a limited time, possibly minutes, to get to the outlet and make a purchase. Okay, that doesn't sound so bad, but the more storage locations that contain your personal details, the greater the chance they will be used against you. Paranoid? No, just careful. Considering how easy it was for me to become a victim of financial fraud about fifteen years ago, I have experience of these things.

As any Amazon customer knows, you are bombarded with offers tailored via your purchase record. How long will it be before smart advertising billboards recognise your presence, as per Steven Spielberg's Minority Report? Yes, the merchandiser's dream of ultimate granularity in customer targeting, but also a fundamental infringement of their anonymity. Perhaps everyone will end up getting five seconds' of public fame on a daily basis, thanks to such devices. Big Brother is truly watching you, even if most of the time it's for the purpose of flogging you consumer junk.

3. Efficiency. There are several million blog posts each day, several hundred billion emails and half a billion daily tweets. How can we possibly extract the wheat from the chaff (showing my age with that idiom), if we spend so much time ploughing through social media? I, for one, am not convinced there's much worth in a lot of this new-fangled stuff anyway (insert smiley here). I really don't want to know what friends, relatives or celebrities had for breakfast or which humorous cat videos they've just watched. Of course it's subjective, but I think there's a good case for claiming the vast majority of digital content is a complete load of rubbish. So how can we live useful, worthwhile or even fulfilled lives when surrounded by it? In other words, how do we find the little gems of genuine worth among the flood of noise? It seems highly probable that a lot of the prominent nonsense theories such as moon landing hoax wouldn't be anywhere near as popular if it wasn't for the World Wide Web disseminating them.

4. Fatigue and overload. Research has shown that our contemporary news culture (short snippets repeated ad nauseum over the course of a day or so) leads to a weary attitude. Far from empowering us, bombarding everyone with the same information, frequently lacking context, can rapidly lead to antipathy. Besides which, if information is inaccurate in the first place it can quickly achieve canonical status as it gets spread across the digital world. As for the effect all this audio-visual over-stimulation is having on children's attention spans...now where was I?

5. The future. So are there any solutions to these issues? I assume as we speak there are research projects aiming to develop heuristic programs that are the electronic equivalent of a personal assistant. If a user carefully builds their personality profile, then the program would be expected to extract nuggets of digital gold from all the sludge. Yet even personally-tailored smart filters that provide daily doses of information, entertainment, commerce and all points in between have their own issues. For example, unless the software is exceptional (i.e. rather more advanced than anything commercially available today) you would probably miss out on laterally- or tangentially-associated content. Even for scientists, this sort of serendipity is a great boon to creativity, but is rarely found in any form of machine intelligence. There's also the risk that corporate or governmental forces could bias the programming…or is that just the paranoia returning? All I can say: knowledge is power.

All in all, this sounds a touch pessimistic. I think Arthur C. Clarke once raised his concern about the inevitable decay within societies that overproduced information. The digital age is centered on the dissemination of content that is both current and popular, but not necessarily optimal. We are assailed by numerous sources of data, often created for purely commercial purposes; rarely for anything of worth. Let's hope we don't end up drowning in videos of pesky kittens. Aw, aren't they cute, though?