Sunday 18 March 2018

Smart phone, dumb people: is technology really reducing our intelligence?

IQ testing is one of those areas that always seems to polarise opinion, with many considering it useful for children as long as it is understood to be related to specific areas of intelligence rather than a person's entire intellectual capabilities. However, many organisations, including some employers, use IQ tests as a primary filter, so unfortunately it cannot be ignored as either irrelevant or outdated. Just as much of the education system is still geared towards passing exams, IQ tests are seen as a valid method to sort potential candidates. They may not be completely valid, but are used as a short-cut tool that serves a limited purpose.

James Flynn of the University of Otago in New Zealand has undertaken long-term research into intelligence, so much so that the 'Flynn Effect' is the name given to the worldwide increase in intelligence since IQ tests were developed over a century ago. The reasons behind this increase are not fully understood, but are probably due to the complex interaction of numerous environmental factors such as enriched audio-visual stimulation, better - and more interactive - education methods, even good artificial lighting for longer hours of reading and writing. It is interesting that as developing nations rapidly gain these improvements to society and infrastructure, their average IQ shows a correspondingly rapid increase when compared to the already developed West and its more staid advancement.

Research suggests that while young children's IQ continues to increase in developed nations, albeit at a reduced rate, the intelligence of teenagers in these countries has been in slow decline over the past thirty years. What is more, the higher the income decile, the larger the decrease. This hints that the causes are more predominant in middle-class lifestyles; basically, family wealth equates to loss of IQ! Data for the UK and Scandinavian countries indicates that a key factor may be the development of consumer electronics, starting with VCRs, games consoles and home computers and now complemented by smart phones, tablets and social media. This would align with the statistics, since the drop is highest among children likely to have greatest access to the devices. So could it be true that our digital distractions are dumbing us down?

1) Time

By spending more time on electronic devices, children live in a narrower world, where audio-visual stimulation aims for maximum enjoyment with minimal effort, the information and imagery flying by at dizzying speed. This isn't just the AV presentation of course: digital content itself closely aligns to pop cultural cornerstones, being glamorous, gimmicky, transient and expendable. As such, the infinitesimally small gradations of social status and friendship that exist amongst children and teenagers requires enormous effort on their part to maintain a constant online presence, both pro-actively and reactively responding to their peers' (and role models') endless inanities.

The amount of effort it would take to filter this is mind-boggling and presumably takes away a lot of time that could be much better spent on other activities. This doesn't have to be something as constructive as reading or traditional studying: going outdoors has been shown to have all sorts of positive effects, as described in Richard Louv's 2005 best-seller Last Child in the Woods: Saving Our Children From Nature-Deficit Disorder.

Studies around the world have shown that there are all sorts of positive effects, including on mood, by mere immersion in nature, not just strenuous physical activity. Whether humans have an innate need for observing the intricate fractal patterns of vegetation (grass lawns and playing fields have been found to be ineffective) or whether it's noticing the seemingly unorganised behaviour of non-human life forms, the Japanese government have promoted Shinrin-yoku or 'forest air bathing' as a counterbalance to the stresses of urbanised existence. It sounds a bit New Age, but there is enough research to back up the idea that time spent in the natural environment can profoundly affect us.

Meanwhile, other nations appear to have given in, as if admitting that their citizens have turned into digitally-preoccupied zombies. Last year, the Dutch town of Bodegraven decided to reduce accidents to mobile-distracted pedestrians by installing red and green LED strips at a busy road junction, so that phone users could tell if it was safe to cross without having to look up!

2) Speed

One obvious change in the past four decades has been in the increased pace of life in developed nations. As we have communication and information retrieval tools that are relatively instantaneous, so employers expect their workforce to respond in tune with the speed of these machines. This act-now approach hardly encourages in-depth cogitation but relies upon seat-of-the-pants thinking, which no doubt requires a regular input of caffeine and adrenaline. The emphasis on rapid turnaround, when coupled with lack of patience, has led to an extremely heavy reliance on the first page of online search results: being smart at sifting through other people's data is fast becoming a replacement for original thought, as lazy students have discovered and no doubt as many school teachers and university lecturers could testify.

Having a convenient source of information means that it is easier for anyone to find a solution to almost anything rather than working something out for themselves. This can lead to a decline in initiative, something which separates thought leaders from everyone else. There is a joy to figuring out something, which after all is a key motivation for many STEM professionals. Some scientists and engineers have explained that being able to understand the inner workings of common objects was a key component of their childhood, leading to an obvious career choice. For example, New Zealand-based scientist and science communicator Michelle Dickinson (A.K.A. Nanogirl) spent her childhood dismantling and repairing such disparate household items as home computers and toasters, echoing Ellie Arroway, the heroine in Carl Sagan's novel Contact, who as a child repaired a defective valve radio before going on to become a radio astronomer.

Of course, these days it would be more difficult to repair contemporary versions of these items, since they are often built so that they cannot even be opened except in a machine shop. Laptops and tablets are prime examples and I've known cases where the likes of Microsoft simply replace rather than repair a screen-damaged device. When I had a desktop computer I frequently installed video and memory cards, but then how-to videos are ubiquitous on YouTube. The latest generation of technology doesn't allow for such do-it-yourself upgrades, to the manufacturer's advantage and the consumer's detriment. As an aside, it's worrying that so many core skills such as basic repairs or map navigation are being lost; in the event of a massive power and/or network outage due to the likes of a solar flare, there could be a lot of people stuck in headless chicken mode. Squawk!

3) Quality

While the World Wide Web covers every subject imaginable (if being of immensely variable quality), that once fairly reliable source of information, television, has largely downgraded the sometimes staid but usually authoritative documentaries of yesteryear into music promo-style pieces of infotainment. Frequently unnecessary computer graphics and overly-dramatic reconstructions and voice overs are interwoven between miniscule sound bites from the experts, the amount of actual information being conveyed reduced to a bare minimum.

In many cases, the likes of the Discovery Channel are even disguising pure fiction as fact, meaning that children - and frequently adults - are hard-placed to differentiate nonsense from reality. This blurring of demarcation does little to encourage critical or even sustained thinking; knowledge in the media and online has been reduced to a consumer-led circus with an emphasis on marketing and hype. Arguably, radio provides the last media format where the majority of content maintains a semblance of sustained, informative discussion on STEM issues.

4) Quantity

The brave new world of technology that surrounds us is primarily geared towards consumerism; after all, even social media is fundamentally a tool for targeted marketing. If there's one thing that manufacturers do not want it is inquisitive customers, since the buzzwords and hype often hide a lack of quality underneath. Unfortunately, the ubiquity of social media and online news in general means that ridiculous ideas rapidly become must-have fads.

Even such commodities as food and drink have become mired with trendy products like charcoal-infused juice, unpasteurised milk and now raw water, attracting the same sort of uncritical punters who think that nutrition gurus know what really constituted human diets in the Palaeolithic. The fact that some of Silicon Valley's smartest have failed to consider the numerous dangers of raw water shows that again, analytical thinking is taking a back seat to whatever is the latest 'awesome' and 'cool' lifestyle choice.

Perhaps then certain types of thinking are becoming difficult to inculcate and sustain in our mentally vulnerable teenagers due to the constant demands of consumerism and its oh-so-seductive delivery channels. Whether today's youth will fall into the viewing habits of older generations, such as the myriad of 'food porn' shows remains to be seen; with so much on offer, is it any wonder people spend entire weekends binge watching series, oblivious to the wider world?

The desire to fit into a peer group and not be left behind by lack of knowledge about some trivia or other, for example about the latest series on Netflix, means that so much time is wasted on activities that only require a limited number of thought processes. Even a good memory isn't required anymore, with electronic calendars and calculators among the simplest of tools available to replace brain power. Besides which, the transience in popular culture means there's little need to remember most of what happened last week!

Ultimately, western nations are falling prey to the insular decadence well known from history as great civilisations pass their prime. Technology and the pace of contemporary life dictated by it must certainly play a part in any decline in IQ, although the human brain being what it is - after all, the most complex object in the known universe - I wouldn't dare guess how much is due to them.

There are probably other causes that are so familiar as to be practically invisible. Take for instance background noise, both visual and aural, which permeates man-made environments. My commute yesterday offers a typical example of the latter sort, with schoolchildren on my train playing loud music on their phones that could be heard some metres away to the two building sites I walked by, plus a main road packed with vehicles up to the size of construction trucks. As a final bonus, I passed ten shops and cafes that were all playing loud if inane pop music that could be heard on the street, through open doors. Gone are the days of tedious elevator muzak: even fairly expensive restaurants play material so fast and loud it barely constitutes the term 'background music'. If such sensory pollution is everywhere, when do we get to enjoy quality cogitation time?

If you think that consumerism isn't as all-encompassing as I state, then consider that the USA spends more per year on pet grooming than it does on nuclear fusion research. I mean, do you honestly really need a knee-high wall-mounted video phone to keep in touch with your dog or cat while you're at work? Talking of which, did you know that in 2015 the Kickstarter crowdfunding platform's Exploding Kittens card game raised almost US$9 million in less than a month? Let's be frank, we've got some work to do if we are to save subsequent generations from declining into trivia-obsessed sheeple. Baa!

Saturday 3 March 2018

Hi-tech roadblock: is some upcoming technology just too radical for society to handle?

Many people still consider science to be a discipline wholly separate from other facets of human existence. If there's one thing I've learnt during the eight years I've been writing this blog it's that there are so many connections between STEM and society that much of the scientific enterprise cannot be considered in isolation.

Cutting-edge theories can take a long time to be assimilated into mainstream society, in some cases their complexity (for example, quantum mechanics) or their emotive value (most obviously, natural selection) meaning that they get respectively misinterpreted or rejected. New technologies emerge out of scientific principles and methodology, if not always from the archetypal laboratory. STEM practitioners are sometimes the driving force behind new devices aimed at the mass market; could it be that their enthusiasm and in-depth knowledge prohibits them from realising that the world isn't yet ready for their brainchild? In some cases the "Hey, wow, cool, look what we can do!" excitement masks the elaborate web of socio-economic factors that mean the invention will never be suitable for a world outside the test environment.

There are plenty of examples of pioneering consumer-oriented technology that either could never fit into its intended niche (such as the UK's Sinclair C5 electric vehicle of the mid-1980s), or missed public demand, the Sony Betamax video recorder having been aimed at home movie makers rather than audiences just wanting to watch pre-recorded material (hence losing out to the inferior-quality VHS format).

At the opposite pole, mobile phone manufacturers in the early 1990s completely underestimated the public interest in their products, which were initially aimed at business users. Bearing in mind that there is considerable worldwide interest in certain new radical technologies that will presumably be aimed at the widest possible market, I thought I'd look at their pros and cons so as to ascertain whether non-STEM factors are likely to dictate their fortunes.

1) Driverless automobiles

There has been recent confirmation that in the next month or so vehicle manufacturers may be able to test their autonomous cars on California's state highways. With Nissan poised to test self-driving taxis in time for a 2020 launch, the era of human drivers could be entering its last few decades. Critics of the technology usually focus on the potential dangers, as shown by the system's first fatality in May 2016.

But what of the reverse? Could the widespread introduction of driverless road vehicles - once the public is convinced of their superior safety attributes - be opposed by authorities or multinational corporations? After all, in 2016 almost 21% of drivers in the USA received a speeding ticket, generating enormous revenue. Exact figures for these fines are unknown, but estimates for annual totals usually centre around six billion dollars. In addition to the fines themselves adding to national or local government coffers (for all sorts of traffic misdemeanours including parking offences), insurance companies benefit from the increase in premiums for drivers with convictions.

Whether vested interests would find the economic losses suitably offset by the prevention of thousands of deaths due to driver error remains to be seen. This stance might seem unjustly anti-corporate, but when the past half-century's history of private profit ahead of public interest is examined (for example, the millions paid by the fossil fuel and tobacco industries to support their products) there are obvious precedents.

One key scientific method is parsimony, A.K.A. Occam's razor. According to this principle, the simplest explanation is usually the correct one, at least in classical science; quantum mechanics plays by its own rules. An example counter to this line of thought can be seen in the work of the statistician, geneticist and tobacco industry spokesman R.A. Fisher, a keen pipe smoker who argued that rather than a cause-and-effect between smoking and lung cancer, there was a more complicated correlation between people who were both genetically susceptible to lung disease and hereditarily predisposed to nicotine addiction! Cigarette, anyone?

As for relinquishing the steering wheel to a machine, I think that a fair proportion of the public enjoy the 'freedom' of driving and that a larger contingent than just boy racers won't give up manual control without a fight, i.e. state intervention will required to put safety ahead of individuality.

2) Extending human lifespan

It might seem odd that anyone would want to oppose technology that could increase longevity, but there would have to be some fairly fundamental changes to society to accommodate anything beyond the most moderate of extended lifespans. According to a 2009 report in The Lancet medical journal, about half of all children born since 2000 could reach their hundredth birthday.

Various reports state that from 2030-2050 - about as far in the future as anyone can offer realistic prognostication for - the proportion of retirees, including far greater numbers of Alzheimer and dementia sufferers, will require many times more geriatricians than are practicing today. The ratio of working-age population to retiree will also drop, from 5:1 to 3:1 in the case of the USA, implying a far greater pensions crisis than that already looming. Numerous companies are using cutting-edge biotech to find cell renewal techniques, including the fifteen teams racing for the Palo Alto Longevity Prize, so the chances of a breakthrough are fairly high.

Japan offers a hint of how developed nations will alter once extended lifespans are available on a widespread basis: one-third of the population are over sixty and one in eight over seventy-five. In 2016 its public debt was more double the GDP and Japan also faces low labour productivity compared to other nations within the OECD. Figures such as these show that governments will find it economically challenging to support the corresponding population demographics, even if many of the healthcare issues usually associated with the elderly are diminished.

However, unlike driverless cars it's difficult to conceive of arguments in favour of legislation to prevent extended lifespans. If all nations achieved equilibrium in economy, technology and demographics there would be far fewer issues, but the gap between developed and developing nations is wide enough to deem that unlikely for many decades.

Discussions around quality of life for the elderly will presumably become more prominent as the age group gains as a proportion of the electorate. There are already various types of companion robots for those living alone, shaped anything from cats to bears to anthropomorphic designs such as the French Buddy and German Care-O-bot, the latter to my mind resembling a giant, mobile chess piece.

3) Artificial intelligence

I've already looked at international attitudes to the expansion and development of AI but if there's one thing most reports discuss it is the loss of jobs to even semi-intelligent machines. Even if there is a lower proportion of younger people, there will still be a need to keep the populace engaged, constructive or otherwise.

Surveys suggest that far from reducing working hours, information technology has caused employees in developed nations to spend more time outside work still working. For example, over half of all American and British employees now check their work email while on holiday. Therefore will governments be able to fund and organise replacement activities for an obsolete workforce, involving for example life-long learning and job sharing?

The old adage about idle hands rings true and unlike during the Great Depression, the sophistication of modern technology doesn't allow for commissioning of large-scale infrastructure projects utilising an unskilled labour pool. Granted that AI will generate new jobs in novel specialisms, but these will be a drop in the ocean compared to the lost roles. So far, the internet and home computing have created work, frequently in areas largely unpredicted by futurists, but it seems doubtful the trend will continue once heuristic machines and the 'internet of things' become commonplace.

So is it possible that governments will interfere with the implementation of cutting-edge technology in order to preserve the status quo, at least until the impending paradigm shift becomes manageable? I could include other examples, but many are developments that are more likely to incur the annoyance of certain industries rather than governments or societies as a whole. One of the prominent examples used for the up-coming Internet of Things is the smart fridge, which would presumably reduce grocery wastage - and therefore lower sales - via its cataloguing of use-by dates.

Also, if people can buy cheap (or dare I mention pirated?) plans for 3D printing at home, they won't have to repeatedly pay for physical goods, plus in some cases their delivery costs. Current designs that are available to print items for use around the home and garage range from soap dishes to measuring cups, flower vases to car windscreen ice scrapers. Therefore it's obvious that a lot of companies producing less sophisticated household goods are in for a sticky future as 3D printers become ubiquitous.

If these examples prove anything, it's that scientific advances cannot be treated in isolation when they have the potential of direct implementation in the real world. It's also difficult to predict how a technology developed for a single purpose can end up being co-opted into wholly other sectors, as happened with ferrofluids, designed to pump rocket fuel in the 1960's and now used in kinetic sculptures and toys. I've discussed the problems of attempting to predict upcoming technology and its future implementation and as such suggest that even if an area of technological progress follows some sort of predictable development, the wider society that encapsulates it may not be ready for its implementation.

It may not be future shock per se, but there are vested interests who like things just the way they are - certain technology may be just too good for the public. Who said anything about how much fossil fuel industries have spent denying man-made climate change? Or could it be time to consider Occam's razor again?