Showing posts with label World Wide Web. Show all posts
Showing posts with label World Wide Web. Show all posts

Thursday 27 September 2018

The anaesthetic of familiarity: how our upbringing can blind us to the obvious

In the restored Edwardian school classroom at Auckland's Museum of Transport and Technology (MOTAT) there is a notice on the wall stating 'Do not ask your teacher questions.' Fortunately, education now goes some way in many nations to emphasising the importance of individual curiosity rather than mere obedience to authority. Of course, there are a fair number of politicians and corporation executives who wish it wasn't so, as an incurious mind is easier to sway than a questioning one. As my last post mentioned, the World Wide Web can be something of an ally for them, since the 'winner takes all' approach of a review-based system aids the slogans and rhetoric of those who wish to control who we vote for and what we buy.

Even the most liberal of nations and cultures face self-imposed hurdles centered round which is the best solution and which is just the most familiar one from our formative years. This post therefore looks at another side of the subjective thinking discussed earlier this month, namely a trap that Richard Dawkins has described as the "anaesthetic of familiarity". Basically, this is when conventions are so accepted as to be seen as the primary option instead of being merely one of a series of choices. Or, as the British philosopher Susan Stebbing wrote in her 1939 book Thinking to Some Purpose: "One of the gravest difficulties encountered at the outset of the attempt to think effectively consists in the difficulty of recognizing what we know as distinguished from what we do not know but merely take for granted."

Again, this mind set is much loved by the manufacturing sector; in addition to such well-known ploys as deliberate obsolescence and staggered release cycles, there are worse examples, especially in everyday consumerism. We often hear how little nutritional value many highly processed foods contain, but think what this has done for the vitamin and mineral supplement industry, whose annual worldwide sales now approach US$40 billion!

Citizens of developed nations today face very different key issues to our pre-industrial ancestors, not the least among them being a constant barrage of decision making. Thanks to the enormous variety of choices available concerning almost every aspect of our daily lives, we have to consider everything from what we wear to what we eat. The deluge of predominantly useless information that we receive in the era of the hashtag makes it more difficult for us to concentrate on problem solving, meaning that the easiest way out is just to follow the crowd.

Richard Dawkins' solution to these issues is to imagine yourself as an alien visitor and then observe the world as a curious outsider. This seems to me to be beyond the reach of many, for whom daily routine appears to be their only way to cope. If this sounds harsh, it comes from personal experience; I've met plenty of people who actively seek an ostrich-like head-in-the-sand approach to life to avoid the trials and tribulations - as well as the wonders - of this rapidly-changing world.

Instead, I would suggest an easier option when it comes to some areas of STEM research: ensure that a fair proportion of researchers and other thought leaders are adult migrants from other nations. Then they will be able to apply an outside perspective, hopefully identifying givens that are too obvious to be spotted by those who have grown up with them.

New Zealand is a good example of this, with arguably its two best known science communicators having been born overseas: Siouxsie Wiles and Michelle Dickinson, A.K.A. Nanogirl. Dr Wiles is a UK-trained microbiologist at the University of Auckland. She frequently appears on Radio New Zealand as well as undertaking television and social media work to promote science in general, as well as for her specialism of fighting bacterial infection.

Dr Dickinson is a materials engineering lecturer and nanomaterials researcher at the University of Auckland who studied in both the UK and USA. Her public outreach work includes books, school tours and both broadcast and social media. She has enough sci-comm kudos that last year, despite not having a background in astronomy, she interviewed Professor Neil deGrasse Tyson during the Auckland leg of his A Cosmic Perspective tour.

The work of the above examples is proof that newcomers can recognise a critical need compared to their home grown equivalents. What is interesting is that despite coming from English-speaking backgrounds - and therefore with limited cultural disparity to their adoptive New Zealand - there must have been enough that was different to convince Doctors Wiles and Dickinson of the need for a hands-on, media savvy approach to science communication.

This is still far from the norm: many STEM professionals believe there is little point to promoting their work to the public except via print-based publications. Indeed, some famous science communicators such as Carl Sagan and Stephen Jay Gould were widely criticised during their lifetime by the scientific establishment for what were deemed undue efforts at self-promotion and the associated debasement of science by combining it with show business.

As an aside, I have to say that as brilliant as some volumes of popular science are, they do tend to preach to the converted; how many non-science fans are likely to pick up a book on say string theory, just for a bit of light reading or self-improvement (the latter being a Victorian convention that appears to have largely fallen from favour)? Instead, the outreach work of the expat examples above is aimed at the widest possible audience without over-simplification or distortion of the principles being communicated.

This approach may not solve all issues about how to think outside the box - scientists may be so embedded within their culture as to not realise that there is a box - but surely by stepping outside the comfort zone we grew up in we may find problems that the local population hasn't noticed?

Critical thinking is key to the scientific enterprise, but it would appear, to little else in human cultures. If we can find methods to avoid the anaesthetic of familiarity and acknowledge that what we deem of as normal can be far from optimal, then these should be promoted with all gusto. If the post-modern creed is that all world views are equally valid and science is just another form of culture-biased story-telling, then now more than ever we need cognitive tools to break through the subjective barriers. If more STEM professionals are able to cross borders and work in unfamiliar locations, isn’t there a chance they can recognise issues that fall under the local radar and so supply a new perspective we need if we are to fulfil our potential?

Wednesday 30 May 2018

Photons vs print: the pitfalls of online science research for non-scientists


It's common knowledge that school teachers and university lecturers are tired of discovering that their students' research is often limited to one search phrase on Google or Bing. Ignoring the minimal amount of rewriting that often accompanies this shoddy behaviour - leading to some very same-y coursework - one of the most important questions to arise is how easy is it to confirm the veracity of online material compared to conventionally-published sources? This is especially important when it comes to science research, particularly when the subject matter involves new hypotheses and cutting-edge ideas.

One of the many problems with the public's attitude to science is that it is nearly always thought of as an expanding body of knowledge rather than as a toolkit to explore reality. Popular science books such as Bill Bryson's 2003 best-seller A Short History of Nearly Everything follow this convention, disseminating facts whilst failing to illuminate the methodologies behind them. If non-scientists don't understand how science works is it little wonder that the plethora of online sources - of immensely variable quality - can cause confusion?

The use of models and the concurrent application of two seemingly conflicting theories (such as Newton's Universal Gravitation and Einstein's General Theory of Relativity) can only be understood with a grounding in how the scientific method(s) proceed. By assuming that scientific facts are largely immutable, non-scientists can become unstuck when trying to summarise research outcomes, regardless of the difficulty in understanding the technicalities. Of course this isn't true for every theory: the Second Law of Thermodynamics is unlikely to ever need updating; but as the discovery of dark energy hints, even Einstein's work on gravity might need amending in future. Humility and caution should be the bywords of hypotheses not yet verified as working theories; dogma and unthinking belief have their own place elsewhere!

In a 1997 talk Richard Dawkins stated that the methods of science are 'testability, evidential support, precision, quantifiability, consistency, intersubjectivity, repeatability, universality, and independence of cultural milieu.' The last phrase implies that the methodologies and conclusions for any piece of research should not differ from nation to nation. Of course the real world intrudes into this model and so culture, gender, politics and even religion play their part as to what is funded and how the results are presented (or even which results are reported and which obfuscated).

For those who want to stay ahead of the crowd by disseminating the most recent breakthroughs it seems obvious that web resources are far superior to most printed publications, professional journals excepted - although the latter are rarely suitable for non-specialist consumption. The expenses associated with producing popular science books means that online sources are often the first port of call.

Therein lies the danger: in the rush to skim seemingly inexhaustible yet easy to find resources, non-professional researchers frequently fail to differentiate between articles written by scientists, those by journalists with science training, those by unspecialised writers, largely on general news sites, and those by biased individuals. It's usually quite easy to spot material from cranks, even within the quagmire of the World Wide Web (searching for proof that the Earth is flat will generate tens of millions of results) but online content written by intelligent people with an agenda can be more difficult to discern. Sometimes, the slick design of a website offers reassurance that the content is more authentic than it really is, the visual aspects implying an authority that is not justified.

So in the spirit of science (okay, so it's hardly comprehensive being just a single trial) I recently conducted a simple experiment. Having read an interesting hypothesis in a popular science book I borrowed from the library last year, I decided to see what Google's first few pages had to say on the same subject, namely that the Y chromosome has been shrinking over the past few hundred million years to such an extent that its days - or in this case, millennia - are numbered.

I had previously read about the role of artificial oestrogens and other disruptive chemicals in the loss of human male fertility, but the decline in the male chromosome itself was something new to me. I therefore did a little background research first. One of the earliest sources I could find for this contentious idea was a 2002 paper in the journal Nature, in which the Australian geneticist Professor Jennifer Graves described the steady shrinking of the Y chromosome in the primate order. Her extrapolation of the data, combined with the knowledge that several rodent groups have already lost their Y chromosome, suggested that the Home sapiens equivalent has perhaps no more than ten million years left before it disappears.

2003 saw the publication of British geneticist Bryan Sykes' controversial book Adam's Curse: A Future Without Men. His prediction based on the rate of atrophy in the human Y chromosome was that it will only last another 125,000 years. To my mind, this eighty-fold difference in timescales suggests that for these early days in its history, very little of the hypothesis could be confirmed with any degree of certainty.

Back to the experiment itself. The top results for 'Y chromosome disappearing' and similar search phrases lead to articles published between 2009 and 2018. They mostly fall into one of two categories: (1) that the Y chromosome is rapidly degenerating and that males, at least of humans and potentially all other mammal species, are possibly endangered; and (2) that although the Y chromosome has shrunk over the past few hundred million years it has been stable for the past 25 million and so is no longer deteriorating. A third, far less common category, concerns the informal polls taken of chromosomal researchers, who have been fairly evenly divided between the two opinions and thus nicknamed the "leavers" and the "remainers". Considering the wildly differing timescales mentioned above, perhaps this lack of consensus is proof of science in action; there just hasn't been firm enough evidence for either category to claim victory.

What is common to many of the results is that inflammatory terms and hyperbole are prevalent, with little in the way of caution you would hope to find with cutting-edge research. Article titles include 'Last Man on Earth?', 'The End of Men' and 'Sorry, Guys: Your Y Chromosome May Be Doomed ', with paragraph text contain provocative phrases such as 'poorly designed' and 'the demise of men'. This approach is friendly to organic search at the same time as amalgamating socio-political concerns with the science.

You might expect that the results would show a change in trend of time, first preferring one category and then the other, but this doesn't appear to be the case. Rearranged in date order, the search results across the period 2009-2017 include both opinions running concurrently. This year however has seen a change, with the leading 2018 search results so far only offering support to the rapid degeneration hypothesis. The reason for this difference is readily apparent: publication of a Danish study that bolsters support for it. This new report is available online, but is difficult for a non-specialist to digest. Therefore, most researchers such as myself would have to either rely upon second-hand summaries or, if there was enough time, wait for the next popular science book that discusses it in layman's terms.

As it is, I cannot tell from my skimming approach to the subject whether the new research is thorough enough to be completely reliable. For example, it only examined the genes of sixty-two Danish men, so I have no idea if this is a large enough sample to be considered valid beyond doubt. However, all of the 2018 online material I read accepted the report without question, which at least suggests that after a decade and a half of vacillating between two theories, there may now be an answer. Even so, by examining the content in the "remainers" category, I wonder how the new research confirms a long term trend rather than short term blip in chromosomal decline. I can't help thinking that the sort of authoritative synthesis found in the better sort of popular science books would answer these queries, such is my faith in the general superiority of print volumes!

Of course books have been known to emphasise pet theories and denigrate those of opponents, but the risk of similar issues for online content is far greater. Professor Graves' work seems to dominate the "leavers" category, via her various papers subsequent to her 2002 original, but just about every reference to them is contaminated with overly emotive language. I somehow doubt that if her research was only applicable to other types of animals, say reptiles, there would be nearly so many online stories covering it, let alone the colourful phrasing that permeates this topic. The history of the Y chromosome is as extraordinary as the chromosome itself, but treating serious scientific speculation - and some limited experimental evidence - with tabloid reductionism and show business hoopla won't help when it comes to non-specialists researching the subject.

There may be an argument here for the education system to systematically teach such basics as common sense and rigour, in the hopes of giving non-scientists a better chance of detecting baloney. This of course includes the ability to accurately filter online material during research. Personally, I tend to do a lot of cross-checking before committing to something I haven't read about on paper. If even such highly-resourced and respected websites as the BBC Science News site can make howlers (how about claiming that chimpanzees are human ancestors?) why should we take any of these resources on trust? Unfortunately, the seductive ease with which information can be found on the World Wide Web does not in any way correlate with its quality. As I found out with the shrinking Y chromosome hypothesis, there are plenty of traps for the unwary.

Wednesday 24 February 2016

Drowning by numbers: how to survive the information age

2002 was a big year. According to some statistics, it was the year that digital storage capacity overtook analogue: books gave way to online information; binary became king. Or hyperbole to that effect. Between email, social media, websites and the interminable selfie, we are all guilty to greater or lesser extent of creating data archived in digital format. The human race now generates zettabytes of data every year (a zettabyte being a trillion gigabytes, in case you're still dealing in such minute amounts of data).

So what's so bad about that? More and more we rely on syntheses of information in order to keep up with the exponential amount of knowledge revealed to our species by the scientific and other methods. Counter to Plato's 2400 year-old dialogue Phaedrus, we can no longer work out everything important for ourselves; instead, we must rely on analysis and results created by other, often long-dead, humans. Even those with superb memories cannot retain more than a miniscule fraction of the information known about even one discipline. In addition, we can now create data-rich types of content undreamed of in Plato's time. Some, MRSI medical scans being an ad-hoc example , may require long-term storage. If quantum computing becomes mainstream, then that will presumably generate an exponential growth in data.

What then, are the primary concerns of living in a society that has such high demands for the creation and safe storage of data? I've been thinking about this for a while now and the following is my analysis of the situation.

1. Storage. In recent years it has become widely known that CDs and to a lesser extent DVDs are subject to several forms of disk rot. I've heard horror stories of people putting their entire photo and/or video collection onto portable hard drives, only for these to fail within a year or two, the data being irrevocably lost. With the advent of cloud storage, this lessens the issue, but not completely. Servers are still subject to all sorts of problems, with even enterprise-level solutions suffering due to insufficient disaster recovery and resilience (to use terms us web developers use). I'm not saying audio tapes, vinyl records and VHS were any better, far from it, but there is a lot less data stored in these formats. There are times when good old-fashioned paper still rules - as it still does in the legal and compliance sectors I've had contact with.

2. Security and privacy. As for safety, the arms race against hackers, etal, is well and truly engaged. Incompetence also has its place. When living in the UK I once received a letter stating that my children's social services records, including their contact details, had become publicly available. This turned out to be due to loss of a memory stick containing database passwords. As for identify theft, well, let's just say that Facebook is a rude word. I managed to track down an old friend after nearly twenty years' incommunicado, finding details such as his address, wife's name and occupation, etc, mostly via Facebook, in less than half an hour. Lucky I'm not a stalker, really!

Even those who avoid social media may find themselves with some form of internet presence. I had a friend who signed a political petition on paper and then several years' later found his name on a petition website. Let's hope it was the sort of campaign that didn't work against his career - these things can happen.

And then there's the fact that being a consumer means numerous manufacturers and retail outlets will have your personal details on file. I've heard that in some countries if you - and more particularly your smartphone - enter a shopping mall, you may get a message saying that as a loyal customer of a particular store there is a special sale on just for you, the crunch being that you only have a limited time, possibly minutes, to get to the outlet and make a purchase. Okay, that doesn't sound so bad, but the more storage locations that contain your personal details, the greater the chance they will be used against you. Paranoid? No, just careful. Considering how easy it was for me to become a victim of financial fraud about fifteen years ago, I have experience of these things.

As any Amazon customer knows, you are bombarded with offers tailored via your purchase record. How long will it be before smart advertising billboards recognise your presence, as per Steven Spielberg's Minority Report? Yes, the merchandiser's dream of ultimate granularity in customer targeting, but also a fundamental infringement of their anonymity. Perhaps everyone will end up getting five seconds' of public fame on a daily basis, thanks to such devices. Big Brother is truly watching you, even if most of the time it's for the purpose of flogging you consumer junk.

3. Efficiency. There are several million blog posts each day, several hundred billion emails and half a billion daily tweets. How can we possibly extract the wheat from the chaff (showing my age with that idiom), if we spend so much time ploughing through social media? I, for one, am not convinced there's much worth in a lot of this new-fangled stuff anyway (insert smiley here). I really don't want to know what friends, relatives or celebrities had for breakfast or which humorous cat videos they've just watched. Of course it's subjective, but I think there's a good case for claiming the vast majority of digital content is a complete load of rubbish. So how can we live useful, worthwhile or even fulfilled lives when surrounded by it? In other words, how do we find the little gems of genuine worth among the flood of noise? It seems highly probable that a lot of the prominent nonsense theories such as moon landing hoax wouldn't be anywhere near as popular if it wasn't for the World Wide Web disseminating them.

4. Fatigue and overload. Research has shown that our contemporary news culture (short snippets repeated ad nauseum over the course of a day or so) leads to a weary attitude. Far from empowering us, bombarding everyone with the same information, frequently lacking context, can rapidly lead to antipathy. Besides which, if information is inaccurate in the first place it can quickly achieve canonical status as it gets spread across the digital world. As for the effect all this audio-visual over-stimulation is having on children's attention spans...now where was I?

5. The future. So are there any solutions to these issues? I assume as we speak there are research projects aiming to develop heuristic programs that are the electronic equivalent of a personal assistant. If a user carefully builds their personality profile, then the program would be expected to extract nuggets of digital gold from all the sludge. Yet even personally-tailored smart filters that provide daily doses of information, entertainment, commerce and all points in between have their own issues. For example, unless the software is exceptional (i.e. rather more advanced than anything commercially available today) you would probably miss out on laterally- or tangentially-associated content. Even for scientists, this sort of serendipity is a great boon to creativity, but is rarely found in any form of machine intelligence. There's also the risk that corporate or governmental forces could bias the programming…or is that just the paranoia returning? All I can say: knowledge is power.

All in all, this sounds a touch pessimistic. I think Arthur C. Clarke once raised his concern about the inevitable decay within societies that overproduced information. The digital age is centered on the dissemination of content that is both current and popular, but not necessarily optimal. We are assailed by numerous sources of data, often created for purely commercial purposes; rarely for anything of worth. Let's hope we don't end up drowning in videos of pesky kittens. Aw, aren't they cute, though?