Sunday 3 January 2010

What's in a label? How words shape reality

With the start of a new year it seems appropriate to look at how our perception of the universe is created via language - after all, there's no position in space identifying an orbital starting point. We grow up with a notion of reality that is largely defined by convenience and historical accidents embedded into our language and therefore our thought patterns (and vice versa). For at least the last six hundred years many societies have called our planet Earth, whilst of course Ocean would be more appropriate. Whilst this is just an obvious chauvinism for a land-based species, there are other terms that owe everything to history. We count in base ten, position zero longitude through the Greenwich Meridian and usually show the Earth from one perspective, despite there not being an arrow in our galaxy stating 'this way up' (but then had Ancient Egyptians' view prevailed, Australia and New Zealand would be in the Northern Hemisphere.)

So how far can go with constructs? Our calendar is an archaic, sub-optimal mish-mash, with the interpolation of July and August meaning the last four months of the year are inaccurately named seven through ten. The changeover from the Julian to Gregorian calendar varied from nation to nation, meaning well-known events such as the birth of George Washington and the Bolshevik Revolution have several dates depending on the country defining that piece of history. As for the majority of humans agreeing that we are now in AD 2010, thanks to a fifteen hundred year-old mistake by Dionysius Exiguus our current year should really be at least AD 2014, if we accept that an historical figure called Jesus of Nazareth was born during the lifetime of Herod the Great. It appears that even the fundamentals that guide us through life are subjective at the very least if not far from accurate in many cases.

The philosopher of science Thomas Kuhn argues that all scientific research is a product of the culture of the scientists engaged on those projects, so whilst we might argue that Galileo was the first scientist in a strictly modern use of the word, can there be a definitive boundary between the quasi-mystical thought processes of Copernicus and Kepler (and even Newton), and that of the modern exponents typified by Einstein and Hawking? Whilst we would like to believe in a notion of pure objectivity, scientists are just as subjective as everyone else and their theories are therefore built on assumptions directly related to history, both cultural and biological.

We use labels to comfort ourselves, even boost our egos, via unconscious assumptions that are gradually looking more ridiculous as we delve ever deeper into the mysteries of creation. For example, the past sixty-five million years has been a period frequently named 'the Age of Mammals'. Yet as Stephen Jay Gould was fond of pointing out, most of the world's biomass is microbial and we macroscopic life forms are comparative newcomers, restricted to a far reduced range of environments compared to bacteria, protists and other small-scale organisms.

Despite such sense-expanding tools as infra-red telescopes and electron microscopes, we still process sensory input and use primarily audio-visual output to define scientific theories and methodology. We are in thrall to the languages we use define our thoughts, both conversational language and mathematics. Although the lingua franca of science has varied over the centuries, all languages from Latin to English have one thing in common: they are used to tell us stories. At a basic level, the history of science is riddled with fables and apocrypha, from Newton being hit by an apple (and inventing the reflecting telescope) to Galileo dropping weights from the Leaning Tower of Pisa, even Columbus believing the world was a sphere (he didn't - he thought it was pear-shaped!)

So if scientific history cannot be relied upon, what about the hypotheses and theories themselves? In the words of John Gribbin, we construct 'Just So' stories to create a comprehendible version of reality. Presumably this reliance on metaphor will only increase as our knowledge becomes further divorced from everyday experience but our technology fails to keep pace with confirming new theories; for example, it is far from likely that we will ever be able to directly view a superstring.

In addition, language doesn't just restrict our ideas: if a term has a scientific sense differing from vernacular meaning, problems frequently arise. A classic example would be quantum leap, which to most people means an enormous step forward but to physicists is an electron's miniscule change of energy level. However, even personal computer pioneer Sir Clive Sinclair used the term in its former meaning for his 1984 Quantum Leap microcomputer (at least I assume he did, although QL owners may disagree...)

Speaking of which, perhaps when we finally build (or machines build for us) computers capable of true artificial intelligence, new ways of exploring the universe not tied down to conventional linguistic-based thought patterns may arise. Then again, since we will be the parents of these machines, this may not be feasible. As one of Terry Pratchett's characters stated: "I think perhaps the most important problem is that we are trying to understand the fundamental workings of the universe via a language devised for telling one another where the best fruit is." But all things considered, we haven't done that badly so far.

Technorati Tags: , ,