Sunday 3 January 2010

What's in a label? How words shape reality

With the start of a new year it seems appropriate to look at how our perception of the universe is created via language - after all, there's no position in space identifying an orbital starting point. We grow up with a notion of reality that is largely defined by convenience and historical accidents embedded into our language and therefore our thought patterns (and vice versa). For at least the last six hundred years many societies have called our planet Earth, whilst of course Ocean would be more appropriate. Whilst this is just an obvious chauvinism for a land-based species, there are other terms that owe everything to history. We count in base ten, position zero longitude through the Greenwich Meridian and usually show the Earth from one perspective, despite there not being an arrow in our galaxy stating 'this way up' (but then had Ancient Egyptians' view prevailed, Australia and New Zealand would be in the Northern Hemisphere.)

So how far can go with constructs? Our calendar is an archaic, sub-optimal mish-mash, with the interpolation of July and August meaning the last four months of the year are inaccurately named seven through ten. The changeover from the Julian to Gregorian calendar varied from nation to nation, meaning well-known events such as the birth of George Washington and the Bolshevik Revolution have several dates depending on the country defining that piece of history. As for the majority of humans agreeing that we are now in AD 2010, thanks to a fifteen hundred year-old mistake by Dionysius Exiguus our current year should really be at least AD 2014, if we accept that an historical figure called Jesus of Nazareth was born during the lifetime of Herod the Great. It appears that even the fundamentals that guide us through life are subjective at the very least if not far from accurate in many cases.

The philosopher of science Thomas Kuhn argues that all scientific research is a product of the culture of the scientists engaged on those projects, so whilst we might argue that Galileo was the first scientist in a strictly modern use of the word, can there be a definitive boundary between the quasi-mystical thought processes of Copernicus and Kepler (and even Newton), and that of the modern exponents typified by Einstein and Hawking? Whilst we would like to believe in a notion of pure objectivity, scientists are just as subjective as everyone else and their theories are therefore built on assumptions directly related to history, both cultural and biological.

We use labels to comfort ourselves, even boost our egos, via unconscious assumptions that are gradually looking more ridiculous as we delve ever deeper into the mysteries of creation. For example, the past sixty-five million years has been a period frequently named 'the Age of Mammals'. Yet as Stephen Jay Gould was fond of pointing out, most of the world's biomass is microbial and we macroscopic life forms are comparative newcomers, restricted to a far reduced range of environments compared to bacteria, protists and other small-scale organisms.

Despite such sense-expanding tools as infra-red telescopes and electron microscopes, we still process sensory input and use primarily audio-visual output to define scientific theories and methodology. We are in thrall to the languages we use define our thoughts, both conversational language and mathematics. Although the lingua franca of science has varied over the centuries, all languages from Latin to English have one thing in common: they are used to tell us stories. At a basic level, the history of science is riddled with fables and apocrypha, from Newton being hit by an apple (and inventing the reflecting telescope) to Galileo dropping weights from the Leaning Tower of Pisa, even Columbus believing the world was a sphere (he didn't - he thought it was pear-shaped!)

So if scientific history cannot be relied upon, what about the hypotheses and theories themselves? In the words of John Gribbin, we construct 'Just So' stories to create a comprehendible version of reality. Presumably this reliance on metaphor will only increase as our knowledge becomes further divorced from everyday experience but our technology fails to keep pace with confirming new theories; for example, it is far from likely that we will ever be able to directly view a superstring.

In addition, language doesn't just restrict our ideas: if a term has a scientific sense differing from vernacular meaning, problems frequently arise. A classic example would be quantum leap, which to most people means an enormous step forward but to physicists is an electron's miniscule change of energy level. However, even personal computer pioneer Sir Clive Sinclair used the term in its former meaning for his 1984 Quantum Leap microcomputer (at least I assume he did, although QL owners may disagree...)

Speaking of which, perhaps when we finally build (or machines build for us) computers capable of true artificial intelligence, new ways of exploring the universe not tied down to conventional linguistic-based thought patterns may arise. Then again, since we will be the parents of these machines, this may not be feasible. As one of Terry Pratchett's characters stated: "I think perhaps the most important problem is that we are trying to understand the fundamental workings of the universe via a language devised for telling one another where the best fruit is." But all things considered, we haven't done that badly so far.

Technorati Tags: , ,

Saturday 19 December 2009

Warp engines offline, Captain: has science fiction become confused with science fact?

The current bickering in Copenhagen seemingly ignores a rather pertinent issue: our skills and experience in reversing climate change are almost exactly zero. Of course we can drastically cut back on fossil fuels, increase energy efficiency and possibly even slow down population growth, but there is little on the technological horizon that can profoundly alter the climate in favour of our species. Yet the implicit view seems to be that if a political solution is found then a practical solution will follow in due course.

So why is it assumed that given enough Government funding, the people in white lab coats can perform miracles of climate engineering? This attitude is symptomatic of an ever-widening gap between the scientific forefront and public perception. Many strands of contemporary science are so detached from everyday life that they inhibit straightforward public assimilation, whilst the ubiquity of electronic consumer goods may be lulling us into a false sense of security regarding our abilities. We are surrounded by 'space age' gadgets and technology from Wii to Wi-Fi that only a generation ago were strictly for James Bond. And with Virgin Galactic seemingly about to usher in a new age of space tourism, becoming an astronaut will be akin to a very expensive form of air travel, though a sub-orbital hop hardly counts as boldly going anywhere.

Another possible cause that doesn't seem to have gained much notice is the influence of science fiction films and television series. With their largely computer-generated visual effects, most Hollywood product effortlessly outshines any real life counterpart. For example, doesn't the International Space Station (ISS) resemble nothing so much as a bunch of tin cans linked by Meccano struts? Yet the ISS is about as good as ultra-expensive high-technology gets, being by far the largest man-made structure ever assembled in orbit. Given a choice between watching ISS crew videos (Thanksgiving dinner with dehydrated turkey, anyone?) and the likes of Bruce Willis saving mankind from doomsday asteroids, most people unmistakably opt for the latter.

Now that the majority of humans live in crowded conurbations far removed from our ancestral peripatetic existence, the desperation for new horizons is obvious. Yet our exploratory avatars such as the Mars rovers hardly qualify as charismatic heroes, hence the great appeal of fictional final frontiers. The complex interplay between reality and fiction is further confused by the new genre of "the science behind…" book. Frequently written by practicing scientists for the likes of Star Trek, The X-Files, Dr Who, etal, the blurring of boundaries can be exemplified by one buyer of The Physics of Star Trek who compared it to A Brief History of Time (although admittedly Stephen Hawking did write the foreword to the former).

Furthermore, the designers of such disparate items as medical monitoring equipment, flip top phones and military aircraft instrumentation have been inspired by Hollywood originals to such an extent that feedback loops now exist, with arcade simulators inspiring real hardware which in turn inspire new games. Articles discussing quantum entanglement experiments seem obliged to draw a comparison with the Star Trek matter transporter, though the transportees are as yet only photons. Theoretical physicist Miguel Alcubierre has even spent time exploring the fundamentals for a faster-than-light 'warp' drive, although it's unlikely to get beyond calculations for some little while. Blue-sky thinking is all very well, but there are plenty of more pressing issues that our finest minds could be working on...

Closer to home, it appears that a lot of the hype surrounding sustainable development is just that. Are we simply in thrall to companies hoping to make a fast buck out of fear, flogging us technologies about as useful as a chocolate teapot? A recent report suggested that the typical British home would gain only minute amounts of electricity from installing solar panels and wind turbines, although the development of spray-on solar cells may drastically improve efficiency in the next few years. But where does this leave us now? Although our species has endured sudden, severe climate changes such as the end of the last glaciation ten thousand years ago, current population density and infrastructure forbid anything as simple as packing our things and moving to higher ground. Cutting back on fossil fuel consumption is clearly necessary, but isn't it equally as important to instigate long-term research programmes in case some of the triggers are due to natural causes such as the Milankovitch cycles? If global temperature increase is inevitable, never mind potential cooling in Western Europe due to a diverted Gulf Stream, then reducing greenhouse gas emissions is merely the tip of the iceberg (sorry, couldn't resist that one).

Anyone who looks back at the grandiose pipe dreams of the 1960's can see that our technological ambitions have profoundly reduced in scope since their idealistic heyday; what we have gained in the micro-scale technologies, we have lost in the giant engineering projects envisaged by likes of Gerard O'Neill, Freeman Dyson, and Arthur C. Clarke. Yet Thunderbirds-style macho engineering is presumably the type we will need to develop if we are heading for a chain reaction of environmental change.

Restructuring an ailing climate will take more than a few decades of recycling and installation of low-voltage light bulbs - we will have to mobilise people and funds on a unique scale if we are not to prove powerless against the mighty engine of Planet Earth. To this end we need to spread the message of our own insignificance, mitigated by research into alleviating the worst-case scenarios: there can be no Hollywood-style quick-fixes to the immense forces ranged against us. No-one could argue that even short-term weather forecasting is an exact science, so discovering whatever trouble the Quantum Weather Butterfly has in store for us will keep earth scientists engaged for many years to come (and there I go again, confusing fiction with reality, doh!)