Showing posts with label Galileo. Show all posts
Showing posts with label Galileo. Show all posts

Wednesday 25 May 2016

From Dr Strangelove to Dr Evil: Hollywood's anti-science stance

Despite the decades of hard work by the likes of Bill Nye, Stephen Hawking, Carl Sagan, Stephen Jay Gould etal, there is still an enormous amount of public suspicion surrounding scientists and their work. From wavering opinion concerning climate change to the negative publicity revolving around genetically-modified crops (A.K.A. 'Frankenfoods') it seems that popular opinion of scientists isn't far above that meted out in recent years to politicians and merchant bankers.

Tabloid media cannot be solely to blame for this, although the ridiculous scaremongering stories given front page attention, frequently involving medical science, are certainly no help. Instead, I would argue that some of the blame for the public attitude to STEM (Science, Technology, Engineering and Mathematics) comes from that ubiquitous global communicator, mainstream Hollywood. So where did the world's movie capital get its ideas from?

It seems that the denigration of science and its technological applications has probably existed as long as modern science itself. Before there were films to spread the negativity, literature had a mixed opinion of the discipline. Could some of the most famous apparently anti-scientific publications from Europe have inspired Hollywood's pioneers, many of whom were European emigrés?

Jonathan Swift's third book of Gulliver's Travels concerns the scientific elite of a floating island called Laputa. First published in 1726 during the so-called Age of Enlightenment, the book is typical of Swift's no holds barred approach to satire, making much use of the learning of the day. Despite being far more concerned with social and political issues rather than an anti-scientific stance, the material is still echoed today in the popular media.

Granted, many would agree that some of the more expensive STEM research projects such as the Large Hadron Collider could wait until global issues concerning hunger, medicine, environmental degradation - and poverty in general - are solved, but then wealth is rarely evenly distributed. After all, the USA apparently spends twice as much on pet grooming as it does on nuclear fusion research. Incidentally, isn't this bizarre in itself: it's not just that we consider ourselves so much more rational than all other animals, but that the human brain is the most complex object in the known universe. That's a pretty scary thought!

As for Mary Shelley's classic novel whose title is evoked during criticism of GM foods, she may have been inspired by the general feeling of doom then in the air; almost literally in fact, due to the 1815 eruption of Mount Tambora, with volcanic dust creating 1816's 'Year without a Summer'. As an aside, the astonishingly lurid colours of J.M.W. Turner's sunsets of the period were another artistic response associated with the high-altitude volcanic aerosols.

In addition to the extremely cold, wet conditions of that year, Shelley is thought to have stopped near to the original Frankenstein Castle in Germany, where alchemy and other dubious dark arts were reputed to have been practiced. Combined with Luigi Galvani's experiments on frogs' legs - originally performed several decades earlier but much imitated still in Shelley's time, including on human cadavers - the novel is clearly a reflection of widespread anxieties of the time.

With the expansion of industrial cities and their associated squalor, the mid-Nineteenth Century saw the origin of philosophies that associated technological advances (and their scientific underpinnings) with a debasement of humanity. William Blake's description of 'satanic mills' epitomises this mode of thought, seen in as diverse a range of expression as the Pre-Raphaelite Brotherhood of artists, the Arts and Crafts movement, even the political writings of Marx and Engels. To blame the greed of the new captains of industry on science is obviously unfair, but then the latter were a far easier target. After all, the English chemist and political radical Joseph Priestley fled to the United States after an authority-sponsored mob burnt down his house in 1791.

Blake's over-wraught emoting ("Science is the Tree of Death") is amongst the strongest negativity of the period, but can we blame him, considering science was, as it is today, often wrongly blamed as the root cause of the widespread destruction of nature to make way for a soulless, artificial environment? But it wasn't just a response to the changes to society and landscape that Blake took exception to: he detested the mechanistic vision of the universe built upon the work of Galileo and Newton, believing that too much knowledge destroyed wonder and awe.

This is clearly as subjective a viewpoint as any discussion of a work of art; it can be easily rebuffed, although the attitude behind it should be treated seriously. Happily, today's plethora of glossy coffee table books on such scientifically-gleaned wonders as Hubble Space Telescope imagery show there is still plenty to be in awe of.

Mainstream cinema frequently paints a very A versus B picture of the world (think classic westerns or war films). But science can rarely fit into such neat parcels: consider how the more accurate general theory of relativity can live alongside its predecessor from Newton. In addition, it's very tricky to make interesting drama within a traditional narrative structure that utilises scientist protagonists unless it's a disaster movie (even the likes of Jurassic Park falls within this category.)

It isn't difficult to recall many negative examples of scientists in Hollywood movies, from at best those too wrapped up in their own work to notice its wider effects, to at worst insane megalomaniacs intent on either world domination or destruction. In contrast, how many sympathetic movie scientists are there?

It seems such a shame that such a ubiquitous form of entertainment consistently portrays such a lack of sympathy towards science. Even the film version of Carl Sagan's novel Contact lacked the cosmic spiritual elements of the source material, as if afraid that a combination of astrophysics and the mystical wouldn't be comprehensible to audiences (2001 syndrome, perhaps?) Science fiction films these days often seem keen to boast of their technical consultants, so what about a more sympathetic attitude to the practitioners of science itself? After all, most scientists don't live with their private armies in secret headquarters bases, planning to takeover the world...

Thursday 28 May 2015

Presenting the universe: 3 landmark science documentary series

They say you carry tastes from your formative years with you for the rest of your life, so perhaps this explains why there are three science documentary television series that still have the power to enchant some decades after first viewing. Whilst there has been no shortage of good television science programming since - Planet Earth and the Walking with... series amongst them - there are three that remain the standard by which I judge all others:
  1. The Ascent of Man (1972) - an account of how humanity has evolved culturally and technology via biological and man-made tools. Presented by mathematician and renaissance man Jacob Bronowski.
  2. Cosmos (1980) - the history of astronomy and planetary exploration, interwoven with the origins of life. Presented by Carl Sagan (as if you didn't know).
  3. The Day the Universe Changed (1985) - a study of how scientific and technological breakthroughs in Western society generate paradigm shifts. Presented by the historian of science James Burke.

All three series have been proclaimed 'landmark' shows so it is interesting to compare their themes, viewpoints and production techniques, discovering just how similar they are in many ways. For a start, their excellent production values allowed for a wide range of international locations and historical recreations. They each have a charismatic presenter who admits to espousing a personal viewpoint, although it's quite easy to note that they get progressively more casual: if Jacob Bronowski has the appearance of a warm elder statesman then Carl Sagan is the father figure for a subsequent generation of scientists; James Burke's on-screen persona is more akin to the cheeky uncle, with a regular supply of puns, some good, some less so.

To some extent it is easy to see that the earliest series begat the second that in turn influenced the third. In fact, there is a direct link in that Carl Sagan hired several of the producers from The Ascent of Man for his own series, clearly seeing the earlier show as a template for Cosmos. What all three have is something extremely rare in other science documentaries: a passion for the arts that promotes a holistic interpretation of humanity's development; science does not exist in isolation. As such, the programmes are supported by superbly-illustrated tie-in books that extend the broadcast material from the latter two series whilst Bronowski's book is primarily a transcript of his semi-improvised monologue.

In addition to considering some of the standard examples of key developments in Western civilisation such as Ancient Greece and Galileo, the series include the occasional examination of Eastern cultures. The programmes also contain discussions of religions, both West and East. In fact, between them the series cover a vast amount of what has made the world the way it is. So not small potatoes, then!

The series themselves:

The Ascent of Man

To some extent, Jacob Bronowski was inspired by the earlier series Civilisation, which examined the history of Western arts. Both series were commissioned by David Attenborough, himself a natural sciences graduate who went on to present ground-breaking series in his own discipline as well as commissioning these landmark programmes. (As an aside, if there are any presenters around today who appears to embody the antithesis of C.P. Snow's 'the two cultures' then Sir David is surely in the top ten).

Bronowski's presentation is an astonishingly erudite (for all its improvisation) analysis of the development of our species and its technological society. Although primarily focused on the West, there is some consideration of other regions, from the advanced steel-making technology of medieval Japan to Meso-American astronomy or the relatively static culture of Easter Island. Time and again, the narrative predates the encumbrance of political correctness: that it was the West that almost solely generated our modern technological society - the 'rage for knowledge' for once outshining dogma and inertia.

Of course, it would be interesting to see how Bronowski might have written it today, in light of Jared Diamond's ground-breaking (in my humble opinion) Guns, Germs and Steel. Although he works hard to present science, the plastic arts, literature and myth as emerging from the same basic elements of our nature, it is clear that Bronowski considers the former to be much rarer - and therefore the more precious - discipline. Having said that, Bronowski makes a large number of Biblical references, primarily from the Old Testament. In light of the current issues with fundamentalism in the USA and elsewhere, it is doubtful that any science documentary today would so easily incorporate the breadth of religious allusions.

If there is a thesis underlying the series it is that considering how natural selection has provided humanity with a unique combination of mental gifts, we should use them to exploit the opportunities thus presented. By having foresight and imagination, our species is the only one capable of great heights - and, as he makes no pretence of - terrible depths. As he considers the latter, Bronowski admits that we should remain humble as to the state of contemporary knowledge and technology, which five hundred years hence will no doubt appear childlike. In addition, he states that belief in absolute knowledge can lead to arrogance; if we aspire to be gods, it can only end in the likes of Auschwitz. But his final speeches contain the wonderful notion that the path to annihilation can be avoided if science is communicated to all of society with the same vigour and zest as given to the humanities.

Cosmos

I was already an astronomy and astronautics fan when I saw this series. Its first UK broadcast slot was somewhat later than my usual bedtime, so it seemed a treat to be allowed to stay up after the rest of the family had gone to bed. Like Star Wars a few years before, it appeared to me to be an audio-visual tour-de-force; not surprisingly, both the tie-in hardback and soundtrack album arrived on my birthday that year.

Nostalgia aside, another key reason for the series' success was the charisma of the presenter himself. Much has been written of Sagan's abilities as a self-publicist, and the programmes do suffer from rather too many staring-beatifically-into-the-distance shots (as to some extent replicated more recently by Brian Cox in his various Wonders Of... series). Of course, it must have taken considerable effort to get the series made in the first place, especially in gaining a budget of over $6 million. After all, another great science populariser, the evolutionary biologist Stephen Jay Gould, never managed to gain anything beyond the occasional one-off documentary.

What is most apparent is Sagan's deep commitment to presenting science to the widest possible audience without distorting the material through over-simplification. However, in retrospect it is also obvious that he was using ideas from several scientific disciplines, such as the Miller-Urey experiment, to bolster his opinions on the likelihood of extra-terrestrial life. To some extent his co-writers reined him in, the final episode given over not to SETI but to plea for environmental stewardship.

Whilst the series is primarily concerned with a global history of astronomy and astrophysics, supplemented with first-hand accounts of planetary exploration, Sagan like Bronowski is equally at home with other scientific disciplines. He discusses the evolution of intelligence and incorporates elements of the humanities with equal aplomb. Another key element is the discussion of the role superstition and dead ends have played in the hindrance or even advancement of scientific progress, from Pythagorean mysticism, via Kepler's conflation of planetary orbits with the five Platonic solids, to Percival Lowell's imaginary Martian canals. Although Sagan repeats his earlier debunking of astrology, UFO sightings and the like, he doesn't rule out the role of emotions in the advancement of science and technology, citing for example the rocket pioneer Robert Goddard's Mars-centred epiphany.

Perhaps the primary reason that the series - despite the obvious dating of some of the knowledge - is still so engaging and why Sagan's narration is so widely quoted, is that he was a prose poet par excellence. Even when discussing purely scientific issues, his tone was such that the information could be effortlessly absorbed whilst allowing the viewer to retain a sense of wonder. Of course, Sagan had ample assistance from his two co-writers Ann Druyan and Steven Soter, as clearly proven by their scripts for the Neil deGrasse Tyson-hosted remake Cosmos: A Spacetime Odyssey. Nonetheless, it is hard to think of another presenter who could have made the original series the success it was on so many levels.

The Day the Universe Changed

Although James Burke had already made a large-scale history of science and technology series called Connections in 1978, it contained a rather different take on some of the same material. By focussing on interactive webs, the earlier series was somewhat glib, in that some of the connections could probably be replaced by equally valid alternatives.

In contrast, The Day the Universe Changed uses a more conventional approach that clearly shares some of the same perspectives as the earlier programmes. Like The Ascent of Man and the Cosmos remake, mediaeval Islamic science is praised for its inquisitiveness as well as the preservation of Classical knowledge. Burke was clearly influenced by his predecessors, even subtitling the series 'A Personal View by James Burke'. Perhaps inevitably he covers some of the same material too, although it would be difficult to create a brief history without reference to Newton or Ancient Greece.

As with Bronowski, Burke integrates scientific advances within wider society, a notable example being the rediscovery of perspective and its profound effect on contemporary art. He also supports the notion that rather than a gradual series of changes, paradigm shifts are fundamental to major scientific breakthroughs. In effect, he claims that new versions of the truth - as understood by a scientific consensus - may rely on abandonment of previous theories due to their irreconcilable differences. Having recently read Rachel Carson's 1950 The Sea Around Us I can offer some agreement: although Carson's geophysical analysis quietly screams in favour of plate tectonics, the contemporary lack of evidence lead her to state the no doubt establishment mantra of the period concerning static land masses.

What Burke constantly emphasises even more than his predecessors is that time and place has a fundamental influence on the scientific enquiry of each period. Being immersed in the preconceived notions of their culture, scientists can find it as difficult as anyone else to gain an objective attitude. In actuality, it is all but impossible, leading to such farcical dead-ends as Piltdown Man, a hoax that lasted for decades because it fulfilled the jingoistic expectations of British scientists. Burke's definition of genius is someone who can escape the givens of their background and thus achieve mental insights that no amount of methodical plodding can equal. Well, perhaps, on occasion.

The series also goes further than its predecessors in defining religion as anti-scientific on two grounds: its demand for absolute obedience in the face of logic and evidence, with reference to Galileo; or the lack of interest in progress, as with the cyclical yet static Buddhist view, content for the universe to endlessly repeat itself. Burke also shows how scientific ideas can be perverted for political ends, as with social Darwinism. But then he goes on to note that as the world gets ever more complex, and changes at an ever faster rate, non-specialists are unable to test new theories in any degree and so are having to rely on authority just as much as before the Enlightenment. How ironic!

All in all, these common threads are to my mind among the most important elements of the three series:
  1. Science and the humanities rely on the same basic processes of the human brain and so are not all that different;
  2. Scientific thinking can be as creative an endeavour as the arts;
  3. Scientists don't live in a cultural vacuum but are part and parcel of their world and time;
  4. Religion is the most change-resistant of human activities and therefore rarely appears sympathetic to science's aims and goals.

As Carl Sagan put it, "we make our world significant by the courage of our questions and the depth of our answers." For me, these three series are significant for their appraisal of some of those courageous explorers who have given us the knowledge and tools we call science.


Tuesday 23 December 2014

Easy fixes: simple corrections of some popular scientific misconceptions

A few months' ago I finally saw the film 'Gravity', courtesy of a friend with a home theatre system. Amongst the numerous technical errors - many pointed out on Twitter by Neil deGrasse Tyson - was one that I hadn't seen mentioned. This was how rapidly Sandra Bullock's character acclimatised to the several space stations and spacecraft immediately after removing her EVA suit helmet. As far as I am aware, the former have nitrogen-oxygen atmospheres whilst the suits are oxygen-only, necessitating several hours of acclimatisation.

I may of course be wrong on this, and of course dramatic tension would be pretty much destroyed if such delays had to be woven into the plot, but it got me thinking that there are some huge fundamental errors propagated in non-scientific circles. Therefore my Christmas/Hanukkah/holiday season present is a very brief, easy -on-the-brain round-up of a few of the more obvious examples.

  1. The Earth is perfect sphere.
    Nope, technically I think the term is 'oblate spheroid'. Basically, a planet's spin squashes the mass so that the polar diameter is less than the equatorial diameter. Earth is only about 0.3% flatter in polar axis but if you look at a photograph of Saturn you can see a very obvious squashing.

  2. Continental drift is the same thing as plate-tectonics.
    As a child I often read that these two were interchangeable, but this is not so. The former is the hypothesis that landmasses have moved over time whilst the latter is the mechanism now accepted to account for this, with the Earth's crust floating over the liquid mantle in large segments or plates.

    Geologist Alfred Wegener suggested the former in 1912 but is was largely pooh-poohed until the latter was discovered by ocean floor spreading half a century later. As Carl Sagan often said, "extraordinary claims require extraordinary evidence".

  3. A local increase in cold, wet weather proves that global warming is a fallacy.
    Unfortunately, chaose theory shows that even the minutest of initial changes can cause major differences of outcome, hence weather forecasting being far from an exact science.

    However, there is another evidence for the validity of this theory, fossil fuel lobbyists and religious fundamentalists aside. I haven't read anything to verify this, but off the top of my head I would suggest that if the warm water that currently travels north-east across the Atlantic from the Gulf of Mexico (and prevents north-western Europe from having cold Canadian eastern seaboard winters), then glacial meltwater may divert this warm, denser seawater. And then the Isles of Scilly off the Cornish coast may face as frosty a winter as the UK mainland!

  4. Evolution and natural selection are the same thing.
    Despite Charles Darwin's On the Origin of Species having been published in 1859, this mistake is as popular as ever. Evolution is simply the notion that a population within a parent species can slowly differentiate to become a daughter species, but until Darwin and Alfred Russel Wallace independently arrived at natural selection, there really wasn't a hypothesis for the mechanism.

    This isn't to say that there weren't attempts to provide one, it's just that none of them fit the facts quite as well as the elegant simplicity of natural selection. Of course today's technology, from DNA analysis to CAT scans of fossils, provides a lot more evidence than was available in the mid-Nineteenth Century. Gregor Mendel's breeding programmes were the start of genetics research that led to the modern evolutionary synthesis that has natural selection at its core.

  5. And finally…freefall vs zero gravity.
    Even orbiting astronauts have been known to say that they are in zero gravity when they are most definitely not. The issue is due to the equivalence of gravity and acceleration, an idea which was worked on by luminaries such as Galileo, Newton and Einstein. If you find yourself in low Earth orbit - as all post-Apollo astronauts are - then clearly you are still bound by our planet's gravity.

    After all, the Moon is approximately 1800 times further away from the Earth than the International Space Station (ISS), but it is kept in orbit by the Earth's pull (okay, so there is the combined Earth-Moon gravitational field, but I'm keeping this simple). By falling around the Earth at a certain speed, objects such as the ISS maintain a freefalling trajectory: too slow and the orbit would decay, causing the station to spiral inwards to a fiery end, whilst too fast would cause it to fly off into deep space.

    You can experience freefall yourself via such delights as an out-of-control plummeting elevator or a trip in an arc-flying astronaut training aircraft A.K.A. 'Vomit Comet'. I'm not sure I'd recommend either! Confusingly, there's also microgravity and weightlessness, but as it is almost Christmas we'll save that for another day.
There are no doubt numerous other, equally fundamental errors out there, which only goes to show that we could do with much better science education in our schools and media. After all, no-one would make so many similar magnitude mistakes regarding the humanities, would they? Or, like the writer H.L. Mencken, would I be better off appreciating that "nobody ever went broke underestimating the intelligence of the (American) public"? I hope not!

Tuesday 14 May 2013

What, how and why? Are there 3 stages to science?

Not being philosophically inclined I was recently surprised to find myself constructing an armchair thesis: it had suddenly dawned on me that there might be three, broad phases or stages to the development of scientific ideas. I'm fairly certain I haven't read about anything along similar lines, so let me explain,  safe in the knowledge that if it's a load of fetid dingo's kidneys, it's entirely of my own doing.

Stage 1

Stage one is the 'what' phase: simply stated, it is about naming and categorising natural phenomena, a delineation of cause and effect. In a sense, it is about finding rational explanations for things and events at the expense of superstition and mysticism.  In addition, it utilises the principle of parsimony, otherwise known as Occam's (or Ockham's) Razor: that the simplest explanation is usually correct. 

Although there were a few clear moments of stage one in Ancient Greece - Eratosthenes' attempt to measure the size of the Earth using Euclidean Geometry being a prime example - it seems to have taken off in earnest with Galileo. Although his work is frequently mythologised (I follow the rolling weights rather than dropping objects from the Leaning Tower of Pisa brigade), Galileo most likely devised both actual and thought experiments to test fundamental findings, such as the separate effects of air resistance and gravity.

Of course, Galileo was primarily interested in physics but the other areas of science followed soon after. Systematic biology came to the fore in such practical work as the anatomical investigations of William Harvey - pioneer in the understanding of blood circulation - and the glass bead microscopes of Antony van Leeuwenhoek. The work of the latter, interestingly enough, was largely to understand how small-scale structure in edible substances created flavours.  It's also worth thinking about how this research expanded horizons: after all, no-one had ever seen the miniature marvels such as bacteria. I wonder how difficult the engravers of illustrated volumes found it, working from sketches and verbal descriptions on sights they have never seen themselves? But then again, no-one has ever directly imaged a quark either…

Talking of biology, we shouldn't ignore Carl Linnaeus, the Swedish scientist who started the cataloguing methodology in use today. New Zealand physicist Ernest Rutherford may have disparagingly referred to all branches of science other than physics as mere stamp collecting but apart from the wild inaccuracy of his statement it is seemingly obvious that without various standards of basic definitions there is no bedrock for more sophisticated research.

The repetitive, largely practical aspect of the phase in such disciplines as geology and taxonomy meant that largely untrained amateurs could make major contributions, such as the multitude of Victorian parsons (of whom Charles Darwin was almost a member) who worked on the quantity over quality principle in collecting and cataloguing immense amounts of data. Of course, Darwin went far beyond phase one but his work built on the evaluation of evolutionary ideas (try saying that three times fast) that numerous predecessors had discussed, from the Ancient Greeks to John Ray in the late Seventeenth Century.

This isn't to say that stage one science will be finished any time soon. The Human Genome Project is a good example of a principally descriptive project that generated many surprises, not least that it is proving more difficult than predicted to utilise the results in practical applications. Although in the BBC television series The Kingdom of Plants David Attenborough mentioned that the Royal Botanic Gardens at Kew contains 90% of known plant species, there are still plenty of remote regions - not to mention the oceans - yet to yield all their secrets to systematic scientific exploration.  In addition to the biota yet to be described in scientific records, the existing catalogues are in the process of major reorganisation. For example, the multitude of duplicate plant names is currently being addressed by taxonomic experts, having so far led to the finding of 600,000 superfluous designations. It isn't just plants either: a recent example was the announcement that DNA evidence suggests there is probably only a single species of giant squid rather than seven. It may sound tedious and repetitive, but without comprehensive labelling and description of natural elements, it would be impossible to progress to the next stage.

Stage 2

Who was the first person to move beyond cataloguing nature to in-depth analysis? We'll probably never know, but bearing in mind that some of the Ionian philosophers and Alexandrian Greeks performed practical experiments, it may well have been one of them.

By looking to explore why phenomena occur and events unfold the way they do, our species took a step beyond description to evaluation. If art is holding a mirror up to nature, then could the second phase be explained as holding a magnifying glass up to nature, reducing a phenomenon to an approximation, and explaining how that approximation works?

For example, Newton took Galileo and Kepler's astronomical work and ran with it, producing his Law of Universal Gravitation. The ‘how' in this case is the gravitational constant that explained how bodies orbit their common centre of gravity. However, Newton was unable to delineate what caused the force to act across infinite, empty space, a theory that had to wait for stage three.

So different from the smug, self-satisfied attitude of scientists at the beginning of the Twentieth Century, the techniques of modern science suggest that there is a feedback cycle in which knowing which questions to ask is at least as important as gaining answers, the adage in this case being ‘good experiments generate new questions'. Having said that, some of the largest and most expensive contemporary experiments such as the Laser Interferometer Gravitational-Wave Observatory (LIGO) and Large Hadron Collider (LHC) have each been principally designed to confirm a single hypothesis.

As recent evidence has shown, even some of the fundamentals of the nature, including dark matter and dark energy, are only just being recognised. Therefore science is a long way from recognising all first principles, let alone understanding them. Closer to home, that most complex of known objects, the human brain, still holds a lot of secrets, and probably will continue to do so for some time to come.
Though microelectronics in general and computers in particular have allowed the execution of experiments in such fields as quantum teleportation, considered close to impossible by the finest minds only half a century ago, there are several reasons why computer processing power is getting closer to a theoretical maximum using current manufacturing techniques and materials. Therefore the near future may see a slowing down in the sorts of leading edge experimental science that has been achieved in recent decades. But how much progress has been made in phase three science?

Stage 3

This is more difficult to define than the other two phases and can easily veer into philosophy, a discipline that has a poor press from many professional scientists. Physicist Richard Feynman for example is supposed to have disparaged it as ‘about as useful to scientists as ornithology is to birds'.  Despite this - and the probability that there as many philosophies of science as there are philosophers -  it's easy to see that the cutting edge of science, particularly theoretical physics, generates as much discussion over its validity as any work of art. If you've read one of the myriad critiques of superstring theory for example, then you will know that it can be viewed as a series of intellectual patterns (accompanied by diabolical equations) that may never be experimentally confirmed. In that case is string theory really just a collection of philosophical hypotheses, unproven by experiment or observation and likely to remain so? The minuteness of the scale (an underwhelming description if ever there was one) makes the prospect of directly recording strings themselves  - as opposed to their effects - highly unlikely.

If that is the case then just where can you draw the line between science and philosophy? Of course one of the fundamental tenets of a valid hypothesis is to make testable predictions that no other hypothesis can account for. But with over a century of theories that increasingly fail to follow common sense  or match everyday experience perhaps this is a sign of approaching maturity in science, as we finally advance beyond the crude limitations of our biological inheritance and its limited senses. Surely one key result of this is that the boundaries between new ideas promulgated by scientists and the thoughts of armchair philosophers will become increasingly blurred? Or is that just fighting talk?

Whereas scientists engaged in phase two investigations seek to find more accurate approximations for phenomena, phase three includes the search for why one theory is thought to be correct over another. A prominent example may help elucidate. Further to Galileo in phase one and Newton in phase two, Einstein's General Relativity, which explains the cause of gravity via the curvature of spacetime, is clearly an example of phase three. Of course, contemporary physicists would argue that Einstein's equations are already known to be lacking finality due to its incompatible with quantum mechanics. Herein lies the rub!

One problem that has caused dissension amongst many scientists is a possibly even more ‘ultimate' question: why is the universe finely tuned enough for life and more than that, intelligent life, to exist? The potential answers cover the entire gamut of human thought, from the conscious design principle supported by some religiously-minded scientists, to the invocation of the laws of probability in a multiverse hypothesis, requiring an immense number of universes all with the different fundamentals (and therefore including a lucky few capable of producing life). But the obvious issue here is that wouldn't Occam's Razor suggest the former is more likely than the latter? As Astronomer Royal Sir Martin Rees states, this is veering into metaphysical territory, which except for the scientists with religious convictions, is usually an area avoided like the plague. However, it may eventually become possible to run computer models that simulate the creation of multiple universes and so as bizarre as it seems, go some way to creating a workable theory out of something that to most people is still a purely philosophical notion. Talk about counting angels on a pinhead!

I can't say I'm entirely convinced by my own theory of three stages to science, but it's been interesting to see how the history and practice of the discipline can be fitted into it. After all, as stated earlier no-one has ever observed a quark, which in the first days of their formulation were sometimes seen as purely mathematical objects any way. So if you're doubtful I don't blame you, but never say never...

Sunday 3 January 2010

What's in a label? How words shape reality

With the start of a new year it seems appropriate to look at how our perception of the universe is created via language - after all, there's no position in space identifying an orbital starting point. We grow up with a notion of reality that is largely defined by convenience and historical accidents embedded into our language and therefore our thought patterns (and vice versa). For at least the last six hundred years many societies have called our planet Earth, whilst of course Ocean would be more appropriate. Whilst this is just an obvious chauvinism for a land-based species, there are other terms that owe everything to history. We count in base ten, position zero longitude through the Greenwich Meridian and usually show the Earth from one perspective, despite there not being an arrow in our galaxy stating 'this way up' (but then had Ancient Egyptians' view prevailed, Australia and New Zealand would be in the Northern Hemisphere.)

So how far can go with constructs? Our calendar is an archaic, sub-optimal mish-mash, with the interpolation of July and August meaning the last four months of the year are inaccurately named seven through ten. The changeover from the Julian to Gregorian calendar varied from nation to nation, meaning well-known events such as the birth of George Washington and the Bolshevik Revolution have several dates depending on the country defining that piece of history. As for the majority of humans agreeing that we are now in AD 2010, thanks to a fifteen hundred year-old mistake by Dionysius Exiguus our current year should really be at least AD 2014, if we accept that an historical figure called Jesus of Nazareth was born during the lifetime of Herod the Great. It appears that even the fundamentals that guide us through life are subjective at the very least if not far from accurate in many cases.

The philosopher of science Thomas Kuhn argues that all scientific research is a product of the culture of the scientists engaged on those projects, so whilst we might argue that Galileo was the first scientist in a strictly modern use of the word, can there be a definitive boundary between the quasi-mystical thought processes of Copernicus and Kepler (and even Newton), and that of the modern exponents typified by Einstein and Hawking? Whilst we would like to believe in a notion of pure objectivity, scientists are just as subjective as everyone else and their theories are therefore built on assumptions directly related to history, both cultural and biological.

We use labels to comfort ourselves, even boost our egos, via unconscious assumptions that are gradually looking more ridiculous as we delve ever deeper into the mysteries of creation. For example, the past sixty-five million years has been a period frequently named 'the Age of Mammals'. Yet as Stephen Jay Gould was fond of pointing out, most of the world's biomass is microbial and we macroscopic life forms are comparative newcomers, restricted to a far reduced range of environments compared to bacteria, protists and other small-scale organisms.

Despite such sense-expanding tools as infra-red telescopes and electron microscopes, we still process sensory input and use primarily audio-visual output to define scientific theories and methodology. We are in thrall to the languages we use define our thoughts, both conversational language and mathematics. Although the lingua franca of science has varied over the centuries, all languages from Latin to English have one thing in common: they are used to tell us stories. At a basic level, the history of science is riddled with fables and apocrypha, from Newton being hit by an apple (and inventing the reflecting telescope) to Galileo dropping weights from the Leaning Tower of Pisa, even Columbus believing the world was a sphere (he didn't - he thought it was pear-shaped!)

So if scientific history cannot be relied upon, what about the hypotheses and theories themselves? In the words of John Gribbin, we construct 'Just So' stories to create a comprehendible version of reality. Presumably this reliance on metaphor will only increase as our knowledge becomes further divorced from everyday experience but our technology fails to keep pace with confirming new theories; for example, it is far from likely that we will ever be able to directly view a superstring.

In addition, language doesn't just restrict our ideas: if a term has a scientific sense differing from vernacular meaning, problems frequently arise. A classic example would be quantum leap, which to most people means an enormous step forward but to physicists is an electron's miniscule change of energy level. However, even personal computer pioneer Sir Clive Sinclair used the term in its former meaning for his 1984 Quantum Leap microcomputer (at least I assume he did, although QL owners may disagree...)

Speaking of which, perhaps when we finally build (or machines build for us) computers capable of true artificial intelligence, new ways of exploring the universe not tied down to conventional linguistic-based thought patterns may arise. Then again, since we will be the parents of these machines, this may not be feasible. As one of Terry Pratchett's characters stated: "I think perhaps the most important problem is that we are trying to understand the fundamental workings of the universe via a language devised for telling one another where the best fruit is." But all things considered, we haven't done that badly so far.

Technorati Tags: , ,