Showing posts with label Einstein. Show all posts
Showing posts with label Einstein. Show all posts

Tuesday 23 December 2014

Easy fixes: simple corrections of some popular scientific misconceptions

A few months' ago I finally saw the film 'Gravity', courtesy of a friend with a home theatre system. Amongst the numerous technical errors - many pointed out on Twitter by Neil deGrasse Tyson - was one that I hadn't seen mentioned. This was how rapidly Sandra Bullock's character acclimatised to the several space stations and spacecraft immediately after removing her EVA suit helmet. As far as I am aware, the former have nitrogen-oxygen atmospheres whilst the suits are oxygen-only, necessitating several hours of acclimatisation.

I may of course be wrong on this, and of course dramatic tension would be pretty much destroyed if such delays had to be woven into the plot, but it got me thinking that there are some huge fundamental errors propagated in non-scientific circles. Therefore my Christmas/Hanukkah/holiday season present is a very brief, easy -on-the-brain round-up of a few of the more obvious examples.

  1. The Earth is perfect sphere.
    Nope, technically I think the term is 'oblate spheroid'. Basically, a planet's spin squashes the mass so that the polar diameter is less than the equatorial diameter. Earth is only about 0.3% flatter in polar axis but if you look at a photograph of Saturn you can see a very obvious squashing.

  2. Continental drift is the same thing as plate-tectonics.
    As a child I often read that these two were interchangeable, but this is not so. The former is the hypothesis that landmasses have moved over time whilst the latter is the mechanism now accepted to account for this, with the Earth's crust floating over the liquid mantle in large segments or plates.

    Geologist Alfred Wegener suggested the former in 1912 but is was largely pooh-poohed until the latter was discovered by ocean floor spreading half a century later. As Carl Sagan often said, "extraordinary claims require extraordinary evidence".

  3. A local increase in cold, wet weather proves that global warming is a fallacy.
    Unfortunately, chaose theory shows that even the minutest of initial changes can cause major differences of outcome, hence weather forecasting being far from an exact science.

    However, there is another evidence for the validity of this theory, fossil fuel lobbyists and religious fundamentalists aside. I haven't read anything to verify this, but off the top of my head I would suggest that if the warm water that currently travels north-east across the Atlantic from the Gulf of Mexico (and prevents north-western Europe from having cold Canadian eastern seaboard winters), then glacial meltwater may divert this warm, denser seawater. And then the Isles of Scilly off the Cornish coast may face as frosty a winter as the UK mainland!

  4. Evolution and natural selection are the same thing.
    Despite Charles Darwin's On the Origin of Species having been published in 1859, this mistake is as popular as ever. Evolution is simply the notion that a population within a parent species can slowly differentiate to become a daughter species, but until Darwin and Alfred Russel Wallace independently arrived at natural selection, there really wasn't a hypothesis for the mechanism.

    This isn't to say that there weren't attempts to provide one, it's just that none of them fit the facts quite as well as the elegant simplicity of natural selection. Of course today's technology, from DNA analysis to CAT scans of fossils, provides a lot more evidence than was available in the mid-Nineteenth Century. Gregor Mendel's breeding programmes were the start of genetics research that led to the modern evolutionary synthesis that has natural selection at its core.

  5. And finally…freefall vs zero gravity.
    Even orbiting astronauts have been known to say that they are in zero gravity when they are most definitely not. The issue is due to the equivalence of gravity and acceleration, an idea which was worked on by luminaries such as Galileo, Newton and Einstein. If you find yourself in low Earth orbit - as all post-Apollo astronauts are - then clearly you are still bound by our planet's gravity.

    After all, the Moon is approximately 1800 times further away from the Earth than the International Space Station (ISS), but it is kept in orbit by the Earth's pull (okay, so there is the combined Earth-Moon gravitational field, but I'm keeping this simple). By falling around the Earth at a certain speed, objects such as the ISS maintain a freefalling trajectory: too slow and the orbit would decay, causing the station to spiral inwards to a fiery end, whilst too fast would cause it to fly off into deep space.

    You can experience freefall yourself via such delights as an out-of-control plummeting elevator or a trip in an arc-flying astronaut training aircraft A.K.A. 'Vomit Comet'. I'm not sure I'd recommend either! Confusingly, there's also microgravity and weightlessness, but as it is almost Christmas we'll save that for another day.
There are no doubt numerous other, equally fundamental errors out there, which only goes to show that we could do with much better science education in our schools and media. After all, no-one would make so many similar magnitude mistakes regarding the humanities, would they? Or, like the writer H.L. Mencken, would I be better off appreciating that "nobody ever went broke underestimating the intelligence of the (American) public"? I hope not!

Saturday 15 March 2014

Cutting remarks: investigating five famous science quotations

If hearing famous movie lines being misquoted seems annoying, then misquoted or misused science citations can be exasperating, silly or downright dangerous. To this end, I thought that I would examine five well-known science quotations to find the truth behind the soundbite. By delineating the accurate (as far as I'm aware) words in the wider context in which they were said/written down/overheard by someone down the hallway, I may be able to understand the intended meaning, and not the autopilot definition frequently used. Here goes:

1) God does not play dice (Albert Einstein)

Possibly Einstein's most famous line, it sound like the sort of glib comment that could be used by religious fundamentalists to denigrate science in two opposing fashions: either Einstein is being facetious and therefore sacrilegious; or he supports an old-fashioned version of conventional Judeo-Christian beliefs in which God can be perceived in the everyday world. Talk about having your cake and eating it!

Einstein is actually supposed to have said: "It is hard to sneak a look at God's cards. But that he would choose to play dice with the world...is something that I cannot believe for a single moment." This gives us much more material to work with: it was actually a quote Einstein himself supplied to a biographer. Some years earlier he had communicated with physicist Max Born along similar lines: "Quantum mechanics is certainly imposing. But an inner voice tells me that it is not yet the real thing. The theory says a lot, but does not really bring us any closer to the secret of the 'old one'. I, at any rate, am convinced that He does not throw dice."

So here is the context behind the quote: Einstein's well-known disbelief in the fundamental nature of quantum mechanics. As I've discussed in a previous post Einstein's opinions on the most accurate scientific theory ever devised was completely out of step with the majority of his contemporaries - and physicists ever since. Of course we haven't yet got to the bottom of it; speaking as a non-scientist I find the Copenhagen Interpretation nonsense. But then, many physicists have said something along the lines of that if you think you understand quantum mechanics, you haven't understood it. Perhaps at heart, Einstein was stuck in a Nineteenth Century mind set, unable to conceive of fundamental limits to our knowledge or that probability lies at the heart of reality. He spent decades looking for a deeper, more obviously comfortable, cause behind quantum mechanics. And as for his interest in the 'Old One', Einstein frequently denied his belief in a Judeo-Christian deity but referred to himself as an agnostic: the existence of any presence worthy of the name 'God' being "the most difficult in the world". Now there's a quote worth repeating!

2) Science is a way of thinking much more than it is a body of knowledge (Carl Sagan)

As I've mentioned before, Bill Bryson's A Short History of Nearly Everything is chock full of the results of scientific investigation but rarely stops to consider the unique aspects that drive the scientific method, or even define the limits of that methodology. Sagan's full quote is: "Science is more than a body of knowledge. It is a way of thinking; a way of sceptically interrogating the universe with a fine understanding of human fallibility. If we are not able to ask sceptical questions, to interrogate those who tell us that something is true, to be sceptical of those in authority, then, we are up for grabs for the next charlatan (political or religious) who comes rambling along."

It is interesting because it states some obvious aspects of science that are rarely discussed, such as the subjective rather than objective nature of science. As human beings, scientists bring emotions, selective memory and personal preferences into their work. In addition, the socio-cultural baggage we carry is hardly ever discussed until a paradigm shift (or just plain, old-fashioned time has passed) and we recognise the idiosyncrasies and prejudices embedded into research. Despite being subject to our frailties and the zeitgeist, once recognised, these limitations are part of the strength of the discipline: it allows us, at least eventually, to discover their effect on what was once considered the most dispassionate branch of learning.

Sagan's repeated use of the word sceptical is also of great significance. Behind the multitude of experimental, analytical and mathematical methods in the scientific toolkit, scepticism should be the universal constant. As well as aiding the recognition of the biases mentioned above, the sceptical approach allows parsimony to take precedence over authority. It may seem a touch idealistic, especially for graduate students having to kowtow to senior faculty when seeking research positions, but open-minded young turks are vital in overcoming the conservative old guard. Einstein's contempt for authority is well-known, as he made clear by delineating unthinking respect for it as the greatest enemy of truth. I haven't read Stephen Jay Gould's Rocks of Ages: Science and Religion in the Fullness of Life, but from what I understand of his ideas, the distinction concerning authority marks a clear boundary worthy of his Non-Overlapping Magisteria.

3) The mystery of the beginning of all things is insoluble by us; and I for one must be content to remain an agnostic (Charles Darwin)

From the original publication of On the Origin of Species in 1859 to the present day, one of the most prominent attacks by devoutly religious critics to natural selection is the improbability of how life started without divine intervention. If we eventually find microbial life on Mars - or larger organisms on Titan, Europa or Enceladus - this may turn the tide against such easy a target, but one thing is for certain: Darwin did not attempt to detail the origin of life itself. Although he stated in a letter to a fellow scientist: "But if (and Oh! What a big if!) we could conceive in some warm little pond, with all sorts of ammonia and phosphoric salts, lights, heat, electricity etc., present that a protein compound was chemically formed ready to undergo still more complex changes" there are no such broad assumptions in his public writings.

As it turns out, Darwin may have got some of the details correct, although the 'warm little pond' is more likely to have been a deep sea volcanic vent. But we are still far from understanding the process by which inert chemicals started to make copies of themselves. It's been more than sixty years since Harold Urey and Stanley Miller at the University of Chicago produced amino acids simply by recreating what conditions were then thought to resemble on the early Earth. Despite numerous variations on this classic experiment in subsequent decades, we are little closer to comprehending the origin of life. So it was appropriate that Darwin, who was not known for flights of fancy (he once quipped "My mind seems to have become a kind of machine for grinding general laws out of large collections of facts") kept speculation out of his strictly evidence-based publications.

Just as Darwin has been (at times, deliberately) misquoted by religious fundamentalists determined to undermine modern biology, his most vociferous disciple today, Richard Dawkins, has also been selectively quoted to weaken the scientific arguments. For example, printing just "The essence of life is statistical improbability on a colossal scale" as opposed to the full text from The Blind Watchmaker discussing cumulative natural selection, is a cheap literary device that lessens the critique, but only if the reader is astute enough to investigate the original source material.

4) Anybody who has been seriously engaged in scientific work of any kind realizes that over the entrance to the gates of the temple of science are written the words: 'Ye must have faith.' (Max Planck)

Thomas Henry Huxley (A.K.A. Darwin's Bulldog) once wrote that "Science is organized common sense where many a beautiful theory was killed by an ugly fact." But that was back in the Nineteenth Century, when classical physics ruled and scientists predicted a time in the near future when they would understand all the fundamentals of the universe. In these post-modern, quantum mechanical times, uncertainty (or rather, Uncertainty) is key, and common sense goes out of the window with the likes of entanglement, etc.

Back to Planck. It seems fairly obvious that his quote tallies closely with the physics of the past century, in which highly defined speculation and advanced mathematics join forces to develop hypotheses into theories long before hard evidence can be gleaned from the experimental method. Some of the key players in quantum physics have even furthered Copernicus' preference for beautiful mathematics over observation and experiment. Consider the one-time Lucasian Professor of Mathematics Paul Dirac's partiality for the beauty of equations over experimental results, even though he considered humanity's progress in maths to be 'feeble'. The strangeness of the sub-atomic world could be seen as a vindication of these views; another of Planck's quotes is "One must be careful, when using the word, real."

Leaving aside advanced physics, there are examples in the other scientific disciplines that confirm Planck's view. In the historical sciences, you can never know the full story. For example, fossils can provide some idea of the how and when a species diverged into two daughter species, but not necessarily the where and why (vis-à-vis ecological 'islands' in the wider sense). Not that this lack of precision should be taken as doubt of validity. As evolutionary biologist Stephen Jay Gould once said, a scientific fact is something "confirmed to such a degree that it would be perverse to withhold provisional assent."  So what might appear to primarily apply to one segment of the scientific endeavour can be applied across all of science.

5) Space travel is utter bilge (Richard van der Riet Woolley, Astronomer Royal)

In 1956 the then-Astronomer Royal made a prediction that was thoroughly disproved five years later with Yuri Gagarin's historic Vostock One flight. The quote has been used ever since as an example of how blind obedience to authority is unwise. But Woolley's complete quote was considerably more ambiguous: "It's utter bilge. I don't think anybody will ever put up enough money to do such a thing...What good would it do us? If we spent the same amount of money on preparing first-class astronomical equipment we would learn much more about the universe...It is all rather rot." He went on say: "It would cost as much as a major war just to put a man on the moon." In fact, the latter appears to be quite accurate, and despite the nostalgia now aimed at the Apollo era, the lack of any follow-up only reinforces the notion that the race to the moon was simply the ultimate example of Cold War competition. After all, only one trained geologist ever got there!

However, I'm not trying to defend the edited version of Woolley's inopportune statement since he appears to have been an armchair naysayer for several decades prior to his most famous quote. Back in 1936, his review of Rockets Through Space: The Dawn of Interplanetary Travel by the first president of the British Interplanetary Society (BIS) was even more pessimistic: "The whole procedure [of shooting rockets into space]...presents difficulties of so fundamental a nature, that we are forced to dismiss the notion as essentially impracticable, in spite of the author's insistent appeal to put aside prejudice and to recollect the supposed impossibility of heavier-than-air flight before it was actually accomplished." Again, it might appear in hindsight that Woolley deserves scorn, were it not for the fact that nearly everyone with some knowledge of space and aeronautics was of a similar opinion, and the opposition were a few 'cranks' and the like, such as BIS members.

The moral of the this story is that it is far from difficult to take a partial quote, or a statement out of context, and alter a sensible, realistic attitude (for its time and place) into an easy piece of fun. A recent tweet I saw was a plaintive request to read what Richard Dawkins actually says, rather than what his opponents claim he has says. In a worst-case scenario, quote-mining makes it possible to imply the very opposite of an author's intentions. Science may not be one hundred percent provable, but it's by the far the best approach we have to finding out that wonderful thing we humans call 'the truth'.

Monday 27 January 2014

An index of possibilities: defining science at a personal level

"If a little knowledge is dangerous, where is the man who has so much as to be out of danger?" - T.H. Huxley

With a sense of revitalisation following the start of a new year - and since the number of misconceived notions of the scientific method are legion - I thought I should put my cards on the table and delineate my personal ideas of what I believe science to be.

I suppose you could say it's a self-learning exercise as much as anything. Most people consider science the least comprehensible of all disciplines, removed from every day experience and only accessible by a select few (a.k.a. an intellectual elite), albeit at the loss of the creativity that drives so many other aspects of our lives. But hopefully the incredible popularity of British physicist Brian Cox and other photogenic scientist-cum-science-communicators is more than a passing fad and will help in the long term to break down this damaging myth. Science is both part and parcel of our existence and will only increase in importance as we try to resolve such vital issues as environmental degradation whilst still providing enough food and water for an ever-increasing population (fingers very much crossed on that one, folks!)

So here goes: my interpretation of the scientific method in ten bite-size, easy-to-swallow, chunks.
  1. A large amount of science is not difficult to comprehend
    Granted, theoretical high-energy physics is one of several areas of science difficult to describe meaningfully in a few, short sound bites. But amidst the more obtuse volumes aimed at a popular readership there are some gems that break down the concepts to a level that retains the essential details without resorting to advanced mathematics. Evolutionary biologist Stephen Jay Gould noted that the fear of incompetence put many intelligent enthusiasts off learning science as a leisure activity, but with the enormity of these popular science sections in many bookstores - there are over 840,000 books in Amazon.com's science section - there is no longer an excuse for not dipping a toe. Leaving physics aside, there are plenty of areas of science that are easy to understand too, especially in the 'historical' disciplines such as palaeontology (more on that later).
  2. Science is not a collection of facts but a way of exploring reality
    This is still one of the most difficult things to convey. Bill Bryson's prize-winning best seller A Short History of Nearly Everything reminds me of the genre of boy's own bumper book of true facts that was still around when I was a child: Victorian-style progress with a capital 'P' and science just a compilation of theories and facts akin to say, history. The reality is of course rather more complicated. The scientific method is a way of examining nature via testable questions that can be resolved to a high degree of certainty by simplified models, either by practical experiments (both repeatable and under 'laboratory conditions') - and including these days, computer simulations - or via mathematics.
  3. Science requires creativity, not just rigor
    The stereotype of scientists as rational, unemotional beings has been broken down over the past thirty years or so, but many non-scientists still have little idea of the creative thinking that can be involved in science, particularly in cutting-edge theorising. From Einstein's thought experiments such as what it would be like to ride alongside a beam of light to the development of string theory - which has little likelihood of experimental evidence in the near future - scientists need to utilise creative thought at least as much as data collation and hard mathematics.
  4. Scientists are only human
    Scientists are far from immune to conditioned paths of thought ingrained via their social and cultural background. Therefore, rather than all scientists being equally adept at developing particular hypotheses, they are subject to the same whims and sense of normality as everyone else. In addition, individual idiosyncrasies can hinder their career. I've discussed previously how Einstein (who famously said his contempt of authority was punished by him becoming an authority himself) refused to accept some of the aspects of quantum theory long after his contemporaries had.
    Scientists could be said then to follow the stereotype visible elsewhere, namely that young radicals frequently evolve into old conservatives.
  5. If there's no proof, is it still science?
    Thomas Henry Huxley (a.k.a. Darwin's Bulldog) once said that the 'deepest sin against the human mind is to believe things without evidence'. Yet scientific hypotheses are sometimes formed prior to any support from nature or real-world experimentation. Although Charles Darwin had plenty of the evidence revealing artificial selection when he wrote On the Origin of Species, the fossil record at the time was extremely patchy and he had no knowledge of Mendelian inheritance. In addition, the most prominent physicists of his day were unaware of nuclear fusion and so their theories of how stars shone implied a solar system far too young for natural selection to be the primary mechanism of evolution. By sticking to his ideas in spite of these issues, did this make Darwin a poor scientist? Or is it feasible that many key advances require a leap of faith - a term unlikely to please Richard Dawkins - due to lack of solid, physical evidence?
  6. Are there two schools of science?
    New Zealand physicist Ernest Rutherford once disparagingly remarked something along the lines of physics being the only real science, and that other so-called scientific disciplines are just stamp collecting. I prefer to think of science as being composed of historical and non-historical disciplines, only occasionally overlapping. For instance, cutting-edge technological application of physics required repeatable and falsifiable experiments, hence the deemed failure of cold fusion, whilst the likes of meteorology, evolutionary biology, and palaeontology are composed of innumerable historical events and/or subject to the complexities of chaos theory and as such are unlikely to provide duplicate circumstances for testing or even capable of being broken down into simplified models that can be accurately tested.
  7. An accepted theory is not necessarily final
    A theory doesn't have to be the absolute end of a quest. For example, Newton's law of universal gravitation had to wait over two centuries for Einstein's general theory of relativity to explain the mechanism behind the phenomenon. Although quantum mechanics is the most accurate theory ever developed (in terms of the match between theory and experimental results), the root cause is yet to be understood, with wildly varying interpretations offered instead. The obvious problem with this approach is that a hypothesis may fit the facts but without an explanatory mechanism, scientists may reject it as untenable. A well-known instance of this scientific conservatism (albeit for good reasons) involved Alfred Wegener's hypothesis of continental drift, which only achieved orthodoxy decades later once plate tectonics was discovered.
  8. Scientific advance rarely proceeds by eureka moments
    Science is a collaborative effort. Few scientists work in a vacuum (except astronauts, of course!) Even the greatest of 'solo' theories such as universal gravitation was on the cards during Newton's lifetime, with contemporaries such as Edmond Halley working along similar lines. Unfortunately, our predilection for simple stories with identifiable heroes means that team leaders and thesis supervisors often receive the credit when many researchers have worked towards a goal. In addition, the priority rule is based on first publication, not when a scientist formulated the idea. Therefore many theories are named after scientists who may not have been the earliest discoverer or formulator. The work of unsung researchers is frequently neglected in favour of this simplified approach that glorifies the work of one pioneer at the expense of many others.
  9. Science is restricted by the necessity of using language to describe it
    Richard Dawkins has often railed against Plato's idealism (a.k.a. Essentialism), using the phrase 'the tyranny of the discontinuous mind'. I recall a primary example of this as a child, whilst contemplating a plastic model kit I had of a Neanderthal. I wondered how the human race had evolved: specifically, how could parents of a predecessor hominid species give birth to a modern human, i.e. a child of a different species? Of course, such discontinuity is nonsense, but it is surprising how frequently our mind interprets the world in this format of neat boundaries. A large part of the problem is how do we define transitional states as the norm, since our language is bound up with intrinsic categories? In addition, we rely on metaphor and analogy to describe aspects of the universe that do not conform to everyday experience, the nature of quantum probability being an obvious example. As with the previous point on our innate need for heroes, we are always constructing narratives, thus restricting our ability to understand nature at a fundamental level.
  10. Science does not include a moral dimension
    Science, like nature, is neither moral nor immoral and cannot provide a framework for human behaviour. Of course, this doesn't prevent scientists from being greedy or stupid, or even just naïve: witness British evolutionary biologist J.B.S. Haldane who recommended the use of poison gas as a war weapon due to it being more humane than conventional weapons (in terms of the ratio of deaths to temporarily incapacitation). This suggests that non-scientists should be involved in the decision-making process for the funding of some science projects, especially those with clear applications in mind. But in order for this to be tenable, the public needs to be considerably more scientifically literate than at present. Otherwise the appalling scare-mongering engendered by the likes of the British tabloid press - think genetically modified crops labelled as 'Frankenstein foods' - will only make matters far worse. GM crops themselves are a perfect example of why the Hollywood approach for clear-cut heroes and villains fails with most of science. Reality is rarely black or white but requires careful analysis of the myriad shades of grey.
In conclusion, it might be said that there are as many variants of science as there are human beings. Contrary to many other disciplines, mistakes and ignorance are clear strengths: as Darwin stated in The Descent of Man, 'Ignorance more frequently begets confidence than does knowledge.' Above all, there are aspects of science that are part and parcel of our everyday experience and as such, we shouldn't just consider it as something to save for special occasions.

Wednesday 20 November 2013

Newton and Einstein: fundamental problems at the heart of science

As previously discussed, Arthur C. Clarke's First Law is as follows: "When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong." Now there have been many examples of prominent scientists who have been proved wrong but don't want to lose their pet idea - think astronomer Fred Hoyle and the Steady State Theory - or bizarrely negated their own hypothesis, such as natural selection's co-discoverer Alfred Russel Wallace and his supernatural explanation of the human mind.

But although with hindsight we can easily mock when pioneers have failed to capitalise on a theory that later proves canonical (assuming any theory except the second law of thermodynamics can ever be said to be the final word in the matter) there are some scientists who have followed profoundly unorthodox paths of thought. In fact, I would go so far as to as say that certain famous figures would find it almost impossible to maintain positions in major research institutes today. This might not matter if these were run-of-the-mill scientists, but I'm talking about two of the key notables of the discipline: Sir Isaac Newton and Albert Einstein.

The public perception of scientists has changed markedly over the past half century, from rational authority figures, via power-mad destroyers, to the uncertainties of today, when the often farcical arguments surrounding climate change have further undermined faith in scientific 'truth'. But the recognition of Newton and Einstein's achievements has never wavered, making them unassailable figures in the history of science. Indeed, if there were ever to be two undisputed champions of physics, or even for all of science - as chosen by contemporary scientists, let alone the public - this contrasting pair is likely to the among the most popular. Yet underneath their profound curiosity and dogged search for truth there are fundamental elements to their personal research that make the offbeat ideas of Wallace, Hoyle & co. appear mildly idiosyncratic.

1) Sir Isaac Newton
While some historians have tried to pass off Newton's non-scientific work as typical of his age, his writings on alchemy, eschatology and the general occult are at least as numerable as those on physics. Some of the more recent examinations of his work have suggested that without these pseudo-scientific studies, Newton would not have gained the mind-set required to generate the scientific corpus he is renowned for. Although he claimed to have no need for hypotheses or 'occult qualities', preferring to examine natural phenomena in order to gain understanding, much of Newton's surviving notes suggest the very opposite. Whether he was using numerology to research the date of the end of the world, or alchemy to search for the Philosopher's Stone, the real Newton was clearly a many-faceted man. This led economist (and owner of some of Newton's papers) John Maynard Keynes to label him "the last of the magicians". Indeed, key aspects of Newton's personality appear entirely in tune with pseudo-science.

It is well known that Newton was a secretive man, given to hiding his discoveries for decades and not wanting to share his theories. Part of this was due to his wish to avoid having to waste time with the less intelligent (i.e. just about everybody else) and partly to his fear of plagiarism, frequently experiencing conflicts with contemporary natural philosophers. To some extent this unwillingness to publish only exacerbated the issue, such as when Leibniz published his version of calculus some years after Newton had completed his unpublicised 'fluxions'.

Today, establishing scientific priority relies upon prompt publication, but Newton's modus operandi was much closer to the technique of the alchemist. Far from being a non-systematic forerunner of chemistry, alchemy was a subjective discipline, couched in metaphor and the lost wisdom of 'ancient' sages (who, after Newton's time, were frequently discovered to be early Medieval or Ptolemaic Egyptian frauds). The purity of the practitioner was deemed fundamental to success and various pseudoscientific 'influences' could prevent repeatability of results.

In addition, such knowledge as could be discovered was only to be shared between a few chosen adepts, not disseminated to a wide audience for further examination and discussion. In personality then, Newton was far more like the pre-Enlightenment alchemist than many of his contemporaries. He believed in a sense of his own destiny: that he had been chosen by God to undertake the sacred duty of decoding now-hidden patterns in the universe and history. When Descartes postulated a 'clockwork universe', Newton opposed it on the grounds that it had no place for a constantly intervening deity. And surprising as it may seem, in that respect he had a lot in common with Einstein.

2) Albert Einstein
Einstein was in many ways a much more down-to-earth (and fully rounded human being) than Newton. Whereas the latter frequently neglected such basic human activities as food and sleep, Einstein indulged in pipe tobacco and playing the violin (shades of Sherlock Holmes, indeed!) However, he was just as much a determined thinker when it came to solving fundamental riddles of nature. A good anecdote, possibly true, tells of how whilst searching for a makeshift tool to straighten a bent paperclip, Einstein came across a box of new paperclips. Yet rather than use one of the new ones per se, he shaped it into the tool required to fix the original paperclip. When questioned, he replied that once had started a task it was difficult for him to curtail it.

But one of the oft-quoted phrases surrounding him is that Einstein would have been better off spending his last two or three decades fishing, rather than pursuing a unified field theory. The reason for this is that despite being a pioneer in the quantum theory of light, he could not accept some of the concepts of quantum mechanics, in particular that it was a fundamental theory based on probability rather than simply a starting point for some underlying aspect of nature as yet unknown.

Even today there are only interpretations of quantum mechanics, not a completely known explanation of what is occurring. However, Einstein considered these as more akin to philosophy rather than science and that following for example the Copenhagen interpretation prevented deeper thought into the true reality. Unfortunately, the majority of physicists got on the quantum mechanics bandwagon, leaving Einstein and a few colleagues to try to find holes in such strange predictions as entanglement, known by Einstein under the unflattering term of "spooky action at a distance".

Although it was only some decades after his death that such phenomena were experimentally proven, Einstein insisted that the non-common sense aspects of quantum mechanics only showed their incompleteness. So what lay at the heart of his fundamental objections to the theory? After all, his creative brilliance had shown itself in his discovery of the mechanism behind Newtonian gravitation, no mean feat for so bizarre a theory. But his glorious originality came at a price: as with many other scientists and natural philosophers, from Johannes Kepler via Newton to James Clerk Maxwell, Einstein sought answers that were aesthetically pleasing. In effect, the desire for truth was driven by a search for beautiful patterns. Like Newton, there is the concept of wanting to understand the mind of God, regardless of how different the two men's concept of a deity was (in Einstein's case, looking for the secrets of the 'old one').

By believing that at the heart of reality there is a beautiful truth, did Einstein hamper his ability to come to terms with such ugly and unsatisfying concepts as the statistical nature of the sub-atomic world? In this respect he seems old-fashioned, even quaint, by the exacting standards required - at least theoretically - in contemporary research institutes. Critical thinking unhampered by aesthetic considerations has long been shown a myth when it comes to scientific insights, but did Einstein take the latter too far in his inability to accept the most important physics developed during the second half of his life? In some respects, his work after the mid-1920s is seemingly as anachronistic as Newton's pseudo-scientific interests.

As a result of even these minimal sketches, it is difficult to believe that Newton would ever have gained an important academic post if he were alive today, whilst Einstein, certainly in the latter half of his life would probably have been relegated to a minor research laboratory at best. So although they may be giants in the scientific pantheon, it is an irony that neither would have gained such acceptance by the establishment had they been alive today. If there's a moral to be drawn here, presumably it is that even great scientists are just as much a product of their time as any other human being, even if they occasionally see further than us intellectual dwarves.

Tuesday 14 May 2013

What, how and why? Are there 3 stages to science?

Not being philosophically inclined I was recently surprised to find myself constructing an armchair thesis: it had suddenly dawned on me that there might be three, broad phases or stages to the development of scientific ideas. I'm fairly certain I haven't read about anything along similar lines, so let me explain,  safe in the knowledge that if it's a load of fetid dingo's kidneys, it's entirely of my own doing.

Stage 1

Stage one is the 'what' phase: simply stated, it is about naming and categorising natural phenomena, a delineation of cause and effect. In a sense, it is about finding rational explanations for things and events at the expense of superstition and mysticism.  In addition, it utilises the principle of parsimony, otherwise known as Occam's (or Ockham's) Razor: that the simplest explanation is usually correct. 

Although there were a few clear moments of stage one in Ancient Greece - Eratosthenes' attempt to measure the size of the Earth using Euclidean Geometry being a prime example - it seems to have taken off in earnest with Galileo. Although his work is frequently mythologised (I follow the rolling weights rather than dropping objects from the Leaning Tower of Pisa brigade), Galileo most likely devised both actual and thought experiments to test fundamental findings, such as the separate effects of air resistance and gravity.

Of course, Galileo was primarily interested in physics but the other areas of science followed soon after. Systematic biology came to the fore in such practical work as the anatomical investigations of William Harvey - pioneer in the understanding of blood circulation - and the glass bead microscopes of Antony van Leeuwenhoek. The work of the latter, interestingly enough, was largely to understand how small-scale structure in edible substances created flavours.  It's also worth thinking about how this research expanded horizons: after all, no-one had ever seen the miniature marvels such as bacteria. I wonder how difficult the engravers of illustrated volumes found it, working from sketches and verbal descriptions on sights they have never seen themselves? But then again, no-one has ever directly imaged a quark either…

Talking of biology, we shouldn't ignore Carl Linnaeus, the Swedish scientist who started the cataloguing methodology in use today. New Zealand physicist Ernest Rutherford may have disparagingly referred to all branches of science other than physics as mere stamp collecting but apart from the wild inaccuracy of his statement it is seemingly obvious that without various standards of basic definitions there is no bedrock for more sophisticated research.

The repetitive, largely practical aspect of the phase in such disciplines as geology and taxonomy meant that largely untrained amateurs could make major contributions, such as the multitude of Victorian parsons (of whom Charles Darwin was almost a member) who worked on the quantity over quality principle in collecting and cataloguing immense amounts of data. Of course, Darwin went far beyond phase one but his work built on the evaluation of evolutionary ideas (try saying that three times fast) that numerous predecessors had discussed, from the Ancient Greeks to John Ray in the late Seventeenth Century.

This isn't to say that stage one science will be finished any time soon. The Human Genome Project is a good example of a principally descriptive project that generated many surprises, not least that it is proving more difficult than predicted to utilise the results in practical applications. Although in the BBC television series The Kingdom of Plants David Attenborough mentioned that the Royal Botanic Gardens at Kew contains 90% of known plant species, there are still plenty of remote regions - not to mention the oceans - yet to yield all their secrets to systematic scientific exploration.  In addition to the biota yet to be described in scientific records, the existing catalogues are in the process of major reorganisation. For example, the multitude of duplicate plant names is currently being addressed by taxonomic experts, having so far led to the finding of 600,000 superfluous designations. It isn't just plants either: a recent example was the announcement that DNA evidence suggests there is probably only a single species of giant squid rather than seven. It may sound tedious and repetitive, but without comprehensive labelling and description of natural elements, it would be impossible to progress to the next stage.

Stage 2

Who was the first person to move beyond cataloguing nature to in-depth analysis? We'll probably never know, but bearing in mind that some of the Ionian philosophers and Alexandrian Greeks performed practical experiments, it may well have been one of them.

By looking to explore why phenomena occur and events unfold the way they do, our species took a step beyond description to evaluation. If art is holding a mirror up to nature, then could the second phase be explained as holding a magnifying glass up to nature, reducing a phenomenon to an approximation, and explaining how that approximation works?

For example, Newton took Galileo and Kepler's astronomical work and ran with it, producing his Law of Universal Gravitation. The ‘how' in this case is the gravitational constant that explained how bodies orbit their common centre of gravity. However, Newton was unable to delineate what caused the force to act across infinite, empty space, a theory that had to wait for stage three.

So different from the smug, self-satisfied attitude of scientists at the beginning of the Twentieth Century, the techniques of modern science suggest that there is a feedback cycle in which knowing which questions to ask is at least as important as gaining answers, the adage in this case being ‘good experiments generate new questions'. Having said that, some of the largest and most expensive contemporary experiments such as the Laser Interferometer Gravitational-Wave Observatory (LIGO) and Large Hadron Collider (LHC) have each been principally designed to confirm a single hypothesis.

As recent evidence has shown, even some of the fundamentals of the nature, including dark matter and dark energy, are only just being recognised. Therefore science is a long way from recognising all first principles, let alone understanding them. Closer to home, that most complex of known objects, the human brain, still holds a lot of secrets, and probably will continue to do so for some time to come.
Though microelectronics in general and computers in particular have allowed the execution of experiments in such fields as quantum teleportation, considered close to impossible by the finest minds only half a century ago, there are several reasons why computer processing power is getting closer to a theoretical maximum using current manufacturing techniques and materials. Therefore the near future may see a slowing down in the sorts of leading edge experimental science that has been achieved in recent decades. But how much progress has been made in phase three science?

Stage 3

This is more difficult to define than the other two phases and can easily veer into philosophy, a discipline that has a poor press from many professional scientists. Physicist Richard Feynman for example is supposed to have disparaged it as ‘about as useful to scientists as ornithology is to birds'.  Despite this - and the probability that there as many philosophies of science as there are philosophers -  it's easy to see that the cutting edge of science, particularly theoretical physics, generates as much discussion over its validity as any work of art. If you've read one of the myriad critiques of superstring theory for example, then you will know that it can be viewed as a series of intellectual patterns (accompanied by diabolical equations) that may never be experimentally confirmed. In that case is string theory really just a collection of philosophical hypotheses, unproven by experiment or observation and likely to remain so? The minuteness of the scale (an underwhelming description if ever there was one) makes the prospect of directly recording strings themselves  - as opposed to their effects - highly unlikely.

If that is the case then just where can you draw the line between science and philosophy? Of course one of the fundamental tenets of a valid hypothesis is to make testable predictions that no other hypothesis can account for. But with over a century of theories that increasingly fail to follow common sense  or match everyday experience perhaps this is a sign of approaching maturity in science, as we finally advance beyond the crude limitations of our biological inheritance and its limited senses. Surely one key result of this is that the boundaries between new ideas promulgated by scientists and the thoughts of armchair philosophers will become increasingly blurred? Or is that just fighting talk?

Whereas scientists engaged in phase two investigations seek to find more accurate approximations for phenomena, phase three includes the search for why one theory is thought to be correct over another. A prominent example may help elucidate. Further to Galileo in phase one and Newton in phase two, Einstein's General Relativity, which explains the cause of gravity via the curvature of spacetime, is clearly an example of phase three. Of course, contemporary physicists would argue that Einstein's equations are already known to be lacking finality due to its incompatible with quantum mechanics. Herein lies the rub!

One problem that has caused dissension amongst many scientists is a possibly even more ‘ultimate' question: why is the universe finely tuned enough for life and more than that, intelligent life, to exist? The potential answers cover the entire gamut of human thought, from the conscious design principle supported by some religiously-minded scientists, to the invocation of the laws of probability in a multiverse hypothesis, requiring an immense number of universes all with the different fundamentals (and therefore including a lucky few capable of producing life). But the obvious issue here is that wouldn't Occam's Razor suggest the former is more likely than the latter? As Astronomer Royal Sir Martin Rees states, this is veering into metaphysical territory, which except for the scientists with religious convictions, is usually an area avoided like the plague. However, it may eventually become possible to run computer models that simulate the creation of multiple universes and so as bizarre as it seems, go some way to creating a workable theory out of something that to most people is still a purely philosophical notion. Talk about counting angels on a pinhead!

I can't say I'm entirely convinced by my own theory of three stages to science, but it's been interesting to see how the history and practice of the discipline can be fitted into it. After all, as stated earlier no-one has ever observed a quark, which in the first days of their formulation were sometimes seen as purely mathematical objects any way. So if you're doubtful I don't blame you, but never say never...

Tuesday 29 May 2012

How to be cyantific: connecting the laboratory to the artist's studio

Moving house - or more broadly speaking, hemispheres - last year was a good excuse for a spring clean on an epic scale. One of the items that didn't make the grade even as far as a charity shop was a framed painting I created several decades' ago, a clumsy attempt to describe scientific imagery in acrylics. In front of a false colour radar map of the surface of Venus was the head and neck of a raptor dinosaur above a bowler-hatted figure straight out of Rene Magritte. You can judge the work for yourself below; I seem to remember the bemusement of the framer but as I said at the time, it wasn't meant to be to everyone's taste...

But if my daub was rather wide of the mark, just how successful have attempts been to represent the theory and practice of science in the plastic, non-linear, arts such as painting and sculpture? Whereas musical and mathematical ability seem to readily connect and there has been some admirable science-influenced poetry, by comparison the visual arts are somewhat lacking. Much has been written about the Surrealist's use of psychoanalysis but as this discipline is frequently described as a pseudoscience I've decided to cut through the issue by ignoring it and concentrate on the 'hard' sciences instead.

Combining science and art - or failing to
One of the most difficult issues to resolve (especially for those who accept C.P. Snow's theory of 'two cultures') is that whilst most science books for a general readership describe a linear progression or definitive advancement to the history of science, art has no such obvious arrow of change. After all, a century has passed since the early non-realist movements (Cubism, les Fauves, etc.) but there are plenty of contemporary artists who avoid abstraction. Granted, they are unlikely to win any of the art world's top prizes, but the progression of science and its child technology over the past three or so centuries clearly differentiates the discipline from the arts, both the sequential schools of the West and the 'traditional' aesthetics of other cultures.

Of course, it's usual to differentiate the character of scientists and artists about as far apart as any human behaviour can get, but like most stereotypical ideas it doesn't take much to prove them wildly inaccurate. Anyone aware of Einstein's views ("Imagination is more important than knowledge") or his last unsuccessful decades spent on a unification theory that ignored quantum mechanics will understand that scientists can have as imaginative and colourful personality as any artist. Indeed, the cutting edge of theoretical science, especially physics, may rely on insights and creativity as much as advanced mathematics, a far cry from the popular image of dull, plodding scientists who follow dry, repetitive processes.

Another aspect worth mentioning is that our species appears unique in the ability to create representations of the world that can be recognised as such by most if not all of our species. Despite Congo the chimpanzee gaining enough kudos in the 1950s for Picasso and Miro to buy his paintings, as well as more recent media interest in elephant art works, there is no evidence that under controlled experimental conditions non-human artists can produce obviously realistic images unaided. Then again, could it be that we are so biased in our recognition patterns that we do not identify what passes for realism in other species? Might it be possible that other animals interpret their work as representational when to us it resembles the energetic daubs of toddlers? (This suggests shades of Douglas Adams's dolphins, who considered themselves more intelligent than humans because rather than build cities and fight wars, all do is muck about in water having a good time...)

So where do we start? Firstly, what about unintentional, science-generated art? Over the past decade or so there has been a spate of large format, text-light, coffee table books consisting of images taken by space probes, telescopes and Earth resources satellites. A recent internet success consisted of time lapse photography of the Earth taken by crew aboard the International Space Station; clearly, no-one spent a hundred billion US dollars or so just to make a breath-taking video, but the by-products of the project are a clear example of how science can incidentally create aesthetic work. This isn't just a contemporary phenomenon either: the earliest examples I can think of are Leonardo da Vinci's dissection drawings; in addition to being possibly the most detailed such illustrations until today's non-invasive scanning techniques they are also beautiful works of art in themselves. But then Leonardo's intentions appear to have been to both investigate the natural world for the sheer sake of learning as well as improve his painting technique by knowledge of the underlying anatomy. I wonder if there are any contemporary artists who use MRI technology or similar as a technical aid for their draftsmanship?

At the other end of the spectrum (groan), mathematician Marcus du Sautoy's 2010 BBC TV series The Beauty of Diagrams was an interesting discourse on how certain images created for a scientific purpose have become mainstream visual symbols. From Vitruvian Man, da Vinci's analysis of ideal human proportions, to the double helix diagram of DNA (incidentally first drawn by Odile Crick, an artist married to a scientist), these works integrate the transmission of information with a beautiful aesthetic. The latter example is particularly interesting in that the attempt to illustrate complex, miniscule structures in an easily understandable format has since become a mainstay of science diagrams, shorthand that is frequently interpreted by the non-specialist as a much closer representation of reality than the schematic it really is.

Physicist and writer John Gribbin has often stated that the cutting edge science of the past century, especially physics, has had to resort to allegory to describe situations at scales far removed from human sensual experience. This implies that an essential method by which science can be conveyed is via the written metaphor and visual symbolism. As we delve further into new phenomena, science may increasingly rely on art to describe ideas that cannot for the foreseeable future be glimpsed at first hand. But ironically this could have a deleterious effect on public understanding if the model is too successful, for then it becomes difficult to supplant with a more accurate theory. An obvious example is the architecture of the atom, with the familiar if highly inaccurate classical model of electrons orbiting the nucleus like a miniature solar system prevalent long after the development of quantum electrodynamics.

You might ask how difficult would it be to describe probabilities and world paths in conventional art media, but Cubism was a style attempting to combine different viewpoints of a subject into one composition. If this appears too simplistic, then it may seem more convincing once you know that physicist Niels Bohr was inspired by Cubist theories during the development of the Complementarity Principle on the wave-particle duality. Cubism is of course only one of the more obvious visual tricks but even the most photo-realistic painting requires techniques to convert three dimensional reality (well four, if you want to include time), into two dimensions. How often do we consider this conversion process in itself, which relies on a series of visual formula to produce the desired result? It may not be science, but the production of most art isn't a haphazard or random series of actions.

It's easy to suggest that a fundamental difference between science and the plastic arts is that the former is ideally built of a combination of method and results whilst the latter is firmly biased towards the works alone. An exception can be seen in abstract expressionism, a.k.a. action painting: at art college we were taught that to practitioners of this school the moment of creation was at least as important as the final result. To this end, Jackson Pollock was filmed painting from as early as 1950, with numerous other artists of various movements following suit soon after. In general though, the art world runs on the rich individuals and corporations who buy the works, not the theories of critics.

And what of art theory? Most of it isn't relevant here, but one of the fundamentals of composition is the harmony and rhythm generated by the use of mathematical ratios and sequences. The Golden section and Fibonacci series are frequently found in organic structures, so in a sense their use is a confirmation of that old adage that the purpose of art is to hold a mirror up to nature. If that sounds trite, why not examine works by contemporary artists inspired by scientific theories or methodologies? That's coming in the next post...

Saturday 20 March 2010

Come all ye faithful: do faith schools threaten British science education?

With the announcement of a New Life Academy in Hull opening later this year the debate over religious education in Britain has become more intense than ever before. Of course we need to take Richard Dawkins' rhetoric with a pinch of salt, but has the current administration allowed or even provided financial support for fundamentalist organisations to infiltrate the British education system at the expense of science and rational thought?

The Hull Academy will follow the Accelerated Christian Education curriculum that amongst other tenets supports the literal truth of the Bible. So how likely is it that the UK will take on aspects of the American Bible Belt, with critical thinking and enquiry subservient to dogma and absolute belief? One of the main criticisms of the ACE system is its reliance on learning by rote, yet at least in their pre-teens, children are shown to benefit from such a system. It appears to do little to quench their thirst for exploration and discovery, which if anything is largely stamped out by an exam-obsessed education system. If all learning is given via rote there is an obvious problem, but in the vast majority of British faith schools this does not seem to be the case.

Alongside the four Emmanuel Schools Foundation academies, the NLA Academy is an easy target for those fearing religious extremism. But outside of Hollywood, the real world is rarely so easy to divide into good and bad. Not only are the ESF schools open to all faiths but an Ofsted inspection failed to support the allegations of creation science being taught. Even if these faculties were heading towards US-style fundamentalism, linking their techniques to all faith schools would be akin to arguing that the majority of British Jewish children attend the Yiddish-speaking private schools in North London's Stamford Hill orthodox community. Parents who are desperate to indoctrinate their children will take a do-it-yourself approach if they cannot find a school to deliver their requirements.

Many senior religious figures of various faiths, including the Archbishop of Canterbury Dr Rowan Williams, have stated that they do not want creationism taught in schools. If there is any stereotyping in this subject, it is here: most fundamentalists concentrate solely on evolutionary theories, natural selection and its implicit linking of mankind to other animals, rather than any other branch of science. Although the age of the Earth (and therefore the universe in general), as well as the sun-centred solar system, is sometimes denied for its disagreement with the Bible and the Koran, there are few extremists prepared to oppose other cornerstones of modern science. Clearly, would-be chemists should feel safe, potential geo- and astrophysicists less so, and those considering a career in evolutionary biology should not move to the American Midwest (or even Hull!)

More seriously, what of more subtle approaches by the mainstream denominations? A 2004 New Statesman article maligned an Anglican school in Canterbury for its attempts to inculcate infants with religious sensibilities via techniques that sounded more like a New Age cult than the Jesuit approach, but since then there has been little in the way of comparable stories. Whether senior figures in the Church of England see faith schools as a way of replenishing their ever-diminishing flock is unknown, but there is no solid evidence for such a master plan. Britain has a long and let's face it, fairly proud history of ordained ministers who have dabbled in the sciences, although few who could be compared with the Augustinian monk Gregor Mendel, the father of modern genetics. Although T.H.Huxley (A.K.A. Darwin's bulldog) railed against the ordained amateurs, his main bone of contention concerned Anglican privilege: comfortable sinecures allowing vicars to delve in the sciences whilst the lower social orders including Huxley had to fight tooth and claw to establish a paid profession.

There are many examples of religiously devout scientists who can be used to diffuse the caricatured 'us and them' mentality, perhaps the best-known current British example being particle physicist the Reverend John Polkinghorne. Organisations such as the International Society for Science and Religion, and the Society of Ordained Scientists, both of which claim Polkinghorne as a member, are against intelligent design from both a faith and science perspective. Whilst the hardline atheists might deem these groups as intending to both have their wafer and eat it, there are clearly a wide range of attitudes in support of current scientific theories at the expense of a literal belief in religious texts. But then don't most Christians today express a level of belief as varied as the rituals of the numerous denominations themselves, often far short of accepting literal Biblical truth? Believers find their own way, and so it is with scientists who follow conventional belief systems.

However, one potential danger of teaching science in faith schools may be a relic of Darwin's contemporaries (and of course Darwin himself initially aimed for a church career), namely the well-intentioned attempt to imbibe the discipline with a moral structure. Yet as our current level of knowledge clearly shows, bearing in mind everything from natural selection to asteroid impact, we cannot ally ethical principles to scientific methods or knowledge. Scientific theories can be used for good or evil, but it is about as tenable to link science to ethics or moral development as it is to blame a cat for torturing its prey. Of course children require moral guidance, but it must be nurtured via other routes. Einstein wrote in 1930 of a sense of cosmic religious feeling which has no need for the conventional anthropomorphic deity but to my mind seems more akin to Buddhism. As such he believed that a key role of science (along with art) is to awaken and preserve this numinous-like feeling. I for one consider this is as far as science can go along the road to spirituality, but equally agree with Huxley's term agnosticism: to go beyond this in either direction with our current, obviously primitive state of understanding, is sheer arrogance. If we wish to inculcate an open mind in our children, we must first guarantee such a thought system in ourselves. All else is indoctrination, be it religious or secular.

One of the ironies of faith schools in a nation where two thirds of secondary school children do not see themselves as religious practitioners, is that they are generally considered to supply a high standard of education and as such are usually oversubscribed. But all in all, there is little evidence to support this notion, since any oversubscribed institution is presumably able to choose a higher calibre of student whilst claiming to the contrary. Current estimates suggest 15% of British children attend faith schools, with a higher proportion in some regions (such as over 20% of London's secondary school places) but as low as 5% in more rural areas. Clearly, parents who want a good education for their children are not being put off by the worry of potential indoctrination. As has become obvious over the past few years, there are large increases in attendance at school-affiliated churches just prior to the application period: a substantial number of parents are obviously faking faith in return for what they deem to be a superior education.

For the moment it seems science education in Britain has little to worry about from the fundamentalists, at least compared to the divisiveness and homophobia that the National Secular Society deem the most prominent results of increasing faith-based education. We must be careful to ensure that as taxpayers we do not end up funding creationist institutions, but we can do little to prevent private schools following this approach. On a positive note, the closest faith school to me has a higher level of science attainment than its non-religious rivals. I admit that I attended an Anglican school for three years and appear to have emerged with as plural a stance as could be wished for. Indeed, I look back fondly on the days of dangerous chemistry experiments before health and safety-guaranteed virtual demonstrations began to supplant this fun aspect of school science: if you haven't used a burning peanut to blow the lid off a cocoa tin, you haven't lived!

Technorati Tags: , , ,

Sunday 3 January 2010

What's in a label? How words shape reality

With the start of a new year it seems appropriate to look at how our perception of the universe is created via language - after all, there's no position in space identifying an orbital starting point. We grow up with a notion of reality that is largely defined by convenience and historical accidents embedded into our language and therefore our thought patterns (and vice versa). For at least the last six hundred years many societies have called our planet Earth, whilst of course Ocean would be more appropriate. Whilst this is just an obvious chauvinism for a land-based species, there are other terms that owe everything to history. We count in base ten, position zero longitude through the Greenwich Meridian and usually show the Earth from one perspective, despite there not being an arrow in our galaxy stating 'this way up' (but then had Ancient Egyptians' view prevailed, Australia and New Zealand would be in the Northern Hemisphere.)

So how far can go with constructs? Our calendar is an archaic, sub-optimal mish-mash, with the interpolation of July and August meaning the last four months of the year are inaccurately named seven through ten. The changeover from the Julian to Gregorian calendar varied from nation to nation, meaning well-known events such as the birth of George Washington and the Bolshevik Revolution have several dates depending on the country defining that piece of history. As for the majority of humans agreeing that we are now in AD 2010, thanks to a fifteen hundred year-old mistake by Dionysius Exiguus our current year should really be at least AD 2014, if we accept that an historical figure called Jesus of Nazareth was born during the lifetime of Herod the Great. It appears that even the fundamentals that guide us through life are subjective at the very least if not far from accurate in many cases.

The philosopher of science Thomas Kuhn argues that all scientific research is a product of the culture of the scientists engaged on those projects, so whilst we might argue that Galileo was the first scientist in a strictly modern use of the word, can there be a definitive boundary between the quasi-mystical thought processes of Copernicus and Kepler (and even Newton), and that of the modern exponents typified by Einstein and Hawking? Whilst we would like to believe in a notion of pure objectivity, scientists are just as subjective as everyone else and their theories are therefore built on assumptions directly related to history, both cultural and biological.

We use labels to comfort ourselves, even boost our egos, via unconscious assumptions that are gradually looking more ridiculous as we delve ever deeper into the mysteries of creation. For example, the past sixty-five million years has been a period frequently named 'the Age of Mammals'. Yet as Stephen Jay Gould was fond of pointing out, most of the world's biomass is microbial and we macroscopic life forms are comparative newcomers, restricted to a far reduced range of environments compared to bacteria, protists and other small-scale organisms.

Despite such sense-expanding tools as infra-red telescopes and electron microscopes, we still process sensory input and use primarily audio-visual output to define scientific theories and methodology. We are in thrall to the languages we use define our thoughts, both conversational language and mathematics. Although the lingua franca of science has varied over the centuries, all languages from Latin to English have one thing in common: they are used to tell us stories. At a basic level, the history of science is riddled with fables and apocrypha, from Newton being hit by an apple (and inventing the reflecting telescope) to Galileo dropping weights from the Leaning Tower of Pisa, even Columbus believing the world was a sphere (he didn't - he thought it was pear-shaped!)

So if scientific history cannot be relied upon, what about the hypotheses and theories themselves? In the words of John Gribbin, we construct 'Just So' stories to create a comprehendible version of reality. Presumably this reliance on metaphor will only increase as our knowledge becomes further divorced from everyday experience but our technology fails to keep pace with confirming new theories; for example, it is far from likely that we will ever be able to directly view a superstring.

In addition, language doesn't just restrict our ideas: if a term has a scientific sense differing from vernacular meaning, problems frequently arise. A classic example would be quantum leap, which to most people means an enormous step forward but to physicists is an electron's miniscule change of energy level. However, even personal computer pioneer Sir Clive Sinclair used the term in its former meaning for his 1984 Quantum Leap microcomputer (at least I assume he did, although QL owners may disagree...)

Speaking of which, perhaps when we finally build (or machines build for us) computers capable of true artificial intelligence, new ways of exploring the universe not tied down to conventional linguistic-based thought patterns may arise. Then again, since we will be the parents of these machines, this may not be feasible. As one of Terry Pratchett's characters stated: "I think perhaps the most important problem is that we are trying to understand the fundamental workings of the universe via a language devised for telling one another where the best fruit is." But all things considered, we haven't done that badly so far.

Technorati Tags: , ,

Sunday 6 December 2009

Hawking and Dawkins: the dynamic duo

There was a time not so long ago when the defining attributes of famous British scientists were little more than a white coat, wild hair, and possibly a monocle. Today, it seems the five-second sound bite mentality of the MTV generation requires any scientist who can top a man-in-the-street poll to have some atypical personality traits, to say the least. So are the current British science superstars good role models in the way they represent science to the public, or having achieved fame are they content to ride the media gravy train, with science taking a backseat (in the last carriage, if you want to continue the metaphor)?

If today's celebrities are frequently reduced to mere caricatures of their former selves (supposing they had anything more in the first place), how can the complex subtleties of modern science survive the media simplification process? If there is one thing that defines our current state of scientific understanding, it is surely that the universe is very subtle indeed. A recent episode of The Armstrong and Miller Show highlighted this beautifully via a sketch of Ben Miller (who in real life swapped a physics PhD for luvviedom) as a professor being interviewed about his latest theory. Each time he was asked if it was possible to provide a brief description of his theory in layman's terms, he succinctly replied, "no".

Arguably the two biggest names today, at least in Britain, are Stephen Hawking and Richard Dawkins. After appearances on everything from Star Trek to The Simpsons, Hawking has overtaken Einstein as the scientific genius everyone has heard of. But, like Einstein's last few decades, has Hawking reached the height of fame long after completing his best work, a genius revered without comprehension by a public unaware of the latest developments in astrophysics? If it's true that theoretical physicists' main period of productivity is usually in their twenties, Hawking wouldn't be any different from other physicists his age (remembering he retired from the Lucasian Chair several months ago).

Hawking himself implies that his fame is compounded of demand from a lazy and scientifically non-savvy media (as in "who's the current Einstein?") twinned with the tedious if understandable interest surrounding his condition. It's probably fair to say that a physically-fit Professor Hawking wouldn't be considered to provide nearly as interesting copy. Of course to be able to write the best-selling (nine-million copies!) A Brief History of Time was a fantastic achievement, not least for its brevity. If it (and Hawking's later ventures) succeed in promoting scientific knowledge and methodologies then all well and good but it's not difficult to get the feeling that he is primarily viewed as a brand name. Very little of the blame can be passed to Hawking himself, but the question that must be asked is does the interest in him divert the limited media attention span for science away from a younger generation of scientists?

Richard Dawkins on the other hand seems to have deliberately cultivated media attention, no doubt revelling in his description as Darwin's Rottweiler. As holder of the Charles Simonyi Professorship until late last year he had an official position from which to promote public understanding, but for me his single-minded crusade has become rather tiresome. His role model, Thomas Henry Huxley, promoted science as "nothing but trained and organized common sense" whilst in addition espousing, via his "trade mark" agnosticism, the notion that one should not believe or disbelieve a proposition without justifiable evidence. Surely Huxley's agnosticism and the ideal of the scientific method are indistinguishable?

In contrast, Dawkins' approach is to browbeat all opposition, religious, scientific, or otherwise, with techniques that ironically having rather more in common with "faith viruses" than science. His documentary The Root of All Evil? allegedly omitted interviews with religious moderates to concentrate on the oddballs. It's understandable that documentary producers like a clear-cut argument, but skewing the evidence to fit the theory is inexcusable for a scientist. Dawkins' use of probability is his most objective method in support of atheism but when the law of parsimony, otherwise known as Occam's razor, cannot obviously be applied to resolve many aspects of the sub-atomic world, how can a glib theory along the lines of "I believe there's a less than even chance of the existence of a deity, therefore there isn't a deity", be accepted any more than a literal interpretation of Genesis? Warning of the increasing dangers of fundamentalism to both science and society as a whole is admirable, but to promote a simplistic thesis regarding complex, largely non-scientific, issues seems more an exercise in self-promotion than anything else. And Dawkins has the cheek to say that the word 'reductionism' makes him want to reach for a weapon...

It pains me to say it but I'm not sure either of the dynamic duo, somewhat atypical scientists as they undoubtedly are, can be said to be ideal promoters of science. If such excellent communicators as Martin Rees, Richard Fortey, or Brian Cox were as well known as Hawking and Dawkins is it more likely we see an increase in science exposition and less media shenanigans? At the end of the day fame is very fickle, if the example of Magnus Pyke is anything to go by. Ubiquitous in the 1970s and '80s, Pyke appeared in everything from a best-selling pop single (and its video) to a washing machine commercial. Voted third in a 1975 New Scientist poll only to Einstein and Newton as the best-known scientist ever, this charismatic and socially-aware 'boffin' is unfortunately almost forgotten today. But then an American business magazine recently claimed that Hawking was an American, no doubt lulled by the speech synthesiser into a false sense of security...

Technorati Tags: , ,