Showing posts with label Newton. Show all posts
Showing posts with label Newton. Show all posts

Friday 21 December 2018

The Twelve (Scientific) Days Of Christmas

As Christmas approaches and we get over-saturated in seasonal pop songs and the occasional carol, I thought it would be appropriate to look at a science-themed variation to this venerable lyric. So without further ado, here are the twelve days of Christmas, STEM-style.

12 Phanerozoic periods

Although there is evidence that life on Earth evolved pretty much as soon as the conditions were in any way suitable, microbes had the planet to themselves for well over three billion years. Larger, complex organisms may have gained a kick-start thanks to a period of global glaciation - the controversial Snowball Earth hypothesis. Although we often hear of exoplanets being found in the Goldilocks zone, it may also take an awful lot of luck to produce a life-bearing environment. The twelve geological periods of the Phanerozoic (literally, well-displayed life) cover the past 542 million years or so and include practically every species most of us have ever heard of. Hard to believe that anyone who knows this could ever consider our species to be the purpose of creation!

11 essential elements in humans

We often hear the phrase 'carbon-based life forms', but we humans actually contain over three times the amount of oxygen than we do of carbon. In order of abundance by mass, the eleven vital elements are oxygen, carbon, hydrogen, nitrogen, calcium, phosphorus, potassium, sulfur, sodium, chlorine and magnesium. Iron, which you might think to be present in larger quantities, is just a trace mineral; adults have a mere 3 or 4 grams. By comparison, we have about 25 grams of magnesium. In fact, iron and the other trace elements amount to less than one percent of our total body mass. Somehow, 'oxygen-based bipeds' just doesn't have the same ring to it.

10 fingers and toes

The evolution of life via natural selection and genetic mutation consists of innumerable, one-off events. This is science as history, although comparative studies of fossils, DNA and anatomy are required instead of written texts and archaeology. It used to be thought that ten digits was canonical, tracing back to the earliest terrestrial vertebrates that evolved from lobe-finned fish. Then careful analysis of the earliest stegocephalians of the late Devonian period such as Acanthostega showed that their limbs terminated in six, seven or even eight digits. The evolution of five-digit limbs seems to have occurred only once, in the subsequent Carboniferous period, yet of course we take it - and the use of base ten counting - as the most obvious of things. Just imagine what you could play on a piano if you had sixteen fingers!

9 climate regions

From the poles to the equator, Earth can be broadly divided into the following climate areas: polar and tundra; boreal forest; temperate forest; Mediterranean; desert; dry grassland; tropical grassland; tropical rainforest. Mountains are the odd region out, appearing in areas at any latitude that contains the geophysical conditions suitable for their formation. Natural selection leads to the evolution of species suited to the local variations in daylight hours, weather and temperature but the labels can be deceptive; the Antarctic for example contains a vast polar desert. We are only just beginning to understand the complex feedback systems between each region and its biota at a time when species are becoming extinct almost faster than they can be catalogued. We upset the relative equilibrium at our peril.

8 major planets in our solar system

When I was a child, all astronomy books described nine known planets, along with dozens of moons and numerous asteroids. Today we know of almost four thousand planets in other solar systems, some of a similar size to Earth (and even some of these in the Goldilocks zone). However, since 1996 our solar system has been reduced to eight planets, with Pluto amended to the status of a dwarf planet. Technically, this is because it fails one of the three criteria of major planets, in that it sometimes crosses Neptune’s orbit rather than sweeping it clear of other bodies. However, as there is at least one Kuiper belt object, Eris, almost as large as Pluto, it makes sense to stick to a definition that won’t see the number of planets continually rise with each generation of space telescope. This downgrading appears to have upset a lot of people, so it’s probably a good to mention that science is as much a series of methodologies as it is a body of knowledge, with the latter being open to change when required - it’s certainly not set-in-stone dogma! So as astronomer Neil DeGrasse Tyson and author of the best-selling The Pluto Files: The Rise and Fall of America's Favorite Planet put it: "Just get over it!"

7 colours of the rainbow

This is one of those everyday things that most of us never think about. Frankly, I don't know anyone who has been able to distinguish indigo from violet in a rainbow and yet we owe this colour breakdown not to an artist but to one of the greatest physicists ever, Sir Isaac Newton. As well as fulfilling most of the criteria of the modern day scientist, Newton was also an alchemist, numerologist, eschatologist (one of his predictions is that the world will end in 2060) and all-round occultist. Following the mystical beliefs of the Pythagoreans, Newton linked the colours of the spectrum to the notes in Western music scale, hence indistinguishable indigo making number seven. This is a good example of how even the best of scientists are only human.

6 mass extinction events

Episode two of the remake of Carl Sagan's Cosmos television series featuring Neil DeGrasse Tyson was called 'Some of the Things That Molecules Do'. It explored the five mass extinction events that have taken place over the past 450 million years. Tyson also discusses what has come to be known as the Holocene extinction, the current, sixth period of mass dying. Although the loss of megafauna species around the world has been blamed on the arrival of Homo sapiens over the past 50,000 years, the rapid acceleration of species loss over the last ten millennia is shocking in the extreme. It is estimated that the current extinction rate is anywhere from a thousand to ten thousand times to the background rate, resulting in the loss of up to two hundred plant or animals species every day. Considering that two-thirds of our pharmaceuticals are derived or based on biological sources, we really are shooting ourselves in the foot. And that's without considering the advanced materials that we could develop from nature.

5 fundamental forces

Also known as interactions, in order from strongest to weakest these are: the strong nuclear force; electro-magnetism; the weak nuclear force; and gravity. One of the most surprising finds in late Twentieth Century cosmology was that as the universe expands, it is being pushed apart at an ever-greater speed. The culprit has been named dark energy, but that's where our knowledge ends of this possible fifth force. Although it appears to account for about 68% of the total energy of the known universe, the label 'dark' refers to the complete lack of understanding as to how it is generated. Perhaps the most radical suggestion is that Einstein's General Theory of Relativity is incorrect and that an overhaul of the mechanism behind gravity would remove the need for dark energy at all. One thing is for certain: we still have a lot to learn about the wide-scale fabric of the universe.

4 DNA bases

Despite being one of the best-selling popular science books ever, Bill Bryson's A Short History of Nearly Everything manages to include a few howlers, including listing thiamine (AKA vitamin B1) as one of the four bases, instead of thymine. In addition to an understanding how the bases (adenine, cytosine, guanine and thymine) are connected via the double helix backbone, the 1953 discovery of DNA's structure also uncovered the replication mechanism, in turn leading to the development of the powerful genetic editing tools in use today. Also, the discovery itself shows how creativity can be used in science: Watson and Crick's model-building technique proved to be a faster way of generating results than the more methodical x-ray crystallography of Rosalind Franklin and Maurice Wilkins - although it should be noted that one of Franklin's images gave her rivals a clue as to the correct structure. The discovery also shows that collaboration is often a vital component of scientific research, as opposed to the legend of the lonely genius.

3 branches of science

When most people think of science, they tend to focus on the stereotypical white-coated boffin, beavering away in a laboratory filled with complex equipment. However, there are numerous branches or disciplines, covering the purely theoretical, the application of scientific theory, and everything in between. Broadly speaking, science can be divided into the formal sciences, natural sciences and social sciences, each covering a variety of categories themselves. Formal sciences include mathematics and logic and has aspects of absolutism about it (2+2=4). The natural or 'hard' sciences are what we learn in school science classes and broadly divide into physics, chemistry and biology. These use observation and experiment to develop working theories, but maths is often a fundamental component of the disciplines. Social or 'soft' sciences speak for themselves, with sub-disciplines such as anthropology sometimes crossing over into humanities such as archaeology. So when someone tells you that all science is impossibly difficult, you know they obviously haven't considered just what constitutes science!

2 types of fundamental particles

Named after Enrico Fermi and Satyendra Nath Bose respectively, fermions and bosons are the fundamental building blocks of the universe. The former, for example quarks and electrons, are the particles of mass and obey the Pauli Exclusion Principle, meaning no two fermions can exist in the same place in the same state. The latter are the carriers of force, with photons being the best known example. One problem with these particles and their properties such as angular momentum or spin is that most analogies are only vaguely appropriate. After all, we aren't used to an object that has to rotate 720 degrees in order to get back to its original state! In addition, there are many aspects of underlying reality that are far from being understood. String theory was once mooted as the great hope for unifying all the fermions and bosons, but has yet to achieve absolute success, while the 2012 discovery of the Higgs boson is only one potential advance in the search for a Grand Unifying Theory of creation.

1 planet Earth

There is a decorative plate on my dining room wall that says "Other planets cannot be as beautiful as this one." Despite the various Earth-sized exoplanets that have been found in the Goldilocks zone of their solar system, we have little chance in the near future of finding out if they are inhabited as opposed to just inhabitable. Although the seasonal methane on Mars hints at microbial life there, any human colonisation will be a physically and psychologically demanding ordeal. The idea that we can use Mars as a lifeboat to safeguard our species - never mind our biosphere - is little more than a pipedream. Yet we continue to exploit our home world with little consideration for the detrimental effects we are having on it. As the environmental movement says: there is no Planet B. Apart from the banning of plastic bags in some supermarkets, little else appears to have been done since my 2010 post on reduce, reuse and recycle. So why not make a New Year’s resolution to help future generations? Wouldn’t that be the best present for your children and your planetary home?

Wednesday 30 May 2018

Photons vs print: the pitfalls of online science research for non-scientists


It's common knowledge that school teachers and university lecturers are tired of discovering that their students' research is often limited to one search phrase on Google or Bing. Ignoring the minimal amount of rewriting that often accompanies this shoddy behaviour - leading to some very same-y coursework - one of the most important questions to arise is how easy is it to confirm the veracity of online material compared to conventionally-published sources? This is especially important when it comes to science research, particularly when the subject matter involves new hypotheses and cutting-edge ideas.

One of the many problems with the public's attitude to science is that it is nearly always thought of as an expanding body of knowledge rather than as a toolkit to explore reality. Popular science books such as Bill Bryson's 2003 best-seller A Short History of Nearly Everything follow this convention, disseminating facts whilst failing to illuminate the methodologies behind them. If non-scientists don't understand how science works is it little wonder that the plethora of online sources - of immensely variable quality - can cause confusion?

The use of models and the concurrent application of two seemingly conflicting theories (such as Newton's Universal Gravitation and Einstein's General Theory of Relativity) can only be understood with a grounding in how the scientific method(s) proceed. By assuming that scientific facts are largely immutable, non-scientists can become unstuck when trying to summarise research outcomes, regardless of the difficulty in understanding the technicalities. Of course this isn't true for every theory: the Second Law of Thermodynamics is unlikely to ever need updating; but as the discovery of dark energy hints, even Einstein's work on gravity might need amending in future. Humility and caution should be the bywords of hypotheses not yet verified as working theories; dogma and unthinking belief have their own place elsewhere!

In a 1997 talk Richard Dawkins stated that the methods of science are 'testability, evidential support, precision, quantifiability, consistency, intersubjectivity, repeatability, universality, and independence of cultural milieu.' The last phrase implies that the methodologies and conclusions for any piece of research should not differ from nation to nation. Of course the real world intrudes into this model and so culture, gender, politics and even religion play their part as to what is funded and how the results are presented (or even which results are reported and which obfuscated).

For those who want to stay ahead of the crowd by disseminating the most recent breakthroughs it seems obvious that web resources are far superior to most printed publications, professional journals excepted - although the latter are rarely suitable for non-specialist consumption. The expenses associated with producing popular science books means that online sources are often the first port of call.

Therein lies the danger: in the rush to skim seemingly inexhaustible yet easy to find resources, non-professional researchers frequently fail to differentiate between articles written by scientists, those by journalists with science training, those by unspecialised writers, largely on general news sites, and those by biased individuals. It's usually quite easy to spot material from cranks, even within the quagmire of the World Wide Web (searching for proof that the Earth is flat will generate tens of millions of results) but online content written by intelligent people with an agenda can be more difficult to discern. Sometimes, the slick design of a website offers reassurance that the content is more authentic than it really is, the visual aspects implying an authority that is not justified.

So in the spirit of science (okay, so it's hardly comprehensive being just a single trial) I recently conducted a simple experiment. Having read an interesting hypothesis in a popular science book I borrowed from the library last year, I decided to see what Google's first few pages had to say on the same subject, namely that the Y chromosome has been shrinking over the past few hundred million years to such an extent that its days - or in this case, millennia - are numbered.

I had previously read about the role of artificial oestrogens and other disruptive chemicals in the loss of human male fertility, but the decline in the male chromosome itself was something new to me. I therefore did a little background research first. One of the earliest sources I could find for this contentious idea was a 2002 paper in the journal Nature, in which the Australian geneticist Professor Jennifer Graves described the steady shrinking of the Y chromosome in the primate order. Her extrapolation of the data, combined with the knowledge that several rodent groups have already lost their Y chromosome, suggested that the Home sapiens equivalent has perhaps no more than ten million years left before it disappears.

2003 saw the publication of British geneticist Bryan Sykes' controversial book Adam's Curse: A Future Without Men. His prediction based on the rate of atrophy in the human Y chromosome was that it will only last another 125,000 years. To my mind, this eighty-fold difference in timescales suggests that for these early days in its history, very little of the hypothesis could be confirmed with any degree of certainty.

Back to the experiment itself. The top results for 'Y chromosome disappearing' and similar search phrases lead to articles published between 2009 and 2018. They mostly fall into one of two categories: (1) that the Y chromosome is rapidly degenerating and that males, at least of humans and potentially all other mammal species, are possibly endangered; and (2) that although the Y chromosome has shrunk over the past few hundred million years it has been stable for the past 25 million and so is no longer deteriorating. A third, far less common category, concerns the informal polls taken of chromosomal researchers, who have been fairly evenly divided between the two opinions and thus nicknamed the "leavers" and the "remainers". Considering the wildly differing timescales mentioned above, perhaps this lack of consensus is proof of science in action; there just hasn't been firm enough evidence for either category to claim victory.

What is common to many of the results is that inflammatory terms and hyperbole are prevalent, with little in the way of caution you would hope to find with cutting-edge research. Article titles include 'Last Man on Earth?', 'The End of Men' and 'Sorry, Guys: Your Y Chromosome May Be Doomed ', with paragraph text contain provocative phrases such as 'poorly designed' and 'the demise of men'. This approach is friendly to organic search at the same time as amalgamating socio-political concerns with the science.

You might expect that the results would show a change in trend of time, first preferring one category and then the other, but this doesn't appear to be the case. Rearranged in date order, the search results across the period 2009-2017 include both opinions running concurrently. This year however has seen a change, with the leading 2018 search results so far only offering support to the rapid degeneration hypothesis. The reason for this difference is readily apparent: publication of a Danish study that bolsters support for it. This new report is available online, but is difficult for a non-specialist to digest. Therefore, most researchers such as myself would have to either rely upon second-hand summaries or, if there was enough time, wait for the next popular science book that discusses it in layman's terms.

As it is, I cannot tell from my skimming approach to the subject whether the new research is thorough enough to be completely reliable. For example, it only examined the genes of sixty-two Danish men, so I have no idea if this is a large enough sample to be considered valid beyond doubt. However, all of the 2018 online material I read accepted the report without question, which at least suggests that after a decade and a half of vacillating between two theories, there may now be an answer. Even so, by examining the content in the "remainers" category, I wonder how the new research confirms a long term trend rather than short term blip in chromosomal decline. I can't help thinking that the sort of authoritative synthesis found in the better sort of popular science books would answer these queries, such is my faith in the general superiority of print volumes!

Of course books have been known to emphasise pet theories and denigrate those of opponents, but the risk of similar issues for online content is far greater. Professor Graves' work seems to dominate the "leavers" category, via her various papers subsequent to her 2002 original, but just about every reference to them is contaminated with overly emotive language. I somehow doubt that if her research was only applicable to other types of animals, say reptiles, there would be nearly so many online stories covering it, let alone the colourful phrasing that permeates this topic. The history of the Y chromosome is as extraordinary as the chromosome itself, but treating serious scientific speculation - and some limited experimental evidence - with tabloid reductionism and show business hoopla won't help when it comes to non-specialists researching the subject.

There may be an argument here for the education system to systematically teach such basics as common sense and rigour, in the hopes of giving non-scientists a better chance of detecting baloney. This of course includes the ability to accurately filter online material during research. Personally, I tend to do a lot of cross-checking before committing to something I haven't read about on paper. If even such highly-resourced and respected websites as the BBC Science News site can make howlers (how about claiming that chimpanzees are human ancestors?) why should we take any of these resources on trust? Unfortunately, the seductive ease with which information can be found on the World Wide Web does not in any way correlate with its quality. As I found out with the shrinking Y chromosome hypothesis, there are plenty of traps for the unwary.

Wednesday 25 May 2016

From Dr Strangelove to Dr Evil: Hollywood's anti-science stance

Despite the decades of hard work by the likes of Bill Nye, Stephen Hawking, Carl Sagan, Stephen Jay Gould etal, there is still an enormous amount of public suspicion surrounding scientists and their work. From wavering opinion concerning climate change to the negative publicity revolving around genetically-modified crops (A.K.A. 'Frankenfoods') it seems that popular opinion of scientists isn't far above that meted out in recent years to politicians and merchant bankers.

Tabloid media cannot be solely to blame for this, although the ridiculous scaremongering stories given front page attention, frequently involving medical science, are certainly no help. Instead, I would argue that some of the blame for the public attitude to STEM (Science, Technology, Engineering and Mathematics) comes from that ubiquitous global communicator, mainstream Hollywood. So where did the world's movie capital get its ideas from?

It seems that the denigration of science and its technological applications has probably existed as long as modern science itself. Before there were films to spread the negativity, literature had a mixed opinion of the discipline. Could some of the most famous apparently anti-scientific publications from Europe have inspired Hollywood's pioneers, many of whom were European emigrés?

Jonathan Swift's third book of Gulliver's Travels concerns the scientific elite of a floating island called Laputa. First published in 1726 during the so-called Age of Enlightenment, the book is typical of Swift's no holds barred approach to satire, making much use of the learning of the day. Despite being far more concerned with social and political issues rather than an anti-scientific stance, the material is still echoed today in the popular media.

Granted, many would agree that some of the more expensive STEM research projects such as the Large Hadron Collider could wait until global issues concerning hunger, medicine, environmental degradation - and poverty in general - are solved, but then wealth is rarely evenly distributed. After all, the USA apparently spends twice as much on pet grooming as it does on nuclear fusion research. Incidentally, isn't this bizarre in itself: it's not just that we consider ourselves so much more rational than all other animals, but that the human brain is the most complex object in the known universe. That's a pretty scary thought!

As for Mary Shelley's classic novel whose title is evoked during criticism of GM foods, she may have been inspired by the general feeling of doom then in the air; almost literally in fact, due to the 1815 eruption of Mount Tambora, with volcanic dust creating 1816's 'Year without a Summer'. As an aside, the astonishingly lurid colours of J.M.W. Turner's sunsets of the period were another artistic response associated with the high-altitude volcanic aerosols.

In addition to the extremely cold, wet conditions of that year, Shelley is thought to have stopped near to the original Frankenstein Castle in Germany, where alchemy and other dubious dark arts were reputed to have been practiced. Combined with Luigi Galvani's experiments on frogs' legs - originally performed several decades earlier but much imitated still in Shelley's time, including on human cadavers - the novel is clearly a reflection of widespread anxieties of the time.

With the expansion of industrial cities and their associated squalor, the mid-Nineteenth Century saw the origin of philosophies that associated technological advances (and their scientific underpinnings) with a debasement of humanity. William Blake's description of 'satanic mills' epitomises this mode of thought, seen in as diverse a range of expression as the Pre-Raphaelite Brotherhood of artists, the Arts and Crafts movement, even the political writings of Marx and Engels. To blame the greed of the new captains of industry on science is obviously unfair, but then the latter were a far easier target. After all, the English chemist and political radical Joseph Priestley fled to the United States after an authority-sponsored mob burnt down his house in 1791.

Blake's over-wraught emoting ("Science is the Tree of Death") is amongst the strongest negativity of the period, but can we blame him, considering science was, as it is today, often wrongly blamed as the root cause of the widespread destruction of nature to make way for a soulless, artificial environment? But it wasn't just a response to the changes to society and landscape that Blake took exception to: he detested the mechanistic vision of the universe built upon the work of Galileo and Newton, believing that too much knowledge destroyed wonder and awe.

This is clearly as subjective a viewpoint as any discussion of a work of art; it can be easily rebuffed, although the attitude behind it should be treated seriously. Happily, today's plethora of glossy coffee table books on such scientifically-gleaned wonders as Hubble Space Telescope imagery show there is still plenty to be in awe of.

Mainstream cinema frequently paints a very A versus B picture of the world (think classic westerns or war films). But science can rarely fit into such neat parcels: consider how the more accurate general theory of relativity can live alongside its predecessor from Newton. In addition, it's very tricky to make interesting drama within a traditional narrative structure that utilises scientist protagonists unless it's a disaster movie (even the likes of Jurassic Park falls within this category.)

It isn't difficult to recall many negative examples of scientists in Hollywood movies, from at best those too wrapped up in their own work to notice its wider effects, to at worst insane megalomaniacs intent on either world domination or destruction. In contrast, how many sympathetic movie scientists are there?

It seems such a shame that such a ubiquitous form of entertainment consistently portrays such a lack of sympathy towards science. Even the film version of Carl Sagan's novel Contact lacked the cosmic spiritual elements of the source material, as if afraid that a combination of astrophysics and the mystical wouldn't be comprehensible to audiences (2001 syndrome, perhaps?) Science fiction films these days often seem keen to boast of their technical consultants, so what about a more sympathetic attitude to the practitioners of science itself? After all, most scientists don't live with their private armies in secret headquarters bases, planning to takeover the world...

Thursday 28 May 2015

Presenting the universe: 3 landmark science documentary series

They say you carry tastes from your formative years with you for the rest of your life, so perhaps this explains why there are three science documentary television series that still have the power to enchant some decades after first viewing. Whilst there has been no shortage of good television science programming since - Planet Earth and the Walking with... series amongst them - there are three that remain the standard by which I judge all others:
  1. The Ascent of Man (1972) - an account of how humanity has evolved culturally and technology via biological and man-made tools. Presented by mathematician and renaissance man Jacob Bronowski.
  2. Cosmos (1980) - the history of astronomy and planetary exploration, interwoven with the origins of life. Presented by Carl Sagan (as if you didn't know).
  3. The Day the Universe Changed (1985) - a study of how scientific and technological breakthroughs in Western society generate paradigm shifts. Presented by the historian of science James Burke.

All three series have been proclaimed 'landmark' shows so it is interesting to compare their themes, viewpoints and production techniques, discovering just how similar they are in many ways. For a start, their excellent production values allowed for a wide range of international locations and historical recreations. They each have a charismatic presenter who admits to espousing a personal viewpoint, although it's quite easy to note that they get progressively more casual: if Jacob Bronowski has the appearance of a warm elder statesman then Carl Sagan is the father figure for a subsequent generation of scientists; James Burke's on-screen persona is more akin to the cheeky uncle, with a regular supply of puns, some good, some less so.

To some extent it is easy to see that the earliest series begat the second that in turn influenced the third. In fact, there is a direct link in that Carl Sagan hired several of the producers from The Ascent of Man for his own series, clearly seeing the earlier show as a template for Cosmos. What all three have is something extremely rare in other science documentaries: a passion for the arts that promotes a holistic interpretation of humanity's development; science does not exist in isolation. As such, the programmes are supported by superbly-illustrated tie-in books that extend the broadcast material from the latter two series whilst Bronowski's book is primarily a transcript of his semi-improvised monologue.

In addition to considering some of the standard examples of key developments in Western civilisation such as Ancient Greece and Galileo, the series include the occasional examination of Eastern cultures. The programmes also contain discussions of religions, both West and East. In fact, between them the series cover a vast amount of what has made the world the way it is. So not small potatoes, then!

The series themselves:

The Ascent of Man

To some extent, Jacob Bronowski was inspired by the earlier series Civilisation, which examined the history of Western arts. Both series were commissioned by David Attenborough, himself a natural sciences graduate who went on to present ground-breaking series in his own discipline as well as commissioning these landmark programmes. (As an aside, if there are any presenters around today who appears to embody the antithesis of C.P. Snow's 'the two cultures' then Sir David is surely in the top ten).

Bronowski's presentation is an astonishingly erudite (for all its improvisation) analysis of the development of our species and its technological society. Although primarily focused on the West, there is some consideration of other regions, from the advanced steel-making technology of medieval Japan to Meso-American astronomy or the relatively static culture of Easter Island. Time and again, the narrative predates the encumbrance of political correctness: that it was the West that almost solely generated our modern technological society - the 'rage for knowledge' for once outshining dogma and inertia.

Of course, it would be interesting to see how Bronowski might have written it today, in light of Jared Diamond's ground-breaking (in my humble opinion) Guns, Germs and Steel. Although he works hard to present science, the plastic arts, literature and myth as emerging from the same basic elements of our nature, it is clear that Bronowski considers the former to be much rarer - and therefore the more precious - discipline. Having said that, Bronowski makes a large number of Biblical references, primarily from the Old Testament. In light of the current issues with fundamentalism in the USA and elsewhere, it is doubtful that any science documentary today would so easily incorporate the breadth of religious allusions.

If there is a thesis underlying the series it is that considering how natural selection has provided humanity with a unique combination of mental gifts, we should use them to exploit the opportunities thus presented. By having foresight and imagination, our species is the only one capable of great heights - and, as he makes no pretence of - terrible depths. As he considers the latter, Bronowski admits that we should remain humble as to the state of contemporary knowledge and technology, which five hundred years hence will no doubt appear childlike. In addition, he states that belief in absolute knowledge can lead to arrogance; if we aspire to be gods, it can only end in the likes of Auschwitz. But his final speeches contain the wonderful notion that the path to annihilation can be avoided if science is communicated to all of society with the same vigour and zest as given to the humanities.

Cosmos

I was already an astronomy and astronautics fan when I saw this series. Its first UK broadcast slot was somewhat later than my usual bedtime, so it seemed a treat to be allowed to stay up after the rest of the family had gone to bed. Like Star Wars a few years before, it appeared to me to be an audio-visual tour-de-force; not surprisingly, both the tie-in hardback and soundtrack album arrived on my birthday that year.

Nostalgia aside, another key reason for the series' success was the charisma of the presenter himself. Much has been written of Sagan's abilities as a self-publicist, and the programmes do suffer from rather too many staring-beatifically-into-the-distance shots (as to some extent replicated more recently by Brian Cox in his various Wonders Of... series). Of course, it must have taken considerable effort to get the series made in the first place, especially in gaining a budget of over $6 million. After all, another great science populariser, the evolutionary biologist Stephen Jay Gould, never managed to gain anything beyond the occasional one-off documentary.

What is most apparent is Sagan's deep commitment to presenting science to the widest possible audience without distorting the material through over-simplification. However, in retrospect it is also obvious that he was using ideas from several scientific disciplines, such as the Miller-Urey experiment, to bolster his opinions on the likelihood of extra-terrestrial life. To some extent his co-writers reined him in, the final episode given over not to SETI but to plea for environmental stewardship.

Whilst the series is primarily concerned with a global history of astronomy and astrophysics, supplemented with first-hand accounts of planetary exploration, Sagan like Bronowski is equally at home with other scientific disciplines. He discusses the evolution of intelligence and incorporates elements of the humanities with equal aplomb. Another key element is the discussion of the role superstition and dead ends have played in the hindrance or even advancement of scientific progress, from Pythagorean mysticism, via Kepler's conflation of planetary orbits with the five Platonic solids, to Percival Lowell's imaginary Martian canals. Although Sagan repeats his earlier debunking of astrology, UFO sightings and the like, he doesn't rule out the role of emotions in the advancement of science and technology, citing for example the rocket pioneer Robert Goddard's Mars-centred epiphany.

Perhaps the primary reason that the series - despite the obvious dating of some of the knowledge - is still so engaging and why Sagan's narration is so widely quoted, is that he was a prose poet par excellence. Even when discussing purely scientific issues, his tone was such that the information could be effortlessly absorbed whilst allowing the viewer to retain a sense of wonder. Of course, Sagan had ample assistance from his two co-writers Ann Druyan and Steven Soter, as clearly proven by their scripts for the Neil deGrasse Tyson-hosted remake Cosmos: A Spacetime Odyssey. Nonetheless, it is hard to think of another presenter who could have made the original series the success it was on so many levels.

The Day the Universe Changed

Although James Burke had already made a large-scale history of science and technology series called Connections in 1978, it contained a rather different take on some of the same material. By focussing on interactive webs, the earlier series was somewhat glib, in that some of the connections could probably be replaced by equally valid alternatives.

In contrast, The Day the Universe Changed uses a more conventional approach that clearly shares some of the same perspectives as the earlier programmes. Like The Ascent of Man and the Cosmos remake, mediaeval Islamic science is praised for its inquisitiveness as well as the preservation of Classical knowledge. Burke was clearly influenced by his predecessors, even subtitling the series 'A Personal View by James Burke'. Perhaps inevitably he covers some of the same material too, although it would be difficult to create a brief history without reference to Newton or Ancient Greece.

As with Bronowski, Burke integrates scientific advances within wider society, a notable example being the rediscovery of perspective and its profound effect on contemporary art. He also supports the notion that rather than a gradual series of changes, paradigm shifts are fundamental to major scientific breakthroughs. In effect, he claims that new versions of the truth - as understood by a scientific consensus - may rely on abandonment of previous theories due to their irreconcilable differences. Having recently read Rachel Carson's 1950 The Sea Around Us I can offer some agreement: although Carson's geophysical analysis quietly screams in favour of plate tectonics, the contemporary lack of evidence lead her to state the no doubt establishment mantra of the period concerning static land masses.

What Burke constantly emphasises even more than his predecessors is that time and place has a fundamental influence on the scientific enquiry of each period. Being immersed in the preconceived notions of their culture, scientists can find it as difficult as anyone else to gain an objective attitude. In actuality, it is all but impossible, leading to such farcical dead-ends as Piltdown Man, a hoax that lasted for decades because it fulfilled the jingoistic expectations of British scientists. Burke's definition of genius is someone who can escape the givens of their background and thus achieve mental insights that no amount of methodical plodding can equal. Well, perhaps, on occasion.

The series also goes further than its predecessors in defining religion as anti-scientific on two grounds: its demand for absolute obedience in the face of logic and evidence, with reference to Galileo; or the lack of interest in progress, as with the cyclical yet static Buddhist view, content for the universe to endlessly repeat itself. Burke also shows how scientific ideas can be perverted for political ends, as with social Darwinism. But then he goes on to note that as the world gets ever more complex, and changes at an ever faster rate, non-specialists are unable to test new theories in any degree and so are having to rely on authority just as much as before the Enlightenment. How ironic!

All in all, these common threads are to my mind among the most important elements of the three series:
  1. Science and the humanities rely on the same basic processes of the human brain and so are not all that different;
  2. Scientific thinking can be as creative an endeavour as the arts;
  3. Scientists don't live in a cultural vacuum but are part and parcel of their world and time;
  4. Religion is the most change-resistant of human activities and therefore rarely appears sympathetic to science's aims and goals.

As Carl Sagan put it, "we make our world significant by the courage of our questions and the depth of our answers." For me, these three series are significant for their appraisal of some of those courageous explorers who have given us the knowledge and tools we call science.


Tuesday 23 December 2014

Easy fixes: simple corrections of some popular scientific misconceptions

A few months' ago I finally saw the film 'Gravity', courtesy of a friend with a home theatre system. Amongst the numerous technical errors - many pointed out on Twitter by Neil deGrasse Tyson - was one that I hadn't seen mentioned. This was how rapidly Sandra Bullock's character acclimatised to the several space stations and spacecraft immediately after removing her EVA suit helmet. As far as I am aware, the former have nitrogen-oxygen atmospheres whilst the suits are oxygen-only, necessitating several hours of acclimatisation.

I may of course be wrong on this, and of course dramatic tension would be pretty much destroyed if such delays had to be woven into the plot, but it got me thinking that there are some huge fundamental errors propagated in non-scientific circles. Therefore my Christmas/Hanukkah/holiday season present is a very brief, easy -on-the-brain round-up of a few of the more obvious examples.

  1. The Earth is perfect sphere.
    Nope, technically I think the term is 'oblate spheroid'. Basically, a planet's spin squashes the mass so that the polar diameter is less than the equatorial diameter. Earth is only about 0.3% flatter in polar axis but if you look at a photograph of Saturn you can see a very obvious squashing.

  2. Continental drift is the same thing as plate-tectonics.
    As a child I often read that these two were interchangeable, but this is not so. The former is the hypothesis that landmasses have moved over time whilst the latter is the mechanism now accepted to account for this, with the Earth's crust floating over the liquid mantle in large segments or plates.

    Geologist Alfred Wegener suggested the former in 1912 but is was largely pooh-poohed until the latter was discovered by ocean floor spreading half a century later. As Carl Sagan often said, "extraordinary claims require extraordinary evidence".

  3. A local increase in cold, wet weather proves that global warming is a fallacy.
    Unfortunately, chaose theory shows that even the minutest of initial changes can cause major differences of outcome, hence weather forecasting being far from an exact science.

    However, there is another evidence for the validity of this theory, fossil fuel lobbyists and religious fundamentalists aside. I haven't read anything to verify this, but off the top of my head I would suggest that if the warm water that currently travels north-east across the Atlantic from the Gulf of Mexico (and prevents north-western Europe from having cold Canadian eastern seaboard winters), then glacial meltwater may divert this warm, denser seawater. And then the Isles of Scilly off the Cornish coast may face as frosty a winter as the UK mainland!

  4. Evolution and natural selection are the same thing.
    Despite Charles Darwin's On the Origin of Species having been published in 1859, this mistake is as popular as ever. Evolution is simply the notion that a population within a parent species can slowly differentiate to become a daughter species, but until Darwin and Alfred Russel Wallace independently arrived at natural selection, there really wasn't a hypothesis for the mechanism.

    This isn't to say that there weren't attempts to provide one, it's just that none of them fit the facts quite as well as the elegant simplicity of natural selection. Of course today's technology, from DNA analysis to CAT scans of fossils, provides a lot more evidence than was available in the mid-Nineteenth Century. Gregor Mendel's breeding programmes were the start of genetics research that led to the modern evolutionary synthesis that has natural selection at its core.

  5. And finally…freefall vs zero gravity.
    Even orbiting astronauts have been known to say that they are in zero gravity when they are most definitely not. The issue is due to the equivalence of gravity and acceleration, an idea which was worked on by luminaries such as Galileo, Newton and Einstein. If you find yourself in low Earth orbit - as all post-Apollo astronauts are - then clearly you are still bound by our planet's gravity.

    After all, the Moon is approximately 1800 times further away from the Earth than the International Space Station (ISS), but it is kept in orbit by the Earth's pull (okay, so there is the combined Earth-Moon gravitational field, but I'm keeping this simple). By falling around the Earth at a certain speed, objects such as the ISS maintain a freefalling trajectory: too slow and the orbit would decay, causing the station to spiral inwards to a fiery end, whilst too fast would cause it to fly off into deep space.

    You can experience freefall yourself via such delights as an out-of-control plummeting elevator or a trip in an arc-flying astronaut training aircraft A.K.A. 'Vomit Comet'. I'm not sure I'd recommend either! Confusingly, there's also microgravity and weightlessness, but as it is almost Christmas we'll save that for another day.
There are no doubt numerous other, equally fundamental errors out there, which only goes to show that we could do with much better science education in our schools and media. After all, no-one would make so many similar magnitude mistakes regarding the humanities, would they? Or, like the writer H.L. Mencken, would I be better off appreciating that "nobody ever went broke underestimating the intelligence of the (American) public"? I hope not!

Wednesday 10 September 2014

Mythbusting: bringing science into the arena

My elder daughter is a big fan of the Discovery Channel show Mythbusters, who have spent eleven years testing myths (and not a few Hollywood set pieces) via science, technology, engineering and frequent resort to high explosives. Therefore, as a birthday treat I recently took her to the live Behind the Myths tour, fronted by Mythbusters hosts Adam Savage and Jamie Hyneman. Considering how macho the series frequently is - it's only female presenter, now left, is a vegetarian who was made to eat live bugs - it was interesting to see what and how the science was presented live.

In some respects it lived up to its reputation, with the hosts apologising for the lack of on-stage explosions but claiming their intentions were to 'blow the mind' instead of say, a pick-up truck or hot water cylinder. That's not to say that there weren't some fiery moments, including several montages of explosions and the infamous paintball machine gun aimed at someone wearing a suit of replica armour. Considering a large percentage of the audience consisted of pre-teens with their parents, the big bang elements were very much appreciated. But since the presenters have a special effects rather than science background, was there anything worthwhile beyond the showmanship?


Apart from a brief introduction to Newton's Second Law of Motion (force equals mass times acceleration, in case you weren't sure) there wasn't much of the classroom about the show. Except that for two hours Hyneman and Savage managed to painlessly convey a lot of scientific ideas. Examples included:
  • Archimedes' quote about using a lever to move the world was demonstrated via a fairground high striker and different sized mallets;
  • Perception, thanks to a point of view camera and some comedic cheating;
  • Tessellation and human mechanics, with four interlocked reclining men able to support their own weight when their chairs were taken away;
  • Friction via a circus-like stunt, in which Savage was lifted high above the stage thanks to the strength of interwoven telephone directories.
Although it might be quite easy to lose sight of the science behind all the razzmatazz, perhaps that was the point. These demonstrations reminded me of the Royal Institution's Christmas lectures, aimed primarily at 'young people' and barely a decade shy of being two hundred years' old. Unlike the television series, which has sometimes revisited experiments - occasionally reversing the original results in the process - the Behind the Myths tour was more a solid grounding in basic physics, with a little chemistry and biology thrown in. If anything, the most obvious outcomes would be to promote curiosity by recognising that science is deeply embedded in everyday life, and that exploring reality can be enormous fun.

The first section of the show had Adam Savage demonstrate juggling whilst explaining how he taught himself the techniques. Since his recollection discussed patience, perseverance and learning from your mistakes, you could say he was presenting in microcosm key elements of the scientific enterprise,' eureka' moments excepted.

I'm uncertain how many in the audience would cotton on to the science-by-the-backdoor aspect of the show. If anything, the children present may be more likely to want a career in movie special effects than in science, but the sense of wonder it generated may have also rubbed off on the adults present. Hyneman and Savage have become well-known enough in their support of STEM subjects and dislike of woolly thinking (take note, Discovery Channel , home of Finding Bigfoot) to have spoken at the 2006 annual convention of the US National Science Teachers Association, as well as presenting a demonstration to President Obama. That's no mean feat for a couple of special effects technicians with no formal science training. Let's hope that the some of the audience sees beyond the whizz bangs into the wonderful world that scientific exploration offers!

Monday 27 January 2014

An index of possibilities: defining science at a personal level

"If a little knowledge is dangerous, where is the man who has so much as to be out of danger?" - T.H. Huxley

With a sense of revitalisation following the start of a new year - and since the number of misconceived notions of the scientific method are legion - I thought I should put my cards on the table and delineate my personal ideas of what I believe science to be.

I suppose you could say it's a self-learning exercise as much as anything. Most people consider science the least comprehensible of all disciplines, removed from every day experience and only accessible by a select few (a.k.a. an intellectual elite), albeit at the loss of the creativity that drives so many other aspects of our lives. But hopefully the incredible popularity of British physicist Brian Cox and other photogenic scientist-cum-science-communicators is more than a passing fad and will help in the long term to break down this damaging myth. Science is both part and parcel of our existence and will only increase in importance as we try to resolve such vital issues as environmental degradation whilst still providing enough food and water for an ever-increasing population (fingers very much crossed on that one, folks!)

So here goes: my interpretation of the scientific method in ten bite-size, easy-to-swallow, chunks.
  1. A large amount of science is not difficult to comprehend
    Granted, theoretical high-energy physics is one of several areas of science difficult to describe meaningfully in a few, short sound bites. But amidst the more obtuse volumes aimed at a popular readership there are some gems that break down the concepts to a level that retains the essential details without resorting to advanced mathematics. Evolutionary biologist Stephen Jay Gould noted that the fear of incompetence put many intelligent enthusiasts off learning science as a leisure activity, but with the enormity of these popular science sections in many bookstores - there are over 840,000 books in Amazon.com's science section - there is no longer an excuse for not dipping a toe. Leaving physics aside, there are plenty of areas of science that are easy to understand too, especially in the 'historical' disciplines such as palaeontology (more on that later).
  2. Science is not a collection of facts but a way of exploring reality
    This is still one of the most difficult things to convey. Bill Bryson's prize-winning best seller A Short History of Nearly Everything reminds me of the genre of boy's own bumper book of true facts that was still around when I was a child: Victorian-style progress with a capital 'P' and science just a compilation of theories and facts akin to say, history. The reality is of course rather more complicated. The scientific method is a way of examining nature via testable questions that can be resolved to a high degree of certainty by simplified models, either by practical experiments (both repeatable and under 'laboratory conditions') - and including these days, computer simulations - or via mathematics.
  3. Science requires creativity, not just rigor
    The stereotype of scientists as rational, unemotional beings has been broken down over the past thirty years or so, but many non-scientists still have little idea of the creative thinking that can be involved in science, particularly in cutting-edge theorising. From Einstein's thought experiments such as what it would be like to ride alongside a beam of light to the development of string theory - which has little likelihood of experimental evidence in the near future - scientists need to utilise creative thought at least as much as data collation and hard mathematics.
  4. Scientists are only human
    Scientists are far from immune to conditioned paths of thought ingrained via their social and cultural background. Therefore, rather than all scientists being equally adept at developing particular hypotheses, they are subject to the same whims and sense of normality as everyone else. In addition, individual idiosyncrasies can hinder their career. I've discussed previously how Einstein (who famously said his contempt of authority was punished by him becoming an authority himself) refused to accept some of the aspects of quantum theory long after his contemporaries had.
    Scientists could be said then to follow the stereotype visible elsewhere, namely that young radicals frequently evolve into old conservatives.
  5. If there's no proof, is it still science?
    Thomas Henry Huxley (a.k.a. Darwin's Bulldog) once said that the 'deepest sin against the human mind is to believe things without evidence'. Yet scientific hypotheses are sometimes formed prior to any support from nature or real-world experimentation. Although Charles Darwin had plenty of the evidence revealing artificial selection when he wrote On the Origin of Species, the fossil record at the time was extremely patchy and he had no knowledge of Mendelian inheritance. In addition, the most prominent physicists of his day were unaware of nuclear fusion and so their theories of how stars shone implied a solar system far too young for natural selection to be the primary mechanism of evolution. By sticking to his ideas in spite of these issues, did this make Darwin a poor scientist? Or is it feasible that many key advances require a leap of faith - a term unlikely to please Richard Dawkins - due to lack of solid, physical evidence?
  6. Are there two schools of science?
    New Zealand physicist Ernest Rutherford once disparagingly remarked something along the lines of physics being the only real science, and that other so-called scientific disciplines are just stamp collecting. I prefer to think of science as being composed of historical and non-historical disciplines, only occasionally overlapping. For instance, cutting-edge technological application of physics required repeatable and falsifiable experiments, hence the deemed failure of cold fusion, whilst the likes of meteorology, evolutionary biology, and palaeontology are composed of innumerable historical events and/or subject to the complexities of chaos theory and as such are unlikely to provide duplicate circumstances for testing or even capable of being broken down into simplified models that can be accurately tested.
  7. An accepted theory is not necessarily final
    A theory doesn't have to be the absolute end of a quest. For example, Newton's law of universal gravitation had to wait over two centuries for Einstein's general theory of relativity to explain the mechanism behind the phenomenon. Although quantum mechanics is the most accurate theory ever developed (in terms of the match between theory and experimental results), the root cause is yet to be understood, with wildly varying interpretations offered instead. The obvious problem with this approach is that a hypothesis may fit the facts but without an explanatory mechanism, scientists may reject it as untenable. A well-known instance of this scientific conservatism (albeit for good reasons) involved Alfred Wegener's hypothesis of continental drift, which only achieved orthodoxy decades later once plate tectonics was discovered.
  8. Scientific advance rarely proceeds by eureka moments
    Science is a collaborative effort. Few scientists work in a vacuum (except astronauts, of course!) Even the greatest of 'solo' theories such as universal gravitation was on the cards during Newton's lifetime, with contemporaries such as Edmond Halley working along similar lines. Unfortunately, our predilection for simple stories with identifiable heroes means that team leaders and thesis supervisors often receive the credit when many researchers have worked towards a goal. In addition, the priority rule is based on first publication, not when a scientist formulated the idea. Therefore many theories are named after scientists who may not have been the earliest discoverer or formulator. The work of unsung researchers is frequently neglected in favour of this simplified approach that glorifies the work of one pioneer at the expense of many others.
  9. Science is restricted by the necessity of using language to describe it
    Richard Dawkins has often railed against Plato's idealism (a.k.a. Essentialism), using the phrase 'the tyranny of the discontinuous mind'. I recall a primary example of this as a child, whilst contemplating a plastic model kit I had of a Neanderthal. I wondered how the human race had evolved: specifically, how could parents of a predecessor hominid species give birth to a modern human, i.e. a child of a different species? Of course, such discontinuity is nonsense, but it is surprising how frequently our mind interprets the world in this format of neat boundaries. A large part of the problem is how do we define transitional states as the norm, since our language is bound up with intrinsic categories? In addition, we rely on metaphor and analogy to describe aspects of the universe that do not conform to everyday experience, the nature of quantum probability being an obvious example. As with the previous point on our innate need for heroes, we are always constructing narratives, thus restricting our ability to understand nature at a fundamental level.
  10. Science does not include a moral dimension
    Science, like nature, is neither moral nor immoral and cannot provide a framework for human behaviour. Of course, this doesn't prevent scientists from being greedy or stupid, or even just naïve: witness British evolutionary biologist J.B.S. Haldane who recommended the use of poison gas as a war weapon due to it being more humane than conventional weapons (in terms of the ratio of deaths to temporarily incapacitation). This suggests that non-scientists should be involved in the decision-making process for the funding of some science projects, especially those with clear applications in mind. But in order for this to be tenable, the public needs to be considerably more scientifically literate than at present. Otherwise the appalling scare-mongering engendered by the likes of the British tabloid press - think genetically modified crops labelled as 'Frankenstein foods' - will only make matters far worse. GM crops themselves are a perfect example of why the Hollywood approach for clear-cut heroes and villains fails with most of science. Reality is rarely black or white but requires careful analysis of the myriad shades of grey.
In conclusion, it might be said that there are as many variants of science as there are human beings. Contrary to many other disciplines, mistakes and ignorance are clear strengths: as Darwin stated in The Descent of Man, 'Ignorance more frequently begets confidence than does knowledge.' Above all, there are aspects of science that are part and parcel of our everyday experience and as such, we shouldn't just consider it as something to save for special occasions.

Wednesday 20 November 2013

Newton and Einstein: fundamental problems at the heart of science

As previously discussed, Arthur C. Clarke's First Law is as follows: "When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong." Now there have been many examples of prominent scientists who have been proved wrong but don't want to lose their pet idea - think astronomer Fred Hoyle and the Steady State Theory - or bizarrely negated their own hypothesis, such as natural selection's co-discoverer Alfred Russel Wallace and his supernatural explanation of the human mind.

But although with hindsight we can easily mock when pioneers have failed to capitalise on a theory that later proves canonical (assuming any theory except the second law of thermodynamics can ever be said to be the final word in the matter) there are some scientists who have followed profoundly unorthodox paths of thought. In fact, I would go so far as to as say that certain famous figures would find it almost impossible to maintain positions in major research institutes today. This might not matter if these were run-of-the-mill scientists, but I'm talking about two of the key notables of the discipline: Sir Isaac Newton and Albert Einstein.

The public perception of scientists has changed markedly over the past half century, from rational authority figures, via power-mad destroyers, to the uncertainties of today, when the often farcical arguments surrounding climate change have further undermined faith in scientific 'truth'. But the recognition of Newton and Einstein's achievements has never wavered, making them unassailable figures in the history of science. Indeed, if there were ever to be two undisputed champions of physics, or even for all of science - as chosen by contemporary scientists, let alone the public - this contrasting pair is likely to the among the most popular. Yet underneath their profound curiosity and dogged search for truth there are fundamental elements to their personal research that make the offbeat ideas of Wallace, Hoyle & co. appear mildly idiosyncratic.

1) Sir Isaac Newton
While some historians have tried to pass off Newton's non-scientific work as typical of his age, his writings on alchemy, eschatology and the general occult are at least as numerable as those on physics. Some of the more recent examinations of his work have suggested that without these pseudo-scientific studies, Newton would not have gained the mind-set required to generate the scientific corpus he is renowned for. Although he claimed to have no need for hypotheses or 'occult qualities', preferring to examine natural phenomena in order to gain understanding, much of Newton's surviving notes suggest the very opposite. Whether he was using numerology to research the date of the end of the world, or alchemy to search for the Philosopher's Stone, the real Newton was clearly a many-faceted man. This led economist (and owner of some of Newton's papers) John Maynard Keynes to label him "the last of the magicians". Indeed, key aspects of Newton's personality appear entirely in tune with pseudo-science.

It is well known that Newton was a secretive man, given to hiding his discoveries for decades and not wanting to share his theories. Part of this was due to his wish to avoid having to waste time with the less intelligent (i.e. just about everybody else) and partly to his fear of plagiarism, frequently experiencing conflicts with contemporary natural philosophers. To some extent this unwillingness to publish only exacerbated the issue, such as when Leibniz published his version of calculus some years after Newton had completed his unpublicised 'fluxions'.

Today, establishing scientific priority relies upon prompt publication, but Newton's modus operandi was much closer to the technique of the alchemist. Far from being a non-systematic forerunner of chemistry, alchemy was a subjective discipline, couched in metaphor and the lost wisdom of 'ancient' sages (who, after Newton's time, were frequently discovered to be early Medieval or Ptolemaic Egyptian frauds). The purity of the practitioner was deemed fundamental to success and various pseudoscientific 'influences' could prevent repeatability of results.

In addition, such knowledge as could be discovered was only to be shared between a few chosen adepts, not disseminated to a wide audience for further examination and discussion. In personality then, Newton was far more like the pre-Enlightenment alchemist than many of his contemporaries. He believed in a sense of his own destiny: that he had been chosen by God to undertake the sacred duty of decoding now-hidden patterns in the universe and history. When Descartes postulated a 'clockwork universe', Newton opposed it on the grounds that it had no place for a constantly intervening deity. And surprising as it may seem, in that respect he had a lot in common with Einstein.

2) Albert Einstein
Einstein was in many ways a much more down-to-earth (and fully rounded human being) than Newton. Whereas the latter frequently neglected such basic human activities as food and sleep, Einstein indulged in pipe tobacco and playing the violin (shades of Sherlock Holmes, indeed!) However, he was just as much a determined thinker when it came to solving fundamental riddles of nature. A good anecdote, possibly true, tells of how whilst searching for a makeshift tool to straighten a bent paperclip, Einstein came across a box of new paperclips. Yet rather than use one of the new ones per se, he shaped it into the tool required to fix the original paperclip. When questioned, he replied that once had started a task it was difficult for him to curtail it.

But one of the oft-quoted phrases surrounding him is that Einstein would have been better off spending his last two or three decades fishing, rather than pursuing a unified field theory. The reason for this is that despite being a pioneer in the quantum theory of light, he could not accept some of the concepts of quantum mechanics, in particular that it was a fundamental theory based on probability rather than simply a starting point for some underlying aspect of nature as yet unknown.

Even today there are only interpretations of quantum mechanics, not a completely known explanation of what is occurring. However, Einstein considered these as more akin to philosophy rather than science and that following for example the Copenhagen interpretation prevented deeper thought into the true reality. Unfortunately, the majority of physicists got on the quantum mechanics bandwagon, leaving Einstein and a few colleagues to try to find holes in such strange predictions as entanglement, known by Einstein under the unflattering term of "spooky action at a distance".

Although it was only some decades after his death that such phenomena were experimentally proven, Einstein insisted that the non-common sense aspects of quantum mechanics only showed their incompleteness. So what lay at the heart of his fundamental objections to the theory? After all, his creative brilliance had shown itself in his discovery of the mechanism behind Newtonian gravitation, no mean feat for so bizarre a theory. But his glorious originality came at a price: as with many other scientists and natural philosophers, from Johannes Kepler via Newton to James Clerk Maxwell, Einstein sought answers that were aesthetically pleasing. In effect, the desire for truth was driven by a search for beautiful patterns. Like Newton, there is the concept of wanting to understand the mind of God, regardless of how different the two men's concept of a deity was (in Einstein's case, looking for the secrets of the 'old one').

By believing that at the heart of reality there is a beautiful truth, did Einstein hamper his ability to come to terms with such ugly and unsatisfying concepts as the statistical nature of the sub-atomic world? In this respect he seems old-fashioned, even quaint, by the exacting standards required - at least theoretically - in contemporary research institutes. Critical thinking unhampered by aesthetic considerations has long been shown a myth when it comes to scientific insights, but did Einstein take the latter too far in his inability to accept the most important physics developed during the second half of his life? In some respects, his work after the mid-1920s is seemingly as anachronistic as Newton's pseudo-scientific interests.

As a result of even these minimal sketches, it is difficult to believe that Newton would ever have gained an important academic post if he were alive today, whilst Einstein, certainly in the latter half of his life would probably have been relegated to a minor research laboratory at best. So although they may be giants in the scientific pantheon, it is an irony that neither would have gained such acceptance by the establishment had they been alive today. If there's a moral to be drawn here, presumably it is that even great scientists are just as much a product of their time as any other human being, even if they occasionally see further than us intellectual dwarves.

Tuesday 14 May 2013

What, how and why? Are there 3 stages to science?

Not being philosophically inclined I was recently surprised to find myself constructing an armchair thesis: it had suddenly dawned on me that there might be three, broad phases or stages to the development of scientific ideas. I'm fairly certain I haven't read about anything along similar lines, so let me explain,  safe in the knowledge that if it's a load of fetid dingo's kidneys, it's entirely of my own doing.

Stage 1

Stage one is the 'what' phase: simply stated, it is about naming and categorising natural phenomena, a delineation of cause and effect. In a sense, it is about finding rational explanations for things and events at the expense of superstition and mysticism.  In addition, it utilises the principle of parsimony, otherwise known as Occam's (or Ockham's) Razor: that the simplest explanation is usually correct. 

Although there were a few clear moments of stage one in Ancient Greece - Eratosthenes' attempt to measure the size of the Earth using Euclidean Geometry being a prime example - it seems to have taken off in earnest with Galileo. Although his work is frequently mythologised (I follow the rolling weights rather than dropping objects from the Leaning Tower of Pisa brigade), Galileo most likely devised both actual and thought experiments to test fundamental findings, such as the separate effects of air resistance and gravity.

Of course, Galileo was primarily interested in physics but the other areas of science followed soon after. Systematic biology came to the fore in such practical work as the anatomical investigations of William Harvey - pioneer in the understanding of blood circulation - and the glass bead microscopes of Antony van Leeuwenhoek. The work of the latter, interestingly enough, was largely to understand how small-scale structure in edible substances created flavours.  It's also worth thinking about how this research expanded horizons: after all, no-one had ever seen the miniature marvels such as bacteria. I wonder how difficult the engravers of illustrated volumes found it, working from sketches and verbal descriptions on sights they have never seen themselves? But then again, no-one has ever directly imaged a quark either…

Talking of biology, we shouldn't ignore Carl Linnaeus, the Swedish scientist who started the cataloguing methodology in use today. New Zealand physicist Ernest Rutherford may have disparagingly referred to all branches of science other than physics as mere stamp collecting but apart from the wild inaccuracy of his statement it is seemingly obvious that without various standards of basic definitions there is no bedrock for more sophisticated research.

The repetitive, largely practical aspect of the phase in such disciplines as geology and taxonomy meant that largely untrained amateurs could make major contributions, such as the multitude of Victorian parsons (of whom Charles Darwin was almost a member) who worked on the quantity over quality principle in collecting and cataloguing immense amounts of data. Of course, Darwin went far beyond phase one but his work built on the evaluation of evolutionary ideas (try saying that three times fast) that numerous predecessors had discussed, from the Ancient Greeks to John Ray in the late Seventeenth Century.

This isn't to say that stage one science will be finished any time soon. The Human Genome Project is a good example of a principally descriptive project that generated many surprises, not least that it is proving more difficult than predicted to utilise the results in practical applications. Although in the BBC television series The Kingdom of Plants David Attenborough mentioned that the Royal Botanic Gardens at Kew contains 90% of known plant species, there are still plenty of remote regions - not to mention the oceans - yet to yield all their secrets to systematic scientific exploration.  In addition to the biota yet to be described in scientific records, the existing catalogues are in the process of major reorganisation. For example, the multitude of duplicate plant names is currently being addressed by taxonomic experts, having so far led to the finding of 600,000 superfluous designations. It isn't just plants either: a recent example was the announcement that DNA evidence suggests there is probably only a single species of giant squid rather than seven. It may sound tedious and repetitive, but without comprehensive labelling and description of natural elements, it would be impossible to progress to the next stage.

Stage 2

Who was the first person to move beyond cataloguing nature to in-depth analysis? We'll probably never know, but bearing in mind that some of the Ionian philosophers and Alexandrian Greeks performed practical experiments, it may well have been one of them.

By looking to explore why phenomena occur and events unfold the way they do, our species took a step beyond description to evaluation. If art is holding a mirror up to nature, then could the second phase be explained as holding a magnifying glass up to nature, reducing a phenomenon to an approximation, and explaining how that approximation works?

For example, Newton took Galileo and Kepler's astronomical work and ran with it, producing his Law of Universal Gravitation. The ‘how' in this case is the gravitational constant that explained how bodies orbit their common centre of gravity. However, Newton was unable to delineate what caused the force to act across infinite, empty space, a theory that had to wait for stage three.

So different from the smug, self-satisfied attitude of scientists at the beginning of the Twentieth Century, the techniques of modern science suggest that there is a feedback cycle in which knowing which questions to ask is at least as important as gaining answers, the adage in this case being ‘good experiments generate new questions'. Having said that, some of the largest and most expensive contemporary experiments such as the Laser Interferometer Gravitational-Wave Observatory (LIGO) and Large Hadron Collider (LHC) have each been principally designed to confirm a single hypothesis.

As recent evidence has shown, even some of the fundamentals of the nature, including dark matter and dark energy, are only just being recognised. Therefore science is a long way from recognising all first principles, let alone understanding them. Closer to home, that most complex of known objects, the human brain, still holds a lot of secrets, and probably will continue to do so for some time to come.
Though microelectronics in general and computers in particular have allowed the execution of experiments in such fields as quantum teleportation, considered close to impossible by the finest minds only half a century ago, there are several reasons why computer processing power is getting closer to a theoretical maximum using current manufacturing techniques and materials. Therefore the near future may see a slowing down in the sorts of leading edge experimental science that has been achieved in recent decades. But how much progress has been made in phase three science?

Stage 3

This is more difficult to define than the other two phases and can easily veer into philosophy, a discipline that has a poor press from many professional scientists. Physicist Richard Feynman for example is supposed to have disparaged it as ‘about as useful to scientists as ornithology is to birds'.  Despite this - and the probability that there as many philosophies of science as there are philosophers -  it's easy to see that the cutting edge of science, particularly theoretical physics, generates as much discussion over its validity as any work of art. If you've read one of the myriad critiques of superstring theory for example, then you will know that it can be viewed as a series of intellectual patterns (accompanied by diabolical equations) that may never be experimentally confirmed. In that case is string theory really just a collection of philosophical hypotheses, unproven by experiment or observation and likely to remain so? The minuteness of the scale (an underwhelming description if ever there was one) makes the prospect of directly recording strings themselves  - as opposed to their effects - highly unlikely.

If that is the case then just where can you draw the line between science and philosophy? Of course one of the fundamental tenets of a valid hypothesis is to make testable predictions that no other hypothesis can account for. But with over a century of theories that increasingly fail to follow common sense  or match everyday experience perhaps this is a sign of approaching maturity in science, as we finally advance beyond the crude limitations of our biological inheritance and its limited senses. Surely one key result of this is that the boundaries between new ideas promulgated by scientists and the thoughts of armchair philosophers will become increasingly blurred? Or is that just fighting talk?

Whereas scientists engaged in phase two investigations seek to find more accurate approximations for phenomena, phase three includes the search for why one theory is thought to be correct over another. A prominent example may help elucidate. Further to Galileo in phase one and Newton in phase two, Einstein's General Relativity, which explains the cause of gravity via the curvature of spacetime, is clearly an example of phase three. Of course, contemporary physicists would argue that Einstein's equations are already known to be lacking finality due to its incompatible with quantum mechanics. Herein lies the rub!

One problem that has caused dissension amongst many scientists is a possibly even more ‘ultimate' question: why is the universe finely tuned enough for life and more than that, intelligent life, to exist? The potential answers cover the entire gamut of human thought, from the conscious design principle supported by some religiously-minded scientists, to the invocation of the laws of probability in a multiverse hypothesis, requiring an immense number of universes all with the different fundamentals (and therefore including a lucky few capable of producing life). But the obvious issue here is that wouldn't Occam's Razor suggest the former is more likely than the latter? As Astronomer Royal Sir Martin Rees states, this is veering into metaphysical territory, which except for the scientists with religious convictions, is usually an area avoided like the plague. However, it may eventually become possible to run computer models that simulate the creation of multiple universes and so as bizarre as it seems, go some way to creating a workable theory out of something that to most people is still a purely philosophical notion. Talk about counting angels on a pinhead!

I can't say I'm entirely convinced by my own theory of three stages to science, but it's been interesting to see how the history and practice of the discipline can be fitted into it. After all, as stated earlier no-one has ever observed a quark, which in the first days of their formulation were sometimes seen as purely mathematical objects any way. So if you're doubtful I don't blame you, but never say never...

Friday 15 March 2013

Preaching to the unconverted: or how to convey science to the devout

It's said that charity begins at home. Likewise, a recent conversation I had with a pious Mormon started me thinking: just how do you promote science, both the method and the uncomfortable facts, to someone who has been raised to mistrust the discipline? Of course, there is a (hopefully) very small segment of the human race that will continue to ignore the evidence even after it is presented right in front of them, but stopping to consider those on the front line - such as biology teachers and ‘outed' atheists in the U.S. Bible Belt - how do you present a well-reasoned set of arguments to promote the theory and practice of science? 

It's relatively easy for the likes of Richard Dawkins to argue his case when he has large audiences of professionals or sympathetic listeners, but what is the best approach when endorsing science to a Biblical literalist on a one-to-one basis? The example above involved explaining just how we know the age of the Earth. Not being the first time I've been asked this, I was fully prepared to enlighten on the likes of uranium series dating, but not having to mention the 'D' words (Darwin or Dawkins) made this a relatively easy task. To aid any fans of science who might find themselves in a similar position I've put together a small toolkit of ideas, even if the conversation veers into that ultimate of controversial subjects, the evolution of the human race:
  1. A possible starting point is to be diffident, explaining the limitations of science and dispelling the notion that it isn't the catalogue of sundry facts it is sometimes described as (for example, in Bill Bryson's A Short History of Nearly Everything). It is difficult but nonetheless profitable to explain the concept that once-accepted elements of scientific knowledge can ostensibly be surpassed by later theories, only to maintain usefulness on a special case basis. A good illustration of this is Newton's Law of Universal Gravitation, which explains the force of gravity but not what creates it. Einstein's General Theory of Relativity provides a solution but Newton's Law is much easier to use, being accurate enough to use even to guide spacecraft. And since General Relativity cannot be combined with quantum mechanics, there is probably another theory waiting to be discovered…somewhere. As British astrophysicist and populariser John Gribbin has often pointed out, elements at the cutting edge of physics are sometimes only describable via metaphor, there not being anything within human experience that can be used as a comparison. Indeed, no-one has ever observed a quark and in the early days of the theory some deemed it just a convenient mathematical model. As for string theory, it's as bizarre as many a creation myth (although you might not want to admit that bit).
  2. Sometimes (as can be seen with Newton and gravity) the 'what' is known whilst the 'why' isn't. Even so, scientists can use the partial theories to extrapolate potential 'truths' or even exploit them via technology. Semi-conductors require quantum mechanics, a theory that no-one really understands. Indeed, no less a figure than Einstein refused to accept many of its implications.  There are many competing interpretations, some clearly more absurd than others, but that doesn't stop it being the most successful scientific theory ever, in terms of the correspondence between the equations and experimental data. So despite the uncertainty - or should that be Uncertainty (that's a pun, for the quantum mechanically-minded) - the theory is a cornerstone of modern physics.
  3. As far as I know, the stereotype of scientists as wild-haired, lab-coated, dispassionate and unemotional beings may stem from the Cold War, when the development of the first civilisation-destroying weapons led many to point their fingers at the inventors rather than their political paymasters. Yet scientists can be as creative as artists. Einstein conducted thought experiments, often aiming for a child-like simplicity, in order to obtain results. The idea that logic alone makes a good scientist is clearly bunkum. Hunches and aesthetics can prove as pivotal as experimental data or equations.
  4. Leading on from this, scientists are just as fallible as the rest of us. Famous examples range from Fred Hoyle's belief in the Steady State theory (and strangely, that the original Archaeopteryx fossils are fakes) through to the British scientific establishment's forty-year failure to recognise that the Piltdown Man finds were crude fakes. However, it isn't always as straightforward as these examples: Einstein's greatest blunder - the cosmological constant - was abandoned after the expansion of the universe was discovered, only for it to reappear in recent years as the result of dark energy. And of course mistakes can prove more useful than finding the correct answer the first time!
  5. There are numerous examples of deeply religious scientists, from Kepler and Newton via Gregor Mendel, the founder of genetics, to the contemporary British particle physicist the Reverend John Polkinghorne. Unlike the good versus evil dichotomy promoted by Hollywood movies, it's rarely a case of us versus them.
  6. Although there are searches for final theories such as the Grand Unified Theory of fundamental forces, one of the current aspects of science that differs profoundly from the attitudes of a century or so ago is that there is the possibility of never finding a final set of solutions. Indeed, a good experiment should generate as many new questions as it answers.
  7. If you feel that you're doing well, you could explain how easy it is to be fooled by non-existent patterns and that our brains aren't really geared up for pure logic. It's quite easy to apparently alter statistics using left- or right-skewed graphs, or to use a logarithmic scale on one axis. In addition, we recognise correlations that just aren't there but we which we would like to think are true. In the case of my Mormon colleague he was entrenched in the notion of UFOs as alien spacecraft! At this point you could even conduct an experiment: make two drawings, one of a constellation and one of evenly-spaced dots, and ask them to identify which one is random. Chances are they will pick the latter. After all, every culture has seen pictures in the random placements of stars in the night sky (or the face of Jesus in a piece of toast).
Constellation vs random dots
Ursa Major (see what you like) vs evenly-spaced dots

So to sum up:
  1. There's a fuzzy line at the cutting edge of physics and no-one understands what most of it means;
  2. We've barely started answering fundamental questions, and there are probably countless more we don't even know to ask yet;
  3. Science doesn't seek to provide comforting truths, only gain objective knowledge, but...
  4. ...due to the way our brains function we can never remove all subjectivity from the method;
  5. No one theory is the last word on a subject;
  6. Prominent scientists easily make mistakes;
  7. And most of all, science is a method for finding out about reality, not a collection of carved-in-stone facts.
So go out there and proselytise. I mean evangelise. Err...spread the word. Pass on the message. You get the picture: good luck!

Monday 1 August 2011

Weather with you: thundersnow, hosepipe bans and climate punditry

I must confess to have not watched any of the current BBC series The Great British Weather, since (a) it looks rubbish; and (b) I spend enough time comparing the short-range forecast with the view outside my window as it is, in order to judge whether it will be a suitable night for astronomy. Since buying a telescope at the start of the year (see an earlier astronomy-related post for more details) I've become just a little bit obsessed, but then as an Englishman it's my inalienable right to fixate on the ever-changeable meteorology of these isles. If I think that there is a chance of it being a cloud-free night I tend to check the forecast every few hours, which for the past two months or so has proved to be almost uniformly disappointing; as a matter of fact, the telescope has remained boxed up since early May.

There appears to be a grim pleasure for UK-based weather watchers that when a meteorology source states that it is currently sunny and dry in your location it may in fact be raining torrentially. We all realise forecasting relies on some understanding of a complex series of variables, but if they can't even get the 'nowcast' correct what chance do the rest of us have?

So just how has the UK's mercurial weather patterns affected the science of meteorology and our attitude towards weather and climate? As far back as 1553 the English mathematician and inventor Leonard Digges included weather lore and descriptions of phenomena in his A General Prognostication. Since then, British scientists have been in the vanguard of meteorology. Isaac Newton's contemporary and rival Robert Hooke may have been the earliest scientist to keep meteorological records, as well as inventing several associated instruments. Vice-Admiral Robert FitzRoy, formerly captain of HMS Beagle (i.e. Darwin's ship) was appointed as the first Meteorological Statist to the Board of Trade in 1854, which in today’s terms would make him the head of the Met Office; he is even reputed to be the inventor of the term 'forecast'.

Modern science aside, as children we pick up a few snippets of the ancient folk learning once used to inculcate elementary weather knowledge. We all know a variation of "Red sky at night, shepherd's delight; red sky in the morning, shepherd's warning", the mere tip of the iceberg when it comes to pre-scientific observation and forecasting. But to me it looks if all of us in ever-changeable Britain have enough vested interest in the weather (once it was for crop-growing, now just for whether it is a sunglasses or umbrella day – or both) to maintain our own, personal weather database in our heads. Yet aren't our memories and lifespan in general just too short to allow us a genuine understanding of meteorological patterns?

One trend that I consider accurate is that those 'little April showers' I recall from childhood (if you remember the song from 'Bambi') are now a thing of the past, with April receiving less rainfall than June. This is an innate feeling: I have not researched it enough to find out if there has been a genuine change over the past three decades. Unfortunately, a combination of poor memory and spurious pattern recognition means we tend to over-emphasise 'freak' events - from thundersnow to the day it poured down at so-and-so's June wedding - at the expense of genuine trends.

For example, my rose-tinted childhood memories of six largely rain-free weeks each summer school break centre around the 1976 drought, when my brother had to be rescued from the evil-smelling mud of a much-reduced reservoir and lost his shoes in the process. I also recall the August 1990 heat wave as I was at the time living less than 20 km from Nailstone in Leicestershire, home of the then record UK temperature of 37.1°C. In contrast, I slept through the Great Storm of 1987 with its 200+km/h winds and don’t recall the event at all! As for 2011, if I kept a diary it would probably go down as the 'Year I Didn't Stop Sneezing'. City pollution and strong continental winds have combined to fill the London air with pollen since late March, no doubt much to the delight of antihistamine manufacturers.

An Norfolk beach in a 21st century summer
An East Anglian beach, August 2008


Our popular media frequently run stories about the latest report on climate change, either supporting or opposing certain hypotheses, but rarely compare it to earlier reports or long-term records. Yet even a modicum of research shows that in the Nineteenth Century Britain experienced a large variation in weather patterns. For example, the painter J.M.W. Turner's glorious palette was not all artistic licence, but almost certainly influenced by the volcanic dust-augmented sunsets following the 1815 Tambora eruption. It wasn't just painting that was affected either, as the UK suffered poor harvests the following year whilst in the eastern United States 1816 was known as 'Eighteen Hundred and Froze to Death'.

The influence of the subjective on the objective doesn't sound any different from most other human endeavours, except that weather professionals too - meteorologists, climatologists, and the like - also rely on biases in their work. Ensemble forecasting, which uses slightly different initial conditions to create data reports which are then combined to provide an average outcome, has been shown to be a more accurate method of prediction. In other words, this sounds like a form of scientific bet hedging!

Recent reports have shown that once-promising hypotheses involving singular factors such as sunspot cycles can in no way account for most primary causes of climate change, either now or in earlier epochs. It seems the simple answers we yearn for are the prerogative of Hollywood narrative, not geophysical reality. One bias that can seriously skew data is the period being used in a report. It sounds elementary, but we are rarely informed that even the difference of a single year in the start date can significantly affect the outcome as to whether, for example, temperature is increasing over time. Of course, scientists may deliberately only publish results for periods that support their hypotheses (hardly a unique trait, if you read Ben Goldacre). When this is combined with sometimes counter-intuitive predictions – such as that a gradual increase in global mean temperature could lead to cooler European winters – is it little wonder we non-professionals are left to build our level of belief in climate change via a muddle of personal experience, confusion and folk tales? The use of glib phrases such as 'we're due another glaciation right about now' doesn't really help either. I'm deeply interested in the subject of climate change and I think there is serious cause for concern, but the data is open to numerous interpretations.

So what are we left with? (Help: I think I'm turning into Jerry Springer!) For one thing, the term 'since records began' can be about as much use as a chocolate teapot. Each year we get more data (obviously) and so each year the baseline changes. Meteorology and climatology are innately complex anyway, but so far both scientists and our media have comprehensively failed to explain to the public just how little is known and how even very short-term trends are open to abrupt change (as with the notorious 'don't worry' forecast the night of the 1987 Great Storm). But then you have only to look out of the window and compare it to the Met Office website to see we have a very long way to go indeed…