Sunday 24 January 2010

The British boffin: an extinct species?

According to the Oxford English Dictionary, the meaning of the word 'boffin' is a person engaged in scientific research, frequently of a military nature. For the minority of Britons who still recognise the expression it often conjures up a time and a place, an evocation of Britain during the third quarter of the 20th century. Despite the Cold War, that era seems to have possessed a profound interconnection between societal and technological progress, a far cry from the frequent mistrust of science apparent today. From the Second World War until the 1970's these 'back-room wizards' were a familiar element of British society, sporting slide rules, briar pipes (women don't get much of a look-in for this genre), and a fondness for acronyms. Although the period saw great improvements in many aspects of applied science, from medicine to agriculture, it is largely aeronautical and astronautical projects that seem synonymous with the age of the boffin. Another curious aspect is that despite the military leanings of many boffin-run projects, the breed does not seem to have been of a more martial aspect than any other type of scientist or engineer.

One of the last gasps of boffinicity was Project Mustard, a prototypical example of scientific and technical genius combined with political and economic naivety. In the mid-1960's the Ministry of Aviation gave the British Aircraft Corporation financial support in the design of the Multi Unit Space Transport And Recovery Device (or MUSTARD), a reusable spaceplane that pre-empted the Space Shuttle. Although the intention was to make manned spaceflight much cheaper than via expendable rockets, it seems incredible that Britain could seriously consider such a project without American support. As it was, Project Mustard got little further than the drawing board and several patents filed in 1967.

The project existed at the tail end of several decades when many aspects of science and technology had becoming increasingly integrated into popular culture. British films of the 1940's and 50's fictionalised real-life boffins such as Spitfire designer R.J. Mitchell (in the First of the Few) and the bouncing bomb inventor Barnes Wallis (of Dambusters fame), whilst furniture and fabrics utilised designs based on molecular biology and the atom. Due to American isolationism Britain managed to almost independently develop nuclear power stations and atomic bombs, along with the first commercial jet airliner (the de Havilland Comet), practical hovercraft and VTOL (Vertical Take-Off and Landing) technology, the latter being a rare post-war reversal whereby the USA bought from Britain. All this was achieved in spite of being the world's largest debtor and the sudden termination of Lend-Lease in 1945; perhaps the threat of a Soviet invasion aided productivity, but the level of British 'firsts' from the period is truly astonishing.

Unfortunately, beneath the surface there was an awful lot of hype. As early as the 1951 Festival of Britain the British economy was jokingly compared to that festival's Skylon structure, in that neither possessed a visible means of support. Throughout the 1950's and 60's financial shortfalls meant that research and development (and recalling the OED definition, in the Cold War that was frequently synonymous with the military) projects, were often obsolete prior to completion. Amongst the victims of financial problems, rapidity of technological progress, political prevarication, and even pressure from the USA (perish the thought), were the Bluestreak ballistic missile and its successors, mixed powerplant interceptors, and TSR-2, a strike aircraft that was impressive even by today's standards. The most farcical moment of all came in 1957 when Defence Minister Duncan Sandys published a white paper declaring that the future of aerial warfare lay solely in guided missiles. The Doctor Beeching-style cuts that followed led to the amalgamation or disappearance of most British aerospace companies and you would have thought, any pretension of Britain competing with the superpowers.

But the boffins weren't beaten yet. Whether it was too much boy's own science fiction (from radio's Journey into Space to comic hero Dan Dare) or even a desire to replace the rapidly disintegrating Empire with the conquest of outer space, private and public sector funding repeatedly initiated space-orientated projects that stood little chance of coming to fruition. In a joint venture with the forerunners of ESA (the European Space Agency), the Black Arrow rocket was used in 1971 for the only wholly-British satellite launch, Prospero X-3. Unfortunately this occurred three months after the project was cancelled, the irony being that the British technology involved proved more reliable than its French and German counterparts. Since then, British funding of joint space ventures has been desultory to say the least, only contributing about half of what France or Germany give to ESA.

All in all, it could be said that the day of the boffin is over. A turning point may be found in the environmental concerns over Concorde in the mid-1970's, leading to the project being recognised as an economic catastrophe. The high-technology failures represented in the disaster movies of the time are the antithesis of the glorification of machinery displayed in Thunderbirds less than a decade earlier. The seemingly Victorian notion that bigger, faster (and louder) equates to progress had been replaced by an understated, almost apologetic air surrounding research and development, even for projects of a primarily civilian nature. Not that this change of attitude initially had much effect on the military: more than half of Government R&D expenditure in the 1980's went to the Ministry of Defence, including the infamous (and cancelled) spy satellite, Project Zircon.

Two more examples from the eighties prove that any space-orientated scheme would now have to undergo prompt and rigorous economic assessment. British Aerospace's Spacelab experiment pallets for ESA were extremely successful, but let's face it; this was a relatively dull project by any standard. The antithesis was another acronym-laden project: HOTOL, the Horizontal Take-Off and Landing pilotless spaceplane, which received Government funding in the mid-eighties. Unfortunately, the potential two or more decade development schedule, combined with an estimated total cost of around £5 billion and lack of MoD interest (the revolutionary engine design being classified), led to the withdrawal of official involvement after several years.

All of the above suggests that twentieth-century Britain had a tradition of wasting vast amounts of time, energy, and occasionally public money, on paper-only projects ranging from blue-sky thinking to the genuinely hare-brained. Yet some schemes show more than an element of genius. In the 1930's, members of the British Interplanetary Society developed a manned lunar lander mission that foreshadowed many elements of Project Apollo to an astonishing degree. Whereas teams in Germany and the USA were developing liquid-fuelled rockets at the time, British law prohibited rocket-building by private citizens. Perhaps this aided the notion that projects on the drawing board were as valuable as those involving nuts and bolts; thus the image of the boffin as slightly detached from politico-economic reality was born.

A recent project that could claim identification with the boffin model was Beagle 2, a shoestring-budgeted Mars lander jointly funded by the private and public sectors and combining the talents of academics and industry under the exceedingly boffin-like Colin Pillinger. The acronym-heavy craft proved where the project's sympathies lay, ranging from a robotic arm called the PAW (Payload Adjustable Workbench) to its PLanetary Undersurface TOol, or PLUTO.

With follow-up Beagle 3 cancelled in 2004 after the disappearance and presumed destruction of its predecessor, you might think that would be the final nail in the boffin coffin (groan). But the HOTOL designers have been quietly beavering away for the last few decades and a new project has risen from the ashes of the original. Skylon, a spaceplane named after the 1951 Festival of Britain structure, received a boost last year from a £900,000 ESA contribution towards its £6m million SABRE (Synergic Air BReathing Engine) research project. Initially unmanned, the craft even has the potential of housing a cabin for up to forty passengers. With an estimated first flight around 2020 the project offers hope of a cheaper reusable spacecraft, but a combination of the current economic downturn and the history of similar projects do not bode well; estimates suggest that even the British military will face budget cuts of eleven to twenty-five percent over the next six years.

So what next for boffindom? International collaboration on the aerospace and astronautics front is obviously the only way forward for Britain, but whether the tradition of idealistic, even eccentric, inventor / designer / engineers can prevail is anyone's guess. Recent news stories mention boffins at CERN (home to the Large Hadron Collider) and even in Japan (where they have successfully bred transparent animals, no less), but for me the archetypal boffin will always be British and skyward-looking, regardless of whether they smoke a briar pipe or not.

Technorati Tags: ,

Sunday 17 January 2010

Shall I compare thee to a charming quark? When mitochondria meets metaphor

Many years ago whilst holidaying in Cyprus I experienced an event commonplace to our ancestors but increasingly rare to us light-polluted urbanites today. Sitting outside one evening a spectacular glow appeared over a nearby hill, slowly gaining a floodlight intensity until the full moon rose, casting shadows and obscuring the Milky Way. Small wonder previous centuries have written so much about the beauty of the "starry realm"; but can poetry survive when having discovered the secrets of the stars, we have ironically lost touch with them as a sensory experience? As the late Richard Feynman asked, "do I see less or more?" His answer, proving him a worthy successor to Emily Dickinson and Robert Frost, encapsulates the view that knowledge gained need not lessen the wonder: "stuck on this carousel my little eye can catch one million year old light..."

But then the night sky (and the natural world in general) is an easy poetic target compared to other aspects of science. Yet historical examples of British scientist-poets abound, from Charles Darwin's grandfather Erasmus, whose verse included copious footnotes explaining the ideas within, to chemist Humphry Davy, physicist James Clerk Maxwell, and more recently biologist Julian Huxley. You might ask who are today's equivalents - who writes paeans to Messenger RNA or odes to nuclear fusion? There are poets who exchanged science for versifying (David Morley) and scientists who edit poetry (Jocelyn Bell Burnell), but few who simultaneously practice both sides of C.P. Snow's infamous The Two Cultures. Apart from several astronomy compilations (featuring verse largely by non-astronomers) there are hardly any recent science-orientated volumes aimed at adults except for James Muirden's The Cosmic Verses: A Rhyming History of the Universe. Informative as it is, Muirden's charming couplets hardly push the boundaries of poetry or science exposition.

One obvious (and therefore not necessarily correct) reason for the lack of contemporary science poetry is that the complexity of modern theories and terminology create a prohibitive first hurdle: the likes of phagocytosis and inhomogeneous magnetic fields hardly trip off the tongue. However, ecologist and 'lapsed physicist' Mario Petrucci, a rare example of a contemporary scientist with an actively-employed poetic gift, argues that science-inspired poetry shouldn't rely on technological name-dropping but look at the defining methodologies. He provides an exquisite example via a (prose) description of the physiological response to listening to verse, which he defines as the "subliminal scent of aroused communication".

Then again, modes of writing have changed dramatically over the past century, with the florid, highfalutin prose of the Victorians replaced by a detached, matter-of-fact style developed to avoid ambiguity. Thomas Henry Huxley (Julian's grandfather) was, like many of his contemporaries, capable of prose that to the modern ear is to all intents and purposes poetry: "...intellectually we stand on an islet in the midst of an illimitable ocean of inexplicability. Our business in every generation is to reclaim a little more land..." In contrast, today's technical papers achieve universal comprehension by austerity of language. This is of course the complete antithesis of poetry, wherein each reader brings their own personal history to enhance imagery and meaning.

At a practical level, does the constant 21st century babble of communications and background noise (not just aural) deprive would-be poets of time to reflect? This implies a somewhat rose-tinted view of earlier times, even though the virtual disappearance of a Classics-based education system has certainly divested us of the safety net of enduring metaphors. In addition, as scientists becoming ever-more specialist in narrower fields (not to mention polymathism seemingly frowned upon), is there a fear from practitioners and publishers alike that the profession has little worth versifying? Even the romantic image of the stargazer spending their nights in a chilly dome has seemingly been replaced by observation via computer screen.

Despite there probably being more books arguing the relationship between arts and sciences than there are volumes of science-themed poetry (from Mary Midgley versus Richard Dawkins to Stephen Jay Gould's attack on E.O. Wilson's definition of consilience), there is plenty for scientist-poets, or just writers with scientific knowledge, to write about. The late 19th century arrogance that the quest for knowledge was nearing its end has been superceded by the view that there may even not be any final answers to life, the universe, and everything. Far from being a list of dry facts and equations, the methods of science demand creativity to achieve paradigm shifts, as anyone with an understanding of Einstein's thought experiments knows. Other natural philosophers have achieved major breakthroughs via aesthetic considerations, such as harmonic proportions for Johannes Kepler, symmetry for Clerk Maxwell and patterns and linguistic analogies for Mendeleyev. As theoretical physicist Lee Smolin has stated, his discipline is based around an aesthetic mode of working, fashioning constructs that capture some essence of understanding about reality. Are theories such as loop quantum gravity that different from poetic metaphors? After all, even the subatomic particle we call a quark was named after the sound of ducks, and then later linked to the rhyme in Finnegans Wake.

But then there is the difficulty of finding a universal definition for poetry anyway. The title of Michael Guillen's Five Equations that Changed the World: The Power and Poetry of Mathematics suggests an aesthetic form on par with verse. If we can accept a wider meaning then perhaps there is a solution as to where science poetry is still to be found: hidden in the mellifluous prose of popularisers. The poetic style of Carl Sagan and his successors can clearly be traced to Loren Eiseley, thence to the pre-war British polymath James Jeans, who in turn was not so far removed from T.H. Huxley at his most rhapsodical. In addition to his writing, Sagan was also capable of poetic gestures that clearly represent our multi-media age's continuation of Erasmus Darwin's verses. When Voyager 1 had passed the orbits of Neptune and Pluto, Sagan persuaded NASA to turn the probe's cameras back towards the sun and make a family portrait of the Solar System, including our very own pale blue dot. Surely this is a superlative example of the amalgamation of science and poetry? And as to the future, the English author Eden Phillpotts once wrote: "The universe is full of magical things, patiently waiting for our wits to grow sharper."

Technorati Tags: , ,

Saturday 9 January 2010

Quis custodiet ipsos custodes? (Or who validates popular science books?)

Gandhi once said "learn as if you were to live forever", but for the non-scientist interested in gaining accurate scientific knowledge this can prove rather tricky. Several options are available in the UK, most with drawbacks: there are few 'casual' part-time adult science courses (including the Open University); the World Wide Web is useful but inhibits organised, cohesive learning and there's always the danger of being taken in by some complete twaddle; whilst television documentaries and periodicals rarely delve into enough detail. This only leaves the ever-expanding genre of popular science books, with the best examples often including the false starts and failed hypotheses that make science so interesting.

However, there is a problem: if the book includes mistakes then the general reader is unlikely to know any better. I'm not talking about the usual spelling typos but more serious flaws concerning incorrect facts or worse still, errors of emphasis and misleading information. Admittedly the first category can be quite fun in a 'spot the mistake' sort of way: to have the particle physicists Brian Cox and Jeff Forshaw inform you that there were Muslims in the second century AD, as they do in Why does E=mc2? (and why should we care?) helps to make the authors a bit more human. After all, why should a physicist also have good historical knowledge? Then again, this is the sort of fact that is extremely easy to verify, so why wasn't this checked in the editing process? You expect Dan Brown's novels to be riddled with scientific errors, but are popular science book editors blind to non-science topics?

Since the above is an historical error many readers may be aware of the mistake, but the general public will often not be aware of inaccuracies relating to scientific facts and theories. Good examples of the latter can be found in Bill Bryson's A Short History of Nearly Everything, the bestselling popular science book in the UK in 2005. As a non-scientist Bryson admits that it's likely to be full of "inky embarrassments" and he's not wrong. For instance, he makes several references to the DNA base Thymine but at one point calls it Thiamine, which is actually Vitamin B1. However, since Bryson is presenting themed chapters of facts (his vision of science rather than any explanation of methods) these are fairly minor issues and don't markedly detract from the substance of the book.

So far that might seem a bit nitpicky but there are other works containing more fundamental flaws that give a wholly inaccurate description of a scientific technique. My favourite error of this sort can be found in the late Stephen Jay Gould's Questioning the Millennium and is howler that continues to astonish me more than a decade after first reading. Gould correctly states that raw radiocarbon dates are expressed as years BP (Before Present) but then posits that this 'present' relates directly to the year of publication of the work containing that date. In other words, if you read a book published in AD 2010 that refers to the date 1010 BP, the latter year is equivalent to AD 1000; whereas for a book published in AD 2000, 1010 BP would equate to AD 990. It's astounding that Gould, who as a palaeontologist presumably had some understanding of other radiometric dating methods, could believe such a system would be workable. The 'present' in the term BP was fixed at AD 1950 decades before Gould's book was published, so it doubly astonishes that no-one questioned his definition. You have to ask were his editors so in awe that they were afraid to query his text, or did his prominence give him copy-editing control of his own material? A mistake of this sort in a discipline so close to Gould's area of expertise can only engender doubt as to the veracity of his other information.

A more dangerous type of error is when the author misleads his readership through personal bias presented as fact. This is particularly important in books dealing with recent scientific developments as there will be few alternative sources for the public to glean the information from. In turn, this highlights the difference between professionals and their peer-reviewed papers and the popularisations available to the rest of us. There is an ever-increasing library of popular books discussing superstrings and M-theory but most make the same mistake of promoting this highly speculative branch of physics not just as the leading contender in the search for a unified field theory, but as the only option. Of course a hypothesis that cannot be experimentally verified is not exactly following a central tenet of science anyway. There has been discussion in recent years of a string theory Mafia so perhaps this is only a natural extension into print; nonetheless it is worrying to see a largely mathematical framework given so much premature attention. I suppose only time will tell...

It also appears that some publishers will accept material from senior but non-mainstream scientists on the basis of the scientist's stature, even if their hypotheses border on pseudoscience. The late Fred Hoyle was a good example of a prominent scientist with a penchant for quirky (some might say bizarre) ideas such as panspermia, who although unfairly ignored by the Nobel Committee seems to have had few problems getting his theories into print. Another example is Elaine Morgan, who over nearly four decades has written a string of volumes promoting the aquatic ape hypothesis despite lack of evidence in the ever-increasing fossil record.

But whereas Hoyle and Morgan's ideas have long been viewed as off the beaten track, there are more conventional figures whose popular accounts can be extremely misleading, particularly if they promote the writer's pet ideas over the accepted norm. Stephen Jay Gould himself frequently came in for criticism for overemphasising various evolutionary methods at the expense of natural selection, yet his peers' viewpoint is never discussed in his popular writings. Another problem can be seen in Bryan Sykes's The Seven Daughters of Eve, which received enormous publicity on publication as it gratifies our desire to understand human origins. However, the book includes a jumbled combination of extreme speculation and pure fiction, tailored in such a way as to maximise interest at the expense of clarification. Some critics have argued the reason behind Sykes's approach is to promote his laboratory's mitochondrial DNA test, capable of revealing which 'daughter' the customer is descended from. Scientists have to make a living like everyone else, but this commercially-driven example perhaps sums up the old adage that you should never believe everything you read. The Catch-22 of course is that unless you understand enough of the subject beforehand, how will you know if a popular science book contains errors?

A final example does indeed suggest that some science books aimed at a general audience prove to be just too complex for comprehensive editing by anyone other than the author. I am talking about Roger Penrose's The Road to Reality: A Complete Guide to the Laws of the Universe. At over one thousand pages this great tome is marketed with the sentence "No particular mathematical knowledge on the part of the reader is assumed", yet I wonder whether the cover blurb writer had their tongue firmly in their cheek? It is supposed to have taken Penrose eight years to write and from my occasional flick-throughs in bookshops I can see it might take me that long to read, never mind understand. I must confess all those equations haven't really tempted me yet, at least not until I have taken a couple of Maths degrees...

Sunday 3 January 2010

What's in a label? How words shape reality

With the start of a new year it seems appropriate to look at how our perception of the universe is created via language - after all, there's no position in space identifying an orbital starting point. We grow up with a notion of reality that is largely defined by convenience and historical accidents embedded into our language and therefore our thought patterns (and vice versa). For at least the last six hundred years many societies have called our planet Earth, whilst of course Ocean would be more appropriate. Whilst this is just an obvious chauvinism for a land-based species, there are other terms that owe everything to history. We count in base ten, position zero longitude through the Greenwich Meridian and usually show the Earth from one perspective, despite there not being an arrow in our galaxy stating 'this way up' (but then had Ancient Egyptians' view prevailed, Australia and New Zealand would be in the Northern Hemisphere.)

So how far can go with constructs? Our calendar is an archaic, sub-optimal mish-mash, with the interpolation of July and August meaning the last four months of the year are inaccurately named seven through ten. The changeover from the Julian to Gregorian calendar varied from nation to nation, meaning well-known events such as the birth of George Washington and the Bolshevik Revolution have several dates depending on the country defining that piece of history. As for the majority of humans agreeing that we are now in AD 2010, thanks to a fifteen hundred year-old mistake by Dionysius Exiguus our current year should really be at least AD 2014, if we accept that an historical figure called Jesus of Nazareth was born during the lifetime of Herod the Great. It appears that even the fundamentals that guide us through life are subjective at the very least if not far from accurate in many cases.

The philosopher of science Thomas Kuhn argues that all scientific research is a product of the culture of the scientists engaged on those projects, so whilst we might argue that Galileo was the first scientist in a strictly modern use of the word, can there be a definitive boundary between the quasi-mystical thought processes of Copernicus and Kepler (and even Newton), and that of the modern exponents typified by Einstein and Hawking? Whilst we would like to believe in a notion of pure objectivity, scientists are just as subjective as everyone else and their theories are therefore built on assumptions directly related to history, both cultural and biological.

We use labels to comfort ourselves, even boost our egos, via unconscious assumptions that are gradually looking more ridiculous as we delve ever deeper into the mysteries of creation. For example, the past sixty-five million years has been a period frequently named 'the Age of Mammals'. Yet as Stephen Jay Gould was fond of pointing out, most of the world's biomass is microbial and we macroscopic life forms are comparative newcomers, restricted to a far reduced range of environments compared to bacteria, protists and other small-scale organisms.

Despite such sense-expanding tools as infra-red telescopes and electron microscopes, we still process sensory input and use primarily audio-visual output to define scientific theories and methodology. We are in thrall to the languages we use define our thoughts, both conversational language and mathematics. Although the lingua franca of science has varied over the centuries, all languages from Latin to English have one thing in common: they are used to tell us stories. At a basic level, the history of science is riddled with fables and apocrypha, from Newton being hit by an apple (and inventing the reflecting telescope) to Galileo dropping weights from the Leaning Tower of Pisa, even Columbus believing the world was a sphere (he didn't - he thought it was pear-shaped!)

So if scientific history cannot be relied upon, what about the hypotheses and theories themselves? In the words of John Gribbin, we construct 'Just So' stories to create a comprehendible version of reality. Presumably this reliance on metaphor will only increase as our knowledge becomes further divorced from everyday experience but our technology fails to keep pace with confirming new theories; for example, it is far from likely that we will ever be able to directly view a superstring.

In addition, language doesn't just restrict our ideas: if a term has a scientific sense differing from vernacular meaning, problems frequently arise. A classic example would be quantum leap, which to most people means an enormous step forward but to physicists is an electron's miniscule change of energy level. However, even personal computer pioneer Sir Clive Sinclair used the term in its former meaning for his 1984 Quantum Leap microcomputer (at least I assume he did, although QL owners may disagree...)

Speaking of which, perhaps when we finally build (or machines build for us) computers capable of true artificial intelligence, new ways of exploring the universe not tied down to conventional linguistic-based thought patterns may arise. Then again, since we will be the parents of these machines, this may not be feasible. As one of Terry Pratchett's characters stated: "I think perhaps the most important problem is that we are trying to understand the fundamental workings of the universe via a language devised for telling one another where the best fruit is." But all things considered, we haven't done that badly so far.

Technorati Tags: , ,