Friday 26 August 2016

The benefit of hindsight: the truth behind several infamous science quotes

With utmost apologies to Jane Austen fans, it is a truth universally acknowledged that most people misinterpret science as an ever-expanding corpus of knowledge rather than as a collection of methods for investigating natural phenomena. A simplistic view for those who adhere to the former misapprehension might include questioning science as a whole when high-profile practitioners make an authoritative statement that is proven - in a scientific sense - to be incorrect.

Amongst the more obvious examples of this are the numerous citations from prominent STEM (Science, Technology, Engineering and Mathematics) professionals that are inaccurate to such an extreme as to appear farcical in light of later evidence. I have already discussed the rather vague of art of scientific prognostication in several connected posts but now want to directly examine several quotations concerning applied science. Whereas many quotes are probably as deserving of contempt as the popular opinion of them, I believe the following require careful reading and knowledge of their context in which to attempt any meaningful judgement.

Unlike Hollywood, STEM subjects are frequently too complex for simple black versus white analysis. Of course there have been rather derisible opinions espoused by senior scientists, many of which - luckily - remain largely unknown to the wider public. The British cosmologist and astronomer Sir Fred Hoyle has a large number of these just to himself, from continued support for the Steady State theory long after the detection of cosmic microwave background radiation, to the even less defensible claims that the Natural History Museum's archaeopteryx fossil is a fake and that flu germs are really alien microbes!

Anyhow, here's the first quote:

1) Something is seriously wrong with space travel.

Richard van der Riet Woolley was the British Astronomer Royal at the dawn of the Space Age. His most infamous quote is the archetypal instance of Arthur C. Clarke's First Law:  "When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong."

Although a prominent astronomer, van der Riet Woolley had little knowledge of the practical mechanics that would be required for spaceflight. By the mid-1930s the British Interplanetary Society had developed detailed (although largely paper-only) studies into a crewed lunar landing mission. In 1936 Van der Riet Woolley publically criticised such work, stating that the development of even an unmanned rocket would present fundamental technical difficulties. Bear in mind that this was only six years before the first V2 rocket, which was capable of reaching an altitude of just over 200km!

In 1956, only one year before Sputnik 1 - and thirteen years prior to Apollo 11 - the astronomer went on to claim that near-future space travel was unlikely and a manned lunar landing "utter bilge, really". Of course this has been used as ammunition against him ever since, but the quote deserves some investigation. Van der Riet Woolley goes on to reveal that his primary objection appears to have changed (presumably post-V2 and its successors) from an engineering problem to an economic one, stating that it would cost as much as a "major war" to land on the moon.

This substantially changes the flavour of his quote, since it is after all reasonably accurate. In 2010 dollars, Project Apollo has an estimated budget of about US$109 billion - incidentally about 11% of the cost of the contemporary Vietnam War. In addition, we should bear in mind that a significant amount of the contractors' work on the project is said to have consisted of unpaid overtime. Is it perhaps time to reappraise the stargazer from a reactionary curmudgeon to an economic realist?

Indeed, had Apollo been initiated in a subsequent decade, there is reasonable evidence to suggest it would have failed to leave the ground, so to speak. The uncertainty of the post-Vietnam and Watergate period, followed by the collapse of the Soviet Union, suggest America's loss of faith in technocracy would have effectively cut Apollo off in its prime. After all, another colossal American science and engineering project, the $12 billion particle accelerator the Superconducting Super Collider, was cancelled in 1993 after being deemed unaffordable. Yet up to that point only about one-sixth of its estimated budget had been spent.

In addition, van der Riet Woolley was not alone among STEM professionals: for three decades from the mid-1920s the inventor of the vacuum tube Lee De Forest is said to have claimed that space travel was impractical. Clearly, the Astronomer Royal was not an isolated voice in the wilderness but part of a large consensus opposed to the dreamers in the British Interplanetary Society and their ilk. Perhaps we should allow him his pragmatism, even if it appears a polar opposite to one of Einstein's great aphorisms: "The most beautiful thing we can experience is the mysterious. It is the source of all true art and science. .."

Talking of whom…

2) Letting the genie out of the bottle.

In late 1934 an American newspaper carried this quotation from Albert Einstein: "There is not the slightest indication that (nuclear energy) will ever be obtainable. It would mean that the atom would have to be shattered at will." This seems to be rather amusing, considering the development of the first self-sustaining nuclear chain reaction only eight years later. But Einstein was first and foremost a theorist, a master of the thought experiment, his father's work in electrical engineering not being noticeably sustained in his son. There is obviously a vast world of difference between imagining riding a beam of light to the practical difficulties in assembling brand new technologies with little in the way of precedent. So why did Einstein make such a definitive prediction?

I think it is possible that it may also have been wishful thinking on Einstein's part; as a pacifist he would have dreaded the development of a new super weapon. As the formulator of the equivalence between mass and energy, he could have felt in some way responsible for initiating the avalanche that eventually led to Hiroshima and Nagasaki. Yet there is no clear path between E=mc2 and a man-made chain reaction; it took a team of brilliant experimental physicists and engineers in addition to theorists to achieve a practical solution, via the immense budget of $26 billion (in 2016 dollars).

It is hardly as if the good professor was alone in his views either, as senior officials also doubted the ability to harness atomic fission for power or weaponry. In 1945 when the Manhattan Project was nearing culmination, the highest-ranking member of the American military, Fleet Admiral William Leahy, apparently informed President Truman that the atomic bomb wouldn't work. Perhaps this isn't as obtuse as it sounds, since due to the level of security only a very small percentage of the personnel working on the project knew any of the details.

Leahy clearly knew exactly what the intended outcome was, but even as "an expert in explosives" had no understanding of the complexity of engineering involved. An interesting associated fact is that despite being a military man, the Admiral considered the atomic bomb unethical for its obvious potential as an indiscriminate killer of civilians. Weapons of mass destruction lack any of the valour or bravado of traditional 'heroic' warfare.  Is it possible that this martial leader wanted the bomb to fail for moral reasons, a case of heart over mind? In which case, is this a rare example in which the pacifism of the most well-known scientist was in total agreement with a military figurehead?

Another potential cause is the paradigm shift that harnessing the power of the atom required. In the decade prior to the Manhattan Project, New Zealand physicist Ernest Rutherford had referred to the possibility of man-made atomic energy as "moonshine" whilst another Nobel laureate, American physicist Robert Millikan, had made similar sentiments in the 1920s. And this from men who were pioneers in understanding the structure of the atom!

As science communicator James Burke vividly described in his 1985 television series The Day the Universe Changed, major scientific developments often require substantial reappraisals in outlook, seeing beyond what is taken for granted. The cutting edge of physics is often described as being ruled by theorists in their twenties; eager young turks who are more prepared to ignore precedents. When he became a pillar of the establishment, Einstein ruefully commented: "To punish me for my contempt for authority, fate made me an authority myself."

Perhaps then, such fundamental shifts in technology as the development of space travel and nuclear fission require equally revolutionary changes in mind set and we shouldn't judge the authors of our example quotes too harshly. Then again, if you are an optimist, Clarke's First Law might seem applicable in this situation, in which case quotes from authority figures with some knowledge of the subject in hand should take note of the ingenuity of our species. If there is a moral to this to story, it is other than the speed of light in a vacuum and the Second Law of Thermodynamics, never say never...

Wednesday 27 July 2016

Resistance is futile: the ongoing war against super bugs

As I'm currently three days into an irritating cough (aren't they all?) accompanied by a sore throat, I've just taken a soothing lozenge. The packet states the lozenges contain a combination of two antibacterial agents which aim to help kill the bacteria causing the infection. However, the packet also notes - in a somewhat smaller font size - that there is no clinical proof an antibacterial agent will reduce the severity or duration of the infection. Could this be because common colds and influenza are caused by viruses not bacteria? I don't suppose the pharmaceutical industry could possibly be duping an ignorant public in the name of profit margins?

Working in a hot desking environment, I frequently remind colleagues not to overdue usage of anti-bacterial sprays on their desk, keyboards, mouse and telephone. Not that I'm exactly certain how damaging the company-supplied sprays are, environmentally-speaking: for all I know, they may be good enough to destroy all the 'bad' bacteria, but I'd rather be safe than sorry. Instead, I recommend the method I use at work, namely washing my hands before eating. Simple, and hopefully less likely to encourage super bugs.

It seems to me that basic hygiene is preferable to the chemical war on microbes, since (a) some are beneficial, including for building immunity; and (b) some strains may survive the cull and lead to a desk biota high in resistant bacteria: after all, isn't that just Darwinian natural selection being given an unintentional boost? Unfortunately, there has been a large increase in sick leave since we moved from conventional offices to hot-desking. Therefore something is clearly going wrong, regardless of approach!

The best well-known of the super bugs has to be Methicillin-resistant Staphylococcus aureus (MRSA), beloved of news journalists but very few others. Although the resistance was first recognised around 1960, the past twenty-five years or so has seen a plethora of scientific reports describing outbreaks separated from healthcare environments. Therefore popular news articles about super bugs in hospitals - and the over-use of antibiotics that have aided their increase in range - only started hitting the headlines after the bacteria had already spread to other types of locale.

This latter community-associated or CA-MRSA is therefore at least as great a risk as the hospital variant, often affecting younger people. MRSA naturally occurs in several percent of the population anyway, so it would be difficult to totally eradicate by any foreseeable method. Many common antibiotics are already useless against MRSA, which can be spread by direct skin contact as well as via objects - such as computer keyboards and mice I might add, to anyone considering converting their offices to hot desking. In addition, the far less well-known methicillin-sensitive Staphylococcus aureus (MSSA) is also on the increase.

Another key reason for the increase of resistant microbes is thanks to the use of antibiotics on farmed animals. Whilst it might seem sensible for densely-packed livestock to be inoculated - frankly I don't mind paying more for free range rather than battery-farmed eggs, but I realise that isn't an option for many - the discovery in the 1940s that antibiotics can be used to promote growth imply profit is yet again the key factor here. Far from being a simple precautionary measure against the spread of infection, livestock and poultry has been given pharmaceuticals in order to maximise produce without an associated increase in feeding costs.

In 1969 the Swann report on the Use of Antibiotics in Animal Husbandry and Veterinary Medicine recommended a cease on their use as growth promoters. After a long period of inertia, the European Union eventually banned such usage for eight antibiotics, four in 1989 and a further four in 2006. Unfortunately many other nations, including the USA, are still pumping enormous amounts of pharmaceuticals into farm animals.

I've found very little in the way of research projects that seek to lessen this dependency. Possibly the method containing the least disruption would be to develop drugs that have similar effects on animal growth but aren't required as human medicine. Perhaps the pharmaceutical giants just aren't finding antibiotic development profitable enough anymore; after all, if medical practice wants to prevent the spread of resistant bacteria it needs to minimise use of antibiotics.

The effects agricultural usage is having is wide-ranging, from pathogens crossing from livestock to humans and back again, to infections spreading to pets and even into wild animals such as flies and rodents. However, the USA seems to have made little effort to follow the EU, with about 80% of the antibiotics sold there being used on farm livestock. Yet another MRSA variant, CC398, has been gaining ground, particularly in pigs and can transfer to humans in the form LA-MRSA. What price a cheap bacon sandwich?

It isn't as if the American scientific establishment hasn't been amassing data to support the case for stopping the practice, which over the past half century or so has led to other, less well-known strains such as Campylobacter coli gaining immunity, Despite high levels of infected produce, large-scale recalls and perhaps over 100,000 deaths per annum in the USA alone (farm workers and food processors can pick up strains, not just the end consumer), commerce appears to be winning over common sense.

It isn't completely bad news: research by the University of Southampton indicates that copper might become useable as an inhibitor (which seems strange - I thought silver might be the metal of choice, considering its anti-bacterial properties - guess that proves I'm not a research chemist, then!) In addition, some of the main fast food chains have started to cut down on buying produce from antibiotic-pumped livestock. But is this too little much too late? With most pharmaceutical production in the hands of a few giant multi-nationals, the human race is largely beholden to a very small number of executives. My suggestion would be...err...just don't get ill? Or work in a hot desking environment. Or leave your home, ever...hmm...

Tuesday 21 June 2016

Military intelligence: how martial sci-tech does science few favours

I recently read an article about the USA's latest aircraft carrier the USS Gerald R. Ford that contained two bewildering facts: that at a combined research and construction cost of around US$18 billion it is the most expensive warship ever built; and that although only the first of three ships to be built in the class - and with an intended lifespan of half a century - it may already be obsolete.

So if potential aggressor nations now have the anti-ship missile technology to sink the carrier, is it little more than an enormous waste of taxpayer funds? There are reports of war games and simulations over the past three decades which fundamentally undermine the Victorian notion of technological progress - that bigger, stronger, faster equals better. This is particularly apt if your opponent uses 'unfair' and/or 'underhand' tactics such as stealth systems and guerrilla strategies. Then why are these colossal projects still being funded?

The USS Gerald R. Ford is merely the (admittedly very large) tip of an enormous iceberg concerning military expenditure of recent decades. Just to drive the point home, here's a few other recent examples:
  1. The US Navy's aircraft carrier-version of the Lightening II Joint Strike Fighter is the F-35C, with some estimates suggesting each combat-ready aircraft costs up to $337 million.
  2. The US Air Force's F-22 Raptor programme was shut down after only 187 operational aircraft were built, as the price per airframe was even higher, around $350 million.
  3. The apotheosis of combat aircraft has to be the B-2 Spirit stealth bomber. Only 21 were ever built, at a whopping $737 million each, excluding the research and development costs, which may double or even triple this number.
  4. So as to not seem unfairly biased against the USA, other nations also have their share of military expenditure. For example, South Korea's K2 Black Panther is the most expensive main battle tank ever built, with per-unit costs of US$8.5 million each.
So who's to blame for all this? The USS Gerald R. Ford for example was approved during George W. Bush's administration but is only nearing completion eight years after he has left office. At least in democracies, politicians usually come and go in less than a decade whilst defence contractors last much longer. Could the armaments sector be duping administrations into giving them a lifeline? A large proportion of manufacturing has migrated to developing nations but due to the sensitive nature of the sector, advanced military technology is one of the few areas still concentrated within the developed West.

It's difficult to collate anything like exact figures, but the proportion of STEM (Science, Technology, Engineering and Mathematics) professionals worldwide who work on military projects is frequently given as 20% to 25%. Is it feasible that this high level of involvement in an area that is both secretive and horrendously expensive may be counter-productive to the public's attitude to science in general?

After all, no other sector has access to such enormous amounts of tax payer's funds without being responsible to some form of public scrutiny. Then again, since the early 1980s we have been sold a vision of military technology that is a mostly one-sided glorification of armaments and the requirements for ever-increasing expenditure in the name of freedom.

How many mainstream Hollywood movies since 1986's Top Gun - including plenty of sci-fi epics - can be seen as glossy advertisements for advanced weaponry? It may seem odd considering the conventional portrayal of movie scientists but homages to the military-industrial complex show little sign of abating. Praise be to the sophistication of the technology, whilst damning those who develop it as untrustworthy schemers outside of mainstream society. It's a curious phenomenon!

However, developing advanced technology for military purposes is hardly new. The ancient Greek Archimedes developed anti-ship devices whilst Leonardo da Vinci wrote effusive letters to prospective patrons about his land, sea and even aerial weapons, albeit some were of dubious practicality.

Today's society is supposedly more refined than those earlier times, yet whilst a concerted effort is being made to attract more women to STEM subjects, the macho nature of armaments presumably ensures the sector remains male-dominated. If proof would were needed of the interest in all things explosive, the global success of the TV show Mythbusters should be a good indicator. If an example of the crazy nature of unrestrained masculinity needs delineating, then how about atomic bomb pioneer Edward Teller's promotion of nuclear devices for civil engineering projects? For every J. Robert Oppenheimer there were far more Tellers.

It isn't just the sheer cost of contemporary military projects that can lead to the ire of taxpayers. There have been some almost farcical instances of under-performance, such as the degradation of the B-2's anti-radar coating by high levels of humidity (never mind rain). It's easy to blame the scientists and engineers in such circumstances; after all, the politicians and generals leave the cutting-edge technology to the experts! But talk about over-promise and under-deliver...

One area that presumably didn't exist before the Twentieth Century's development of weapons of mass destruction cannot be blamed on STEM professionals and that is the deliberate use of civilians as guinea pigs. From the US and British atomic bomb tests that affected local populations as well as military personal to the cloud-seeding experiments over heavily-populated areas that may have led to fatal downpours, it seems no-one is safe from their own armed forces.

Of course, a large proportion of the degradation of the image of scientists as authority figures may have occurred during the Cold War, when it became apparent that military technocrats of the period earned their reputation as 'architects of the apocalypse'. There's obviously a lot of complexity around this issue. Arguments range back and forth, on such topics as once the Apollo moon landings proved America's technological superiority to the Soviet Union, the project was rapidly wound up; or how did the more right-wing elements of society feel when that same know-how was stalemated by markedly inferior forces in Vietnam?

The space shuttle was another victim of military requirements, the orbiter's unprecedented size being needed for the then large spy satellites - and the intention to fly two of them from Vandenburg Air Force base for 'shadow' missions. In a sense, the military could be seen to have had their fingers in many leading but nominally civilian pies.

This isn't to say that there haven't been productive examples of military technology modified for civilian usage, from early manned spacecraft launched on adapted ICBMs to the ARPANET providing a foundation for the Internet.

Even so, it is easy to look at the immense worldwide expenditure on weapon development and wonder what could be achieved if even a few percent of that funding was redirected elsewhere. There's no doubt about it: the sheer quantity, sophistication and expensive of modern military hardware provides some legitimate public concerns as to the role of science and technology in the name of 'defence'. Especially if $18 billion worth of aircraft carrier is little more than a showy piece of machismo that belongs to the last half century, not the next.

Wednesday 25 May 2016

From Dr Strangelove to Dr Evil: Hollywood's anti-science stance

Despite the decades of hard work by the likes of Bill Nye, Stephen Hawking, Carl Sagan, Stephen Jay Gould etal, there is still an enormous amount of public suspicion surrounding scientists and their work. From wavering opinion concerning climate change to the negative publicity revolving around genetically-modified crops (A.K.A. 'Frankenfoods') it seems that popular opinion of scientists isn't far above that meted out in recent years to politicians and merchant bankers.

Tabloid media cannot be solely to blame for this, although the ridiculous scaremongering stories given front page attention, frequently involving medical science, are certainly no help. Instead, I would argue that some of the blame for the public attitude to STEM (Science, Technology, Engineering and Mathematics) comes from that ubiquitous global communicator, mainstream Hollywood. So where did the world's movie capital get its ideas from?

It seems that the denigration of science and its technological applications has probably existed as long as modern science itself. Before there were films to spread the negativity, literature had a mixed opinion of the discipline. Could some of the most famous apparently anti-scientific publications from Europe have inspired Hollywood's pioneers, many of whom were European emigrés?

Jonathan Swift's third book of Gulliver's Travels concerns the scientific elite of a floating island called Laputa. First published in 1726 during the so-called Age of Enlightenment, the book is typical of Swift's no holds barred approach to satire, making much use of the learning of the day. Despite being far more concerned with social and political issues rather than an anti-scientific stance, the material is still echoed today in the popular media.

Granted, many would agree that some of the more expensive STEM research projects such as the Large Hadron Collider could wait until global issues concerning hunger, medicine, environmental degradation - and poverty in general - are solved, but then wealth is rarely evenly distributed. After all, the USA apparently spends twice as much on pet grooming as it does on nuclear fusion research. Incidentally, isn't this bizarre in itself: it's not just that we consider ourselves so much more rational than all other animals, but that the human brain is the most complex object in the known universe. That's a pretty scary thought!

As for Mary Shelley's classic novel whose title is evoked during criticism of GM foods, she may have been inspired by the general feeling of doom then in the air; almost literally in fact, due to the 1815 eruption of Mount Tambora, with volcanic dust creating 1816's 'Year without a Summer'. As an aside, the astonishingly lurid colours of J.M.W. Turner's sunsets of the period were another artistic response associated with the high-altitude volcanic aerosols.

In addition to the extremely cold, wet conditions of that year, Shelley is thought to have stopped near to the original Frankenstein Castle in Germany, where alchemy and other dubious dark arts were reputed to have been practiced. Combined with Luigi Galvani's experiments on frogs' legs - originally performed several decades earlier but much imitated still in Shelley's time, including on human cadavers - the novel is clearly a reflection of widespread anxieties of the time.

With the expansion of industrial cities and their associated squalor, the mid-Nineteenth Century saw the origin of philosophies that associated technological advances (and their scientific underpinnings) with a debasement of humanity. William Blake's description of 'satanic mills' epitomises this mode of thought, seen in as diverse a range of expression as the Pre-Raphaelite Brotherhood of artists, the Arts and Crafts movement, even the political writings of Marx and Engels. To blame the greed of the new captains of industry on science is obviously unfair, but then the latter were a far easier target. After all, the English chemist and political radical Joseph Priestley fled to the United States after an authority-sponsored mob burnt down his house in 1791.

Blake's over-wraught emoting ("Science is the Tree of Death") is amongst the strongest negativity of the period, but can we blame him, considering science was, as it is today, often wrongly blamed as the root cause of the widespread destruction of nature to make way for a soulless, artificial environment? But it wasn't just a response to the changes to society and landscape that Blake took exception to: he detested the mechanistic vision of the universe built upon the work of Galileo and Newton, believing that too much knowledge destroyed wonder and awe.

This is clearly as subjective a viewpoint as any discussion of a work of art; it can be easily rebuffed, although the attitude behind it should be treated seriously. Happily, today's plethora of glossy coffee table books on such scientifically-gleaned wonders as Hubble Space Telescope imagery show there is still plenty to be in awe of.

Mainstream cinema frequently paints a very A versus B picture of the world (think classic westerns or war films). But science can rarely fit into such neat parcels: consider how the more accurate general theory of relativity can live alongside its predecessor from Newton. In addition, it's very tricky to make interesting drama within a traditional narrative structure that utilises scientist protagonists unless it's a disaster movie (even the likes of Jurassic Park falls within this category.)

It isn't difficult to recall many negative examples of scientists in Hollywood movies, from at best those too wrapped up in their own work to notice its wider effects, to at worst insane megalomaniacs intent on either world domination or destruction. In contrast, how many sympathetic movie scientists are there?

It seems such a shame that such a ubiquitous form of entertainment consistently portrays such a lack of sympathy towards science. Even the film version of Carl Sagan's novel Contact lacked the cosmic spiritual elements of the source material, as if afraid that a combination of astrophysics and the mystical wouldn't be comprehensible to audiences (2001 syndrome, perhaps?) Science fiction films these days often seem keen to boast of their technical consultants, so what about a more sympathetic attitude to the practitioners of science itself? After all, most scientists don't live with their private armies in secret headquarters bases, planning to takeover the world...