Friday 26 August 2016

The benefit of hindsight: the truth behind several infamous science quotes

With utmost apologies to Jane Austen fans, it is a truth universally acknowledged that most people misinterpret science as an ever-expanding corpus of knowledge rather than as a collection of methods for investigating natural phenomena. A simplistic view for those who adhere to the former misapprehension might include questioning science as a whole when high-profile practitioners make an authoritative statement that is proven - in a scientific sense - to be incorrect.

Amongst the more obvious examples of this are the numerous citations from prominent STEM (Science, Technology, Engineering and Mathematics) professionals that are inaccurate to such an extreme as to appear farcical in light of later evidence. I have already discussed the rather vague of art of scientific prognostication in several connected posts but now want to directly examine several quotations concerning applied science. Whereas many quotes are probably as deserving of contempt as the popular opinion of them, I believe the following require careful reading and knowledge of their context in which to attempt any meaningful judgement.

Unlike Hollywood, STEM subjects are frequently too complex for simple black versus white analysis. Of course there have been rather derisible opinions espoused by senior scientists, many of which - luckily - remain largely unknown to the wider public. The British cosmologist and astronomer Sir Fred Hoyle has a large number of these just to himself, from continued support for the Steady State theory long after the detection of cosmic microwave background radiation, to the even less defensible claims that the Natural History Museum's archaeopteryx fossil is a fake and that flu germs are really alien microbes!

Anyhow, here's the first quote:

1) Something is seriously wrong with space travel.

Richard van der Riet Woolley was the British Astronomer Royal at the dawn of the Space Age. His most infamous quote is the archetypal instance of Arthur C. Clarke's First Law:  "When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong."

Although a prominent astronomer, van der Riet Woolley had little knowledge of the practical mechanics that would be required for spaceflight. By the mid-1930s the British Interplanetary Society had developed detailed (although largely paper-only) studies into a crewed lunar landing mission. In 1936 Van der Riet Woolley publically criticised such work, stating that the development of even an unmanned rocket would present fundamental technical difficulties. Bear in mind that this was only six years before the first V2 rocket, which was capable of reaching an altitude of just over 200km!

In 1956, only one year before Sputnik 1 - and thirteen years prior to Apollo 11 - the astronomer went on to claim that near-future space travel was unlikely and a manned lunar landing "utter bilge, really". Of course this has been used as ammunition against him ever since, but the quote deserves some investigation. Van der Riet Woolley goes on to reveal that his primary objection appears to have changed (presumably post-V2 and its successors) from an engineering problem to an economic one, stating that it would cost as much as a "major war" to land on the moon.

This substantially changes the flavour of his quote, since it is after all reasonably accurate. In 2010 dollars, Project Apollo has an estimated budget of about US$109 billion - incidentally about 11% of the cost of the contemporary Vietnam War. In addition, we should bear in mind that a significant amount of the contractors' work on the project is said to have consisted of unpaid overtime. Is it perhaps time to reappraise the stargazer from a reactionary curmudgeon to an economic realist?

Indeed, had Apollo been initiated in a subsequent decade, there is reasonable evidence to suggest it would have failed to leave the ground, so to speak. The uncertainty of the post-Vietnam and Watergate period, followed by the collapse of the Soviet Union, suggest America's loss of faith in technocracy would have effectively cut Apollo off in its prime. After all, another colossal American science and engineering project, the $12 billion particle accelerator the Superconducting Super Collider, was cancelled in 1993 after being deemed unaffordable. Yet up to that point only about one-sixth of its estimated budget had been spent.

In addition, van der Riet Woolley was not alone among STEM professionals: for three decades from the mid-1920s the inventor of the vacuum tube Lee De Forest is said to have claimed that space travel was impractical. Clearly, the Astronomer Royal was not an isolated voice in the wilderness but part of a large consensus opposed to the dreamers in the British Interplanetary Society and their ilk. Perhaps we should allow him his pragmatism, even if it appears a polar opposite to one of Einstein's great aphorisms: "The most beautiful thing we can experience is the mysterious. It is the source of all true art and science. .."

Talking of whom…

2) Letting the genie out of the bottle.

In late 1934 an American newspaper carried this quotation from Albert Einstein: "There is not the slightest indication that (nuclear energy) will ever be obtainable. It would mean that the atom would have to be shattered at will." This seems to be rather amusing, considering the development of the first self-sustaining nuclear chain reaction only eight years later. But Einstein was first and foremost a theorist, a master of the thought experiment, his father's work in electrical engineering not being noticeably sustained in his son. There is obviously a vast world of difference between imagining riding a beam of light to the practical difficulties in assembling brand new technologies with little in the way of precedent. So why did Einstein make such a definitive prediction?

I think it is possible that it may also have been wishful thinking on Einstein's part; as a pacifist he would have dreaded the development of a new super weapon. As the formulator of the equivalence between mass and energy, he could have felt in some way responsible for initiating the avalanche that eventually led to Hiroshima and Nagasaki. Yet there is no clear path between E=mc2 and a man-made chain reaction; it took a team of brilliant experimental physicists and engineers in addition to theorists to achieve a practical solution, via the immense budget of $26 billion (in 2016 dollars).

It is hardly as if the good professor was alone in his views either, as senior officials also doubted the ability to harness atomic fission for power or weaponry. In 1945 when the Manhattan Project was nearing culmination, the highest-ranking member of the American military, Fleet Admiral William Leahy, apparently informed President Truman that the atomic bomb wouldn't work. Perhaps this isn't as obtuse as it sounds, since due to the level of security only a very small percentage of the personnel working on the project knew any of the details.

Leahy clearly knew exactly what the intended outcome was, but even as "an expert in explosives" had no understanding of the complexity of engineering involved. An interesting associated fact is that despite being a military man, the Admiral considered the atomic bomb unethical for its obvious potential as an indiscriminate killer of civilians. Weapons of mass destruction lack any of the valour or bravado of traditional 'heroic' warfare.  Is it possible that this martial leader wanted the bomb to fail for moral reasons, a case of heart over mind? In which case, is this a rare example in which the pacifism of the most well-known scientist was in total agreement with a military figurehead?

Another potential cause is the paradigm shift that harnessing the power of the atom required. In the decade prior to the Manhattan Project, New Zealand physicist Ernest Rutherford had referred to the possibility of man-made atomic energy as "moonshine" whilst another Nobel laureate, American physicist Robert Millikan, had made similar sentiments in the 1920s. And this from men who were pioneers in understanding the structure of the atom!

As science communicator James Burke vividly described in his 1985 television series The Day the Universe Changed, major scientific developments often require substantial reappraisals in outlook, seeing beyond what is taken for granted. The cutting edge of physics is often described as being ruled by theorists in their twenties; eager young turks who are more prepared to ignore precedents. When he became a pillar of the establishment, Einstein ruefully commented: "To punish me for my contempt for authority, fate made me an authority myself."

Perhaps then, such fundamental shifts in technology as the development of space travel and nuclear fission require equally revolutionary changes in mind set and we shouldn't judge the authors of our example quotes too harshly. Then again, if you are an optimist, Clarke's First Law might seem applicable in this situation, in which case quotes from authority figures with some knowledge of the subject in hand should take note of the ingenuity of our species. If there is a moral to this to story, it is other than the speed of light in a vacuum and the Second Law of Thermodynamics, never say never...

Wednesday 27 July 2016

Resistance is futile: the ongoing war against super bugs

As I'm currently three days into an irritating cough (aren't they all?) accompanied by a sore throat, I've just taken a soothing lozenge. The packet states the lozenges contain a combination of two antibacterial agents which aim to help kill the bacteria causing the infection. However, the packet also notes - in a somewhat smaller font size - that there is no clinical proof an antibacterial agent will reduce the severity or duration of the infection. Could this be because common colds and influenza are caused by viruses not bacteria? I don't suppose the pharmaceutical industry could possibly be duping an ignorant public in the name of profit margins?

Working in a hot desking environment, I frequently remind colleagues not to overdue usage of anti-bacterial sprays on their desk, keyboards, mouse and telephone. Not that I'm exactly certain how damaging the company-supplied sprays are, environmentally-speaking: for all I know, they may be good enough to destroy all the 'bad' bacteria, but I'd rather be safe than sorry. Instead, I recommend the method I use at work, namely washing my hands before eating. Simple, and hopefully less likely to encourage super bugs.

It seems to me that basic hygiene is preferable to the chemical war on microbes, since (a) some are beneficial, including for building immunity; and (b) some strains may survive the cull and lead to a desk biota high in resistant bacteria: after all, isn't that just Darwinian natural selection being given an unintentional boost? Unfortunately, there has been a large increase in sick leave since we moved from conventional offices to hot-desking. Therefore something is clearly going wrong, regardless of approach!

The best well-known of the super bugs has to be Methicillin-resistant Staphylococcus aureus (MRSA), beloved of news journalists but very few others. Although the resistance was first recognised around 1960, the past twenty-five years or so has seen a plethora of scientific reports describing outbreaks separated from healthcare environments. Therefore popular news articles about super bugs in hospitals - and the over-use of antibiotics that have aided their increase in range - only started hitting the headlines after the bacteria had already spread to other types of locale.

This latter community-associated or CA-MRSA is therefore at least as great a risk as the hospital variant, often affecting younger people. MRSA naturally occurs in several percent of the population anyway, so it would be difficult to totally eradicate by any foreseeable method. Many common antibiotics are already useless against MRSA, which can be spread by direct skin contact as well as via objects - such as computer keyboards and mice I might add, to anyone considering converting their offices to hot desking. In addition, the far less well-known methicillin-sensitive Staphylococcus aureus (MSSA) is also on the increase.

Another key reason for the increase of resistant microbes is thanks to the use of antibiotics on farmed animals. Whilst it might seem sensible for densely-packed livestock to be inoculated - frankly I don't mind paying more for free range rather than battery-farmed eggs, but I realise that isn't an option for many - the discovery in the 1940s that antibiotics can be used to promote growth imply profit is yet again the key factor here. Far from being a simple precautionary measure against the spread of infection, livestock and poultry has been given pharmaceuticals in order to maximise produce without an associated increase in feeding costs.

In 1969 the Swann report on the Use of Antibiotics in Animal Husbandry and Veterinary Medicine recommended a cease on their use as growth promoters. After a long period of inertia, the European Union eventually banned such usage for eight antibiotics, four in 1989 and a further four in 2006. Unfortunately many other nations, including the USA, are still pumping enormous amounts of pharmaceuticals into farm animals.

I've found very little in the way of research projects that seek to lessen this dependency. Possibly the method containing the least disruption would be to develop drugs that have similar effects on animal growth but aren't required as human medicine. Perhaps the pharmaceutical giants just aren't finding antibiotic development profitable enough anymore; after all, if medical practice wants to prevent the spread of resistant bacteria it needs to minimise use of antibiotics.

The effects agricultural usage is having is wide-ranging, from pathogens crossing from livestock to humans and back again, to infections spreading to pets and even into wild animals such as flies and rodents. However, the USA seems to have made little effort to follow the EU, with about 80% of the antibiotics sold there being used on farm livestock. Yet another MRSA variant, CC398, has been gaining ground, particularly in pigs and can transfer to humans in the form LA-MRSA. What price a cheap bacon sandwich?

It isn't as if the American scientific establishment hasn't been amassing data to support the case for stopping the practice, which over the past half century or so has led to other, less well-known strains such as Campylobacter coli gaining immunity, Despite high levels of infected produce, large-scale recalls and perhaps over 100,000 deaths per annum in the USA alone (farm workers and food processors can pick up strains, not just the end consumer), commerce appears to be winning over common sense.

It isn't completely bad news: research by the University of Southampton indicates that copper might become useable as an inhibitor (which seems strange - I thought silver might be the metal of choice, considering its anti-bacterial properties - guess that proves I'm not a research chemist, then!) In addition, some of the main fast food chains have started to cut down on buying produce from antibiotic-pumped livestock. But is this too little much too late? With most pharmaceutical production in the hands of a few giant multi-nationals, the human race is largely beholden to a very small number of executives. My suggestion would be...err...just don't get ill? Or work in a hot desking environment. Or leave your home, ever...hmm...