Wednesday 27 July 2016

Resistance is futile: the ongoing war against super bugs

As I'm currently three days into an irritating cough (aren't they all?) accompanied by a sore throat, I've just taken a soothing lozenge. The packet states the lozenges contain a combination of two antibacterial agents which aim to help kill the bacteria causing the infection. However, the packet also notes - in a somewhat smaller font size - that there is no clinical proof an antibacterial agent will reduce the severity or duration of the infection. Could this be because common colds and influenza are caused by viruses not bacteria? I don't suppose the pharmaceutical industry could possibly be duping an ignorant public in the name of profit margins?

Working in a hot desking environment, I frequently remind colleagues not to overdue usage of anti-bacterial sprays on their desk, keyboards, mouse and telephone. Not that I'm exactly certain how damaging the company-supplied sprays are, environmentally-speaking: for all I know, they may be good enough to destroy all the 'bad' bacteria, but I'd rather be safe than sorry. Instead, I recommend the method I use at work, namely washing my hands before eating. Simple, and hopefully less likely to encourage super bugs.

It seems to me that basic hygiene is preferable to the chemical war on microbes, since (a) some are beneficial, including for building immunity; and (b) some strains may survive the cull and lead to a desk biota high in resistant bacteria: after all, isn't that just Darwinian natural selection being given an unintentional boost? Unfortunately, there has been a large increase in sick leave since we moved from conventional offices to hot-desking. Therefore something is clearly going wrong, regardless of approach!

The best well-known of the super bugs has to be Methicillin-resistant Staphylococcus aureus (MRSA), beloved of news journalists but very few others. Although the resistance was first recognised around 1960, the past twenty-five years or so has seen a plethora of scientific reports describing outbreaks separated from healthcare environments. Therefore popular news articles about super bugs in hospitals - and the over-use of antibiotics that have aided their increase in range - only started hitting the headlines after the bacteria had already spread to other types of locale.

This latter community-associated or CA-MRSA is therefore at least as great a risk as the hospital variant, often affecting younger people. MRSA naturally occurs in several percent of the population anyway, so it would be difficult to totally eradicate by any foreseeable method. Many common antibiotics are already useless against MRSA, which can be spread by direct skin contact as well as via objects - such as computer keyboards and mice I might add, to anyone considering converting their offices to hot desking. In addition, the far less well-known methicillin-sensitive Staphylococcus aureus (MSSA) is also on the increase.

Another key reason for the increase of resistant microbes is thanks to the use of antibiotics on farmed animals. Whilst it might seem sensible for densely-packed livestock to be inoculated - frankly I don't mind paying more for free range rather than battery-farmed eggs, but I realise that isn't an option for many - the discovery in the 1940s that antibiotics can be used to promote growth imply profit is yet again the key factor here. Far from being a simple precautionary measure against the spread of infection, livestock and poultry has been given pharmaceuticals in order to maximise produce without an associated increase in feeding costs.

In 1969 the Swann report on the Use of Antibiotics in Animal Husbandry and Veterinary Medicine recommended a cease on their use as growth promoters. After a long period of inertia, the European Union eventually banned such usage for eight antibiotics, four in 1989 and a further four in 2006. Unfortunately many other nations, including the USA, are still pumping enormous amounts of pharmaceuticals into farm animals.

I've found very little in the way of research projects that seek to lessen this dependency. Possibly the method containing the least disruption would be to develop drugs that have similar effects on animal growth but aren't required as human medicine. Perhaps the pharmaceutical giants just aren't finding antibiotic development profitable enough anymore; after all, if medical practice wants to prevent the spread of resistant bacteria it needs to minimise use of antibiotics.

The effects agricultural usage is having is wide-ranging, from pathogens crossing from livestock to humans and back again, to infections spreading to pets and even into wild animals such as flies and rodents. However, the USA seems to have made little effort to follow the EU, with about 80% of the antibiotics sold there being used on farm livestock. Yet another MRSA variant, CC398, has been gaining ground, particularly in pigs and can transfer to humans in the form LA-MRSA. What price a cheap bacon sandwich?

It isn't as if the American scientific establishment hasn't been amassing data to support the case for stopping the practice, which over the past half century or so has led to other, less well-known strains such as Campylobacter coli gaining immunity, Despite high levels of infected produce, large-scale recalls and perhaps over 100,000 deaths per annum in the USA alone (farm workers and food processors can pick up strains, not just the end consumer), commerce appears to be winning over common sense.

It isn't completely bad news: research by the University of Southampton indicates that copper might become useable as an inhibitor (which seems strange - I thought silver might be the metal of choice, considering its anti-bacterial properties - guess that proves I'm not a research chemist, then!) In addition, some of the main fast food chains have started to cut down on buying produce from antibiotic-pumped livestock. But is this too little much too late? With most pharmaceutical production in the hands of a few giant multi-nationals, the human race is largely beholden to a very small number of executives. My suggestion would be...err...just don't get ill? Or work in a hot desking environment. Or leave your home, ever...hmm...

Tuesday 21 June 2016

Military intelligence: how martial sci-tech does science few favours

I recently read an article about the USA's latest aircraft carrier the USS Gerald R. Ford that contained two bewildering facts: that at a combined research and construction cost of around US$18 billion it is the most expensive warship ever built; and that although only the first of three ships to be built in the class - and with an intended lifespan of half a century - it may already be obsolete.

So if potential aggressor nations now have the anti-ship missile technology to sink the carrier, is it little more than an enormous waste of taxpayer funds? There are reports of war games and simulations over the past three decades which fundamentally undermine the Victorian notion of technological progress - that bigger, stronger, faster equals better. This is particularly apt if your opponent uses 'unfair' and/or 'underhand' tactics such as stealth systems and guerrilla strategies. Then why are these colossal projects still being funded?

The USS Gerald R. Ford is merely the (admittedly very large) tip of an enormous iceberg concerning military expenditure of recent decades. Just to drive the point home, here's a few other recent examples:
  1. The US Navy's aircraft carrier-version of the Lightening II Joint Strike Fighter is the F-35C, with some estimates suggesting each combat-ready aircraft costs up to $337 million.
  2. The US Air Force's F-22 Raptor programme was shut down after only 187 operational aircraft were built, as the price per airframe was even higher, around $350 million.
  3. The apotheosis of combat aircraft has to be the B-2 Spirit stealth bomber. Only 21 were ever built, at a whopping $737 million each, excluding the research and development costs, which may double or even triple this number.
  4. So as to not seem unfairly biased against the USA, other nations also have their share of military expenditure. For example, South Korea's K2 Black Panther is the most expensive main battle tank ever built, with per-unit costs of US$8.5 million each.
So who's to blame for all this? The USS Gerald R. Ford for example was approved during George W. Bush's administration but is only nearing completion eight years after he has left office. At least in democracies, politicians usually come and go in less than a decade whilst defence contractors last much longer. Could the armaments sector be duping administrations into giving them a lifeline? A large proportion of manufacturing has migrated to developing nations but due to the sensitive nature of the sector, advanced military technology is one of the few areas still concentrated within the developed West.

It's difficult to collate anything like exact figures, but the proportion of STEM (Science, Technology, Engineering and Mathematics) professionals worldwide who work on military projects is frequently given as 20% to 25%. Is it feasible that this high level of involvement in an area that is both secretive and horrendously expensive may be counter-productive to the public's attitude to science in general?

After all, no other sector has access to such enormous amounts of tax payer's funds without being responsible to some form of public scrutiny. Then again, since the early 1980s we have been sold a vision of military technology that is a mostly one-sided glorification of armaments and the requirements for ever-increasing expenditure in the name of freedom.

How many mainstream Hollywood movies since 1986's Top Gun - including plenty of sci-fi epics - can be seen as glossy advertisements for advanced weaponry? It may seem odd considering the conventional portrayal of movie scientists but homages to the military-industrial complex show little sign of abating. Praise be to the sophistication of the technology, whilst damning those who develop it as untrustworthy schemers outside of mainstream society. It's a curious phenomenon!

However, developing advanced technology for military purposes is hardly new. The ancient Greek Archimedes developed anti-ship devices whilst Leonardo da Vinci wrote effusive letters to prospective patrons about his land, sea and even aerial weapons, albeit some were of dubious practicality.

Today's society is supposedly more refined than those earlier times, yet whilst a concerted effort is being made to attract more women to STEM subjects, the macho nature of armaments presumably ensures the sector remains male-dominated. If proof would were needed of the interest in all things explosive, the global success of the TV show Mythbusters should be a good indicator. If an example of the crazy nature of unrestrained masculinity needs delineating, then how about atomic bomb pioneer Edward Teller's promotion of nuclear devices for civil engineering projects? For every J. Robert Oppenheimer there were far more Tellers.

It isn't just the sheer cost of contemporary military projects that can lead to the ire of taxpayers. There have been some almost farcical instances of under-performance, such as the degradation of the B-2's anti-radar coating by high levels of humidity (never mind rain). It's easy to blame the scientists and engineers in such circumstances; after all, the politicians and generals leave the cutting-edge technology to the experts! But talk about over-promise and under-deliver...

One area that presumably didn't exist before the Twentieth Century's development of weapons of mass destruction cannot be blamed on STEM professionals and that is the deliberate use of civilians as guinea pigs. From the US and British atomic bomb tests that affected local populations as well as military personal to the cloud-seeding experiments over heavily-populated areas that may have led to fatal downpours, it seems no-one is safe from their own armed forces.

Of course, a large proportion of the degradation of the image of scientists as authority figures may have occurred during the Cold War, when it became apparent that military technocrats of the period earned their reputation as 'architects of the apocalypse'. There's obviously a lot of complexity around this issue. Arguments range back and forth, on such topics as once the Apollo moon landings proved America's technological superiority to the Soviet Union, the project was rapidly wound up; or how did the more right-wing elements of society feel when that same know-how was stalemated by markedly inferior forces in Vietnam?

The space shuttle was another victim of military requirements, the orbiter's unprecedented size being needed for the then large spy satellites - and the intention to fly two of them from Vandenburg Air Force base for 'shadow' missions. In a sense, the military could be seen to have had their fingers in many leading but nominally civilian pies.

This isn't to say that there haven't been productive examples of military technology modified for civilian usage, from early manned spacecraft launched on adapted ICBMs to the ARPANET providing a foundation for the Internet.

Even so, it is easy to look at the immense worldwide expenditure on weapon development and wonder what could be achieved if even a few percent of that funding was redirected elsewhere. There's no doubt about it: the sheer quantity, sophistication and expensive of modern military hardware provides some legitimate public concerns as to the role of science and technology in the name of 'defence'. Especially if $18 billion worth of aircraft carrier is little more than a showy piece of machismo that belongs to the last half century, not the next.