Tuesday 18 June 2013

Deserving dollars: should mega budget science be funded in an age of austerity?

With the UK narrowly avoiding France's fate of a triple dip recession, I thought I would bite the bullet and examine some of the economics of current science. In a time when numerous nations are feeling severe effects due to the downturn, it is ironic that there are a multitude of science projects with budgets larger than the GDP of some smaller nations. So who funds these ventures and are they value for money, or even worthwhile, in these straitened times? Here are a few examples of current and upcoming projects, with the lesser known the project the more the information supplied:

National Ignition Facility

The world's most powerful laser was designed with a single goal: to generate net energy from nuclear fusion by creating temperatures and pressures similar to those in the cores of stars. However, to state that the NIF has not lived up to expectation would be something of an understatement. According to even the most conservative sources, the original budget of the Lawrence Livermore National Laboratory project has at the very least doubled if not quadrupled to over US$4 billion, whilst the scheduled operational date came five years overdue.

I first learned of the project some years ago thanks to a friend who knew one of the scientists involved. The vital statistics are astonishing, both for the scale of the facility and the energies involved. But it seems that there may be underlying problems with the technology. Over-reliance on computer simulations and denial of deleterious experimental results on precursor projects, as well as the vested interests of project staffers and the over-confident potential for military advances, have all been suggested as causes for what history may conclude as a white elephant. So perhaps if you are looking for an archetypal example of how non-scientific factors have crippled research, this may well be it.

Unlike all the other projects discussed, the National Ignition Facility is solely funded by one nation, the USA. Of course, it could be argued that four billion dollars is a bargain if the project succeeded, and that it is today's time-precious society that needs to learn patience in order to appreciate the long-term timescales required to overcome the immense technological challenges. Nuclear fusion would presumably solve many of todays - and the foreseeable futures - energy requirements whilst being rather more environmentally friendly than either fossil fuels or fission reactors. The potential rewards are plain for all to see.

However, the problems are deep-rooted, leading to arguments against the development of laser-based fusion per se. Alternative fusion projects such as the Joint European Torus and the $20 billion ITER - see an earlier post on nuclear fusion research for details - use longer-established methods. My verdict in a nutshell: the science was possibly unsound from the start and the money would be better spent elsewhere. Meanwhile, perhaps the facility could get back a small portion of its funding if Star Trek movies continue to hire the NIF as a filming location!

The International Space Station

I remember the late Carl Sagan arguing that the only benefit of the ISS that couldn’t be achieved via cheaper projects such as – during the Space Shuttle era - the European Space Agency’s Spacelab, was research into the deleterious effects on health of long-duration spaceflight. So at $2 billion per year to run is it worthwhile, or but another example of a fundamentally flawed project? After all, as it is the station includes such non-scientific facets as the ultimate tourist destination for multi-millionaires!

Sometimes referred to as a lifeline for American and Russian aerospace industries (or even a way to prevent disaffected scientists in the latter from working for rogue states), I have been unable to offer a persuasive argument as to why the money would not have been better spent elsewhere. It is true that there has been investigation into vaccines for salmonella and MRSA, but after twelve years of permanent crewing on board the station, just how value for money has this research been? After all, similar studies were carried out on Space Shuttle flights in previous few decades, suggesting that the ISS was not vital to these programmes. The Astronomer Royal Lord Martin Rees has described as it as a 'turkey in the sky', siphoning funds that could have been spent on a plethora of unmanned missions such as interplanetary probes. But as we should be aware, it usually isn't a case that money not spent on one project would automatically become available for projects elsewhere.

On a positive scientific note, the station has played host to the $2 billion Alpha Magnetic Spectrometer - a key contender in the search for dark matter - which would presumably have difficulty finding a long-duration orbital platform elsewhere. But then this is hardly likely to excite those who want immediate, practical benefits from such huge expenditure.

The ISS has no doubt performed well as a test bed for examining the deterioration of the human body due to living in space, if anything seriously weakening the argument for a manned Mars mission in the near future. Perhaps one other area in which the station has excelled has been that of a focal point for promoting science to the public, but surely those who follow in Sagan’s footsteps - the U.K.'s Brian Cox for one - can front television series with a similar goal for the tiniest fraction of the cost?

The Large Hadron Collider

An amazing public-relations success story, considering how far removed the science and technology are from everyday mundanity, the world's largest particle accelerator requires $1 billion per year to operate on top of a construction budget of over $6 billion. With a staff of over 10,000 the facility is currently in the midst of a two-year upgrade, giving plenty of time for its international research community to analyse the results. After all, the Higgs Boson A.K.A. 'God particle' has been found…probably.

So if the results are confirmed, what next? Apparently, the facility can be re-engineered for a wide variety of purposes, varying from immediately pragmatic biomedical research on cancer and radiation exposure to the long-term search for dark matter. This combination of practical benefits with extended fundamental science appears to be as good a compromise as any compared to similar-scale projects. Whether similar research could be carried out by more specialised projects is unknown. Does anyone know?

As for the future of mega-budget schemes, there are various projects in development extending into the next decade. The Southern Hemisphere is playing host to two large international collaborations: the Square Kilometre Array is due to begin construction in eleven nations - excluding its UK headquarters - in 2016, but it will be around eight years before this $2 billion radio telescope array is fully operational. Meanwhile the equally unimaginatively-named European Extremely Large Telescope is planned for a site in Chile, with an even longer construction period and a price tag approaching $1.5 billion. Both projects are being designed for a variety of purposes, from dark matter investigation to searching for small (i.e. Earth-sized) extra-solar planets with biologically-modified atmospheres.

At this point it is pertinent to ask do extremely ambitious science projects have to come with equally impressive price tags? Personally I believe that with a bit more ingenuity a lot of useful research can be undertaken on far smaller budgets. Public participation in distributed computing projects such as Folding@home and Seti@home, in which raw data is processed by home computers, is about as modest an approach as feasible for such large amounts of information.

An example of a long-term project on a comparatively small budget is the US-based Earthscope programme, which collects and analyses data including eminently practical research into seismic detection. With a construction cost of about $200 million and annual budget around a mere $125 million this seems to be a relative bargain for a project that combines wide-scale, theoretical targets with short-term, pragmatic gains. But talking of practical goals, there are other scientific disciplines crying out for a large increase in funding. Will the explosive demise of a meteor above the Russian city of Chelyabinsk back in February act as a wake-up call for more research into locating and deflecting Earth-crossing asteroids and comets? After all, the 2014 NASA budget for asteroid detection projects is barely over the hundred million dollar mark!

I will admit to some unique advantages to enormous projects, such as the bringing together of researchers from the funding nations that may lead to fruitful collaboration. This is presumably due to the sheer number of scientists gathered together for long periods, as opposed to spending just a few days at an international conference or seminar, for instance. Even so, I cannot help but feel that the money for many of the largest scale projects could be bettered used elsewhere, solving some of the immediate problems facing our species and ecosystem.

Unfortunately, the countries involved offer their populations little in the way of voice as to how public money is spent on research. But then considering the appalling state of science education in so many nations, as well as the short shrift that popular culture usually gives to the discipline, perhaps it isn’t so surprising after all. If we want to make mega-budget projects more accountable, we will need to make fundamental changes to the status of science in society. Without increased understanding of the research involved, governments are unlikely to grant us choice.

Tuesday 14 May 2013

What, how and why? Are there 3 stages to science?

Not being philosophically inclined I was recently surprised to find myself constructing an armchair thesis: it had suddenly dawned on me that there might be three, broad phases or stages to the development of scientific ideas. I'm fairly certain I haven't read about anything along similar lines, so let me explain,  safe in the knowledge that if it's a load of fetid dingo's kidneys, it's entirely of my own doing.

Stage 1

Stage one is the 'what' phase: simply stated, it is about naming and categorising natural phenomena, a delineation of cause and effect. In a sense, it is about finding rational explanations for things and events at the expense of superstition and mysticism.  In addition, it utilises the principle of parsimony, otherwise known as Occam's (or Ockham's) Razor: that the simplest explanation is usually correct. 

Although there were a few clear moments of stage one in Ancient Greece - Eratosthenes' attempt to measure the size of the Earth using Euclidean Geometry being a prime example - it seems to have taken off in earnest with Galileo. Although his work is frequently mythologised (I follow the rolling weights rather than dropping objects from the Leaning Tower of Pisa brigade), Galileo most likely devised both actual and thought experiments to test fundamental findings, such as the separate effects of air resistance and gravity.

Of course, Galileo was primarily interested in physics but the other areas of science followed soon after. Systematic biology came to the fore in such practical work as the anatomical investigations of William Harvey - pioneer in the understanding of blood circulation - and the glass bead microscopes of Antony van Leeuwenhoek. The work of the latter, interestingly enough, was largely to understand how small-scale structure in edible substances created flavours.  It's also worth thinking about how this research expanded horizons: after all, no-one had ever seen the miniature marvels such as bacteria. I wonder how difficult the engravers of illustrated volumes found it, working from sketches and verbal descriptions on sights they have never seen themselves? But then again, no-one has ever directly imaged a quark either…

Talking of biology, we shouldn't ignore Carl Linnaeus, the Swedish scientist who started the cataloguing methodology in use today. New Zealand physicist Ernest Rutherford may have disparagingly referred to all branches of science other than physics as mere stamp collecting but apart from the wild inaccuracy of his statement it is seemingly obvious that without various standards of basic definitions there is no bedrock for more sophisticated research.

The repetitive, largely practical aspect of the phase in such disciplines as geology and taxonomy meant that largely untrained amateurs could make major contributions, such as the multitude of Victorian parsons (of whom Charles Darwin was almost a member) who worked on the quantity over quality principle in collecting and cataloguing immense amounts of data. Of course, Darwin went far beyond phase one but his work built on the evaluation of evolutionary ideas (try saying that three times fast) that numerous predecessors had discussed, from the Ancient Greeks to John Ray in the late Seventeenth Century.

This isn't to say that stage one science will be finished any time soon. The Human Genome Project is a good example of a principally descriptive project that generated many surprises, not least that it is proving more difficult than predicted to utilise the results in practical applications. Although in the BBC television series The Kingdom of Plants David Attenborough mentioned that the Royal Botanic Gardens at Kew contains 90% of known plant species, there are still plenty of remote regions - not to mention the oceans - yet to yield all their secrets to systematic scientific exploration.  In addition to the biota yet to be described in scientific records, the existing catalogues are in the process of major reorganisation. For example, the multitude of duplicate plant names is currently being addressed by taxonomic experts, having so far led to the finding of 600,000 superfluous designations. It isn't just plants either: a recent example was the announcement that DNA evidence suggests there is probably only a single species of giant squid rather than seven. It may sound tedious and repetitive, but without comprehensive labelling and description of natural elements, it would be impossible to progress to the next stage.

Stage 2

Who was the first person to move beyond cataloguing nature to in-depth analysis? We'll probably never know, but bearing in mind that some of the Ionian philosophers and Alexandrian Greeks performed practical experiments, it may well have been one of them.

By looking to explore why phenomena occur and events unfold the way they do, our species took a step beyond description to evaluation. If art is holding a mirror up to nature, then could the second phase be explained as holding a magnifying glass up to nature, reducing a phenomenon to an approximation, and explaining how that approximation works?

For example, Newton took Galileo and Kepler's astronomical work and ran with it, producing his Law of Universal Gravitation. The ‘how' in this case is the gravitational constant that explained how bodies orbit their common centre of gravity. However, Newton was unable to delineate what caused the force to act across infinite, empty space, a theory that had to wait for stage three.

So different from the smug, self-satisfied attitude of scientists at the beginning of the Twentieth Century, the techniques of modern science suggest that there is a feedback cycle in which knowing which questions to ask is at least as important as gaining answers, the adage in this case being ‘good experiments generate new questions'. Having said that, some of the largest and most expensive contemporary experiments such as the Laser Interferometer Gravitational-Wave Observatory (LIGO) and Large Hadron Collider (LHC) have each been principally designed to confirm a single hypothesis.

As recent evidence has shown, even some of the fundamentals of the nature, including dark matter and dark energy, are only just being recognised. Therefore science is a long way from recognising all first principles, let alone understanding them. Closer to home, that most complex of known objects, the human brain, still holds a lot of secrets, and probably will continue to do so for some time to come.
Though microelectronics in general and computers in particular have allowed the execution of experiments in such fields as quantum teleportation, considered close to impossible by the finest minds only half a century ago, there are several reasons why computer processing power is getting closer to a theoretical maximum using current manufacturing techniques and materials. Therefore the near future may see a slowing down in the sorts of leading edge experimental science that has been achieved in recent decades. But how much progress has been made in phase three science?

Stage 3

This is more difficult to define than the other two phases and can easily veer into philosophy, a discipline that has a poor press from many professional scientists. Physicist Richard Feynman for example is supposed to have disparaged it as ‘about as useful to scientists as ornithology is to birds'.  Despite this - and the probability that there as many philosophies of science as there are philosophers -  it's easy to see that the cutting edge of science, particularly theoretical physics, generates as much discussion over its validity as any work of art. If you've read one of the myriad critiques of superstring theory for example, then you will know that it can be viewed as a series of intellectual patterns (accompanied by diabolical equations) that may never be experimentally confirmed. In that case is string theory really just a collection of philosophical hypotheses, unproven by experiment or observation and likely to remain so? The minuteness of the scale (an underwhelming description if ever there was one) makes the prospect of directly recording strings themselves  - as opposed to their effects - highly unlikely.

If that is the case then just where can you draw the line between science and philosophy? Of course one of the fundamental tenets of a valid hypothesis is to make testable predictions that no other hypothesis can account for. But with over a century of theories that increasingly fail to follow common sense  or match everyday experience perhaps this is a sign of approaching maturity in science, as we finally advance beyond the crude limitations of our biological inheritance and its limited senses. Surely one key result of this is that the boundaries between new ideas promulgated by scientists and the thoughts of armchair philosophers will become increasingly blurred? Or is that just fighting talk?

Whereas scientists engaged in phase two investigations seek to find more accurate approximations for phenomena, phase three includes the search for why one theory is thought to be correct over another. A prominent example may help elucidate. Further to Galileo in phase one and Newton in phase two, Einstein's General Relativity, which explains the cause of gravity via the curvature of spacetime, is clearly an example of phase three. Of course, contemporary physicists would argue that Einstein's equations are already known to be lacking finality due to its incompatible with quantum mechanics. Herein lies the rub!

One problem that has caused dissension amongst many scientists is a possibly even more ‘ultimate' question: why is the universe finely tuned enough for life and more than that, intelligent life, to exist? The potential answers cover the entire gamut of human thought, from the conscious design principle supported by some religiously-minded scientists, to the invocation of the laws of probability in a multiverse hypothesis, requiring an immense number of universes all with the different fundamentals (and therefore including a lucky few capable of producing life). But the obvious issue here is that wouldn't Occam's Razor suggest the former is more likely than the latter? As Astronomer Royal Sir Martin Rees states, this is veering into metaphysical territory, which except for the scientists with religious convictions, is usually an area avoided like the plague. However, it may eventually become possible to run computer models that simulate the creation of multiple universes and so as bizarre as it seems, go some way to creating a workable theory out of something that to most people is still a purely philosophical notion. Talk about counting angels on a pinhead!

I can't say I'm entirely convinced by my own theory of three stages to science, but it's been interesting to see how the history and practice of the discipline can be fitted into it. After all, as stated earlier no-one has ever observed a quark, which in the first days of their formulation were sometimes seen as purely mathematical objects any way. So if you're doubtful I don't blame you, but never say never...