Showing posts with label FoldingAtHome. Show all posts
Showing posts with label FoldingAtHome. Show all posts

Tuesday, 18 June 2013

Deserving dollars: should mega budget science be funded in an age of austerity?

With the UK narrowly avoiding France's fate of a triple dip recession, I thought I would bite the bullet and examine some of the economics of current science. In a time when numerous nations are feeling severe effects due to the downturn, it is ironic that there are a multitude of science projects with budgets larger than the GDP of some smaller nations. So who funds these ventures and are they value for money, or even worthwhile, in these straitened times? Here are a few examples of current and upcoming projects, with the lesser known the project the more the information supplied:

National Ignition Facility

The world's most powerful laser was designed with a single goal: to generate net energy from nuclear fusion by creating temperatures and pressures similar to those in the cores of stars. However, to state that the NIF has not lived up to expectation would be something of an understatement. According to even the most conservative sources, the original budget of the Lawrence Livermore National Laboratory project has at the very least doubled if not quadrupled to over US$4 billion, whilst the scheduled operational date came five years overdue.

I first learned of the project some years ago thanks to a friend who knew one of the scientists involved. The vital statistics are astonishing, both for the scale of the facility and the energies involved. But it seems that there may be underlying problems with the technology. Over-reliance on computer simulations and denial of deleterious experimental results on precursor projects, as well as the vested interests of project staffers and the over-confident potential for military advances, have all been suggested as causes for what history may conclude as a white elephant. So perhaps if you are looking for an archetypal example of how non-scientific factors have crippled research, this may well be it.

Unlike all the other projects discussed, the National Ignition Facility is solely funded by one nation, the USA. Of course, it could be argued that four billion dollars is a bargain if the project succeeded, and that it is today's time-precious society that needs to learn patience in order to appreciate the long-term timescales required to overcome the immense technological challenges. Nuclear fusion would presumably solve many of todays - and the foreseeable futures - energy requirements whilst being rather more environmentally friendly than either fossil fuels or fission reactors. The potential rewards are plain for all to see.

However, the problems are deep-rooted, leading to arguments against the development of laser-based fusion per se. Alternative fusion projects such as the Joint European Torus and the $20 billion ITER - see an earlier post on nuclear fusion research for details - use longer-established methods. My verdict in a nutshell: the science was possibly unsound from the start and the money would be better spent elsewhere. Meanwhile, perhaps the facility could get back a small portion of its funding if Star Trek movies continue to hire the NIF as a filming location!

The International Space Station

I remember the late Carl Sagan arguing that the only benefit of the ISS that couldn’t be achieved via cheaper projects such as – during the Space Shuttle era - the European Space Agency’s Spacelab, was research into the deleterious effects on health of long-duration spaceflight. So at $2 billion per year to run is it worthwhile, or but another example of a fundamentally flawed project? After all, as it is the station includes such non-scientific facets as the ultimate tourist destination for multi-millionaires!

Sometimes referred to as a lifeline for American and Russian aerospace industries (or even a way to prevent disaffected scientists in the latter from working for rogue states), I have been unable to offer a persuasive argument as to why the money would not have been better spent elsewhere. It is true that there has been investigation into vaccines for salmonella and MRSA, but after twelve years of permanent crewing on board the station, just how value for money has this research been? After all, similar studies were carried out on Space Shuttle flights in previous few decades, suggesting that the ISS was not vital to these programmes. The Astronomer Royal Lord Martin Rees has described as it as a 'turkey in the sky', siphoning funds that could have been spent on a plethora of unmanned missions such as interplanetary probes. But as we should be aware, it usually isn't a case that money not spent on one project would automatically become available for projects elsewhere.

On a positive scientific note, the station has played host to the $2 billion Alpha Magnetic Spectrometer - a key contender in the search for dark matter - which would presumably have difficulty finding a long-duration orbital platform elsewhere. But then this is hardly likely to excite those who want immediate, practical benefits from such huge expenditure.

The ISS has no doubt performed well as a test bed for examining the deterioration of the human body due to living in space, if anything seriously weakening the argument for a manned Mars mission in the near future. Perhaps one other area in which the station has excelled has been that of a focal point for promoting science to the public, but surely those who follow in Sagan’s footsteps - the U.K.'s Brian Cox for one - can front television series with a similar goal for the tiniest fraction of the cost?

The Large Hadron Collider

An amazing public-relations success story, considering how far removed the science and technology are from everyday mundanity, the world's largest particle accelerator requires $1 billion per year to operate on top of a construction budget of over $6 billion. With a staff of over 10,000 the facility is currently in the midst of a two-year upgrade, giving plenty of time for its international research community to analyse the results. After all, the Higgs Boson A.K.A. 'God particle' has been found…probably.

So if the results are confirmed, what next? Apparently, the facility can be re-engineered for a wide variety of purposes, varying from immediately pragmatic biomedical research on cancer and radiation exposure to the long-term search for dark matter. This combination of practical benefits with extended fundamental science appears to be as good a compromise as any compared to similar-scale projects. Whether similar research could be carried out by more specialised projects is unknown. Does anyone know?

As for the future of mega-budget schemes, there are various projects in development extending into the next decade. The Southern Hemisphere is playing host to two large international collaborations: the Square Kilometre Array is due to begin construction in eleven nations - excluding its UK headquarters - in 2016, but it will be around eight years before this $2 billion radio telescope array is fully operational. Meanwhile the equally unimaginatively-named European Extremely Large Telescope is planned for a site in Chile, with an even longer construction period and a price tag approaching $1.5 billion. Both projects are being designed for a variety of purposes, from dark matter investigation to searching for small (i.e. Earth-sized) extra-solar planets with biologically-modified atmospheres.

At this point it is pertinent to ask do extremely ambitious science projects have to come with equally impressive price tags? Personally I believe that with a bit more ingenuity a lot of useful research can be undertaken on far smaller budgets. Public participation in distributed computing projects such as Folding@home and Seti@home, in which raw data is processed by home computers, is about as modest an approach as feasible for such large amounts of information.

An example of a long-term project on a comparatively small budget is the US-based Earthscope programme, which collects and analyses data including eminently practical research into seismic detection. With a construction cost of about $200 million and annual budget around a mere $125 million this seems to be a relative bargain for a project that combines wide-scale, theoretical targets with short-term, pragmatic gains. But talking of practical goals, there are other scientific disciplines crying out for a large increase in funding. Will the explosive demise of a meteor above the Russian city of Chelyabinsk back in February act as a wake-up call for more research into locating and deflecting Earth-crossing asteroids and comets? After all, the 2014 NASA budget for asteroid detection projects is barely over the hundred million dollar mark!

I will admit to some unique advantages to enormous projects, such as the bringing together of researchers from the funding nations that may lead to fruitful collaboration. This is presumably due to the sheer number of scientists gathered together for long periods, as opposed to spending just a few days at an international conference or seminar, for instance. Even so, I cannot help but feel that the money for many of the largest scale projects could be bettered used elsewhere, solving some of the immediate problems facing our species and ecosystem.

Unfortunately, the countries involved offer their populations little in the way of voice as to how public money is spent on research. But then considering the appalling state of science education in so many nations, as well as the short shrift that popular culture usually gives to the discipline, perhaps it isn’t so surprising after all. If we want to make mega-budget projects more accountable, we will need to make fundamental changes to the status of science in society. Without increased understanding of the research involved, governments are unlikely to grant us choice.

Sunday, 30 December 2012

Software Samaritans: in praise of science-orientated freeware

In the midst of the gift-giving season it seems an appropriate time to look at a source of presents that keeps on giving, A.K.A. the World Wide Web. In addition to all the scientific information that can be gleaned at comparatively little effort, there is also an immense amount of fantastic freeware that is available to non-professionals. I have found that these can be broken down into three distinctive types of application:
  1. Simulated experiments such as microscope simulators or virtual chemistry laboratories
  2. Distributed computing projects, which are applications that do not require any user effort other than downloading and installation
  3. Aplications with specific purposes to actively aid amateur science practice, such as planetariums
I have to admit to not having any experience with the first category, but examples such as a molecular biology application Gene Designer 2.0, The Virtual Microscope and Virtual (chemistry) Labs - all suitable for school and university students - are astonishing in their ability to extend conventional textbook and lecture-based learning. All I can say is - I wish I had access to such software when I was at school!

I have a bit more experience with distributed computing projects, having been a volunteer on Seti@home - back in its first year (1999-2000). Only the second large-scale project of this type, the grandiose aim is to discover radio signals broadcast by alien civilisations. All the user has to do is download and install the application, which then runs when the computer is idling as per a glorified screensaver. In this particular case, the Seti@home signal-processing software is able to search for extra-terrestrial transmissions that might be only 10% the strength of earlier surveys, using data collected by the giant Arecibo radio telescope. The application has proved to be remarkably successful, having been downloaded to over 3 million personal computers.

But if this project is a bit blue sky for you, there are plenty of others with more down-to-earth objectives. For example, Folding@home and Rosetta@home are fantastic opportunities for all of us non-professionals to help molecular biologists studying protein folding in order to develop cures for diseases such as HIV, Alzheimer's, and Huntington's. So far, the research has generated over a hundred research papers, but the complexity of the subject means there's plenty of room for additional computers to get involved for many years to come.

The third class of software supplies the user with the same sort of functionality as commercially-available applications, but in many cases surpasses them in terms of capabilities and quantity of data. These tend to congregate into a few classes or themes, suitable for usage amongst amateurs of variable capability and commitment.

One popular category is planetarium applications such as Stellarium, which has plenty of features for city-bound (i.e. restricted vision) enthusiasts such as myself. It even includes a night vision mode, red-tinted so as to keep the observer's eye adjusted to the darkness, although unfortunately my telescope camera software doesn't have an equivalent and as I cannot reduce the laptop screen brightness until after I've achieved focus, I'm left stumbling and squinting until my eyes readjust. Stellarium seems reasonably accurate with regards to stars and planets but I've never managed to check if the satellite trajectories confirm to reality. 

For anyone lucky enough to live in a non-light polluted environment  there are more sophisticated free applications, such as Cartes du Ciel-SkyChart which allows you to create printable charts as well as remotely control telescope drives. If you are really an expert at the telescope then C2A (Computer Aided Astronomy) is the bee's knees in planetarium software, even able to simulate natural light pollution during the lunar cycle and allowing you to create your own object catalogues!

As an aside, what gets me with these applications is how they calculate the positioning of celestial objects from any location on Earth, at any time, in any direction, and at varied focal lengths. After all, there is a well-known issue with calculating the gravitational interactions of more than two celestial objects known as the n-body problem. So how do the more sophisticated planetarium applications work out positioning for small objects such as asteroids? I used to have enough issues writing basic gravity and momentum effects in ActionScript when building games in Adobe Flash!  All I can say is that these programmers appear like mathematics geniuses compared to someone of my limited ability.

Processing astrophotography images

Generating Jupiter: from raw planetary camera frame to final processed image

Back to the astronomy freeware. Once I've aligned my telescope courtesy of Stellarium and recorded either video or a sequence of stills using the QHY5v planetary camera (wonder if they'll give me any freebies for plugging their hardware?) I need to intensively process the raw material to bring out the details. For this image processing I use another free application called RegiStax which again astonishes me as to the genius of the programmers, not to mention their generosity. Being a regular user of some extremely complex (and expensive) commercial image editing applications since the late 1990s, I undertook a little research into how such software actually works. All I can say is that unless you are interested in Perlin noise functions (seeded random number generators), stochastic patterns, Gaussian distribution and Smallest Univalue Segment Assimilating Nucleus (SUSAN) algorithms - nice! - you might just want to accept that these applications are built by programmers who, as with the planetarium software builders mentioned above, have advanced mathematics skills beyond the comprehension of most of us.

So in case you weren't aware, the World Wide Web provides far more to the amateur scientist or student than just a virtual encyclopaedia: thanks to the freeware Samaritans you can now do everything from finding the position of millions of astronomical objects to examining electron microscope images of lunar dust. It’s like having Christmas every day of the year!