Showing posts with label Large Hadron Collider. Show all posts
Showing posts with label Large Hadron Collider. Show all posts

Tuesday 27 October 2020

Bursting the bubble: how outside influences affect scientific research

In these dark times, when some moron (sorry, non-believer in scientific evidence) can easily reach large numbers of people on social media with their conspiracy theories and pseudoscientific nonsense, I thought it would be an apt moment to look at the sort of issues that block the initiation, development and acceptance of new scientific ideas. We are all aware of the long-term feud between some religions and science but aside from that, what else can influence or inhibit both theoretical and applied scientific research?

There are plenty of other factors, from simple national pride to the ideologies of the far left and right that have prohibited theories considered inappropriate. Even some of the greatest twentieth century scientists faced persecution; Einstein was one of the many whose papers were destroyed by the Nazis simply for falling under the banner 'Jewish science'. At least this particular form of state-selective science was relatively short-lived: in the Soviet Union, theories deemed counter to dialectical materialism were banned for many decades. A classic example of this was Stalin's promotion of the crackpot biologist Trofim Lysenko - who denied the modern evolutionary synthesis - and whose scientific opponents were ruthlessly persecuted. 

Even in countries with freedom of speech, if there is a general perception that a particular area of research has negative connotations then no matter how unfounded, public funding may be affected likewise. From the seemingly high-profile adulation of STEM in the 1950s and 1960s (ironic, considering the threat of nuclear war), subsequent decades have seen a decreasing trust in both science and its practitioners. For example, the Ig Nobel awards have for almost thirty years been a high-profile way of publicising scientific projects deemed frivolous or a waste of resources. A similar attitude is frequently heard in arts graduate-led mainstream media; earlier this month, a BBC radio topical news comedy complemented a science venture that was seen as "doing something useful for once." 

Of course, this attitude is commonly related to how research is funded, the primary question being why should large amounts of resources go to keep STEM professionals employed if their work fails to generate anything of immediate use? I've previously discussed this contentious issue, and despite the successes of the Large Hadron Collider and Laser Interferometer Gravitational-Wave Observatory, there are valid arguments in favour of them being postponed until our species has dealt with fundamental issues such as climate change mitigation. 

There are plenty of far less grandiose projects that could benefit from even a few percent of the resources given to the international, mega-budget collaborations that gain the majority of headlines. Counter to the 'good science but wrong time' argument is the serendipitous nature of research; many unforeseen inventions and discoveries have been made by chance, with few predictions hitting the mark.

The celebrity-fixated media tends to skew the public's perception of scientists, representing them more often as solitary geniuses rather than team players. This has led to oversimplified distortions, such as that inflicted on Stephen Hawking for the last few decades of his life. Hawking was treated as a wise oracle on all sorts of science- and future-related questions, some far from his field of expertise. This does neither the individuals involved nor the scientific enterprise any favours. It makes it appear as if a mastermind can pull rabbits out of a hat, rather than hardworking groups spending years on slow, methodical and - let's face it - from the outsider's viewpoint what appears to be somewhat dull research. 

The old-school caricature of the wild-haired, lab-coated boffin is thankfully no longer in evidence, but there are still plenty of popular misconceptions that even dedicated STEM media channels don't appear to have removed. For example, almost everyone I meet fails to differentiate between the science of palaeontology and the non-science of archaeology, the former of course usually being solely associated with dinosaurs. If I had to condense the popular media approach to science, it might be something along these lines:

  • Physics (including astronomy). Big budget and difficult to understand, but sometimes exciting and inspiring
  • Chemistry. Dull but necessary, focusing on improving products from food to pharmaceuticals
  • Biology (usually excluding conventional medicine). Possibly dangerous, both to human ego and our ethical and moral compass (involve religion at this point if you want to) due to both working theories (e.g. natural selection) and practical applications, such as stem cell research. 

Talking of applied science, a more insidious form of pressure has sometimes been used by industry, either to keep consumers purchasing their products or prevent them moving to rival brands. Various patents, such as for longer-lasting products, have been snapped up and hidden by companies protecting their interests, while the treatment meted out to scientific whistle blowers has been legendary. Prominent examples include Rachel Carson's expose of DDT, which led to attacks on her credibility, to industry lobbying of governments to prevent the banning of CFCs after they were found to be destroying the ozone layer.

When the might of commerce is combined with wishful thinking by the scientist involved, it can lead to dreadful consequences. Despite a gathering body of evidence for smoking-related illnesses, the geneticist and tobacco industry spokesman Ronald Fisher - himself a keen pipe smoker - argued for a more complex relationship between nicotine and lung disease. The sector used his prominence to denigrate the truth, no doubt shortening the lives of immense numbers of smokers.

If there's a moral to all this, it is that even at a purely theoretical level science cannot be isolated from all manner of activities and concerns. Next month I'll investigate negative factors within science itself that have had deleterious effects on this uniquely human sphere of accomplishment.

Wednesday 24 January 2018

Low-key wonders: how small-scale innovation can aid the developing world

The success of mega-budget science experiments such as the Large Hadron Collider (LHC) and Laser Interferometer Gravitational-Wave Observatory (LIGO) has quite rightly generated widespread praise for these technological marvels. This has led to plenty of discussion regarding similar international endeavours now in the pipeline, such as the Square Kilometre Array (SKA). However, numerous far smaller, cheaper projects have been largely overlooked, despite their potential to offer practical improvements to millions of humans and in some cases, to the environment as well. Although not nearly as glamorous as their far larger counterparts, these innovative schemes - at least in application if not necessarily in technology - are surely as important and deserve more attention than that so far given to them.

The projects in question are aimed towards improving the quality of life in developing nations and as such tend to fall into one of a few key categories:
  1. Fuel efficiency and non-fossil fuel energy sources
  2. Water, nutrition and food preparation
  3. Medicine and hygiene
  4. Low-cost consumer electronics
The companies and inventors conceiving these schemes are based around the world in both developing and developed countries, with most having little if any association with multi-national manufacturers. Indeed, such has been the lack of interest from traditional industry that some of the projects have relied on a few far-sighted entrepreneurs or crowdfunding schemes. In some cases it appears that the less sophisticated the technology being developed, the more successful the product; clearly, lack of funding for research and development can limit the efficiency and reliability of new devices. Although the World Bank estimates that the crowdfunding market could generate ninety to ninety-five billion US dollars by 2030, the lack of secure financial infrastructure and limited ecommerce experience in developing nations mean that its infoDev global partnership programme is finding it tricky to help small-scale innovation take off in these countries.

1. Fuel efficiency and non-fossil fuel energy sources

An irregular electricity supply if available at all is still a prominent problem in developing countries, so millions of poor households rely on dangerous and inefficient forms of lighting and cooking. Kerosene lamps for example, in addition to causing health issues from smoke inhalation are responsible for three percent of the world's carbon dioxide emissions. One simple yet effective solution comes in the form of the GravityLight, whereby a bag of slowly descending ballast drives a generator to power an LED light for about twenty minutes before it needs resetting.

Other devices are less innovative but still very useful, such as Princeton University-developed SunSaluter, one of several compact solar panel products being designed to optimise energy collection, in this particular instance with the ability to rotate and so follow the sun across the sky. Another alternative energy scheme currently being prototyped is called ROTOR and uses a small floating device to generate hydro-electric power. These local-level systems are not only environmentally friendly but would relieve poor families of having to buy fossil fuels such as kerosene. Unfortunately, many are still at the development stage and lack of funding usually means slow progress in implementation.

Another invention that utilises existing components without any moving parts is the Eco-Cooler, which uses halved plastic bottles to drastically reduce temperatures in houses without needing a power source. This may not be cutting edge technology per se, but as per a previous post from 2010, this simple ingenious solution may prove to a wider public how they can help themselves and the environment simultaneously.

2. Water, nutrition and food preparation

If water is the new oil then devices that can heat and/or purify it at the same time as saving money and lessening environmental impact cannot be far behind in importance. Inventions in this category range from incremental improvements (i.e. more efficient versions of conventional products such as the Berkeley-Darfur Stove) to the innovative Jompy Water Boiler prototype, which heats water to purify it at the same time as cooking food and saving fuel.

Water purification systems are being tested, as are waste recyclers that can convert household organic waste at low-cost into drinking water or even cooking gas. These devices are being developed to use little or no power to operate, and in the case of Indian conglomerate Tata's Swach water filter, a combination of traditional rice husk ash and nanosilver forms the active ingredients. As I've discussed elsewhere nanosilver is not the most environmentally friendly of substances but at twenty-five US dollars this device has become widespread over the past eight years, presumably saving the lives of innumerable children in regions without a safe water supply.

At the other end, so to speak, the UK's Cranfield University has received funding from the Bill and Melinda Gates Foundation to complete development of the self-powered Nano Membrane Toilet. This is one of several such designs that don't require connection to plumbing as they work without an external water supply or outflow. Indeed, the Cranfield design is a net producer of water and possibly even energy too.

Developing countries are also seeking ways to improve nutrition themselves, such as the seventeen African nations involved in the Sweetpotato for Profit and Health Initiative. This ten-year programme aims to reduce vitamin A deficiency by breeding a new strain of sweet potato, especially aimed at households with very young children. It may not involve cutting-edge genetic modification, but just the work required to overcome the social and economic conservatism of the region is probably as key to success as the agricultural science.

3. Medicine and hygiene

In a crossover between nutrition and medicine, the advertising and marketing agency Grey Singapore has been involved with the distribution of the Life Saving Dot, an iodine-rich bindi designed to cure iodine deficiency in many rural Indian women. In this instance, the use of the traditional design means that the product shouldn't face suspicion from its target market.

In 2010 a German former teacher Martin Aufmuth began developing a simple method to quickly produce pairs of spectacles without the need for a power supply. His OneDollarGlasses are now selling worldwide, further proof that low-tech ingenuity can generate enormous benefits.

More high-tech schemes are also in development that could prove to be extremely efficient yet relatively low-cost life savers. Médecins Sans Frontières has been studying the use of both 3-D printing and virtual reality for setting up field hospitals while the e-NABLING the Future project coordinates volunteers who can supply 3-D printed items such as prosthetics. The disaster-relief NGO Field Ready aims to provide 'faster, cheaper and better' aid via the manufacture of 3-D printed elements, including medical items, a sure sign that this technology is probably the best method of rapidly producing custom components in regions lacking sophisticated infrastructure.

Solar power is also being co-opted to replace conventional batteries in devices such as hearing aids, with the Botswana-based Deaftronics and its Solar Ear unit a pioneer in this field. Presumably, as smart clothing technology becomes more common, such devices will be able to use the wearer's own motion to supply the necessary power.

Pharmaceutical distribution and illness diagnosis techniques are also on the verge of radical improvements, particularly in Africa. An example of the former is the Ghanaian-based mPedigree's use of a free SMS code to confirm that the pharmaceutical is genuine. MIT research is aiding the latter, thanks to a series of paper strip tests for conditions ranging from Ebola to dengue fever.

4. Low-cost consumer electronics

The first example I came across of such devices was British inventor Trevor Baylis' wind-up radio, developed in 1995. Having been rejected by mainstream radio manufacturers, Baylis was lucky to gain the support of entrepreneurs so that he could achieve mass-production.

One of the few major companies to take an interest in the bottom end of the market has been Vodafone, whose 150 and 250 model mobile phones appeared in 2010 and were aimed solely at developing nations; the importance of rapid yet cheap communication in rural areas should not be underestimated. Other devices have not been so lucky with their manufacturers, with the world's cheapest tablets, the Indian Government-promoted Ubislate/Aakash range, suffering from so many design and build issues that the device is unlikely to satisfy its intended market any time soon.

Although the Aakash fiasco may inhibit other Western corporations from wanting to engage in similar projects, the mini paradigm shifts that some of these projects have engendered could well generate a two-way interaction between developed and developing nations. Rather than playing safe by fiddling with small iterations based on existing designs, the potential for wholly new products manufactured by smarter, more efficient methods has been given a solid proving ground by some of the examples described above. This 'trickle up' method may prove to be the way in which multinationals get involved in this level of project; needless to say, the timing couldn't be more apt.

From long-lasting, low voltage light bulbs to non-fossil fuel road vehicles, there is a multitude of examples of how big business has traditionally stifled innovation if it meant potential loss of profit. In some cases, shortened product lifespan and incremental upgrade release cycles have forced consumers to participate in a planned obsolescence programme, at the cost of the wider environment as well as customer bank balance. With talk of a several trillion US dollar funding gap in the United Nations' sustainable development goals - which the USA is now more than ever unwilling to subsidise - any means to replace aid relief with self-sustaining processes and local manufacturing are to be welcomed. There's enormous potential out there for developing nations to improve dramatically without relying on charitable hand-outs or the dubious support of big business. Hopefully the flow of  inventors, entrepreneurs and volunteers will continue building that future.

Tuesday 28 November 2017

Research without borders: why international cooperation is good for STEM

I've just finished reading Bryan Sykes' (okay, I know he's a bit controversial) The Seven Daughters of Eve, about the development of mitochondrial DNA research for population genetics. One chapter mentioned Dr Sykes' discovery of the parallel work of Hans-Jürgen Bandelt, who's Mathematics Genealogy Project provided a structure diagram perfectly suited to explaining Sykes' own evolutionary branching results. This discovery occurred largely by chance, suggesting that small research groups must rely either on serendipity or have knowledge of the latest professional papers in order to find other teams who's work might be useful.

This implies that the more international the character of scientific and technological research, the more likely there will be such fortuitous occurrences. Britain's tortuous path out of the European Union has led various organisations on both sides of the Channel to claim that this can only damage British STEM research. The Francis Crick Institute, a London-based biomedical research centre that opened last year, has staff originating from over seventy nations. This size and type of establishment cannot possibly rely on being supplied with researchers from just one nation. Yet EU scientists resident in Britain have felt 'less welcome' since the Brexit referendum, implying a potential loss of expertise in the event of a mass withdrawal.

In recent years, European Union research donations to the UK have exceeded Britain's own contributions by £3 billion, meaning that the additional £300 million newly announced for research and development over the coming four years is only ten percent of what the EU has provided - and the UK Government is clearly looking to the private sector to make up the shortfall. It should also be recognised that although there are high numbers of non-British nationals working in Britain's STEM sector, the country also has a fair number of its own STEM professionals working overseas in EU nations.

The United Kingdom is home to highly expensive, long-term projects that require overseas funding and expertise, including the Oxfordshire-based Joint European Torus nuclear fusion facility. British funding and staff also contribute to numerous big-budget international projects, from the EU-driven Copernicus Earth observation satellite programme to the non-EU CERN. The latter is best-known for the Large Hadron Collider, the occasional research home of physicist and media star Brian Cox (how does he find the time?) and involves twenty-two key nations plus researchers from more than eighty other countries. Despite the intention to stay involved in at least the non-EU projects, surveys suggest that post-Brexit there will be greater numbers of British STEM professionals moving abroad. Indeed, in the past year some American institutions have actively pursued the notion of recruiting more British scientists and engineers.

Of course, the UK is far from unique in being involved in so many projects requiring international cooperation. Thirty nations are collaborating on the US-based Deep Underground Neutrino Experiment (DUNE); the recently-successful Laser Interferometer Gravitational-Wave Observatory (LIGO) involves staff from eighteen countries; and the Square Kilometre Array radio telescope project utilises researchers of more than twenty nationalities. Although the USA has a large population when compared to European nations, one report from 2004 states that approaching half of US physicists were born overseas. Clearly, these projects are deeply indebted to non-nationals.

It isn't just STEM professionals that rely on journeying cross-border, either. Foreign science and technology students make up considerable percentages in some developed countries: in recent years, over 25% of the USA's STEM graduate students and even higher numbers of its master's degree and doctorate students were not born there. Canada, Australia, New Zealand and several European countries have similar statistics, with Indian and Chinese students making up a large proportion of those studying abroad.

As a small nation with severely limited resources for research, New Zealand does extremely well out of the financial contributions from foreign students. Each PhD student spends an average of NZ$175,000 on fees and living costs, never mind additional revenue from the likes of family holidays, so clearly the economics alone make sense. Non-nationals can also introduce new perspectives and different approaches, potentially lessening inflexibility due to cultural mind sets. In recent years, two New Zealand-based scientists, microbiologist Dr Siouxsie Wiles and nanotechnologist Dr Michelle Dickinson (A.K.A. Nanogirl) have risen to prominence thanks to their fantastic science communication work, including with children. Both were born in the UK, but New Zealand sci-comm would be substantially poorer without their efforts. Could it be that their sense of perspective homed in on a need that locally-raised scientists failed to recognise?

This combination of open borders for STEM professionals and international collaboration on expensive projects proves if anything that science cannot be separated from society as a whole. Publically-funded research requires not only a government willing to see beyond its short-term spell in office but a level of state education that satisfies the general populace as to why public money should be granted for such undertakings. Whilst I have previously discussed the issues surrounding the use of state funding for mega-budget research with no obvious practical application, the merits of each project should still be discussed on an individual basis. In addition, and as a rule of thumb, it seems that the larger the project, the almost certain increase in the percentage of non-nationals required to staff it.

The anti-Brexit views of prominent British scientists such as Brian Cox and the Astronomer Royal, Lord Rees of Ludlow, are well known. Let's just hope that the rising xenophobia and anti-immigration feeling that led to Brexit doesn't stand for 'brain exit'. There's been enough of that already and no nation - not even the USA - has enough brain power or funding to go it alone on the projects that really need prompt attention (in case you're in any doubt, alternative energy sources and climate change mitigation spring to mind). Shortly before the Brexit referendum, Professor Stephen Hawking said: "Gone are the days when we could stand on our own, against the world. We need to be part of a larger group of nations." Well if that's not obvious, I don't know what is!

Friday 11 August 2017

From steampunk to Star Trek: the interwoven strands between science, technology and consumer design

With Raspberry Pi computers having sold over eleven million units by the end of last year, consumer interest in older technology appears to have become big business. Even such decidedly old-school devices as crystal radio kits are selling well, whilst replicas of vintage telescopes are proof that not everyone has a desire for the cutting-edge. I'm not sure why this is so, but since even instant Polaroid-type cameras are now available again - albeit with a cute, toy-like styling - perhaps manufacturers are just capitalising on a widespread desire to appear slightly out of the ordinary. Even so, such products are far closer to the mainstream than left field: instant-developing cameras for example now reach worldwide sales of over five million per year. That's hardly a niche market!

Polaroid cameras aside, could it be the desire for a less minimal aesthetic that is driving such purchases? Older technology, especially if it is pre-integrated circuit, has a decidedly quaint look to it, sometimes with textures - and smells - to match. As an aside, it's interesting that whilst on the one hand current miniaturisation has reduced energy consumption for many smaller pieces of technology from the Frankenstein laboratory appearance of valve-based computing and room-sized mainframes to the smart watch etal, the giant scale of cutting-edge technology projects require immense amounts of energy, with nuclear fusion reactors presumably having overtaken the previous perennial favourite example of space rockets when it comes to power usage.

The interface between sci-tech aesthetics and non-scientific design is a complicated one: it used to be the case that consumer or amateur appliances were scaled-down versions of professional devices, or could even be home-made, for example telescopes or crystal radios. Nowadays there is a massive difference between the equipment in high-tech laboratories and the average home; even consumer-level 3D printers won't be able to reproduce gravity wave detectors or CRISPR-Cas9 genome editing tools any time soon.

The current trend in favour - or at least acknowledgement - of sustainable development, is helping to nullify the pervasive Victorian notion that bigger, faster, noisier (and smellier) is equated with progress. It's therefore interesting to consider the interaction of scientific ideas and instruments, new technology and consumerism over the past century or so. To my mind, there appear to be five main phases since the late Victorian period:
  1. Imperial steam
  2. Streamlining and speed
  3. The Atomic Age
  4. Minimalism and information technology
  5. Virtual light

1) Imperial steam

In the period from the late Nineteenth Century's first generation of professional scientists up to the First World War, there appears to have been an untrammelled optimism for all things technological. Brass, iron, wood and leather devices - frequently steam-powered - created an aesthetic that seemingly without effort has an aura of romance to modern eyes.

Although today's steampunk/alternative history movement is indebted to later authors, especially Michael Moorcock, as much as it is to Jules Verne and H.G. Wells, the latter pair are only the two most famous of a whole legion of late Victorian and Edwardian writers who extolled - and occasionally agonised over - the wonders of the machine age.

I must confess I much prefer steam engines to electric or diesel locomotives, despite the noise, smuts and burning of fossil fuels. Although the pistons and connecting rods of these locomotives might be the epitome of the design from this phase, it should be remembered that it was not unknown for Victorian engineers to add fluted columns and cornucopia reliefs to their cast iron and brass machinery, echoes of a pre-industrial past. An attempt was being made, however crude, to tie together the might of steam power to the Classical civilisations that failed to go beyond the aeolipile toy turbine and the Antikythera mechanism.

2) Streamlining and speed

From around 1910, the fine arts and then decorative arts developed new styles obsessed with mechanical movement, especially speed. The dynamic work of the Futurists led the way, depicting the increasing pace of life in an age when humans and machines were starting to interact ever more frequently. The development of heavier-than-air flight even led to a group of 'aeropainters' whose work stemmed from their experience of flying.

Although scientific devices still had some of the Rube Goldberg/Heath Robinson appearance of their Nineteenth Century forebears, both consumer goods and vehicles picked up the concept of streamlining to suggest a sophisticated, future-orientated design. Items such as radios and toasters utilised early plastics, stainless steel and chrome to imply a higher level of technology than their interiors actually contained. This is in contrast to land, sea and aerial craft, whereby the practical benefits of streamlining happily coincided with an attractive aesthetic, leading to design classics such as the Supermarine seaplanes (forerunners of the Spitfire) and the world speed record-holding A4 Pacific Class steam locomotives.

3) The Atomic Age

By the 1950s practically anything that could be streamlined was, whether buildings that looked like ocean liners or cars with rocket-like tailfins and dashboards fit for a Dan Dare spaceship. However, a new aesthetic was gaining popularity in the wake of the development of atomic weapons. It seems to have been an ironic move that somewhere between the optimism of an era of exciting new domestic gadgets and the potential for nuclear Armageddon, the Bohr (classical physics) model of the atom itself gained a key place in post-war design.

Combined with rockets and space the imagery could readily be termed 'space cadet', but it wasn't the only area of science to influence wider society. Biological research was undergoing a resurgence, which may explain why stylised x-ray forms, amoebas and bodily organs become ubiquitous on textiles, furnishings, and fashion. Lighting fixtures were a standout example of items utilising designs based on the molecular models used in research laboratories (which famously gave Crick and Watson the edge in winning the race to understand the structure of DNA).

Monumental architecture also sought to represent the world of molecules on a giant scale, culminating in the 102 metre-high Atomium built in Brussels for the 1958 World's Fair. It could be said that never before had science- and technological-inspired imagery been so pervasive in non-STEM arenas.

4) Minimalism and information technology

From the early 1970s the bright, optimistic designs of the previous quarter century were gradually replaced by the cool, monochromatic sophistication of minimalism. Less is more became the ethos, with miniaturisation increasing as solid-state electronics and then integrated circuits became available. A plethora of artificial materials, especially plastics, meant that forms and textures could be incredibly varied if refined.

Perhaps a combination of economic recession, mistrust of authority (including science and a military-led technocracy) and a burgeoning awareness of environmental issues led to the replacement of exuberant colour with muted, natural tones and basic if self-possessed geometries. Consumers could now buy microcomputers and video games consoles; what had previously only existed in high-tech labs or science fiction became commonplace in the household. Sci-fi media began a complex two-way interaction with cutting-edge science; it's amazing to consider that only two decades separated the iPad from its fictional Star Trek: The Next Generation predecessor, the PADD.

5) Virtual light

With ultra high-energy experiments such as nuclear fusion reactors and the ubiquity of digital devices and content, today's science-influenced designs aim to be simulacra of their professional big brothers. As stated earlier, although consumer technology is farther removed from mega-budget science apparatus than ever, the former's emphasis on virtual interfaces is part of a feedback loop between the two widely differing scales.

The blue and green glowing lights of everything from futuristic engines to computer holographic interfaces in many Hollywood blockbusters are representations of both the actual awesome power required by the likes of the Large Hadron Collider and as an analogy for the visually-unspectacular real-life lasers and quantum teleportation, the ultimate fusion (sorry, couldn't resist that one) being the use of the real National Ignition Facility target chamber as the engine core of the USS Enterprise in Star Trek: Into Darkness.

Clearly, this post-industrial/information age aesthetic is likely to be with us for some time to come, as consumer-level devices emulate the cool brilliance of professional STEM equipment; the outer casing is often simple yet elegant, aiming not to distract from the bright glowing pixels that take up so much of our time. Let's hope this seduction by the digital world can be moderated by a desire to keep the natural, material world working.

Monday 30 January 2017

Hold the back page: 5 reasons science journalism can be bad for science

Although there's an extremely mixed quality to television science documentaries these days (with the Discovery Channel firmly at the nadir) - and in stark contrast to the excellent range of international radio programmes available - the popular press bombards us daily with news articles discussing science and technology. Both traditional print and online publications reach an enormous percentage of the public who would never otherwise read stories connected to STEM (Science, Technology, Engineering and Mathematics). Therefore these delivery channels and the journalists who write material for them face an immense challenge: how to make science accessible and comprehensible as well as interesting. How well they are doing can be judged by the general public's attitude towards the subject...which is currently not that great.

In November 2016 Oxford Dictionaries stated that their Word of the Year was 'post-truth', which refers to 'circumstances in which objective facts are less influential...than appeals to emotion and personal belief.' Clearly, this is the antithesis of how good science should proceed. Combined with the enormous output from social media, which gives the impression that anyone's opinion is as valid as a trained professionals and you can see why things aren't going well for critical thought in general. Did you know that a Google search for 'flat earth' generates over 12 million results? What a waste of everyone's time and data storage! As they said about Brexit: pride and prejudice has overcome sense and sensibility. Here then are five reasons why popular science journalism, mostly covering general news publications but occasionally dipping into specialist magazines too, can be detrimental to the public's attitude towards science.

1) Most science writers on daily newspapers or non-specialist periodicals don't have any formal science training. Evolutionary biologist Stephen Jay Gould once pointed out that journalists have a tendency to read summaries rather than full reports or scientific papers, thus distancing themselves from the original material before they even write about it. The problem is that an approach that works for the humanities may not be suitable for science stories. We're not critiquing movies or gourmet cuisine, folks!

As an humorous example of where a lack of research has led to a prevalent error,  a 1984 April Fools' Day spoof research paper by American journalism student Diana ben-Aaron was published in 350 newspapers before the original publisher admitted that Retrobreeding the Woolly Mammoth was phoney. One of the facts that ben-Aaron made up (and still remains unknown) is that woolly mammoth had fifty-eight chromosomes. This number is now ubiquitous across the World Wide Web from Wikipedia to the Washington Post, although I'm pleased to see that the National Geographic magazine website correctly states the situation. Clearly, anyone who follows the President Trump approach that "All I know is what's on the Internet" isn't going to get the correct answer.

This isn't to say that even a scientifically-trained journalist would understand stories from all sectors: the pace of advance in some fields is so fast than no-one can afford the time to maintain a sophisticated understanding of areas beyond their own specialism. But it isn't just particular research that is a concern: general concepts and methodology can be ignored or misunderstood; whilst a lack of mathematical training can easily restrict an understanding of how statistics work, with error bars and levels of significance often overlooked or misrepresented.

Related to this ambiguity and margin for error, journalists love to give definitive explanations, which is where there can be serious issues. Science is a way of finding ever more accurate explanations for the universe, not a collection of unchangeable laws (excepting the Second Law of Thermodynamics, of course). Therefore today's breakthrough may be reversed by tomorrow's report of sample contamination, unrepeatable results or other failure. It's rarely mentioned that scientists are willing to live with uncertainty - it's a key component of the scientific enterprise, after all. Yet in the event of an about turn or setback it's usually the scientists involved who get blamed, with accusations ranging from wasting public money to taking funding from something more worthwhile. Meanwhile, the journalist who wrote the original distorted account rarely gets held responsible. As for the one-sided scare stories such as nicknaming GM crops as 'Frankenfoods', this lowers what should be a serious public debate to an infantile level extremely difficult to overthrow.

2) How many science documentaries have you seen where the narrator says something along the lines of “and then the scientists found something that stunned them”? Such is the nature of story-making today, where audiences are deemed to have such short attention spans that every five minutes they require either a summary of the last ten minutes or a shock announcement. This week I saw a chart about bias within major news organisations: both CNN and USA Today were labelled as 'sensational or clickbait'. I've repeatedly read about scientists who were prompted by journalists towards making a controversial or sensational quote, which if published would distort their work but provide a juicy headline. It seems that limiting hyperbole is a critical skill for any scientist being interviewed.

Journalists don't owe invertebrate paleontologists, for example, a free lunch but there is a lot of good professional and occasionally amateur science being conducted away from the spotlight. Concentrating on the more controversial areas of research does little to improve science in the public's eye. Even reporting of such abstract (but mega-budget) experiments as the Large Hadron Collider seems to be based around headlines about 'The God Particle' (nearly six million results on Google) A.K.A. Higgs Boson (less than two million results). Next thing, they'll be nicknaming the LHC ‘The Hammer of Thor' or something equally cretinous. Although come to think of it…

The World Wide Web is far worse than printed news, with shock headlines ('It Was The Most XXX Ever Found - "It Blew My Mind," Expert Says') and over-inflated summaries that would make even lowbrow tabloids blush. Even specialist periodicals are not immune to the syndrome, with New Scientist magazine being particularly at fault. In 2009 it published the silly headline 'Darwin was wrong' which drew the ire of many biologists whilst providing a new form of ammunition for creationists. In 2012 their special 'The God Issue' turned out to contain less than fifteen pages on religion - but then it is meant to be a popular science periodical! In this vein the Ig Nobels seem to get more attention than the Nobel Prizes as journalists look for a quirky man-bites-dog angle to convince the public that a science story is worth reading.

3) Talking of which, journalists want to reach the widest possible audience and therefore looking for human angle is a prominent way to lure in readers. The two most recent Brian Cox television documentary series, Human Universe and Forces of Nature have concentrated on stories around families and children, with the science elements being interwoven almost effortlessly into the narrative.

In print and digital formats this bias means that the focus is frequently on articles that might directly affect humanity, especially medical, agricultural and environmental stories. This puts an unbalanced emphasis on certain areas of science and technology, leaving other specialisations largely unreported. This might not appear bad in itself, but lack of visibility can cause difficulties when it comes to maintaining public funding or attracting private philanthropy for less commercial and/or more theoretical science projects.

Another method used to make science more palatable is to concentrate on individual geniuses rather than team efforts. I assume only a very small proportion of the public know that theoretical physicists do their best work before they are thirty years old, yet the seventy-five year old Stephen Hawking (whose name is now a trademark, no less) is quoted almost every week as if he were Moses. He's well worth listening to but even so, Professor Hawking seems have become a spokesperson for almost any aspect of science the media want a quote on.

4) With competition tougher than ever thanks to social media and smartphone photography, journalists face ever tighter deadlines to publish before anyone else. This can obviously lead to a drop in accuracy, with even basic fact-checking sometimes lacking. For example, a year or two ago I sent a tweet to the British paleopathologist and presenter Dr Alice Roberts that the BBC Science and Environment News web page stated humans were descended from chimpanzees! She must have contacted them fairly rapidly as the content was corrected soon after, but if even the BBC can make such basic blunders, what hope is there for less reputable news-gathering sources? As with much of contemporary business, the mentality seems to be to get something into market as quick as possible and if it happens to be a smartphone that frequently catches fire, we'll deal with that one later. The Samsung Galaxy Note 7's recent debacle is the gadget equivalent of the BBC error: beating the opposition takes precedence over exactitude.

It's one to thing to define science as striving towards more accurate descriptions of aspects of reality rather than being a series of set-in-stone commandments, but publishing incorrect details for basic, well-established facts can only generate mistrust of journalists by both scientific professionals and members of the public who discover the mistake. Surely there's time for a little cross-checking with reference books and/or websites in order to prevent the majority of these howlers? Having said that, I find it scary that a major media organisation can commit such blunders. I wonder what the outcry would be if the BBC's Entertainment and Arts News page claimed that Jane Austen wrote Hamlet?

5) Finally, there's another explanation that has less to do with the science journalists themselves and more with what constitutes newsworthy stories. Negativity is the key here, and as such science news is swept along with it. For example, the BBC Science and Environment News web page currently has three articles on climate change and animal extinctions, an expensive project technology failure, earthquake news and a pharmaceutical story. Like a lot of political reports, those concerning STEM subjects concentrate on the bad side of the fence. Unfortunately, the dog-bites-man ordinariness of, for example ‘Project X succeeds in finding something interesting' usually precludes it from being deemed media-worthy. The ethos seems to be either find a unique angle or publish something pessimistic.

One tried and tested method to capture attention is to concentrate on scandal and error: science is just as full of problems as any other aspect of humanity. Of course it is good to examine the failure of high-tech agriculture that led to the UK's BSE 'mad cow' disease outbreaks in the 1980s and 90s, but the widespread dissemination of the supposed link between MMR and autism has caused immense damage around the world, thanks to a single report being unthinkingly conveyed as rock-hard evidence.

Bearing in mind that journalism is meant to turn a profit, perhaps we shouldn't be surprised at how misrepresented scientific research can be. It's difficult enough to find the most objective versions of reality, considering all the cognitive bias in these post-truth times. There are no obvious answers as to how to resolve the issue of poor quality science reporting without either delaying publishing and/or employing scientifically-trained staff. The market forces that drive journalism unfortunately mean that STEM stories rarely do science justice and often promote a negative attitude among the rest of mankind. Which is hardly what we need right now!

Wednesday 25 May 2016

From Dr Strangelove to Dr Evil: Hollywood's anti-science stance

Despite the decades of hard work by the likes of Bill Nye, Stephen Hawking, Carl Sagan, Stephen Jay Gould etal, there is still an enormous amount of public suspicion surrounding scientists and their work. From wavering opinion concerning climate change to the negative publicity revolving around genetically-modified crops (A.K.A. 'Frankenfoods') it seems that popular opinion of scientists isn't far above that meted out in recent years to politicians and merchant bankers.

Tabloid media cannot be solely to blame for this, although the ridiculous scaremongering stories given front page attention, frequently involving medical science, are certainly no help. Instead, I would argue that some of the blame for the public attitude to STEM (Science, Technology, Engineering and Mathematics) comes from that ubiquitous global communicator, mainstream Hollywood. So where did the world's movie capital get its ideas from?

It seems that the denigration of science and its technological applications has probably existed as long as modern science itself. Before there were films to spread the negativity, literature had a mixed opinion of the discipline. Could some of the most famous apparently anti-scientific publications from Europe have inspired Hollywood's pioneers, many of whom were European emigrés?

Jonathan Swift's third book of Gulliver's Travels concerns the scientific elite of a floating island called Laputa. First published in 1726 during the so-called Age of Enlightenment, the book is typical of Swift's no holds barred approach to satire, making much use of the learning of the day. Despite being far more concerned with social and political issues rather than an anti-scientific stance, the material is still echoed today in the popular media.

Granted, many would agree that some of the more expensive STEM research projects such as the Large Hadron Collider could wait until global issues concerning hunger, medicine, environmental degradation - and poverty in general - are solved, but then wealth is rarely evenly distributed. After all, the USA apparently spends twice as much on pet grooming as it does on nuclear fusion research. Incidentally, isn't this bizarre in itself: it's not just that we consider ourselves so much more rational than all other animals, but that the human brain is the most complex object in the known universe. That's a pretty scary thought!

As for Mary Shelley's classic novel whose title is evoked during criticism of GM foods, she may have been inspired by the general feeling of doom then in the air; almost literally in fact, due to the 1815 eruption of Mount Tambora, with volcanic dust creating 1816's 'Year without a Summer'. As an aside, the astonishingly lurid colours of J.M.W. Turner's sunsets of the period were another artistic response associated with the high-altitude volcanic aerosols.

In addition to the extremely cold, wet conditions of that year, Shelley is thought to have stopped near to the original Frankenstein Castle in Germany, where alchemy and other dubious dark arts were reputed to have been practiced. Combined with Luigi Galvani's experiments on frogs' legs - originally performed several decades earlier but much imitated still in Shelley's time, including on human cadavers - the novel is clearly a reflection of widespread anxieties of the time.

With the expansion of industrial cities and their associated squalor, the mid-Nineteenth Century saw the origin of philosophies that associated technological advances (and their scientific underpinnings) with a debasement of humanity. William Blake's description of 'satanic mills' epitomises this mode of thought, seen in as diverse a range of expression as the Pre-Raphaelite Brotherhood of artists, the Arts and Crafts movement, even the political writings of Marx and Engels. To blame the greed of the new captains of industry on science is obviously unfair, but then the latter were a far easier target. After all, the English chemist and political radical Joseph Priestley fled to the United States after an authority-sponsored mob burnt down his house in 1791.

Blake's over-wraught emoting ("Science is the Tree of Death") is amongst the strongest negativity of the period, but can we blame him, considering science was, as it is today, often wrongly blamed as the root cause of the widespread destruction of nature to make way for a soulless, artificial environment? But it wasn't just a response to the changes to society and landscape that Blake took exception to: he detested the mechanistic vision of the universe built upon the work of Galileo and Newton, believing that too much knowledge destroyed wonder and awe.

This is clearly as subjective a viewpoint as any discussion of a work of art; it can be easily rebuffed, although the attitude behind it should be treated seriously. Happily, today's plethora of glossy coffee table books on such scientifically-gleaned wonders as Hubble Space Telescope imagery show there is still plenty to be in awe of.

Mainstream cinema frequently paints a very A versus B picture of the world (think classic westerns or war films). But science can rarely fit into such neat parcels: consider how the more accurate general theory of relativity can live alongside its predecessor from Newton. In addition, it's very tricky to make interesting drama within a traditional narrative structure that utilises scientist protagonists unless it's a disaster movie (even the likes of Jurassic Park falls within this category.)

It isn't difficult to recall many negative examples of scientists in Hollywood movies, from at best those too wrapped up in their own work to notice its wider effects, to at worst insane megalomaniacs intent on either world domination or destruction. In contrast, how many sympathetic movie scientists are there?

It seems such a shame that such a ubiquitous form of entertainment consistently portrays such a lack of sympathy towards science. Even the film version of Carl Sagan's novel Contact lacked the cosmic spiritual elements of the source material, as if afraid that a combination of astrophysics and the mystical wouldn't be comprehensible to audiences (2001 syndrome, perhaps?) Science fiction films these days often seem keen to boast of their technical consultants, so what about a more sympathetic attitude to the practitioners of science itself? After all, most scientists don't live with their private armies in secret headquarters bases, planning to takeover the world...

Tuesday 26 January 2016

Spreading the word: 10 reasons why science communication is so important

Although there have been science-promoting societies since the Renaissance, most of the dissemination of scientific ideas was played out at royal courts, religious foundations or for similarly elite audiences. Only since the Royal Institution lectures of the early 19th century and such leading lights as Michael Faraday and Sir Humphry Davy has there been any organised communication of the discipline to the general public.

Today, it would appear that there is a plethora - possibly even a glut - in the market. Amazon.com carries over 192,000 popular science books and over 4,000 science documentary DVD titles, so there's certainly plenty of choice! Things have dramatically improved since the middle of the last century, when according to the late evolutionary biologist Stephen Jay Gould, there was essentially no publicly-available material about dinosaurs.

From the ubiquity of the latter (especially since the appearance of Steven Spielberg's originally 1993 Jurassic Park) it might appear that most science communication is aimed at children - and, dishearteningly, primarily at boys - but this really shouldn't be so. Just as anyone can take evening courses in everything from pottery to a foreign language, why shouldn't the public be encouraged to understand some of the most important current issues in the fields of science, technology, engineering and mathematics (STEM), at the same time hopefully picking up key methods of the discipline?

As Carl Sagan once said, the public are all too eager to accept the products of science, so why not the methods? It may not be important if most people don't know how to throw a clay pot on a wheel or understand why a Cubist painting looks as it does, but it certainly matters as to how massive amounts of public money are invested in a project and whether that research has far-reaching consequences.
Here then are the points I consider the most important as to why science should be popularised in the most accessible way - although without oversimplifying the material to the point of distortion:

1. Politicians and the associated bureaucracy need basic understanding of some STEM research, often at the cutting edge, in order to generate new policies. Yet as I have previously examined, few current politicians have a scientific background. If our elected leaders are to make informed decisions, they need to understand the science involved. It's obvious, but then if the summary material they are supplied with is incorrect or deliberately biased, the outcome may not be the most appropriate one. STEM isn't just small fry: in 2010 the nations with the ten highest research and development budgets had a combined spend of over US$1.2 trillion.

2. If public money is being used for certain projects, then taxpayers are only able to make valid disagreements as to how their money is spent if they understand the research (military R&D excepted of course, since this is usually too hush-hush for the rest of us poor folk to know about). In 1993 the US Government cancelled the Superconducting Super Collider particle accelerator as it was deemed good science but not affordable science. Much as I love the results coming out of the Large Hadron Collider, I do worry that the immense amount of funding (over US$13 billion spent by 2012) might be better used elsewhere on other high-technology projects with more immediate benefits. I've previously discussed both the highs and lows of nuclear fusion research, which surely has to be one of the most important areas in mega-budget research and development today?

3. Criminal law serves to protect the populace from the unscrupulous, but since the speed of scientific advances and technological change run way ahead of legislation, public knowledge of the issues could help prevent miscarriages of justice or at least wasting money. The USA population has spent over US$3 billion on homeopathy, despite a 1997 report by the President of the National Council Against Health Fraud that stated "Homeopathy is a fraud perpetrated on the public." Even a basic level of critical thinking might help in the good fight against baloney.

4. Understanding of current developments might lead to reliance as much on the head as the heart. For example, what are the practical versus moral implications for embryonic stem cell research (exceptionally potent with President Obama's State of the Union speech to cure cancer). Or what about the pioneering work in xenotransplantation: could the next few decades see the use of genetically-altered pig hearts to save humans, and if so would patients with strong religious convictions agree to such transplants?

5. The realisation that much popular journalism is sensationalist and has little connection to reality. The British tabloid press labelling of genetically-modified crops as 'Frankenstein foods' is typical of the nonsense that clouds complex and serious issues for the sake of high sales. Again, critical thinking might more easily differentiate biased rhetoric from 'neutral' facts.

6. Sometimes scientists can be paid to lie. Remember campaigns with scientific support from the last century that stated smoking tobacco is good for you or that lead in petrol is harmless? How about the DuPont Corporation refusing to stop CFC production, with the excuse that capitalist profit should outweigh environmental degradation and the resulting increase in skin cancer? Whistle-blowers have often been marginalised by industry-funded scientists (think of the initial reaction to Rachel Carson concerning DDT) so it's doubtful anything other than knowledge of the issues would penetrate the slick corporate smokescreen.

7. Knowing the boundaries of the scientific method - what science can and cannot tell us and what should be left to other areas of human activity - is key to understanding where the discipline should fit into society. I've already mentioned the moral implications and whether research can be justified due to the potential outcome, but conversely, are there habits and rituals, or just societal conditioning, that blinds us to what could be achieved with public lobbying to governments?

8. Nations may be enriched as a whole by cutting out nonsense and focusing on solutions for critical issues, for example by not having to waste time and money explaining that global warming and evolution by natural selection are successful working theories due to the mass of evidence. Notice how uncontroversial most astronomical and dinosaur-related popularisations are. Now compare to the evolution of our own species. Enough said!

9. Improving the public perspective of scientists themselves. A primary consensus still seems to promote the notion of lone geniuses, emotionally removed from the rest of society and frequently promoting their own goals above the general good. Apart from the obvious ways in which this conflicts with other points already stated, much research is undertaken by large, frequently multi-national teams; think Large Hadron Collider, of course. Such knowledge may aid removal of the juvenile Hollywood science hero (rarely a heroine) and increase support for the sustained efforts that require public substantial funding (nuclear fusion being a perfect example).

10. Reducing the parochialism, sectarianism and their associated conflict that if anything appears to be on the increase. It's a difficult issue and unlikely that it could be a key player but let's face it, any help here must be worth trying. Neil deGrasse Tyson's attitude is worth mentioning: our ideological differences seem untenable against a cosmic perspective. Naïve perhaps, but surely worth the effort?

Last year Bill Gates said: "In science, we're all kids. A good scientist is somebody who has redeveloped from scratch many times the chain of reasoning of how we know what we know, just to see where there are holes." The more the rest of us understand this, isn't there a chance we would notice the holes in other spheres of thought we currently consider unbending? This can only be a good thing, if we wish to survive our turbulent technological adolescence.

Tuesday 18 June 2013

Deserving dollars: should mega budget science be funded in an age of austerity?

With the UK narrowly avoiding France's fate of a triple dip recession, I thought I would bite the bullet and examine some of the economics of current science. In a time when numerous nations are feeling severe effects due to the downturn, it is ironic that there are a multitude of science projects with budgets larger than the GDP of some smaller nations. So who funds these ventures and are they value for money, or even worthwhile, in these straitened times? Here are a few examples of current and upcoming projects, with the lesser known the project the more the information supplied:

National Ignition Facility

The world's most powerful laser was designed with a single goal: to generate net energy from nuclear fusion by creating temperatures and pressures similar to those in the cores of stars. However, to state that the NIF has not lived up to expectation would be something of an understatement. According to even the most conservative sources, the original budget of the Lawrence Livermore National Laboratory project has at the very least doubled if not quadrupled to over US$4 billion, whilst the scheduled operational date came five years overdue.

I first learned of the project some years ago thanks to a friend who knew one of the scientists involved. The vital statistics are astonishing, both for the scale of the facility and the energies involved. But it seems that there may be underlying problems with the technology. Over-reliance on computer simulations and denial of deleterious experimental results on precursor projects, as well as the vested interests of project staffers and the over-confident potential for military advances, have all been suggested as causes for what history may conclude as a white elephant. So perhaps if you are looking for an archetypal example of how non-scientific factors have crippled research, this may well be it.

Unlike all the other projects discussed, the National Ignition Facility is solely funded by one nation, the USA. Of course, it could be argued that four billion dollars is a bargain if the project succeeded, and that it is today's time-precious society that needs to learn patience in order to appreciate the long-term timescales required to overcome the immense technological challenges. Nuclear fusion would presumably solve many of todays - and the foreseeable futures - energy requirements whilst being rather more environmentally friendly than either fossil fuels or fission reactors. The potential rewards are plain for all to see.

However, the problems are deep-rooted, leading to arguments against the development of laser-based fusion per se. Alternative fusion projects such as the Joint European Torus and the $20 billion ITER - see an earlier post on nuclear fusion research for details - use longer-established methods. My verdict in a nutshell: the science was possibly unsound from the start and the money would be better spent elsewhere. Meanwhile, perhaps the facility could get back a small portion of its funding if Star Trek movies continue to hire the NIF as a filming location!

The International Space Station

I remember the late Carl Sagan arguing that the only benefit of the ISS that couldn’t be achieved via cheaper projects such as – during the Space Shuttle era - the European Space Agency’s Spacelab, was research into the deleterious effects on health of long-duration spaceflight. So at $2 billion per year to run is it worthwhile, or but another example of a fundamentally flawed project? After all, as it is the station includes such non-scientific facets as the ultimate tourist destination for multi-millionaires!

Sometimes referred to as a lifeline for American and Russian aerospace industries (or even a way to prevent disaffected scientists in the latter from working for rogue states), I have been unable to offer a persuasive argument as to why the money would not have been better spent elsewhere. It is true that there has been investigation into vaccines for salmonella and MRSA, but after twelve years of permanent crewing on board the station, just how value for money has this research been? After all, similar studies were carried out on Space Shuttle flights in previous few decades, suggesting that the ISS was not vital to these programmes. The Astronomer Royal Lord Martin Rees has described as it as a 'turkey in the sky', siphoning funds that could have been spent on a plethora of unmanned missions such as interplanetary probes. But as we should be aware, it usually isn't a case that money not spent on one project would automatically become available for projects elsewhere.

On a positive scientific note, the station has played host to the $2 billion Alpha Magnetic Spectrometer - a key contender in the search for dark matter - which would presumably have difficulty finding a long-duration orbital platform elsewhere. But then this is hardly likely to excite those who want immediate, practical benefits from such huge expenditure.

The ISS has no doubt performed well as a test bed for examining the deterioration of the human body due to living in space, if anything seriously weakening the argument for a manned Mars mission in the near future. Perhaps one other area in which the station has excelled has been that of a focal point for promoting science to the public, but surely those who follow in Sagan’s footsteps - the U.K.'s Brian Cox for one - can front television series with a similar goal for the tiniest fraction of the cost?

The Large Hadron Collider

An amazing public-relations success story, considering how far removed the science and technology are from everyday mundanity, the world's largest particle accelerator requires $1 billion per year to operate on top of a construction budget of over $6 billion. With a staff of over 10,000 the facility is currently in the midst of a two-year upgrade, giving plenty of time for its international research community to analyse the results. After all, the Higgs Boson A.K.A. 'God particle' has been found…probably.

So if the results are confirmed, what next? Apparently, the facility can be re-engineered for a wide variety of purposes, varying from immediately pragmatic biomedical research on cancer and radiation exposure to the long-term search for dark matter. This combination of practical benefits with extended fundamental science appears to be as good a compromise as any compared to similar-scale projects. Whether similar research could be carried out by more specialised projects is unknown. Does anyone know?

As for the future of mega-budget schemes, there are various projects in development extending into the next decade. The Southern Hemisphere is playing host to two large international collaborations: the Square Kilometre Array is due to begin construction in eleven nations - excluding its UK headquarters - in 2016, but it will be around eight years before this $2 billion radio telescope array is fully operational. Meanwhile the equally unimaginatively-named European Extremely Large Telescope is planned for a site in Chile, with an even longer construction period and a price tag approaching $1.5 billion. Both projects are being designed for a variety of purposes, from dark matter investigation to searching for small (i.e. Earth-sized) extra-solar planets with biologically-modified atmospheres.

At this point it is pertinent to ask do extremely ambitious science projects have to come with equally impressive price tags? Personally I believe that with a bit more ingenuity a lot of useful research can be undertaken on far smaller budgets. Public participation in distributed computing projects such as Folding@home and Seti@home, in which raw data is processed by home computers, is about as modest an approach as feasible for such large amounts of information.

An example of a long-term project on a comparatively small budget is the US-based Earthscope programme, which collects and analyses data including eminently practical research into seismic detection. With a construction cost of about $200 million and annual budget around a mere $125 million this seems to be a relative bargain for a project that combines wide-scale, theoretical targets with short-term, pragmatic gains. But talking of practical goals, there are other scientific disciplines crying out for a large increase in funding. Will the explosive demise of a meteor above the Russian city of Chelyabinsk back in February act as a wake-up call for more research into locating and deflecting Earth-crossing asteroids and comets? After all, the 2014 NASA budget for asteroid detection projects is barely over the hundred million dollar mark!

I will admit to some unique advantages to enormous projects, such as the bringing together of researchers from the funding nations that may lead to fruitful collaboration. This is presumably due to the sheer number of scientists gathered together for long periods, as opposed to spending just a few days at an international conference or seminar, for instance. Even so, I cannot help but feel that the money for many of the largest scale projects could be bettered used elsewhere, solving some of the immediate problems facing our species and ecosystem.

Unfortunately, the countries involved offer their populations little in the way of voice as to how public money is spent on research. But then considering the appalling state of science education in so many nations, as well as the short shrift that popular culture usually gives to the discipline, perhaps it isn’t so surprising after all. If we want to make mega-budget projects more accountable, we will need to make fundamental changes to the status of science in society. Without increased understanding of the research involved, governments are unlikely to grant us choice.

Tuesday 14 May 2013

What, how and why? Are there 3 stages to science?

Not being philosophically inclined I was recently surprised to find myself constructing an armchair thesis: it had suddenly dawned on me that there might be three, broad phases or stages to the development of scientific ideas. I'm fairly certain I haven't read about anything along similar lines, so let me explain,  safe in the knowledge that if it's a load of fetid dingo's kidneys, it's entirely of my own doing.

Stage 1

Stage one is the 'what' phase: simply stated, it is about naming and categorising natural phenomena, a delineation of cause and effect. In a sense, it is about finding rational explanations for things and events at the expense of superstition and mysticism.  In addition, it utilises the principle of parsimony, otherwise known as Occam's (or Ockham's) Razor: that the simplest explanation is usually correct. 

Although there were a few clear moments of stage one in Ancient Greece - Eratosthenes' attempt to measure the size of the Earth using Euclidean Geometry being a prime example - it seems to have taken off in earnest with Galileo. Although his work is frequently mythologised (I follow the rolling weights rather than dropping objects from the Leaning Tower of Pisa brigade), Galileo most likely devised both actual and thought experiments to test fundamental findings, such as the separate effects of air resistance and gravity.

Of course, Galileo was primarily interested in physics but the other areas of science followed soon after. Systematic biology came to the fore in such practical work as the anatomical investigations of William Harvey - pioneer in the understanding of blood circulation - and the glass bead microscopes of Antony van Leeuwenhoek. The work of the latter, interestingly enough, was largely to understand how small-scale structure in edible substances created flavours.  It's also worth thinking about how this research expanded horizons: after all, no-one had ever seen the miniature marvels such as bacteria. I wonder how difficult the engravers of illustrated volumes found it, working from sketches and verbal descriptions on sights they have never seen themselves? But then again, no-one has ever directly imaged a quark either…

Talking of biology, we shouldn't ignore Carl Linnaeus, the Swedish scientist who started the cataloguing methodology in use today. New Zealand physicist Ernest Rutherford may have disparagingly referred to all branches of science other than physics as mere stamp collecting but apart from the wild inaccuracy of his statement it is seemingly obvious that without various standards of basic definitions there is no bedrock for more sophisticated research.

The repetitive, largely practical aspect of the phase in such disciplines as geology and taxonomy meant that largely untrained amateurs could make major contributions, such as the multitude of Victorian parsons (of whom Charles Darwin was almost a member) who worked on the quantity over quality principle in collecting and cataloguing immense amounts of data. Of course, Darwin went far beyond phase one but his work built on the evaluation of evolutionary ideas (try saying that three times fast) that numerous predecessors had discussed, from the Ancient Greeks to John Ray in the late Seventeenth Century.

This isn't to say that stage one science will be finished any time soon. The Human Genome Project is a good example of a principally descriptive project that generated many surprises, not least that it is proving more difficult than predicted to utilise the results in practical applications. Although in the BBC television series The Kingdom of Plants David Attenborough mentioned that the Royal Botanic Gardens at Kew contains 90% of known plant species, there are still plenty of remote regions - not to mention the oceans - yet to yield all their secrets to systematic scientific exploration.  In addition to the biota yet to be described in scientific records, the existing catalogues are in the process of major reorganisation. For example, the multitude of duplicate plant names is currently being addressed by taxonomic experts, having so far led to the finding of 600,000 superfluous designations. It isn't just plants either: a recent example was the announcement that DNA evidence suggests there is probably only a single species of giant squid rather than seven. It may sound tedious and repetitive, but without comprehensive labelling and description of natural elements, it would be impossible to progress to the next stage.

Stage 2

Who was the first person to move beyond cataloguing nature to in-depth analysis? We'll probably never know, but bearing in mind that some of the Ionian philosophers and Alexandrian Greeks performed practical experiments, it may well have been one of them.

By looking to explore why phenomena occur and events unfold the way they do, our species took a step beyond description to evaluation. If art is holding a mirror up to nature, then could the second phase be explained as holding a magnifying glass up to nature, reducing a phenomenon to an approximation, and explaining how that approximation works?

For example, Newton took Galileo and Kepler's astronomical work and ran with it, producing his Law of Universal Gravitation. The ‘how' in this case is the gravitational constant that explained how bodies orbit their common centre of gravity. However, Newton was unable to delineate what caused the force to act across infinite, empty space, a theory that had to wait for stage three.

So different from the smug, self-satisfied attitude of scientists at the beginning of the Twentieth Century, the techniques of modern science suggest that there is a feedback cycle in which knowing which questions to ask is at least as important as gaining answers, the adage in this case being ‘good experiments generate new questions'. Having said that, some of the largest and most expensive contemporary experiments such as the Laser Interferometer Gravitational-Wave Observatory (LIGO) and Large Hadron Collider (LHC) have each been principally designed to confirm a single hypothesis.

As recent evidence has shown, even some of the fundamentals of the nature, including dark matter and dark energy, are only just being recognised. Therefore science is a long way from recognising all first principles, let alone understanding them. Closer to home, that most complex of known objects, the human brain, still holds a lot of secrets, and probably will continue to do so for some time to come.
Though microelectronics in general and computers in particular have allowed the execution of experiments in such fields as quantum teleportation, considered close to impossible by the finest minds only half a century ago, there are several reasons why computer processing power is getting closer to a theoretical maximum using current manufacturing techniques and materials. Therefore the near future may see a slowing down in the sorts of leading edge experimental science that has been achieved in recent decades. But how much progress has been made in phase three science?

Stage 3

This is more difficult to define than the other two phases and can easily veer into philosophy, a discipline that has a poor press from many professional scientists. Physicist Richard Feynman for example is supposed to have disparaged it as ‘about as useful to scientists as ornithology is to birds'.  Despite this - and the probability that there as many philosophies of science as there are philosophers -  it's easy to see that the cutting edge of science, particularly theoretical physics, generates as much discussion over its validity as any work of art. If you've read one of the myriad critiques of superstring theory for example, then you will know that it can be viewed as a series of intellectual patterns (accompanied by diabolical equations) that may never be experimentally confirmed. In that case is string theory really just a collection of philosophical hypotheses, unproven by experiment or observation and likely to remain so? The minuteness of the scale (an underwhelming description if ever there was one) makes the prospect of directly recording strings themselves  - as opposed to their effects - highly unlikely.

If that is the case then just where can you draw the line between science and philosophy? Of course one of the fundamental tenets of a valid hypothesis is to make testable predictions that no other hypothesis can account for. But with over a century of theories that increasingly fail to follow common sense  or match everyday experience perhaps this is a sign of approaching maturity in science, as we finally advance beyond the crude limitations of our biological inheritance and its limited senses. Surely one key result of this is that the boundaries between new ideas promulgated by scientists and the thoughts of armchair philosophers will become increasingly blurred? Or is that just fighting talk?

Whereas scientists engaged in phase two investigations seek to find more accurate approximations for phenomena, phase three includes the search for why one theory is thought to be correct over another. A prominent example may help elucidate. Further to Galileo in phase one and Newton in phase two, Einstein's General Relativity, which explains the cause of gravity via the curvature of spacetime, is clearly an example of phase three. Of course, contemporary physicists would argue that Einstein's equations are already known to be lacking finality due to its incompatible with quantum mechanics. Herein lies the rub!

One problem that has caused dissension amongst many scientists is a possibly even more ‘ultimate' question: why is the universe finely tuned enough for life and more than that, intelligent life, to exist? The potential answers cover the entire gamut of human thought, from the conscious design principle supported by some religiously-minded scientists, to the invocation of the laws of probability in a multiverse hypothesis, requiring an immense number of universes all with the different fundamentals (and therefore including a lucky few capable of producing life). But the obvious issue here is that wouldn't Occam's Razor suggest the former is more likely than the latter? As Astronomer Royal Sir Martin Rees states, this is veering into metaphysical territory, which except for the scientists with religious convictions, is usually an area avoided like the plague. However, it may eventually become possible to run computer models that simulate the creation of multiple universes and so as bizarre as it seems, go some way to creating a workable theory out of something that to most people is still a purely philosophical notion. Talk about counting angels on a pinhead!

I can't say I'm entirely convinced by my own theory of three stages to science, but it's been interesting to see how the history and practice of the discipline can be fitted into it. After all, as stated earlier no-one has ever observed a quark, which in the first days of their formulation were sometimes seen as purely mathematical objects any way. So if you're doubtful I don't blame you, but never say never...

Wednesday 27 February 2013

An index of possibilities: is science prognostication today worthwhile or just foolish?

A few evenings ago I saw the International Space Station. It was dusk, and walking home with the family we were looking at Jupiter when a moving bright light almost directly overhead got our attention. Too high for an aircraft, too large for a satellite, a quick check on the Web when we got home confirmed it was the ISS. 370 kilometres above our heads, a one hundred metre long, permanently crewed construction confirmed everything I read in my childhood: we had become a space-borne species. But if so few of the other scientific and technological advances I was supposed to be enjoying in adulthood have come true, has the literature of science prediction in these areas also changed markedly?

It is common to hear nowadays that science is viewed as just one of many equally valid methods of describing reality. So whilst on the one hand most homes in the developed world contain a myriad of up-to-date high technology, many of the users of these items haven't got the faintest idea how they work. Sadly, neither do they particularly have any interest in finding out. It's a scary thought that more and more of the key devices we rely on every day are designed and manufactured by a tiny percentage of specialists in the know; we are forever increasing the ease with which our civilisation could be knocked back to the steam age - if not the stone age.

Since products of such advanced technology are now familiar in the domestic environment and not just in the laboratory, why are there seemingly fewer examples of popular literature praising the ever-improving levels of knowledge and application compared to Arthur C. Clarke's 1962 prophetic classic Profiles of the Future and its less critical imitators that so caught my attention as a child? Is it that the level of familiarity has led to the non-scientist failing to find much interest or inspiration in what is now such an integrated aspect of our lives? With scientific advance today frequently just equated with cutting-edge consumerism we are committing an enormous error, downplaying far more interesting and important aspects of the discipline whilst cutting ourselves off from the very processes by which we can gain genuine knowledge.

Therefore it looks as if there's somewhat of an irony: non-scientists either disregard scientific prognostication as non-practical idealism ("just give me the new iPad, please") and/or consider themselves much more tech savvy than the previous generation (not an unfair observations, if for obvious reasons - my pre-teen children can work with our 4Gb laptop whilst my first computer had a 48Kb RAM). Of course it's not all doom and gloom. Although such as landmark experiments as the New Horizons mission to Pluto has gone largely unnoticed, at least by anyone I know, the Large Hadron Collider (LHC) and Mars Curiosity rover receive regular attention in popular media.

Perhaps the most regularly-occurring theme in science news articles over the past decade or so has been climate change, but with the various factions and exposé stories confusing the public on an already extremely complex issue, could it be that many people are turning their back on reading postulated technological advances as (a) technology may have greatly contributed to global warming; and (b) they don't want to consider a future that could be extremely bleak unless we ameliorate or solve the problem? The Astronomer Royal and former President of the Royal Society Martin Rees is one of many authors to offer a profoundly pessimistic view of mankind's future. His 2003 book Our Final Hour suggests that either by accident or design, at some point before AD2100 we are likely to initiate a technological catastrophe here on the Earth, and the only way to guarantee our species' survival is to establish colonies elsewhere as soon as possible.

But there are plenty of futurists with the opposite viewpoint to Rees and like-minded authors, including the grandly-titled World Future Society, whose annual Outlook reports are written with the aim of inspiring action towards improving our prospects. Most importantly, by including socio-economic aspects they may fare better than Arthur C. Clarke and his generation, whose space cadet optimism now seems hopelessly naïve.

One way near-future extrapolation may increase accuracy is for specialists to concentrate in their area of expertise. To this end, many scientists and popularisers have concentrated on trendy topics such as nanotechnology, with Ray Kurzweil perhaps the best known example. This isn't to say that there aren't still some generalist techno-prophets still around, but Michio Kaku's work along these lines has proved very mixed as to quality whilst the BBC Futures website is curiously old school, with plenty of articles on macho projects (e.g. military and transport hardware) that are mostly still in the CAD program and will probably remain that way for many years to come.

With so many factors influencing which science and technology projects get pursued, it seems worthwhile to consider whether even a little knowledge of current states and developments might be as useful as in-depth scientific knowledge when it comes to accurate prognostication, with luck instead playing the primary role. One of my favourite examples of art-inspired science is the iPad, released to an eager public in 2010 some twenty-three years after the fictional PADD was first shown on Star Trek: The Next Generation (TNG) - although ironically the latter is closer in size to non-Apple tablets. In an equally interesting reverse of this, there is now a US$10 million prize on offer for the development of a hand-held Wi-Fi health monitoring and diagnosis device along the lines of the Star Trek tricorder. No doubt Gene Roddenberry would have been pleased that his optimistic ideas are being implemented so rapidly; but then even NASA have at times hired his TNG graphic designer!

I'll admit that even I have made my own modest if inadvertent contribution to science prediction. In an April Fools' post in 2010 I light-heartedly suggested that perhaps sauropod dinosaurs could have used methane emissions as a form of self-defence. Well, not quite, but a British study in the May 2012 edition of Current Biology hypothesises that the climate of the period could have been significantly affected by dino-farts. As they say, truth is always stranger than fiction…

Monday 29 October 2012

From geek to guru: can professional scientists be successful expositors (and maintain careers in both fields)?

The recent BBC TV series Orbit: Earth's Extraordinary Journey gave me food for thought: although presenter Helen Czerski is a professional physicist she was burdened with a co-presenter who has no formal connection with science, namely Kate Humble. You have to ask: why was Humble needed at all? I'll grant that there could have been a logistics issue, namely getting all the locations filmed in the right season within one year, but if that was the case why not use another scientist, perhaps from a different discipline? Were the producers afraid a brace of scientists would put the public off the series?

The old days of senior figures pontificating as if in a university lecture theatre are long gone, with blackboard diagrams and scruffy hair replaced by presenters who are keen to prove their non-geek status via participation in what essentially amount to danger sports in the name of illustrating examples. Okay, so the old style could be very dry and hardly likely to be inspirational to the non-converted, but did Orbit really need a non-scientist when Helen Czerski (who is hardly new to television presenting) can deliver to camera whilst skydiving? In addition, there are some female presenters, a prominent British example being Alice Roberts, who have been allowed to solely present several excellent series, albeit involving science and humanities crossovers (and why not?)

But going back to Kate Humble, some TV presenters seems to cover such a range of subject matter that it makes you wonder if they are just hired faces with no real interest (and/or knowledge) in what they are espousing: “just read the cue cards convincingly, please!” Richard Hammond - presenter of light entertainment show Top Gear and the (literally) explosive Brainiac: Science Abuse has flirted with more in-depth material in Richard Hammond's Journey To The Centre Of The Planet, Richard Hammond's Journey To The Bottom Of The Ocean and Richard Hammond's Invisible Worlds. Note the inclusion of his name in the titles – just in case you weren't aware who he is. Indeed, his Top Gear co-presenter James May seems to be genre-hopping in a similar vein, including James May's Big Ideas, James May's Things You Need to Know, James May on the Moon and James May at the Edge of Space amongst others, again providing a hint as to who is fronting the programmes. Could it be that public opinion of scientists is poor enough - part geek, part Dr Strangelove - to force producers to employ non-scientist presenters with a well-established TV image, even if that image largely consists of racing cars?

Popular science books from Cosmos to A Brief History of Time

Having said that, science professionals aren't infallible communicators: Sir David Attenborough, a natural sciences graduate and fossil collector since childhood, made an astonishing howler in his otherwise excellent BBC documentary First Life. During an episode that ironically included Richard 'Mr Trilobite' Fortey himself, Sir David described these organisms as being so named due to their head/body/tail configuration. In fact, the group's name stems somewhat obviously from tri-lobes, being the central and lateral lobes in their body plan. It was an astounding slip up and gave me food for thought as to whether anyone on these series ever double checks the factual content, just to make sure it wasn't copied off the back of a cereal packet.

Another possible reason for using non-science presenters is that in order to make a programme memorable, producers aim to differentiate their expositors as much as possible. I've already discussed the merits of two of the world's best known scientists, Stephen Hawking and Richard Dawkins, and the unique attributes they bring to their programmes, even if in Dawkins' case this revolves around his attitude to anyone who has an interest in any form of unproven belief. I wonder if he extends his disapprobation to string theorists?

What is interesting is that whereas the previous generation of popular science expositors achieved fame through their theories and eventually bestselling popularisations, the current crop, of whom Helen Czerski is an example, have become well-known directly through television appearances. That's not to say that the majority of people who have heard of Stephen Hawking and Richard Dawkins have read The Selfish Gene or A Brief History of Time. After all, the former was first published in 1976 and achieved renown in academic circles long before the public knew of Dawkins. Some estimates suggest as little as 1% of the ten million or so buyers of the latter have actually read it in its entirety and in fact there has been something of a small industry in reader's companions, not to mention Hawking's own A Briefer History of Time, intended to convey in easier-to-digest form some of the more difficult elements of the original book. In addition, the US newspaper Investors Business Daily published an article in 2009 implying they thought Hawking was an American! So can you define fame solely of being able to identify a face with a name?

In the case of Richard Dawkins it could be argued that he has a remit as a professional science communicator, or at least had from 1995 to 2008, due to his position during this time as the first Simonyi Professor for the Public Understanding of Science. What about other scientists who have achieved some degree of recognition outside of their fields of study thanks to effective science communication? Theoretical physicist Michio Kaku has appeared in over fifty documentaries and counting and has written several bestselling popular science books , whilst if you want a sound bite on dinosaurs Dale Russell is your palaeontologist. But it's difficult to think of any one scientist capable of inspiring the public as much as Carl Sagan post- Cosmos. Sagan though was the antithesis of the shy and retiring scientist stereotype and faced peer accusations of deliberately cultivating fame (and of course, fortune) to the extent of jumping on scientific bandwagons solely in order to gain popularity. As a result, at the height of his popularity and with a Pulitzer Prize-winning book behind him, Sagan failed to gain entry to the US National Academy of Sciences. It could be argued that no-one has taken his place because they don't want their scientific achievements belittled or ignored by the senior science establishment: much better to claim they are a scientist with a sideline in presenting, rather than a communicator with a science background. So in this celebrity-obsessed age, is it better to be a scientific shrinking violet?

At this point you might have noticed that I've missed out Brian Cox (or Professor Brian Cox as it states on the cover of his books, just in case you thought he was an ex-keyboard player who had somehow managed to wangle his way into CERN.) If anyone could wish to be Sagan's heir - and admits to Sagan as a key inspiration - then surely Cox is that scientist. With a recent guest appearance as himself on Dr Who and an action hero-like credibility, his TV series having featured him flying in a vintage supersonic Lightening jet and quad biking across the desert, Cox is an informal, seemingly non-authoritative version of Sagan. A key question is will he become an egotistical prima donna and find himself divorced from the Large Hadron Collider in return for lucrative TV and tie-in book deals?

Of course, you can't have science without communication. After all, what's the opposite of popular science: unpopular science? The alternative to professionals enthusing about their subject is to have a mouth-for-hire, however well presented; delineating material they neither understand nor care about. And considering the power that non-thinking celebrities appear to wield, it's vital that science gets the best communicators it can, recruited from within its own discipline. The alternative can clearly be seen by last years' celebrity suggestion that oceans are salty due to whale sperm. Aargh!