Showing posts with label Charles Darwin. Show all posts
Showing posts with label Charles Darwin. Show all posts

Monday 15 March 2021

Distorted Darwin: common misconceptions about evolution and natural selection

A few months' ago, I discussed how disagreements with religious texts can lead the devout to disagree with key scientific theories; presumably this is a case of fundamentalists denying the fundamentals? Of all the areas of scientific research that cause issues today, it is evolutionary biology that generates the most opposition. This is interesting in so many ways, not least because the primary texts of the Abrahamic religions have little to say on the topic beyond the almost universal elements seen in creation myths, namely that one or more superior beings created all life on Earth and that He/They placed humanity at the zenith.

Thanks to opposition to the modern evolutionary synthesis, there is a plethora of misinformation, from material taken out of context to complete falsehoods, that is used to promote Creationist ideas rather than scientifically-gleaned knowledge. Even those with well-meaning intentions often make mistakes when condensing the complexity of the origin and history of life into easy-to-digest material. I've previously written about the concepts of evolutionary bushes rather than ladders, concurrent rather than consecutive radiation of sister species and speciation via punctuated equilibrium (i.e., the uneven pace of evolution) so here are a few other examples where the origin, implications and illustrations of natural selection has been distorted or overly simplified to the point of inaccuracy.

I've previously mentioned that Charles Darwin was the earliest discoverer - but only a decade or two ahead of Alfred Russel Wallace - of natural selection, and not as is often written, evolution per se. However, this is not completely accurate. Darwin's hypothesis was more complete than Wallace's, in the sense of being entirely scientific and therefore testable. Wallace on the other hand maintained there must have been divine intervention in the creation of our species, making us different from all other life forms.

In addition, there were several precursors who partially formulated ideas regarding natural selection, but who were unable to promote a consistent, evidence-based hypothesis to anywhere near the extent that Darwin achieved. For example, as early as 1831 the Scottish agriculturalist Patrick Matthew published some notes on what he termed 'new diverging ramifications of life' as he thought must occur after mass extinctions. Nevertheless, he failed to expand and fully explain his ideas, seemingly unaware of where they could lead. In this sense, he is a minor figure compared to the thorough research Darwin undertook to back up his hypothesis. 

Darwin appears to have been unaware of Matthew's ideas, although the same could not be said for Robert Chambers' (anonymous) 1844 publication Vestiges of the Natural History of Creation, which although highly speculative contained some kernels of truth about the mechanisms behind biological evolution. Just as Thomas Malthus' 1798 An Essay on the Principle of Population inspired Darwin, so the mid-nineteenth century contained other combinations of ideas and real-world inspiration that provided,an ideal background for the formulation of natural selection. In other words, the conditions were ready for those with the correct mindset to uncover the mechanism behind evolution. What Darwin did was to combine the inspiration with an immense amount of rigour, including examples taken from selective breeding.

Another frequently quoted fallacy is that evolution always maintains a single direction from earlier, simpler organisms to later, more complex ones. I've covered this before in discussions of the evolution of our own species, as many popular biology accounts seek parallels between technological progress and a central branch of animal evolution leading ever upwards until it produced us. 

Modern techniques such as genetic analysis and sophisticated examination of fossils - including scanning their internal cavities – has negated this appealing but incorrect idea. For example, mammals evolved around the same time as the dinosaurs (and over one hundred million years before flowering plants) while parasitic species often have a far more rudimentary structure than their ancestors. 

Despite this, we still see countless illustrations showing a clear-cut path from primordial organisms 'up' to Homo sapiens. No-one who has seen the cranial endocast of a dinosaur would consider it to be superior to even the least intelligent of mammals, although the later medium-sized carnivorous species were on the way to developing a bird-like brain-to-body mass ratio. Yet throughout the Jurassic and Cretaceous periods, dinosaurs filled most ecological niches at the expense of the mammals; you would be hard-pressed to state that the latter were the dominant type of land organism during the Mesozoic!

Research published last year shows that New Zealand's unique tuatara, the sole remaining member of the Rhynchocephalia, is a reptile that shares some genetic similarities to the Monotremata, the egg-laying mammalian species known as platypus and echidna. In addition, a report from the beginning of this year states that the ancestors of today's five monotreme species diverged from all other mammals 187 million years ago; therefore, they have spent approximately three times as long on their own evolutionary journey as they did when part of all the other mammalian lineages. As a result of retaining many ancestral features, the platypus genome is in some ways more like that of birds and reptiles rather than placental and marsupial mammals. But we still include them amongst the mammals rather than as a hybrid or separate class; both platypus and echidna have fur, are warm-blooded and produce milk (although with a unique delivery system!) This allows their inclusion in Mammalia; does this mean we arbitrarily allow certain traits and discard others?

Would it be fair to say that the boundaries we make between organisms are more for our convenience than the underlying reality? Are you happy to label birds as 'avian dinosaurs' and if not, why not? If they had feathers, nests and even underground burrows, some dinosaurs were clearly part of the way there; physiologically, it was teeth, bony tail, and a crocodilian-type brain that provided the differentiation from birds. Scans of fossils show that dinosaur hearts may have been more like birds than other reptiles, which along with the possible discovery of bird-like air sacs, means that they could have had something of the former's more active lifestyle. 

This doesn't confirm that they were warm-blooded: today there are eight species, including leatherback turtles, that are mesothermic and therefore lie between warm- and cold-blooded metabolisms. Eggshell analysis suggests that some of the theropod (carnivorous) dinosaurs could have been warm-blooded, but as dinosaurs existed for around 165 million years it may be that some evolved to be mesothermic and others to be endothermic (i.e., fully warm-blooded). In this respect then, some meat-eating dinosaurs especially may have had more in common with us mammals than they did with other reptiles such as lizards and snakes.

All this only goes to show that there is far more to life's rich pageant than the just-so stories still used to illustrate the history of life. Science communication to the public is fundamental to our society but it needs to present the awkward complexities of evolution via all the tortured pathways of natural selection if it is not to fall victim to those who prefer myths of the last few thousand years to the history of countless millennia, as revealed in the genes and rocks waiting for us to explore.


Thursday 9 November 2017

Wonders of Creation: explaining the universe with Brian Cox and Robin Ince

As Carl Sagan once you said: "if you wish to make an apple pie from scratch, you must first invent the universe." A few nights' ago, I went to what its' promoters bill as ‘the world's most successful and significant science show', which in just over two hours presented a delineation of the birth, history, and eventual death of the universe. In fact, it covered just about everything from primordial slime to the triumphs of the Cassini space probe, only lacking the apple pie itself.

The show in question is an evening with British physicist and presenter Professor Brian Cox. As a long-time fan of his BBC Radio show The Infinite Monkey Cage I was interested to see how the celebrity professor worked his sci-comm magic with a live audience. In addition to the good professor, his co-presenter on The Infinite Monkey Cage, the comedian Robin Ince, also appeared on stage. As such, I was intrigued to see how their combination of learned scientist and representative layman (or 'interested idiot' as he styles himself) would work in front of two thousand people.

I've previously discussed the trend for extremely expensive live shows featuring well-known scientists and (grumble-grumble) the ticket's to Brian Cox were similarly priced to those for Neil deGrasse Tyson earlier this year. As usual, my friends and I went for the cheaper seats, although Auckland must have plenty of rich science fans, judging by the almost packed house (I did a notice a few empty seats in the presumably most expensive front row). As with Professor Tyson, the most expensive tickets for this show included a meet and greet afterwards, at an eye-watering NZ$485!

When Cox asked if there were any scientists in the audience, there were very few cheers. I did notice several members of New Zealand's sci-comm elite, including Dr Michelle Dickinson, A.K.A. Nanogirl, who had met Ince on his previous Cosmic Shambles LIVE tour; perhaps the cost precluded many STEM professionals from attending. As I have said before, such inflated prices can easily lead to only dedicated fans attending, which is nothing less than preaching to the converted. In which case, it's more of a meet-the-celebrity event akin to a music concert than an attempt to spread the wonder - and rationality - of science.

So was I impressed? The opening music certainly generated some nostalgia for me, as it was taken from Brian Eno's soundtrack for the Al Reinert 1983 feature-length documentary on the Apollo lunar missions. Being of almost the same age as Professor Cox, I confess to having in my teens bought the album of vinyl - and still have it! Unlike Neil deGrasse Tyson's show, the Cox-Ince evening was an almost non-stop visual feast, with one giant screen portraying a range of photographs and diagrams, even a few videos. At the times, the images almost appeared to be 3D, seemingly hanging out of the screen, with shots of the Earth and various planets and moons bulging onto the darkened stage. I have to admit to being extremely impressed with the visuals, even though I had seen some of them before. Highlights included the Hubble Space Telescope's famous Ultra-Deep Field of the earliest galaxies and the montage of the cosmic microwave background taken by the WMAP probe.

The evening (okay, let's call it a cosmology lecture with comic interludes) began as per Neil deGrasse Tyson with the age and scale of the universe, then progressed through galaxy formation and a few examples of known extra-solar planets. However, the material was also bang up to date, as it included the recent discoveries of gravitational waves at LIGO and the creation of heavy elements such as gold and platinum in neutron star collisions.

Evolution of the universe

Our universe: a potted history

Professor Cox also took us through the future prospects of the solar system and the eventual heat death of the universe, generating a few "oohs" and "aahs" along the way.  Interestingly, there was little explanation of dark matter and dark energy; perhaps it was deemed too speculative a topic to do it justice. Black holes had a generous amount of attention though, including Hawking radiation. Despite having an audience of primarily non-STEM professionals (admittedly after a show of hands found a large proportion of them to be The Infinite Monkey Cage listeners), a certain level of knowledge was presupposed and there was little attempt to explain the basics. Indeed, at one point an equation popped up - and it wasn't E=MC2. How refreshing!

Talking of which, there was a brief rundown of Einstein's Special and General Theories of Relativity, followed by the latter's development into the hypothesis of the expanding universe and eventual proof of the Big Bang model. Einstein's Cosmological Constant and his initial dismissal of physicist-priest Georges LemaƮtre's work were given as examples that even the greatest scientists sometimes make mistakes, showing that science is not a set of inviolable truths that we can never improve upon (the Second Law of Thermodynamics excluded, of course). LemaƮtre was also held up to be an example of how science and religion can co-exist peacefully, in this case, within the same person.

Another strand, proving that Cox is indeed deeply indebted to Carl Sagan (aren't we all?) was his potted history of life on Earth, with reference to the possibility of microbial life on Mars, Europa and Enceladus. The lack of evidence for intelligent extra-terrestrials clearly bothers Brian Cox as much as it did Sagan. However, Cox appeared to retain his scientific impartiality, suggesting that - thanks to the 3.5 billion year plus gap between the origin of life and the evolution of multi-cellular organisms - intelligent species may be extremely rare.

For a fan of crewed space missions, Cox made little mention of future space travel, concentrating instead on robotic probes such as Cassini. The Large Hadron Collider also didn't feature in any meaningful way, although one of the audience questions around the danger of LHC-created black holes was put into perspective next to the natural black holes that might be produced by cosmic ray interactions with the Earth's atmosphere; the latter's 108 TeV (tera electron volts) far exceed the energies generated by the LHC and we've not been compressed to infinity yet.

Robin Ince's contributions were largely restricted to short if hilarious segments but he also made a passionate plea (there's no other word for it) on the readability of Charles Darwin and his relevance today. He discussed Darwin's earthworm experiments and made short work of the American evangelicals'  "no Darwin equals no Hitler" nonsense, concluding with one of his best jokes: "no Pythagoras would mean no Toblerone".

One of the friends I went with admitted to learning little that was new but as stated earlier I really went to examine the sci-comm methods being used and their effect on the audience. Cox and Ince may have covered a lot of scientific ground but they were far from neglectful of the current state of our species and our environment. Various quotes from astronauts and the use of one of the 'pale blue dot' images of a distant Earth showed the intent to follow in Carl Sagan's footsteps and present the poetic wonder of the immensity of creation and the folly of our pathetic conflicts by comparison. The Cox-Ince combination is certainly a very effective one, as any listeners to The Infinite Monkey Cage will know. Other science communicators could do far worse than to follow their brand of no-nonsense lecturing punctuated by amusing interludes. As for me, I'm wondering whether to book tickets for Richard Dawkins and Lawrence Krauss in May next year. They are slightly cheaper than both Brian Cox and Neil deGrasse Tyson. Hmmm…

Sunday 26 February 2017

Wondering about the wanderer: the life and times of the monarch butterfly in New Zealand

This summer has seen a proliferation of monarch butterflies in my garden. Over the past five years there's been little change in planting - except for a few additional self-seeded swan plants (a.k.a. milk weed Gomphocarpus fruticosus and similar species) - so why am I now seeing so many more Kahuku/Wanderer than previous years? This summer has seen a mixture of wet and dry weeks but not an extreme in either direction, when compared to the previous four summers in house. Is that the secret: just a balance of weather conditions; or is there more to it than that? As I pointed out in a recent post, a cluster of swan plants several street's away has seen very few monarch butterflies. Let's have a look at the details.

Monarch caterpillar

My experience:

Although common enough in all except the coldest regions of New Zealand, Danaus plexippus is not a native species but seemingly self-introduced at some point within the last 150 years. It's large size and colourful wing markings have led to its popularity in art and science. I've seen paintings, collages, sculptures and jewellery utilising its patterns, which contrast vividly with New Zealand's predominantly green appearance.

Swan plants, the almost sole food source, are readily available from garden centres and buying one can lead to large numbers of self-seeded plants, aiding the spread of the monarch. I've found this year that even young plants under 50cm tall have had eggs laid on them. I've also noticed that the swan plants in my back garden contain more than double the number of caterpillars than those in the front garden, despite the latter garden being much larger and having a lot more vegetation. I've even noticed that some caterpillars in the front garden disappear shortly after starting to pupate; perhaps the denser planting attracts or hides more predators?

Monarch chrysalis

Lifecycle:

The eggs are usually found on the underside of leaves and tend to be more conspicuous than the first instar (freshly-hatched) caterpillars. Apparently, larger caterpillars will munch through both eggs and smaller caterpillars without noticing, so it's a monarch-eat-monarch world out there! I've had to move some caterpillars when they get to a decent size in order to prevent them eating their entire plant and starving to death. Females can lay hundreds of eggs in their lifetime at a rate of up to 40 per day, so monarch care sites recommend destroying later eggs to allow the earlier individuals to survive. In general, the warmer the weather the quicker the caterpillars grown to full size before pupating. However, it has been noted that butterflies that hatch in the autumn can survive over winter, often in colonies, their lifespan extended from two months for same-summer breeders up to nine months. Unlike in their North American homeland, New Zealand monarchs do not migrate enormous distances.

Monarch chrysalis about to hatch

Predation:

Despite absorbing toxins from milkweed, both caterpillars and butterflies are predated by a range of other animals. I've occasionally found a pair of wings on the ground, which is a good indication of predation by a South African praying mantis, Miomantis caffra. Other introduced invertebrates such as wasps will also attack monarchs. It's interesting that these predators tend to have originated in Europe, Africa and Asia yet the monarch evolved in North America; clearly, the former aren't too specialised to be able to handle alien prey. Which of course is what has happened in general to New Zealand's native birds and reptiles, with European mustelids and rodents and Australian possums finding a veritable feast amongst the kiwi and company.

Caring for monarchs:

Apart from removing caterpillars from overcrowded plants, my only other assistance is to rehang any fallen chrysalis and move the occasional pre-pupating wanderer into a wood and wire cage until they metamorphose. Although I have found one chrysalis about eight metres from the closest swan plant, a fully-grown wandering caterpillar might just prove too tempting a morsel. Otherwise I tend to leave nature to do its thing; after all, it's hardly an endangered species. Many caterpillars disappear before reaching pupation due to a combination of disease and predation and any swan plant that gets completely eaten may lead the incumbent caterpillars to starvation. Darwin was famously inspired by Thomas Malthus' An Essay on the Principle of Population, so it's great to be able to see such a theory in action in your own garden!

Monarch butterfly

Public interest:

Despite being neither native nor endangered, there are various New Zealand-based citizen science projects studying them, such as by fitting wing tags for tracking purposes. Much as I am in favour of direct public engagement in science, I wonder if the effort wouldn't be better redirected towards endangered native species. As I've previously discussed, if visually attractive poster species get much of the attention, where does that leave the smaller, more drab, less conspicuous critters that may be more important?

I'm still at a loss to what has caused this summer's proliferation of monarch butterflies in my garden. There are just as many other summer species as usual, such as adult cicada and black crickets, and seemingly as many monarch predators such as praying mantises. But as I've mentioned before, perhaps what to human eyes appear similar conditions are not so to these colourful creatures. Although how much effort would be required to detail those conditions is somewhat beyond the capability of this amateur entomologist!

Wednesday 27 July 2016

Resistance is futile: the ongoing war against super bugs

As I'm currently three days into an irritating cough (aren't they all?) accompanied by a sore throat, I've just taken a soothing lozenge. The packet states the lozenges contain a combination of two antibacterial agents which aim to help kill the bacteria causing the infection. However, the packet also notes - in a somewhat smaller font size - that there is no clinical proof an antibacterial agent will reduce the severity or duration of the infection. Could this be because common colds and influenza are caused by viruses not bacteria? I don't suppose the pharmaceutical industry could possibly be duping an ignorant public in the name of profit margins?

Working in a hot desking environment, I frequently remind colleagues not to overdue usage of anti-bacterial sprays on their desk, keyboards, mouse and telephone. Not that I'm exactly certain how damaging the company-supplied sprays are, environmentally-speaking: for all I know, they may be good enough to destroy all the 'bad' bacteria, but I'd rather be safe than sorry. Instead, I recommend the method I use at work, namely washing my hands before eating. Simple, and hopefully less likely to encourage super bugs.

It seems to me that basic hygiene is preferable to the chemical war on microbes, since (a) some are beneficial, including for building immunity; and (b) some strains may survive the cull and lead to a desk biota high in resistant bacteria: after all, isn't that just Darwinian natural selection being given an unintentional boost? Unfortunately, there has been a large increase in sick leave since we moved from conventional offices to hot-desking. Therefore something is clearly going wrong, regardless of approach!

The best well-known of the super bugs has to be Methicillin-resistant Staphylococcus aureus (MRSA), beloved of news journalists but very few others. Although the resistance was first recognised around 1960, the past twenty-five years or so has seen a plethora of scientific reports describing outbreaks separated from healthcare environments. Therefore popular news articles about super bugs in hospitals - and the over-use of antibiotics that have aided their increase in range - only started hitting the headlines after the bacteria had already spread to other types of locale.

This latter community-associated or CA-MRSA is therefore at least as great a risk as the hospital variant, often affecting younger people. MRSA naturally occurs in several percent of the population anyway, so it would be difficult to totally eradicate by any foreseeable method. Many common antibiotics are already useless against MRSA, which can be spread by direct skin contact as well as via objects - such as computer keyboards and mice I might add, to anyone considering converting their offices to hot desking. In addition, the far less well-known methicillin-sensitive Staphylococcus aureus (MSSA) is also on the increase.

Another key reason for the increase of resistant microbes is thanks to the use of antibiotics on farmed animals. Whilst it might seem sensible for densely-packed livestock to be inoculated - frankly I don't mind paying more for free range rather than battery-farmed eggs, but I realise that isn't an option for many - the discovery in the 1940s that antibiotics can be used to promote growth imply profit is yet again the key factor here. Far from being a simple precautionary measure against the spread of infection, livestock and poultry has been given pharmaceuticals in order to maximise produce without an associated increase in feeding costs.

In 1969 the Swann report on the Use of Antibiotics in Animal Husbandry and Veterinary Medicine recommended a cease on their use as growth promoters. After a long period of inertia, the European Union eventually banned such usage for eight antibiotics, four in 1989 and a further four in 2006. Unfortunately many other nations, including the USA, are still pumping enormous amounts of pharmaceuticals into farm animals.

I've found very little in the way of research projects that seek to lessen this dependency. Possibly the method containing the least disruption would be to develop drugs that have similar effects on animal growth but aren't required as human medicine. Perhaps the pharmaceutical giants just aren't finding antibiotic development profitable enough anymore; after all, if medical practice wants to prevent the spread of resistant bacteria it needs to minimise use of antibiotics.

The effects agricultural usage is having is wide-ranging, from pathogens crossing from livestock to humans and back again, to infections spreading to pets and even into wild animals such as flies and rodents. However, the USA seems to have made little effort to follow the EU, with about 80% of the antibiotics sold there being used on farm livestock. Yet another MRSA variant, CC398, has been gaining ground, particularly in pigs and can transfer to humans in the form LA-MRSA. What price a cheap bacon sandwich?

It isn't as if the American scientific establishment hasn't been amassing data to support the case for stopping the practice, which over the past half century or so has led to other, less well-known strains such as Campylobacter coli gaining immunity, Despite high levels of infected produce, large-scale recalls and perhaps over 100,000 deaths per annum in the USA alone (farm workers and food processors can pick up strains, not just the end consumer), commerce appears to be winning over common sense.

It isn't completely bad news: research by the University of Southampton indicates that copper might become useable as an inhibitor (which seems strange - I thought silver might be the metal of choice, considering its anti-bacterial properties - guess that proves I'm not a research chemist, then!) In addition, some of the main fast food chains have started to cut down on buying produce from antibiotic-pumped livestock. But is this too little much too late? With most pharmaceutical production in the hands of a few giant multi-nationals, the human race is largely beholden to a very small number of executives. My suggestion would be...err...just don't get ill? Or work in a hot desking environment. Or leave your home, ever...hmm...

Monday 28 September 2015

Resurrecting megafauna: the various problems of de-extinction


The record-breaking success of Jurassic World proves that if there's anything a lot of people want to see in the animal kingdom it is species that are both large and fierce. Unfortunately, in these post-glacial times that type of fauna has been much reduced and will no doubt wane even further - not that I particularly wish to encounter an apex predator at close quarters, you understand.

Hollywood, of course, has much to answer for. There was plenty of poor science in the original Jurassic Park movie - the use of gap-filling frog DNA being a far worse crime in my book than the over-sized velociraptors (think Achillobator and similar species) but the most recent film in the franchise has pointedly ignored the advances in dinosaur knowledge made in the intervening period. Perhaps a CGI test of a feathered T-Rex looked just to comical?

In contrast, the amount of publically-available material discussing de-extinction has increased exponentially in the two decades since Jurassic Park was released, with the line between fact and fiction well and truly blurred. That's not to say that an enormous amount hasn't been learned about the DNA of extinct species during this period. I recently watched a rather good documentary on the National Geographic channel (yes, it does occasionally happen) about the one-month old baby mammoth Lyuba, recovered in Siberia almost forty-two thousand years after she died. The amount of genetic information that has been recovered from mammoths is now extremely comprehensive, but then they were alive until almost yesterday at geological timescales. Needless to say the further back in time a creature existed, the more problematic it is to retrieve any genetic material.

A lot has been written about the methods that have been, or could in the near future, be used to resurrect ancient animals. Some procedures involve the use of contemporary species as surrogate parents, such as elephants standing in for mother mammoths. But it seems fair to say that all such projects are finding difficulties rather greater than originally planned. One common misconception is that any resurrected animal would be a pure example of its kind. Even the numerous frozen mammoth carcasses have failed to supply anywhere near a complete genome and of course it isn't just a case of filling in gaps as per a jigsaw puzzle: one primary issue is how to know where each fragment fits into the whole. Our knowledge of genetics may have advanced enormously since Watson and Crick's landmark 1953 paper, but genetic engineering is still incredibly difficult even with species that are alive today. After all, Dolly the sheep wasn't a pure clone, but had nuclear DNA from one donor and mitochondrial DNA from another.

Therefore instead of resurrecting extinct species we would be engineering hybrid genomes. Jurassic World took this process to the extreme with Indominus rex, a giant hybrid of many species including cuttlefish! Some research suggests that the most of the original genes of any species over a million years old – and therefore including all dinosaurs – might never be recovered. Something  terrible lizard-ish may be built one day, but it would be closer to say, a chicken, with added teeth, a long bony tail and a serious attitude problem. In fact, George Lucas has been a key funder of the chickenosaurus project with aims along these lines. Let's hope he doesn't start building an army of them, totally obedient clones, ready for world domination…oh no, that was fiction, wasn't it?

But if – or more likely, when – creating variants of extinct species becomes possible, should we even attempt it? Apart from the formidable technical challenges, a lot of the drive behind it seems to be for populating glorified wildlife parks, or even worse, game reserves. The mock TV documentary series Prehistoric Park for example only contained large animals from various periods, frequently fierce carnivores, with no attention given to less conspicuous creatures or indeed flora. This gee-whiz mentality seems to follow a lot of the material written about de-extinction, masking some very serious long-term issues in favour of something akin to old-style menageries. Jurassic Park, in fact.

A big question that would be near impossible to answer in advance is whether such a species would be able to thrive or even survive in a climate far removed from the original, unless there was major genetic engineering just for such adaptive purposes. Again, the further back the animal lived, the less likely it is that there is a contemporary habitat close to the original. It may be possible to recreate glacial steppes suitable for some mammoth species, but what about the Earth of ten million or one hundred million years ago? Prehistoric Park got around the issue for its Carboniferous megafauna by housing them in a high oxygen enclosure, which is certainly a solution, if something of a fire hazard!

Any newly-created animal will lack the symbiotic microbial fauna and flora of the original era, but I've not seen much that tackles this issue. I suppose there could be a multi-stage process, starting with deliberate injections of material in vitro (or via the host /mother). But once the animal is born it will have to exist with whatever the local environment/habitat has to offer. The chimerical nature of the organism may help provide a solution, but again this takes the creature even further from the original.

Then there is the rather important issue of food. To his credit, Michael Crichton suggested in Jurassic Park that herbivorous dinosaurs swallowing gizzard stones might accidentally eat berries that their metabolism couldn't handle. It would be extremely expensive to maintain compounds large enough for megafauna that are constantly kept free of wind-blown, bird-dropped and otherwise invasive material dangerous to the animals.

If the hybrids were allowed free reign, what if they escaped or were able to breed naturally? Given a breeding population (as opposed to say, sterilised clones) evolution via natural selection may lead them in a new direction. It would be wise to consider them as an integral part of the ecosystem into which they are placed, remembering Darwin's metaphor of ten thousand sharp wedges. Is there a possibility that they could out-compete modern species or in some other way exacerbate the contemporary high rate of extinction?

I've previously discussed the dangers of deliberate introduction of foreign species for biological control purposes: surely introducing engineered hybrids of extinct species is the ultimate example of this process? Or would there be a complete ban on natural reproduction for resurrected species, with each generation hand-reared from a bank of genetic material? At this point it should be clear that it isn't just the nomenclature that is confusing.

Some research has been undertaken to investigate the de-extinction of species whose demise during the past few centuries can clearly be blamed on humans, obvious examples being the Tasmanian tiger and the nine species of New Zealand moa. It could be claimed that this has more to do with alleviating guilt than serving a useful purpose (assuaging crimes against the ecosystem, as it were) but even in these cases the funds might be better turned towards more pressing issues. After all, two-thirds of amphibian species are currently endangered, largely due to direct human action. That's not to say that such money would then be available, since for example, a wealthy business tycoon who wants to sponsor mammoth resurrection - and they do exist - wouldn't necessarily transfer their funding to engineering hardier crops or revitalising declining pollinating insect species such as bees.

As it happens, even species that existed until a few hundred years ago have left little useable fragments of DNA, the dodo being a prime example. That's not to say that it won't one day be retrievable, as shown by the quagga, which was the first extinct species to have its DNA recovered, via a Nineteenth Century pelt.

As Jeff Goldman's chaos mathematician says in Jurassic Park, "scientists were so preoccupied with whether or not they could that they didn't stop to think if they should". Isn't that a useful consideration for any endeavour into the unknown? If there's one thing that biological control has shown, it is to expect the unexpected. The Romans may have enjoyed animal circuses, but we need to think carefully before we create a high-tech living spectacle without rather more consideration to the wider picture than appears to currently be the case.



Tuesday 23 December 2014

Easy fixes: simple corrections of some popular scientific misconceptions

A few months' ago I finally saw the film 'Gravity', courtesy of a friend with a home theatre system. Amongst the numerous technical errors - many pointed out on Twitter by Neil deGrasse Tyson - was one that I hadn't seen mentioned. This was how rapidly Sandra Bullock's character acclimatised to the several space stations and spacecraft immediately after removing her EVA suit helmet. As far as I am aware, the former have nitrogen-oxygen atmospheres whilst the suits are oxygen-only, necessitating several hours of acclimatisation.

I may of course be wrong on this, and of course dramatic tension would be pretty much destroyed if such delays had to be woven into the plot, but it got me thinking that there are some huge fundamental errors propagated in non-scientific circles. Therefore my Christmas/Hanukkah/holiday season present is a very brief, easy -on-the-brain round-up of a few of the more obvious examples.

  1. The Earth is perfect sphere.
    Nope, technically I think the term is 'oblate spheroid'. Basically, a planet's spin squashes the mass so that the polar diameter is less than the equatorial diameter. Earth is only about 0.3% flatter in polar axis but if you look at a photograph of Saturn you can see a very obvious squashing.

  2. Continental drift is the same thing as plate-tectonics.
    As a child I often read that these two were interchangeable, but this is not so. The former is the hypothesis that landmasses have moved over time whilst the latter is the mechanism now accepted to account for this, with the Earth's crust floating over the liquid mantle in large segments or plates.

    Geologist Alfred Wegener suggested the former in 1912 but is was largely pooh-poohed until the latter was discovered by ocean floor spreading half a century later. As Carl Sagan often said, "extraordinary claims require extraordinary evidence".

  3. A local increase in cold, wet weather proves that global warming is a fallacy.
    Unfortunately, chaose theory shows that even the minutest of initial changes can cause major differences of outcome, hence weather forecasting being far from an exact science.

    However, there is another evidence for the validity of this theory, fossil fuel lobbyists and religious fundamentalists aside. I haven't read anything to verify this, but off the top of my head I would suggest that if the warm water that currently travels north-east across the Atlantic from the Gulf of Mexico (and prevents north-western Europe from having cold Canadian eastern seaboard winters), then glacial meltwater may divert this warm, denser seawater. And then the Isles of Scilly off the Cornish coast may face as frosty a winter as the UK mainland!

  4. Evolution and natural selection are the same thing.
    Despite Charles Darwin's On the Origin of Species having been published in 1859, this mistake is as popular as ever. Evolution is simply the notion that a population within a parent species can slowly differentiate to become a daughter species, but until Darwin and Alfred Russel Wallace independently arrived at natural selection, there really wasn't a hypothesis for the mechanism.

    This isn't to say that there weren't attempts to provide one, it's just that none of them fit the facts quite as well as the elegant simplicity of natural selection. Of course today's technology, from DNA analysis to CAT scans of fossils, provides a lot more evidence than was available in the mid-Nineteenth Century. Gregor Mendel's breeding programmes were the start of genetics research that led to the modern evolutionary synthesis that has natural selection at its core.

  5. And finally…freefall vs zero gravity.
    Even orbiting astronauts have been known to say that they are in zero gravity when they are most definitely not. The issue is due to the equivalence of gravity and acceleration, an idea which was worked on by luminaries such as Galileo, Newton and Einstein. If you find yourself in low Earth orbit - as all post-Apollo astronauts are - then clearly you are still bound by our planet's gravity.

    After all, the Moon is approximately 1800 times further away from the Earth than the International Space Station (ISS), but it is kept in orbit by the Earth's pull (okay, so there is the combined Earth-Moon gravitational field, but I'm keeping this simple). By falling around the Earth at a certain speed, objects such as the ISS maintain a freefalling trajectory: too slow and the orbit would decay, causing the station to spiral inwards to a fiery end, whilst too fast would cause it to fly off into deep space.

    You can experience freefall yourself via such delights as an out-of-control plummeting elevator or a trip in an arc-flying astronaut training aircraft A.K.A. 'Vomit Comet'. I'm not sure I'd recommend either! Confusingly, there's also microgravity and weightlessness, but as it is almost Christmas we'll save that for another day.
There are no doubt numerous other, equally fundamental errors out there, which only goes to show that we could do with much better science education in our schools and media. After all, no-one would make so many similar magnitude mistakes regarding the humanities, would they? Or, like the writer H.L. Mencken, would I be better off appreciating that "nobody ever went broke underestimating the intelligence of the (American) public"? I hope not!

Saturday 15 March 2014

Cutting remarks: investigating five famous science quotations

If hearing famous movie lines being misquoted seems annoying, then misquoted or misused science citations can be exasperating, silly or downright dangerous. To this end, I thought that I would examine five well-known science quotations to find the truth behind the soundbite. By delineating the accurate (as far as I'm aware) words in the wider context in which they were said/written down/overheard by someone down the hallway, I may be able to understand the intended meaning, and not the autopilot definition frequently used. Here goes:

1) God does not play dice (Albert Einstein)

Possibly Einstein's most famous line, it sound like the sort of glib comment that could be used by religious fundamentalists to denigrate science in two opposing fashions: either Einstein is being facetious and therefore sacrilegious; or he supports an old-fashioned version of conventional Judeo-Christian beliefs in which God can be perceived in the everyday world. Talk about having your cake and eating it!

Einstein is actually supposed to have said: "It is hard to sneak a look at God's cards. But that he would choose to play dice with the world...is something that I cannot believe for a single moment." This gives us much more material to work with: it was actually a quote Einstein himself supplied to a biographer. Some years earlier he had communicated with physicist Max Born along similar lines: "Quantum mechanics is certainly imposing. But an inner voice tells me that it is not yet the real thing. The theory says a lot, but does not really bring us any closer to the secret of the 'old one'. I, at any rate, am convinced that He does not throw dice."

So here is the context behind the quote: Einstein's well-known disbelief in the fundamental nature of quantum mechanics. As I've discussed in a previous post Einstein's opinions on the most accurate scientific theory ever devised was completely out of step with the majority of his contemporaries - and physicists ever since. Of course we haven't yet got to the bottom of it; speaking as a non-scientist I find the Copenhagen Interpretation nonsense. But then, many physicists have said something along the lines of that if you think you understand quantum mechanics, you haven't understood it. Perhaps at heart, Einstein was stuck in a Nineteenth Century mind set, unable to conceive of fundamental limits to our knowledge or that probability lies at the heart of reality. He spent decades looking for a deeper, more obviously comfortable, cause behind quantum mechanics. And as for his interest in the 'Old One', Einstein frequently denied his belief in a Judeo-Christian deity but referred to himself as an agnostic: the existence of any presence worthy of the name 'God' being "the most difficult in the world". Now there's a quote worth repeating!

2) Science is a way of thinking much more than it is a body of knowledge (Carl Sagan)

As I've mentioned before, Bill Bryson's A Short History of Nearly Everything is chock full of the results of scientific investigation but rarely stops to consider the unique aspects that drive the scientific method, or even define the limits of that methodology. Sagan's full quote is: "Science is more than a body of knowledge. It is a way of thinking; a way of sceptically interrogating the universe with a fine understanding of human fallibility. If we are not able to ask sceptical questions, to interrogate those who tell us that something is true, to be sceptical of those in authority, then, we are up for grabs for the next charlatan (political or religious) who comes rambling along."

It is interesting because it states some obvious aspects of science that are rarely discussed, such as the subjective rather than objective nature of science. As human beings, scientists bring emotions, selective memory and personal preferences into their work. In addition, the socio-cultural baggage we carry is hardly ever discussed until a paradigm shift (or just plain, old-fashioned time has passed) and we recognise the idiosyncrasies and prejudices embedded into research. Despite being subject to our frailties and the zeitgeist, once recognised, these limitations are part of the strength of the discipline: it allows us, at least eventually, to discover their effect on what was once considered the most dispassionate branch of learning.

Sagan's repeated use of the word sceptical is also of great significance. Behind the multitude of experimental, analytical and mathematical methods in the scientific toolkit, scepticism should be the universal constant. As well as aiding the recognition of the biases mentioned above, the sceptical approach allows parsimony to take precedence over authority. It may seem a touch idealistic, especially for graduate students having to kowtow to senior faculty when seeking research positions, but open-minded young turks are vital in overcoming the conservative old guard. Einstein's contempt for authority is well-known, as he made clear by delineating unthinking respect for it as the greatest enemy of truth. I haven't read Stephen Jay Gould's Rocks of Ages: Science and Religion in the Fullness of Life, but from what I understand of his ideas, the distinction concerning authority marks a clear boundary worthy of his Non-Overlapping Magisteria.

3) The mystery of the beginning of all things is insoluble by us; and I for one must be content to remain an agnostic (Charles Darwin)

From the original publication of On the Origin of Species in 1859 to the present day, one of the most prominent attacks by devoutly religious critics to natural selection is the improbability of how life started without divine intervention. If we eventually find microbial life on Mars - or larger organisms on Titan, Europa or Enceladus - this may turn the tide against such easy a target, but one thing is for certain: Darwin did not attempt to detail the origin of life itself. Although he stated in a letter to a fellow scientist: "But if (and Oh! What a big if!) we could conceive in some warm little pond, with all sorts of ammonia and phosphoric salts, lights, heat, electricity etc., present that a protein compound was chemically formed ready to undergo still more complex changes" there are no such broad assumptions in his public writings.

As it turns out, Darwin may have got some of the details correct, although the 'warm little pond' is more likely to have been a deep sea volcanic vent. But we are still far from understanding the process by which inert chemicals started to make copies of themselves. It's been more than sixty years since Harold Urey and Stanley Miller at the University of Chicago produced amino acids simply by recreating what conditions were then thought to resemble on the early Earth. Despite numerous variations on this classic experiment in subsequent decades, we are little closer to comprehending the origin of life. So it was appropriate that Darwin, who was not known for flights of fancy (he once quipped "My mind seems to have become a kind of machine for grinding general laws out of large collections of facts") kept speculation out of his strictly evidence-based publications.

Just as Darwin has been (at times, deliberately) misquoted by religious fundamentalists determined to undermine modern biology, his most vociferous disciple today, Richard Dawkins, has also been selectively quoted to weaken the scientific arguments. For example, printing just "The essence of life is statistical improbability on a colossal scale" as opposed to the full text from The Blind Watchmaker discussing cumulative natural selection, is a cheap literary device that lessens the critique, but only if the reader is astute enough to investigate the original source material.

4) Anybody who has been seriously engaged in scientific work of any kind realizes that over the entrance to the gates of the temple of science are written the words: 'Ye must have faith.' (Max Planck)

Thomas Henry Huxley (A.K.A. Darwin's Bulldog) once wrote that "Science is organized common sense where many a beautiful theory was killed by an ugly fact." But that was back in the Nineteenth Century, when classical physics ruled and scientists predicted a time in the near future when they would understand all the fundamentals of the universe. In these post-modern, quantum mechanical times, uncertainty (or rather, Uncertainty) is key, and common sense goes out of the window with the likes of entanglement, etc.

Back to Planck. It seems fairly obvious that his quote tallies closely with the physics of the past century, in which highly defined speculation and advanced mathematics join forces to develop hypotheses into theories long before hard evidence can be gleaned from the experimental method. Some of the key players in quantum physics have even furthered Copernicus' preference for beautiful mathematics over observation and experiment. Consider the one-time Lucasian Professor of Mathematics Paul Dirac's partiality for the beauty of equations over experimental results, even though he considered humanity's progress in maths to be 'feeble'. The strangeness of the sub-atomic world could be seen as a vindication of these views; another of Planck's quotes is "One must be careful, when using the word, real."

Leaving aside advanced physics, there are examples in the other scientific disciplines that confirm Planck's view. In the historical sciences, you can never know the full story. For example, fossils can provide some idea of the how and when a species diverged into two daughter species, but not necessarily the where and why (vis-Ć -vis ecological 'islands' in the wider sense). Not that this lack of precision should be taken as doubt of validity. As evolutionary biologist Stephen Jay Gould once said, a scientific fact is something "confirmed to such a degree that it would be perverse to withhold provisional assent."  So what might appear to primarily apply to one segment of the scientific endeavour can be applied across all of science.

5) Space travel is utter bilge (Richard van der Riet Woolley, Astronomer Royal)

In 1956 the then-Astronomer Royal made a prediction that was thoroughly disproved five years later with Yuri Gagarin's historic Vostock One flight. The quote has been used ever since as an example of how blind obedience to authority is unwise. But Woolley's complete quote was considerably more ambiguous: "It's utter bilge. I don't think anybody will ever put up enough money to do such a thing...What good would it do us? If we spent the same amount of money on preparing first-class astronomical equipment we would learn much more about the universe...It is all rather rot." He went on say: "It would cost as much as a major war just to put a man on the moon." In fact, the latter appears to be quite accurate, and despite the nostalgia now aimed at the Apollo era, the lack of any follow-up only reinforces the notion that the race to the moon was simply the ultimate example of Cold War competition. After all, only one trained geologist ever got there!

However, I'm not trying to defend the edited version of Woolley's inopportune statement since he appears to have been an armchair naysayer for several decades prior to his most famous quote. Back in 1936, his review of Rockets Through Space: The Dawn of Interplanetary Travel by the first president of the British Interplanetary Society (BIS) was even more pessimistic: "The whole procedure [of shooting rockets into space]...presents difficulties of so fundamental a nature, that we are forced to dismiss the notion as essentially impracticable, in spite of the author's insistent appeal to put aside prejudice and to recollect the supposed impossibility of heavier-than-air flight before it was actually accomplished." Again, it might appear in hindsight that Woolley deserves scorn, were it not for the fact that nearly everyone with some knowledge of space and aeronautics was of a similar opinion, and the opposition were a few 'cranks' and the like, such as BIS members.

The moral of the this story is that it is far from difficult to take a partial quote, or a statement out of context, and alter a sensible, realistic attitude (for its time and place) into an easy piece of fun. A recent tweet I saw was a plaintive request to read what Richard Dawkins actually says, rather than what his opponents claim he has says. In a worst-case scenario, quote-mining makes it possible to imply the very opposite of an author's intentions. Science may not be one hundred percent provable, but it's by the far the best approach we have to finding out that wonderful thing we humans call 'the truth'.

Monday 27 January 2014

An index of possibilities: defining science at a personal level

"If a little knowledge is dangerous, where is the man who has so much as to be out of danger?" - T.H. Huxley

With a sense of revitalisation following the start of a new year - and since the number of misconceived notions of the scientific method are legion - I thought I should put my cards on the table and delineate my personal ideas of what I believe science to be.

I suppose you could say it's a self-learning exercise as much as anything. Most people consider science the least comprehensible of all disciplines, removed from every day experience and only accessible by a select few (a.k.a. an intellectual elite), albeit at the loss of the creativity that drives so many other aspects of our lives. But hopefully the incredible popularity of British physicist Brian Cox and other photogenic scientist-cum-science-communicators is more than a passing fad and will help in the long term to break down this damaging myth. Science is both part and parcel of our existence and will only increase in importance as we try to resolve such vital issues as environmental degradation whilst still providing enough food and water for an ever-increasing population (fingers very much crossed on that one, folks!)

So here goes: my interpretation of the scientific method in ten bite-size, easy-to-swallow, chunks.
  1. A large amount of science is not difficult to comprehend
    Granted, theoretical high-energy physics is one of several areas of science difficult to describe meaningfully in a few, short sound bites. But amidst the more obtuse volumes aimed at a popular readership there are some gems that break down the concepts to a level that retains the essential details without resorting to advanced mathematics. Evolutionary biologist Stephen Jay Gould noted that the fear of incompetence put many intelligent enthusiasts off learning science as a leisure activity, but with the enormity of these popular science sections in many bookstores - there are over 840,000 books in Amazon.com's science section - there is no longer an excuse for not dipping a toe. Leaving physics aside, there are plenty of areas of science that are easy to understand too, especially in the 'historical' disciplines such as palaeontology (more on that later).
  2. Science is not a collection of facts but a way of exploring reality
    This is still one of the most difficult things to convey. Bill Bryson's prize-winning best seller A Short History of Nearly Everything reminds me of the genre of boy's own bumper book of true facts that was still around when I was a child: Victorian-style progress with a capital 'P' and science just a compilation of theories and facts akin to say, history. The reality is of course rather more complicated. The scientific method is a way of examining nature via testable questions that can be resolved to a high degree of certainty by simplified models, either by practical experiments (both repeatable and under 'laboratory conditions') - and including these days, computer simulations - or via mathematics.
  3. Science requires creativity, not just rigor
    The stereotype of scientists as rational, unemotional beings has been broken down over the past thirty years or so, but many non-scientists still have little idea of the creative thinking that can be involved in science, particularly in cutting-edge theorising. From Einstein's thought experiments such as what it would be like to ride alongside a beam of light to the development of string theory - which has little likelihood of experimental evidence in the near future - scientists need to utilise creative thought at least as much as data collation and hard mathematics.
  4. Scientists are only human
    Scientists are far from immune to conditioned paths of thought ingrained via their social and cultural background. Therefore, rather than all scientists being equally adept at developing particular hypotheses, they are subject to the same whims and sense of normality as everyone else. In addition, individual idiosyncrasies can hinder their career. I've discussed previously how Einstein (who famously said his contempt of authority was punished by him becoming an authority himself) refused to accept some of the aspects of quantum theory long after his contemporaries had.
    Scientists could be said then to follow the stereotype visible elsewhere, namely that young radicals frequently evolve into old conservatives.
  5. If there's no proof, is it still science?
    Thomas Henry Huxley (a.k.a. Darwin's Bulldog) once said that the 'deepest sin against the human mind is to believe things without evidence'. Yet scientific hypotheses are sometimes formed prior to any support from nature or real-world experimentation. Although Charles Darwin had plenty of the evidence revealing artificial selection when he wrote On the Origin of Species, the fossil record at the time was extremely patchy and he had no knowledge of Mendelian inheritance. In addition, the most prominent physicists of his day were unaware of nuclear fusion and so their theories of how stars shone implied a solar system far too young for natural selection to be the primary mechanism of evolution. By sticking to his ideas in spite of these issues, did this make Darwin a poor scientist? Or is it feasible that many key advances require a leap of faith - a term unlikely to please Richard Dawkins - due to lack of solid, physical evidence?
  6. Are there two schools of science?
    New Zealand physicist Ernest Rutherford once disparagingly remarked something along the lines of physics being the only real science, and that other so-called scientific disciplines are just stamp collecting. I prefer to think of science as being composed of historical and non-historical disciplines, only occasionally overlapping. For instance, cutting-edge technological application of physics required repeatable and falsifiable experiments, hence the deemed failure of cold fusion, whilst the likes of meteorology, evolutionary biology, and palaeontology are composed of innumerable historical events and/or subject to the complexities of chaos theory and as such are unlikely to provide duplicate circumstances for testing or even capable of being broken down into simplified models that can be accurately tested.
  7. An accepted theory is not necessarily final
    A theory doesn't have to be the absolute end of a quest. For example, Newton's law of universal gravitation had to wait over two centuries for Einstein's general theory of relativity to explain the mechanism behind the phenomenon. Although quantum mechanics is the most accurate theory ever developed (in terms of the match between theory and experimental results), the root cause is yet to be understood, with wildly varying interpretations offered instead. The obvious problem with this approach is that a hypothesis may fit the facts but without an explanatory mechanism, scientists may reject it as untenable. A well-known instance of this scientific conservatism (albeit for good reasons) involved Alfred Wegener's hypothesis of continental drift, which only achieved orthodoxy decades later once plate tectonics was discovered.
  8. Scientific advance rarely proceeds by eureka moments
    Science is a collaborative effort. Few scientists work in a vacuum (except astronauts, of course!) Even the greatest of 'solo' theories such as universal gravitation was on the cards during Newton's lifetime, with contemporaries such as Edmond Halley working along similar lines. Unfortunately, our predilection for simple stories with identifiable heroes means that team leaders and thesis supervisors often receive the credit when many researchers have worked towards a goal. In addition, the priority rule is based on first publication, not when a scientist formulated the idea. Therefore many theories are named after scientists who may not have been the earliest discoverer or formulator. The work of unsung researchers is frequently neglected in favour of this simplified approach that glorifies the work of one pioneer at the expense of many others.
  9. Science is restricted by the necessity of using language to describe it
    Richard Dawkins has often railed against Plato's idealism (a.k.a. Essentialism), using the phrase 'the tyranny of the discontinuous mind'. I recall a primary example of this as a child, whilst contemplating a plastic model kit I had of a Neanderthal. I wondered how the human race had evolved: specifically, how could parents of a predecessor hominid species give birth to a modern human, i.e. a child of a different species? Of course, such discontinuity is nonsense, but it is surprising how frequently our mind interprets the world in this format of neat boundaries. A large part of the problem is how do we define transitional states as the norm, since our language is bound up with intrinsic categories? In addition, we rely on metaphor and analogy to describe aspects of the universe that do not conform to everyday experience, the nature of quantum probability being an obvious example. As with the previous point on our innate need for heroes, we are always constructing narratives, thus restricting our ability to understand nature at a fundamental level.
  10. Science does not include a moral dimension
    Science, like nature, is neither moral nor immoral and cannot provide a framework for human behaviour. Of course, this doesn't prevent scientists from being greedy or stupid, or even just naĆÆve: witness British evolutionary biologist J.B.S. Haldane who recommended the use of poison gas as a war weapon due to it being more humane than conventional weapons (in terms of the ratio of deaths to temporarily incapacitation). This suggests that non-scientists should be involved in the decision-making process for the funding of some science projects, especially those with clear applications in mind. But in order for this to be tenable, the public needs to be considerably more scientifically literate than at present. Otherwise the appalling scare-mongering engendered by the likes of the British tabloid press - think genetically modified crops labelled as 'Frankenstein foods' - will only make matters far worse. GM crops themselves are a perfect example of why the Hollywood approach for clear-cut heroes and villains fails with most of science. Reality is rarely black or white but requires careful analysis of the myriad shades of grey.
In conclusion, it might be said that there are as many variants of science as there are human beings. Contrary to many other disciplines, mistakes and ignorance are clear strengths: as Darwin stated in The Descent of Man, 'Ignorance more frequently begets confidence than does knowledge.' Above all, there are aspects of science that are part and parcel of our everyday experience and as such, we shouldn't just consider it as something to save for special occasions.

Tuesday 14 May 2013

What, how and why? Are there 3 stages to science?

Not being philosophically inclined I was recently surprised to find myself constructing an armchair thesis: it had suddenly dawned on me that there might be three, broad phases or stages to the development of scientific ideas. I'm fairly certain I haven't read about anything along similar lines, so let me explain,  safe in the knowledge that if it's a load of fetid dingo's kidneys, it's entirely of my own doing.

Stage 1

Stage one is the 'what' phase: simply stated, it is about naming and categorising natural phenomena, a delineation of cause and effect. In a sense, it is about finding rational explanations for things and events at the expense of superstition and mysticism.  In addition, it utilises the principle of parsimony, otherwise known as Occam's (or Ockham's) Razor: that the simplest explanation is usually correct. 

Although there were a few clear moments of stage one in Ancient Greece - Eratosthenes' attempt to measure the size of the Earth using Euclidean Geometry being a prime example - it seems to have taken off in earnest with Galileo. Although his work is frequently mythologised (I follow the rolling weights rather than dropping objects from the Leaning Tower of Pisa brigade), Galileo most likely devised both actual and thought experiments to test fundamental findings, such as the separate effects of air resistance and gravity.

Of course, Galileo was primarily interested in physics but the other areas of science followed soon after. Systematic biology came to the fore in such practical work as the anatomical investigations of William Harvey - pioneer in the understanding of blood circulation - and the glass bead microscopes of Antony van Leeuwenhoek. The work of the latter, interestingly enough, was largely to understand how small-scale structure in edible substances created flavours.  It's also worth thinking about how this research expanded horizons: after all, no-one had ever seen the miniature marvels such as bacteria. I wonder how difficult the engravers of illustrated volumes found it, working from sketches and verbal descriptions on sights they have never seen themselves? But then again, no-one has ever directly imaged a quark either…

Talking of biology, we shouldn't ignore Carl Linnaeus, the Swedish scientist who started the cataloguing methodology in use today. New Zealand physicist Ernest Rutherford may have disparagingly referred to all branches of science other than physics as mere stamp collecting but apart from the wild inaccuracy of his statement it is seemingly obvious that without various standards of basic definitions there is no bedrock for more sophisticated research.

The repetitive, largely practical aspect of the phase in such disciplines as geology and taxonomy meant that largely untrained amateurs could make major contributions, such as the multitude of Victorian parsons (of whom Charles Darwin was almost a member) who worked on the quantity over quality principle in collecting and cataloguing immense amounts of data. Of course, Darwin went far beyond phase one but his work built on the evaluation of evolutionary ideas (try saying that three times fast) that numerous predecessors had discussed, from the Ancient Greeks to John Ray in the late Seventeenth Century.

This isn't to say that stage one science will be finished any time soon. The Human Genome Project is a good example of a principally descriptive project that generated many surprises, not least that it is proving more difficult than predicted to utilise the results in practical applications. Although in the BBC television series The Kingdom of Plants David Attenborough mentioned that the Royal Botanic Gardens at Kew contains 90% of known plant species, there are still plenty of remote regions - not to mention the oceans - yet to yield all their secrets to systematic scientific exploration.  In addition to the biota yet to be described in scientific records, the existing catalogues are in the process of major reorganisation. For example, the multitude of duplicate plant names is currently being addressed by taxonomic experts, having so far led to the finding of 600,000 superfluous designations. It isn't just plants either: a recent example was the announcement that DNA evidence suggests there is probably only a single species of giant squid rather than seven. It may sound tedious and repetitive, but without comprehensive labelling and description of natural elements, it would be impossible to progress to the next stage.

Stage 2

Who was the first person to move beyond cataloguing nature to in-depth analysis? We'll probably never know, but bearing in mind that some of the Ionian philosophers and Alexandrian Greeks performed practical experiments, it may well have been one of them.

By looking to explore why phenomena occur and events unfold the way they do, our species took a step beyond description to evaluation. If art is holding a mirror up to nature, then could the second phase be explained as holding a magnifying glass up to nature, reducing a phenomenon to an approximation, and explaining how that approximation works?

For example, Newton took Galileo and Kepler's astronomical work and ran with it, producing his Law of Universal Gravitation. The ‘how' in this case is the gravitational constant that explained how bodies orbit their common centre of gravity. However, Newton was unable to delineate what caused the force to act across infinite, empty space, a theory that had to wait for stage three.

So different from the smug, self-satisfied attitude of scientists at the beginning of the Twentieth Century, the techniques of modern science suggest that there is a feedback cycle in which knowing which questions to ask is at least as important as gaining answers, the adage in this case being ‘good experiments generate new questions'. Having said that, some of the largest and most expensive contemporary experiments such as the Laser Interferometer Gravitational-Wave Observatory (LIGO) and Large Hadron Collider (LHC) have each been principally designed to confirm a single hypothesis.

As recent evidence has shown, even some of the fundamentals of the nature, including dark matter and dark energy, are only just being recognised. Therefore science is a long way from recognising all first principles, let alone understanding them. Closer to home, that most complex of known objects, the human brain, still holds a lot of secrets, and probably will continue to do so for some time to come.
Though microelectronics in general and computers in particular have allowed the execution of experiments in such fields as quantum teleportation, considered close to impossible by the finest minds only half a century ago, there are several reasons why computer processing power is getting closer to a theoretical maximum using current manufacturing techniques and materials. Therefore the near future may see a slowing down in the sorts of leading edge experimental science that has been achieved in recent decades. But how much progress has been made in phase three science?

Stage 3

This is more difficult to define than the other two phases and can easily veer into philosophy, a discipline that has a poor press from many professional scientists. Physicist Richard Feynman for example is supposed to have disparaged it as ‘about as useful to scientists as ornithology is to birds'.  Despite this - and the probability that there as many philosophies of science as there are philosophers -  it's easy to see that the cutting edge of science, particularly theoretical physics, generates as much discussion over its validity as any work of art. If you've read one of the myriad critiques of superstring theory for example, then you will know that it can be viewed as a series of intellectual patterns (accompanied by diabolical equations) that may never be experimentally confirmed. In that case is string theory really just a collection of philosophical hypotheses, unproven by experiment or observation and likely to remain so? The minuteness of the scale (an underwhelming description if ever there was one) makes the prospect of directly recording strings themselves  - as opposed to their effects - highly unlikely.

If that is the case then just where can you draw the line between science and philosophy? Of course one of the fundamental tenets of a valid hypothesis is to make testable predictions that no other hypothesis can account for. But with over a century of theories that increasingly fail to follow common sense  or match everyday experience perhaps this is a sign of approaching maturity in science, as we finally advance beyond the crude limitations of our biological inheritance and its limited senses. Surely one key result of this is that the boundaries between new ideas promulgated by scientists and the thoughts of armchair philosophers will become increasingly blurred? Or is that just fighting talk?

Whereas scientists engaged in phase two investigations seek to find more accurate approximations for phenomena, phase three includes the search for why one theory is thought to be correct over another. A prominent example may help elucidate. Further to Galileo in phase one and Newton in phase two, Einstein's General Relativity, which explains the cause of gravity via the curvature of spacetime, is clearly an example of phase three. Of course, contemporary physicists would argue that Einstein's equations are already known to be lacking finality due to its incompatible with quantum mechanics. Herein lies the rub!

One problem that has caused dissension amongst many scientists is a possibly even more ‘ultimate' question: why is the universe finely tuned enough for life and more than that, intelligent life, to exist? The potential answers cover the entire gamut of human thought, from the conscious design principle supported by some religiously-minded scientists, to the invocation of the laws of probability in a multiverse hypothesis, requiring an immense number of universes all with the different fundamentals (and therefore including a lucky few capable of producing life). But the obvious issue here is that wouldn't Occam's Razor suggest the former is more likely than the latter? As Astronomer Royal Sir Martin Rees states, this is veering into metaphysical territory, which except for the scientists with religious convictions, is usually an area avoided like the plague. However, it may eventually become possible to run computer models that simulate the creation of multiple universes and so as bizarre as it seems, go some way to creating a workable theory out of something that to most people is still a purely philosophical notion. Talk about counting angels on a pinhead!

I can't say I'm entirely convinced by my own theory of three stages to science, but it's been interesting to see how the history and practice of the discipline can be fitted into it. After all, as stated earlier no-one has ever observed a quark, which in the first days of their formulation were sometimes seen as purely mathematical objects any way. So if you're doubtful I don't blame you, but never say never...

Friday 15 March 2013

Preaching to the unconverted: or how to convey science to the devout

It's said that charity begins at home. Likewise, a recent conversation I had with a pious Mormon started me thinking: just how do you promote science, both the method and the uncomfortable facts, to someone who has been raised to mistrust the discipline? Of course, there is a (hopefully) very small segment of the human race that will continue to ignore the evidence even after it is presented right in front of them, but stopping to consider those on the front line - such as biology teachers and ‘outed' atheists in the U.S. Bible Belt - how do you present a well-reasoned set of arguments to promote the theory and practice of science? 

It's relatively easy for the likes of Richard Dawkins to argue his case when he has large audiences of professionals or sympathetic listeners, but what is the best approach when endorsing science to a Biblical literalist on a one-to-one basis? The example above involved explaining just how we know the age of the Earth. Not being the first time I've been asked this, I was fully prepared to enlighten on the likes of uranium series dating, but not having to mention the 'D' words (Darwin or Dawkins) made this a relatively easy task. To aid any fans of science who might find themselves in a similar position I've put together a small toolkit of ideas, even if the conversation veers into that ultimate of controversial subjects, the evolution of the human race:
  1. A possible starting point is to be diffident, explaining the limitations of science and dispelling the notion that it isn't the catalogue of sundry facts it is sometimes described as (for example, in Bill Bryson's A Short History of Nearly Everything). It is difficult but nonetheless profitable to explain the concept that once-accepted elements of scientific knowledge can ostensibly be surpassed by later theories, only to maintain usefulness on a special case basis. A good illustration of this is Newton's Law of Universal Gravitation, which explains the force of gravity but not what creates it. Einstein's General Theory of Relativity provides a solution but Newton's Law is much easier to use, being accurate enough to use even to guide spacecraft. And since General Relativity cannot be combined with quantum mechanics, there is probably another theory waiting to be discovered…somewhere. As British astrophysicist and populariser John Gribbin has often pointed out, elements at the cutting edge of physics are sometimes only describable via metaphor, there not being anything within human experience that can be used as a comparison. Indeed, no-one has ever observed a quark and in the early days of the theory some deemed it just a convenient mathematical model. As for string theory, it's as bizarre as many a creation myth (although you might not want to admit that bit).
  2. Sometimes (as can be seen with Newton and gravity) the 'what' is known whilst the 'why' isn't. Even so, scientists can use the partial theories to extrapolate potential 'truths' or even exploit them via technology. Semi-conductors require quantum mechanics, a theory that no-one really understands. Indeed, no less a figure than Einstein refused to accept many of its implications.  There are many competing interpretations, some clearly more absurd than others, but that doesn't stop it being the most successful scientific theory ever, in terms of the correspondence between the equations and experimental data. So despite the uncertainty - or should that be Uncertainty (that's a pun, for the quantum mechanically-minded) - the theory is a cornerstone of modern physics.
  3. As far as I know, the stereotype of scientists as wild-haired, lab-coated, dispassionate and unemotional beings may stem from the Cold War, when the development of the first civilisation-destroying weapons led many to point their fingers at the inventors rather than their political paymasters. Yet scientists can be as creative as artists. Einstein conducted thought experiments, often aiming for a child-like simplicity, in order to obtain results. The idea that logic alone makes a good scientist is clearly bunkum. Hunches and aesthetics can prove as pivotal as experimental data or equations.
  4. Leading on from this, scientists are just as fallible as the rest of us. Famous examples range from Fred Hoyle's belief in the Steady State theory (and strangely, that the original Archaeopteryx fossils are fakes) through to the British scientific establishment's forty-year failure to recognise that the Piltdown Man finds were crude fakes. However, it isn't always as straightforward as these examples: Einstein's greatest blunder - the cosmological constant - was abandoned after the expansion of the universe was discovered, only for it to reappear in recent years as the result of dark energy. And of course mistakes can prove more useful than finding the correct answer the first time!
  5. There are numerous examples of deeply religious scientists, from Kepler and Newton via Gregor Mendel, the founder of genetics, to the contemporary British particle physicist the Reverend John Polkinghorne. Unlike the good versus evil dichotomy promoted by Hollywood movies, it's rarely a case of us versus them.
  6. Although there are searches for final theories such as the Grand Unified Theory of fundamental forces, one of the current aspects of science that differs profoundly from the attitudes of a century or so ago is that there is the possibility of never finding a final set of solutions. Indeed, a good experiment should generate as many new questions as it answers.
  7. If you feel that you're doing well, you could explain how easy it is to be fooled by non-existent patterns and that our brains aren't really geared up for pure logic. It's quite easy to apparently alter statistics using left- or right-skewed graphs, or to use a logarithmic scale on one axis. In addition, we recognise correlations that just aren't there but we which we would like to think are true. In the case of my Mormon colleague he was entrenched in the notion of UFOs as alien spacecraft! At this point you could even conduct an experiment: make two drawings, one of a constellation and one of evenly-spaced dots, and ask them to identify which one is random. Chances are they will pick the latter. After all, every culture has seen pictures in the random placements of stars in the night sky (or the face of Jesus in a piece of toast).
Constellation vs random dots
Ursa Major (see what you like) vs evenly-spaced dots

So to sum up:
  1. There's a fuzzy line at the cutting edge of physics and no-one understands what most of it means;
  2. We've barely started answering fundamental questions, and there are probably countless more we don't even know to ask yet;
  3. Science doesn't seek to provide comforting truths, only gain objective knowledge, but...
  4. ...due to the way our brains function we can never remove all subjectivity from the method;
  5. No one theory is the last word on a subject;
  6. Prominent scientists easily make mistakes;
  7. And most of all, science is a method for finding out about reality, not a collection of carved-in-stone facts.
So go out there and proselytise. I mean evangelise. Err...spread the word. Pass on the message. You get the picture: good luck!

Monday 27 February 2012

Predators vs poisons: the ups and downs of biological control

Ever since Darwin, islands and island groups have been known as prominent natural laboratories of evolution. Their isolation leads to radiation of species from a single common ancestor, the finches and giant tortoises of the Galapagos Islands providing a classic example. But a small population restricted in range also means that many island species are extremely susceptible to external factors, rapid extinction being the ultimate result - as can be seen from the dodo onwards. Living as I do on an island (New Zealand counts within the terms of this discussion, as I will explain) has led me to explore what a foreign invasion can do to a local population.

Either through direct hunting or the actions of imported Polynesian dogs and rats, almost half the native vertebrate fauna was wiped out within a few centuries of humans arriving in New Zealand; so much for the myth of pre-technological tribes living in ecological harmony! But the deliberate introduction of a new species to pray on another is now a much-practised and scientifically-supported technique. One of the late Stephen Jay Gould's most moving essays concerned the plight of the Partula genus of snails on the Society Islands of Polynesia. The story starts with the introduction of edible Achatina snails to the islands as food, only for some to escape and become an agricultural pest. In 1977 the Euglandina cannibal wolfsnail was brought in as a method of biological control, the idea being that they would eat the crop munchers. Unfortunately, the latest wave of immigrant gastropods ignored the Achatina and went after the local species instead. The results were devastating: in little more than a decade, many species of Partula had become extinct in their native habitat.

(As an interesting aside, the hero of Gould's Partula vs. Euglandina story is gastropod biologist Henry Crampton, whose half century of research into the genus is presumably no longer relevant in light of the decimation of many species. Yet Crampton, born in 1875, worked in typical Victorian quantitative fashion and during a single field trip managed to collect 116,000 specimens from just a single island, Moorea. I have no idea how many individual snails existed at the time, but to me this enormous number removed from breeding population in the name of scientific research was unlikely to do anything for the genus. I wonder whether comparable numbers of organisms are still being collected by researchers today: somehow I doubt it!)

The Society Islands is not the only place where the deliberate introduction of Euglandina has led to the unintended devastation of indigenous snail species: Hawaii and its native Achatinella and Bermuda's Poecilozonites have suffered a similar fate to Partula. Gould used the example of the Partula as a passionate plea (invoking 'genocide' and 'wholesale slaughter') to prevent further inept biological control programmes, but do these examples justify banning the method in totality?

The impetus for this post came from a recent visit to my local wetlands reserve, when my daughters played junior field biologists and netted small fish in order to examine them in a portable environment container (alright, a jam jar) - before of course returning them to the stream alive. The main fish species they caught was Gambusia, which originates from the Gulf of Mexico but was introduced to New Zealand in the 1930s as a predator of mosquito larvae. However, akin to Euglandina it has had a severe impact on many other fish species and is now rightly considered a pest. In fact, it's even illegal to keep them in a home aquarium, presumably just in case you accidentally aid their dispersion. Australia has also tried introducing Gambusia to control the mosquito population, but there is little data to show it works there either. The latter nation also provides a good illustration of environmental degradation via second- and third-hand problems originating from deliberate introduction. For example, the cane toad was imported to control several previously introduced beetle species but instead rapidly decimated native fauna, including amphibians and reptiles further up the food chain, via toad-vectored diseases.

Gambusia: the aggressive mosquito fish
Gambusia affinis: a big problem in a small fish

This isn't to say that there haven't been major successes with the technique. An early example concerns a small insect called the cottony cushion scale, which began to have a major impact on citrus farming in late Nineteenth Century California. It was brought under control by the introduction of several Australian fly and beetle species and without any obvious collateral damage, as the military might phrase it. But considering the extinction history of New Zealand since humans arrived, I've been amazed to discover just how many organisms have been deliberately introduced as part of biological control schemes, many in the past quarter century. For instance, twenty-one insect and mite species have been brought over to stem the unrestrained growth of weeds such as ragwort and gorse, although the rates of success have been extremely mixed (Old man's beard proving a complete failure, for example). As for controlling unwelcome fauna in New Zealand, a recent promising research programme involves the modification of parasites that could inhibit possum fertility. This is something of a necessity considering possums (first imported from Australia in the 1830s and now numbering around sixty million) are prominent bovine tuberculosis vectors.

Stephen Jay Gould was a well-known promoter of the importance of contingency within evolution, and how a re-run of any specific branch of life would only lead to a different outcome. So the question has to be asked, how do biologists test the effect of outsider species on an ecosystem (i.e. within laboratory conditions) when only time will show whether the outcome is as intended? No amount of research will show whether an unknown factor might, at an unspecified time during or after the eradication programme, have a negative impact. It could have been argued in the past that the relative cheapness of biological control compared to alternatives such as poison or chemicals made it the preferable option. However, I imagine the initial costs, involving lengthy testing cycles, mean that it is no longer a cut price alternative.

Considering the recent developments in genetic modification (GM), I wonder whether researchers have been looking into ways of minimising unforeseen dangers? For example, what about the possibility of tailoring the lifespan of the control organism? In other words, once the original invasive species has been eliminated, the predator would also rapidly die out (perhaps by something as simple as being unable to switch to an alternative food source, of which there are already many examples in nature). Or does that sound too much like the replicant-designing Dr Eldon Tyrell in Blade Runner?

One promising recent use of GM organisms as a biological control method has been part of the fight to eradicate disease-carrying (female) mosquitos. Any female offspring of the genetically altered male mosquitos are incapable of flight and thus are unable to infect humans or indeed reproduce. However, following extremely positive cage-based testing in Mexico, researchers appear to have got carried away with their achievements and before you could say 'peer review' they conducted assessments directly in the wild in Malaysia, where I assume there is little GM regulation or public consultation. Therefore test results from one location were extrapolated to another with a very different biota, without regard for knock-on effects such as what unwelcome species might come out of the woodwork to fill the gap in the ecosystem. When stakes are so high, the sheer audacity of the scientists involved appears breathtaking. Like Dr Tyrell, we play god at our peril; let us hope we don't come to an equally sticky end at the hands of our creation...