Showing posts with label DNA. Show all posts
Showing posts with label DNA. Show all posts
Friday, 21 December 2018
The Twelve (Scientific) Days Of Christmas
As Christmas approaches and we get over-saturated in seasonal pop songs and the occasional carol, I thought it would be appropriate to look at a science-themed variation to this venerable lyric. So without further ado, here are the twelve days of Christmas, STEM-style.
Wednesday, 12 December 2018
New neurons: astrocytes, gene therapy and the public fear of brain modification
Ever since the first cyberpunk novels of the early 1980s - and the massive increase of public awareness in the genre thanks to Hollywood - the idea of artificially-enhanced humans has been a topic of intense discussion. Either via direct augmentation of the brain or the development of a brain-computer interface (BCI), the notion of Homo superior has been associated with a dystopian near-future that owes much to Aldous Huxley's Brave New World. After reading about current research into repairing damaged areas of the brain and spinal cord, I thought it would be good to examine this darkly-tinged area.
Back in 2009 I posted about how science fiction has to some extent been confused with science fact, which coupled with the fairly appalling quality of much mainstream media coverage of science stories, has led to public fear where none is necessary and a lack of concern where there should be heaps. When it comes to anything suggestive of enhancing the mind, many people immediately fall back on pessimistic fictional examples, from Frankenstein to Star Trek's the Borg. This use of anti-scientific material in the consideration of real-world STEM is not an optimal response, to say the least.
Rather than working to augment normal humans, real research projects on the brain are usually funded on the basis that they will generate improved medical techniques for individuals with brain or spinal cord injuries. However, a combination of the fictional tropes mentioned above and the plethora of internet-disseminated conspiracy theories, usually concerning alleged secret military projects, have caused the public to concentrate on entirely the wrong aspects.
The most recent material I have read concerning cutting-edge work on the brain covers three teams' use of astrocytes to repair damaged areas. This is an alternative to converting induced pluripotent stem cells (iPSCs) to nerve cells, which has shown promise for many other types of cell. Astrocytes are amazing things, able to connect with several million synapses. Apparently Einstein's brain had far more of them than usual in the region connected with mathematical thinking. The big question would be whether this accumulation was due to nature or nurture, the latter being the high level of exercise Einstein demanded of this region of his brain.
Astrocyte research for brain and spinal cord repair has been ongoing since the 1990s, in order to discover if they can be reprogrammed as functional replacements for lost neurons without any side effects. To this end, mice have been deliberately brain-damaged and then attempts made to repair that damage via converted astrocytes. The intention is to study if stroke victims could be cured via this method, although there are hopes that eventually it may also be a solution for Parkinson's disease, Alzheimer's and even ALS (motor neurone disease). The conversion from astrocyte to neuron is courtesy of a virus that introduces the relevant DNA, although none of the research has as yet proven that the converted cells are fully functional neurons.
Therefore, it would seem we are some decades away from claiming that genetic manipulation can cure brain-impairing diseases. But geneticists must share some of the blame for giving the public the wrong impression. The hyperbole surrounding the Human Genome Project gave both public and medical workers a false sense of optimism regarding the outcome of the genome mapping. In the late 1990s, a pioneer gene therapist predicted that by 2020 virtually every disease would include gene therapy as part of the treatment. We are only just over a year short of this date, but most research is still in first phase trial - and only concern diseases that don't have a conventional cure. It turned out that the mapping was just the simplest stage of a multi-part programme to understand the complexities of which genes code for which disorders.
Meanwhile, gene expression in the form of epigenetics has inspired a large and extremely lucrative wave of pseudo-scientific quackery that belongs in the same genre as homeopathy, crystal healing and all the other New Age flim-flam that uses real scientific terminology to part the gullible from their cash. The poor standard of science education outside of schools (and in many regions, probably within them too) has led to the belief that changing your lifestyle can fix genetic defects or affect cures of serious brain-based illnesses.
Alas, although gene expression can be affected by environmental influences, we are ultimately at the mercy of what we inherited from our parents. Until the astrocyte research has been verified, or a stem cell solution found, the terrible truth is that the victims of strokes and other brain-based maladies must rely upon established medical treatments.
This isn't to say that we may in some cases be able to reduce or postpone the risk with a better lifestyle; diet and exercise (of both the body and brain) are clearly important, but they won't work miracles. We need to wait for the outcome of the current research into astrocytes and iPSCs to find out if the human brain can be repaired after devastating attacks from within or without. Somehow I doubt that Homo superior is waiting round the corner, ready to take over the world from us unenhanced humans…
Back in 2009 I posted about how science fiction has to some extent been confused with science fact, which coupled with the fairly appalling quality of much mainstream media coverage of science stories, has led to public fear where none is necessary and a lack of concern where there should be heaps. When it comes to anything suggestive of enhancing the mind, many people immediately fall back on pessimistic fictional examples, from Frankenstein to Star Trek's the Borg. This use of anti-scientific material in the consideration of real-world STEM is not an optimal response, to say the least.
Rather than working to augment normal humans, real research projects on the brain are usually funded on the basis that they will generate improved medical techniques for individuals with brain or spinal cord injuries. However, a combination of the fictional tropes mentioned above and the plethora of internet-disseminated conspiracy theories, usually concerning alleged secret military projects, have caused the public to concentrate on entirely the wrong aspects.
The most recent material I have read concerning cutting-edge work on the brain covers three teams' use of astrocytes to repair damaged areas. This is an alternative to converting induced pluripotent stem cells (iPSCs) to nerve cells, which has shown promise for many other types of cell. Astrocytes are amazing things, able to connect with several million synapses. Apparently Einstein's brain had far more of them than usual in the region connected with mathematical thinking. The big question would be whether this accumulation was due to nature or nurture, the latter being the high level of exercise Einstein demanded of this region of his brain.
Astrocyte research for brain and spinal cord repair has been ongoing since the 1990s, in order to discover if they can be reprogrammed as functional replacements for lost neurons without any side effects. To this end, mice have been deliberately brain-damaged and then attempts made to repair that damage via converted astrocytes. The intention is to study if stroke victims could be cured via this method, although there are hopes that eventually it may also be a solution for Parkinson's disease, Alzheimer's and even ALS (motor neurone disease). The conversion from astrocyte to neuron is courtesy of a virus that introduces the relevant DNA, although none of the research has as yet proven that the converted cells are fully functional neurons.
Therefore, it would seem we are some decades away from claiming that genetic manipulation can cure brain-impairing diseases. But geneticists must share some of the blame for giving the public the wrong impression. The hyperbole surrounding the Human Genome Project gave both public and medical workers a false sense of optimism regarding the outcome of the genome mapping. In the late 1990s, a pioneer gene therapist predicted that by 2020 virtually every disease would include gene therapy as part of the treatment. We are only just over a year short of this date, but most research is still in first phase trial - and only concern diseases that don't have a conventional cure. It turned out that the mapping was just the simplest stage of a multi-part programme to understand the complexities of which genes code for which disorders.
Meanwhile, gene expression in the form of epigenetics has inspired a large and extremely lucrative wave of pseudo-scientific quackery that belongs in the same genre as homeopathy, crystal healing and all the other New Age flim-flam that uses real scientific terminology to part the gullible from their cash. The poor standard of science education outside of schools (and in many regions, probably within them too) has led to the belief that changing your lifestyle can fix genetic defects or affect cures of serious brain-based illnesses.
Alas, although gene expression can be affected by environmental influences, we are ultimately at the mercy of what we inherited from our parents. Until the astrocyte research has been verified, or a stem cell solution found, the terrible truth is that the victims of strokes and other brain-based maladies must rely upon established medical treatments.
This isn't to say that we may in some cases be able to reduce or postpone the risk with a better lifestyle; diet and exercise (of both the body and brain) are clearly important, but they won't work miracles. We need to wait for the outcome of the current research into astrocytes and iPSCs to find out if the human brain can be repaired after devastating attacks from within or without. Somehow I doubt that Homo superior is waiting round the corner, ready to take over the world from us unenhanced humans…
Wednesday, 13 June 2018
Debunking DNA: A new search for the Loch Ness monster
I was recently surprised to read that a New Zealand genomics scientist, Neil Gemmell of Otago University, is about to lead an international team in the search for the Loch Ness monster. Surely, I thought, that myth has long since been put to bed and is only something exploited for the purposes of tourism? I remember some years ago that a fleet of vessels using side-sweeping sonar had covered much of the loch without discovering anything conclusive. When combined with the fact that the most famous photograph is a known fake and the lack of evidence from the plethora of tourist cameras (never mind those of dedicated Nessie watchers) that have convened on the spot, the conclusion seems obvious.
I've put together a few points that don't bode well for the search, even assuming that Nessie is a 'living fossil' (à la coelacanth) rather than a supernatural creature; the usual explanation is a cold water-adapted descendant of a long-necked plesiosaur - last known to have lived in the Cretaceous Period:
However, I then read that separate from the headline-grabbing monster hunt, the expedition's underlying purpose concerns environmental DNA sampling, a type of test never before used at Loch Ness. Gemmell's team have proffered a range of scientifically valid reasons for their project:
Imagine if NASA could only get funding for Earth observation missions by including the potential to prove whether our planet was flat or not? (Incidentally, you might think a flat Earth was just the territory of a few nutbars, but a poll conducted in February this year suggests that fully two percent of Americans are convinced the Earth is a disk, not spherical).
Back to reality. Despite the great work of scientists who write popular books and hold lectures on their area of expertise, it seems that the media - particularly Hollywood - are the primary source of science knowledge to the general public. Hollywood's version of de-extinction science, particularly for ancient species such as dinosaurs, seems to be far better known than the relatively unglamorous reality. Dr Beth Shapiro's book How to clone a mammoth for example is an excellent introduction to the subject, but would find it difficult to compete along side the adventures of the Jurassic World/Park films.
The problem is that many if not most people want to believe in a world that is more exciting than their daily routine would suggest, with cryptozoology offering itself as an alternative to hard science thanks to its vast library of sightings over the centuries. Of course it's easy to scoff: one million tourists visit Loch Ness each year but consistently fail to find anything; surely in this case absence of evidence is enough to prove evidence of absence?
The Loch Ness monster is of course merely the tip of the mythological creature iceberg. The Wikipedia entry on cryptids lists over 170 species - can they all be just as suspect? The deep ocean is the best bet today for large creatures new to science. In a 2010 post I mentioned that the still largely unexplored depths could possibly contain unknown megafauna, such as a larger version of the oarfish that could prove to be the fabled sea serpent.
I've long had a fascination with large creatures, both real (dinosaurs, of course) and imaginary. When I was eight years old David Attenborough made a television series called Fabulous Animals and I had the tie-in book. In a similar fashion to the new Loch Ness research project, Attenborough used the programmes to bring natural history and evolutionary biology to a pre-teen audience via the lure of cryptozoology. For example, he discussed komodo dragons and giant squid, comparing extant megafauna to extinct species such as woolly mammoth and to mythical beasts, including the Loch Ness Monster.
A few years later, another television series that I avidly watched covered some of the same ground, namely Arthur C. Clarke's Mysterious World. No less than four episodes covered submarine cryptozoology, including the giant squid, sea serpents and of course Nessie him (or her) self. Unfortunately the quality of such programmes has plummeted since, although as the popularity of the (frankly ridiculous) seven-year running series Finding Bigfoot shows, the public have an inexhaustible appetite for this sort of stuff.
I've read that it is estimated only about ten percent of extinct species have been discovered in the fossil record, so there are no doubt some potential surprises out there (Home floresiensis, anyone?) However, the evidence - or lack thereof - seems firmly stacked against the Loch Ness monster. What is unlikely though is that the latest expedition will dampen the spirits of the cryptid believers. A recent wolf-like corpse found in Montana, USA, may turn out to be coyote-wolf hybrid, but this hasn't stopped the Bigfoot and werewolf fans from spreading X-Files style theories across the internet. I suppose it’s mostly harmless fun, and if Professor Gemmell’s team can spread some real science along the way, who am I to argue with that? Long live Nessie!
I've put together a few points that don't bode well for the search, even assuming that Nessie is a 'living fossil' (à la coelacanth) rather than a supernatural creature; the usual explanation is a cold water-adapted descendant of a long-necked plesiosaur - last known to have lived in the Cretaceous Period:
- Loch Ness was formed by glacial action around 10,000 years ago, so where did Nessie come from?
- Glacial action implies no underwater caves for hiding in
- How can a single creature maintain a long-term population (the earliest mentions date back thirteen hundred years)?
- What does such a large creature eat without noticeably reducing the loch's fish population?
- Why have no remains ever been found, such as large bones, even on sonar?
However, I then read that separate from the headline-grabbing monster hunt, the expedition's underlying purpose concerns environmental DNA sampling, a type of test never before used at Loch Ness. Gemmell's team have proffered a range of scientifically valid reasons for their project:
- To survey the loch's ecosystem, from bacteria upwards
- Demonstrate the scientific process to the public (presumably versus all the pseudoscientific nonsense surrounding cryptozoology)
- Test for trace DNA from potential but realistic causes of 'monster' sightings, such as large sturgeon or catfish
- Understand local biodiversity with a view to conservation, especially as regards the effect caused by invasive species such as the Pacific pink salmon.
Imagine if NASA could only get funding for Earth observation missions by including the potential to prove whether our planet was flat or not? (Incidentally, you might think a flat Earth was just the territory of a few nutbars, but a poll conducted in February this year suggests that fully two percent of Americans are convinced the Earth is a disk, not spherical).
Back to reality. Despite the great work of scientists who write popular books and hold lectures on their area of expertise, it seems that the media - particularly Hollywood - are the primary source of science knowledge to the general public. Hollywood's version of de-extinction science, particularly for ancient species such as dinosaurs, seems to be far better known than the relatively unglamorous reality. Dr Beth Shapiro's book How to clone a mammoth for example is an excellent introduction to the subject, but would find it difficult to compete along side the adventures of the Jurassic World/Park films.
The problem is that many if not most people want to believe in a world that is more exciting than their daily routine would suggest, with cryptozoology offering itself as an alternative to hard science thanks to its vast library of sightings over the centuries. Of course it's easy to scoff: one million tourists visit Loch Ness each year but consistently fail to find anything; surely in this case absence of evidence is enough to prove evidence of absence?
The Loch Ness monster is of course merely the tip of the mythological creature iceberg. The Wikipedia entry on cryptids lists over 170 species - can they all be just as suspect? The deep ocean is the best bet today for large creatures new to science. In a 2010 post I mentioned that the still largely unexplored depths could possibly contain unknown megafauna, such as a larger version of the oarfish that could prove to be the fabled sea serpent.
I've long had a fascination with large creatures, both real (dinosaurs, of course) and imaginary. When I was eight years old David Attenborough made a television series called Fabulous Animals and I had the tie-in book. In a similar fashion to the new Loch Ness research project, Attenborough used the programmes to bring natural history and evolutionary biology to a pre-teen audience via the lure of cryptozoology. For example, he discussed komodo dragons and giant squid, comparing extant megafauna to extinct species such as woolly mammoth and to mythical beasts, including the Loch Ness Monster.
A few years later, another television series that I avidly watched covered some of the same ground, namely Arthur C. Clarke's Mysterious World. No less than four episodes covered submarine cryptozoology, including the giant squid, sea serpents and of course Nessie him (or her) self. Unfortunately the quality of such programmes has plummeted since, although as the popularity of the (frankly ridiculous) seven-year running series Finding Bigfoot shows, the public have an inexhaustible appetite for this sort of stuff.
I've read that it is estimated only about ten percent of extinct species have been discovered in the fossil record, so there are no doubt some potential surprises out there (Home floresiensis, anyone?) However, the evidence - or lack thereof - seems firmly stacked against the Loch Ness monster. What is unlikely though is that the latest expedition will dampen the spirits of the cryptid believers. A recent wolf-like corpse found in Montana, USA, may turn out to be coyote-wolf hybrid, but this hasn't stopped the Bigfoot and werewolf fans from spreading X-Files style theories across the internet. I suppose it’s mostly harmless fun, and if Professor Gemmell’s team can spread some real science along the way, who am I to argue with that? Long live Nessie!
Monday, 11 September 2017
Valuing the velvet worm: noticing the most inconspicuous of species
Most of the recent television documentaries or books I've encountered that discuss extra-terrestrial life include some description of the weirder species we share our own planet with. Lumped together under the term 'extremophiles' these organisms appear to thrive in environments hostile to most other life forms, from the coolant ponds of nuclear reactors to the boiling volcanic vents of the deep ocean floor.
Although this has rightly gained attention for these often wonderfully-named species (from snottites to tardigrades) there are numerous other lifeforms scarcely noticed by anyone other than a few specialists, quietly going about their unassuming business. However, they may provide a few useful lessons for all of us, including that we should acknowledge there may be unrecognised problems generated when we make rapid yet radical modifications to local environments.
There is a small, unassuming type of creature alive today that differs little from a marine animal present in the Middle Cambrian period around five hundred million years ago. I first read about onychophorans in Stephen Jay Gould's 1989 exposition on the Burgess Shale, Wonderful Life, and although those fossil marine lobopodians are not definitively onychophorans they are presumed to be ancestral. More commonly known by one genus, peripatus, or even more colloquially as velvet worms, there are at least several hundred species around today, possibly many more. The velvet component of their name is due to their texture, but they bear more resemblance to caterpillars than to worms. They are often described as the ‘missing link' between arthropods and worms but as is usually the case this is a wildly inappropriate term in this context of biological classification. The key difference to the Burgess Shale specimens is that today's velvet worms are fully terrestrial: there are no known marine or freshwater species.
Primarily resident in the southern hemisphere, the largely nocturnal peripatus shun bright light and requiring humid conditions to survive. Although there are about thirty species here in New Zealand, a combination of their small size (under 60mm long) and loss of habitat means they are rarely seen. The introduction of predators such as hedgehogs - who of course never meet peripatus in their northern hemisphere home territory - means that New Zealand's species have even more to contend with. Although I frequently (very carefully) look under leaf litter and inside damp logs on bush walks in regions known to contain the genus Peripatoides - and indeed where others have told me they have seen them - I have yet to encounter a single specimen.
There appears to be quite limited research, with less than a third of New Zealand species fully described. However, enough is known about two species to identify their population status as 'vulnerable'. One forest in the South Island has been labelled an 'Area of Significant Conservation Value' thanks to its population of peripatus, with the Department of Conservation relocating specimens prior to road development. Clearly, they had better luck locating velvet worms than I have had! It isn't just the New Zealand that lacks knowledge of home-grown onychophorans either: in the past two decades Australian researchers have increased the number of their known species from just seven to about sixty.
Their uncanny resemblance to the Burgess Shale specimens, despite their transition from marine to terrestrial environments, has led velvet worms to be described by another well-worn phrase, 'living fossils'. However, is this short-hand in any way useful, or is it a lazy and largely inaccurate term? The recent growth in sophisticated DNA analysis suggests that even when outward anatomy may be change little, the genome itself may vary widely. Obviously DNA doesn't preserve in fossils and so any such changes cannot be tracked from the Cambrian specimens, but the genetic variation found in other types of organisms sharing a similar appearance shows that reliance on just external anatomy can be deceptive.
Due to lack of funding, basic taxonomic research, the bedrock for cladistics, is sadly lacking. In the case of New Zealand, some of the shortfall has been made up for by dedicated amateurs, but there are few new taxonomists learning the skills to continue this work - which is often seen as dull and plodding compared to the excitement of, for example, genetics. Most people might say so what interest could there be in such tiny, insignificant creatures as peripatus? After all, how likely would you be to move an ant's nest in your garden before undertaking some re-landscaping? But as shown by the changing terminology from 'food chains' to 'food webs', in most cases we still don't understand how the removal of one species might generate a domino effect on a local ecosystem.
I've previously discussed the over-reliance on 'poster' species such as giant pandas for environmental campaigns, but mere aesthetics don't equate to importance, either for us or ecology as a whole. It is becoming increasingly clear that by weight the majority of our planet's biomass is microbial. Then come the insects, with the beetles prominent both by number of species and individuals. Us large mammals are really just the icing on the cake and certainly when it comes to Homo sapiens, the rest of the biosphere would probably be far better off without us, domesticated species aside.
It would be nice to value organisms for themselves, but unfortunately our market economies require the smell of profit before they will lift a finger. Therefore if their usefulness could be ascertained, it might help generate greater financial incentive to support the wider environment. Onychophorans may seem dull, but there are several aspects to them that is both interesting in itself and might also provide something fruitful for us humans.
Firstly, they have an unusual weapon in the form of a mechanism that shoots adhesive slime at prey. Like spider silk, is it possible that this might prove an interesting line of research in the materials or pharmaceutical industries? After all, it was the prickly burrs of certain plants that inspired the development of Velcro, whilst current studies of tardigrades (the tiny 'water bears' living amongst the mosses) are investigating their near indestructability. If even a single, tiny species becomes extinct, that genome is generally lost forever: who knows what insights it might have led to? Although museum collections can be useful, DNA does decay and contamination leads to immense complexities in unravelling the original organism's genome. All in all, it's much better to have a living population to work on than rely on what can be pieced together post-extinction.
In addition, for such tiny creatures, velvet worms have developed complex social structures; is it possible that analysis of their brains might be useful in computing or artificial intelligence? Of course it is unlikely - and extinction is nothing if not natural - but the current rate is far greater than it has been outside of mass extinctions. Losing a large and obvious species such as the Yangtze River dolphin (and that was despite it being labelled a ‘national treasure') is one thing, but how many small, barely-known plants and animals are going the same way without anyone noticing? Could it be that right now some minute, unassuming critter is dying out and that we will only find out too late that it was a vital predator of crop-eating pests like snails or disease vectors such as cockroaches?
It has been said that ignorance is bliss, but with so many humans needing to be fed, watered and treated for illness, now more than ever we need as much help as we can get. Having access to the complex ready-made biochemistry of a unique genome is surely easier than attempting to synthesise one from scratch or recover it from a long-dead preserved specimen? By paying minimal attention to the smallest organisms that lie all around us, we could be losing so much more than just an unobtrusive plant, animal or fungus.
We can't save every species on the current endangered list but more attention could be given to the myriad of life forms that get side-lined by the cute and cuddly flagship species, usually large animals. Most of us would be upset by the disappearance of the eighteen hundred or so giant pandas still left in the wild, but somehow I doubt their loss would have as great an impact on the surrounding ecosystem than that of some far less well known flora or fauna. If you think that's nonsense, then consider the vital roles that bees and dung beetles play in helping human agriculture.
Although the decimation of native New Zealand wildlife has led to protective legislation for all our vertebrates and a few famous invertebrates such as giant weta, the vast majority of other species are still left to their own devices. That's not to say that the ecosystems in most other countries are given far less support, of course. But without funding for basic description and taxonomy, who knows what is even out there, never mind whether it might be important to humanity? Could it be that here is a new field for citizen scientists to move into?
Needless to say, the drier climes brought on by rising temperatures will not do peripatus any favours, thanks to its need to remain in damp conditions. Whether by widespread use of the poison 1080 (in the bid to create a pest-free New Zealand by 2050) or the accidental importation of a non-native fungus such as those decimating amphibians worldwide and causing kauri dieback in New Zealand, there are plenty of ways that humans could unwittingly wipe out velvet worms, etal. So next time you watch a documentary on the demise of large, familiar mammals, why not spare a thought for all those wee critters hiding in the bush, going about their business and trying to avoid all the pitfalls us humans have unthinkingly laid for them?
Although this has rightly gained attention for these often wonderfully-named species (from snottites to tardigrades) there are numerous other lifeforms scarcely noticed by anyone other than a few specialists, quietly going about their unassuming business. However, they may provide a few useful lessons for all of us, including that we should acknowledge there may be unrecognised problems generated when we make rapid yet radical modifications to local environments.
There is a small, unassuming type of creature alive today that differs little from a marine animal present in the Middle Cambrian period around five hundred million years ago. I first read about onychophorans in Stephen Jay Gould's 1989 exposition on the Burgess Shale, Wonderful Life, and although those fossil marine lobopodians are not definitively onychophorans they are presumed to be ancestral. More commonly known by one genus, peripatus, or even more colloquially as velvet worms, there are at least several hundred species around today, possibly many more. The velvet component of their name is due to their texture, but they bear more resemblance to caterpillars than to worms. They are often described as the ‘missing link' between arthropods and worms but as is usually the case this is a wildly inappropriate term in this context of biological classification. The key difference to the Burgess Shale specimens is that today's velvet worms are fully terrestrial: there are no known marine or freshwater species.
Primarily resident in the southern hemisphere, the largely nocturnal peripatus shun bright light and requiring humid conditions to survive. Although there are about thirty species here in New Zealand, a combination of their small size (under 60mm long) and loss of habitat means they are rarely seen. The introduction of predators such as hedgehogs - who of course never meet peripatus in their northern hemisphere home territory - means that New Zealand's species have even more to contend with. Although I frequently (very carefully) look under leaf litter and inside damp logs on bush walks in regions known to contain the genus Peripatoides - and indeed where others have told me they have seen them - I have yet to encounter a single specimen.
There appears to be quite limited research, with less than a third of New Zealand species fully described. However, enough is known about two species to identify their population status as 'vulnerable'. One forest in the South Island has been labelled an 'Area of Significant Conservation Value' thanks to its population of peripatus, with the Department of Conservation relocating specimens prior to road development. Clearly, they had better luck locating velvet worms than I have had! It isn't just the New Zealand that lacks knowledge of home-grown onychophorans either: in the past two decades Australian researchers have increased the number of their known species from just seven to about sixty.
Their uncanny resemblance to the Burgess Shale specimens, despite their transition from marine to terrestrial environments, has led velvet worms to be described by another well-worn phrase, 'living fossils'. However, is this short-hand in any way useful, or is it a lazy and largely inaccurate term? The recent growth in sophisticated DNA analysis suggests that even when outward anatomy may be change little, the genome itself may vary widely. Obviously DNA doesn't preserve in fossils and so any such changes cannot be tracked from the Cambrian specimens, but the genetic variation found in other types of organisms sharing a similar appearance shows that reliance on just external anatomy can be deceptive.
Due to lack of funding, basic taxonomic research, the bedrock for cladistics, is sadly lacking. In the case of New Zealand, some of the shortfall has been made up for by dedicated amateurs, but there are few new taxonomists learning the skills to continue this work - which is often seen as dull and plodding compared to the excitement of, for example, genetics. Most people might say so what interest could there be in such tiny, insignificant creatures as peripatus? After all, how likely would you be to move an ant's nest in your garden before undertaking some re-landscaping? But as shown by the changing terminology from 'food chains' to 'food webs', in most cases we still don't understand how the removal of one species might generate a domino effect on a local ecosystem.
I've previously discussed the over-reliance on 'poster' species such as giant pandas for environmental campaigns, but mere aesthetics don't equate to importance, either for us or ecology as a whole. It is becoming increasingly clear that by weight the majority of our planet's biomass is microbial. Then come the insects, with the beetles prominent both by number of species and individuals. Us large mammals are really just the icing on the cake and certainly when it comes to Homo sapiens, the rest of the biosphere would probably be far better off without us, domesticated species aside.
It would be nice to value organisms for themselves, but unfortunately our market economies require the smell of profit before they will lift a finger. Therefore if their usefulness could be ascertained, it might help generate greater financial incentive to support the wider environment. Onychophorans may seem dull, but there are several aspects to them that is both interesting in itself and might also provide something fruitful for us humans.
Firstly, they have an unusual weapon in the form of a mechanism that shoots adhesive slime at prey. Like spider silk, is it possible that this might prove an interesting line of research in the materials or pharmaceutical industries? After all, it was the prickly burrs of certain plants that inspired the development of Velcro, whilst current studies of tardigrades (the tiny 'water bears' living amongst the mosses) are investigating their near indestructability. If even a single, tiny species becomes extinct, that genome is generally lost forever: who knows what insights it might have led to? Although museum collections can be useful, DNA does decay and contamination leads to immense complexities in unravelling the original organism's genome. All in all, it's much better to have a living population to work on than rely on what can be pieced together post-extinction.
In addition, for such tiny creatures, velvet worms have developed complex social structures; is it possible that analysis of their brains might be useful in computing or artificial intelligence? Of course it is unlikely - and extinction is nothing if not natural - but the current rate is far greater than it has been outside of mass extinctions. Losing a large and obvious species such as the Yangtze River dolphin (and that was despite it being labelled a ‘national treasure') is one thing, but how many small, barely-known plants and animals are going the same way without anyone noticing? Could it be that right now some minute, unassuming critter is dying out and that we will only find out too late that it was a vital predator of crop-eating pests like snails or disease vectors such as cockroaches?
It has been said that ignorance is bliss, but with so many humans needing to be fed, watered and treated for illness, now more than ever we need as much help as we can get. Having access to the complex ready-made biochemistry of a unique genome is surely easier than attempting to synthesise one from scratch or recover it from a long-dead preserved specimen? By paying minimal attention to the smallest organisms that lie all around us, we could be losing so much more than just an unobtrusive plant, animal or fungus.
We can't save every species on the current endangered list but more attention could be given to the myriad of life forms that get side-lined by the cute and cuddly flagship species, usually large animals. Most of us would be upset by the disappearance of the eighteen hundred or so giant pandas still left in the wild, but somehow I doubt their loss would have as great an impact on the surrounding ecosystem than that of some far less well known flora or fauna. If you think that's nonsense, then consider the vital roles that bees and dung beetles play in helping human agriculture.
Although the decimation of native New Zealand wildlife has led to protective legislation for all our vertebrates and a few famous invertebrates such as giant weta, the vast majority of other species are still left to their own devices. That's not to say that the ecosystems in most other countries are given far less support, of course. But without funding for basic description and taxonomy, who knows what is even out there, never mind whether it might be important to humanity? Could it be that here is a new field for citizen scientists to move into?
Needless to say, the drier climes brought on by rising temperatures will not do peripatus any favours, thanks to its need to remain in damp conditions. Whether by widespread use of the poison 1080 (in the bid to create a pest-free New Zealand by 2050) or the accidental importation of a non-native fungus such as those decimating amphibians worldwide and causing kauri dieback in New Zealand, there are plenty of ways that humans could unwittingly wipe out velvet worms, etal. So next time you watch a documentary on the demise of large, familiar mammals, why not spare a thought for all those wee critters hiding in the bush, going about their business and trying to avoid all the pitfalls us humans have unthinkingly laid for them?
Friday, 11 August 2017
From steampunk to Star Trek: the interwoven strands between science, technology and consumer design
With Raspberry Pi computers having sold over eleven million
units by the end of last year, consumer interest in older technology appears to
have become big business. Even such decidedly old-school devices as crystal
radio kits are selling well, whilst replicas of vintage telescopes are proof that not
everyone has a desire for the cutting-edge. I'm not sure why this is so, but
since even instant Polaroid-type cameras are now available again - albeit with
a cute, toy-like styling - perhaps manufacturers are just capitalising on a
widespread desire to appear slightly out of the ordinary. Even so, such
products are far closer to the mainstream than left field: instant-developing
cameras for example now reach worldwide sales of over five million per year.
That's hardly a niche market!
Polaroid cameras aside, could it be the desire for a less minimal aesthetic that is driving such purchases? Older technology, especially if it is pre-integrated circuit, has a decidedly quaint look to it, sometimes with textures - and smells - to match. As an aside, it's interesting that whilst on the one hand current miniaturisation has reduced energy consumption for many smaller pieces of technology from the Frankenstein laboratory appearance of valve-based computing and room-sized mainframes to the smart watch etal, the giant scale of cutting-edge technology projects require immense amounts of energy, with nuclear fusion reactors presumably having overtaken the previous perennial favourite example of space rockets when it comes to power usage.
The interface between sci-tech aesthetics and non-scientific design is a complicated one: it used to be the case that consumer or amateur appliances were scaled-down versions of professional devices, or could even be home-made, for example telescopes or crystal radios. Nowadays there is a massive difference between the equipment in high-tech laboratories and the average home; even consumer-level 3D printers won't be able to reproduce gravity wave detectors or CRISPR-Cas9 genome editing tools any time soon.
The current trend in favour - or at least acknowledgement - of sustainable development, is helping to nullify the pervasive Victorian notion that bigger, faster, noisier (and smellier) is equated with progress. It's therefore interesting to consider the interaction of scientific ideas and instruments, new technology and consumerism over the past century or so. To my mind, there appear to be five main phases since the late Victorian period:
Although today's steampunk/alternative history movement is indebted to later authors, especially Michael Moorcock, as much as it is to Jules Verne and H.G. Wells, the latter pair are only the two most famous of a whole legion of late Victorian and Edwardian writers who extolled - and occasionally agonised over - the wonders of the machine age.
I must confess I much prefer steam engines to electric or diesel locomotives, despite the noise, smuts and burning of fossil fuels. Although the pistons and connecting rods of these locomotives might be the epitome of the design from this phase, it should be remembered that it was not unknown for Victorian engineers to add fluted columns and cornucopia reliefs to their cast iron and brass machinery, echoes of a pre-industrial past. An attempt was being made, however crude, to tie together the might of steam power to the Classical civilisations that failed to go beyond the aeolipile toy turbine and the Antikythera mechanism.
Although scientific devices still had some of the Rube Goldberg/Heath Robinson appearance of their Nineteenth Century forebears, both consumer goods and vehicles picked up the concept of streamlining to suggest a sophisticated, future-orientated design. Items such as radios and toasters utilised early plastics, stainless steel and chrome to imply a higher level of technology than their interiors actually contained. This is in contrast to land, sea and aerial craft, whereby the practical benefits of streamlining happily coincided with an attractive aesthetic, leading to design classics such as the Supermarine seaplanes (forerunners of the Spitfire) and the world speed record-holding A4 Pacific Class steam locomotives.
Combined with rockets and space the imagery could readily be termed 'space cadet', but it wasn't the only area of science to influence wider society. Biological research was undergoing a resurgence, which may explain why stylised x-ray forms, amoebas and bodily organs become ubiquitous on textiles, furnishings, and fashion. Lighting fixtures were a standout example of items utilising designs based on the molecular models used in research laboratories (which famously gave Crick and Watson the edge in winning the race to understand the structure of DNA).
Monumental architecture also sought to represent the world of molecules on a giant scale, culminating in the 102 metre-high Atomium built in Brussels for the 1958 World's Fair. It could be said that never before had science- and technological-inspired imagery been so pervasive in non-STEM arenas.
Perhaps a combination of economic recession, mistrust of authority (including science and a military-led technocracy) and a burgeoning awareness of environmental issues led to the replacement of exuberant colour with muted, natural tones and basic if self-possessed geometries. Consumers could now buy microcomputers and video games consoles; what had previously only existed in high-tech labs or science fiction became commonplace in the household. Sci-fi media began a complex two-way interaction with cutting-edge science; it's amazing to consider that only two decades separated the iPad from its fictional Star Trek: The Next Generation predecessor, the PADD.
The blue and green glowing lights of everything from futuristic engines to computer holographic interfaces in many Hollywood blockbusters are representations of both the actual awesome power required by the likes of the Large Hadron Collider and as an analogy for the visually-unspectacular real-life lasers and quantum teleportation, the ultimate fusion (sorry, couldn't resist that one) being the use of the real National Ignition Facility target chamber as the engine core of the USS Enterprise in Star Trek: Into Darkness.
Clearly, this post-industrial/information age aesthetic is likely to be with us for some time to come, as consumer-level devices emulate the cool brilliance of professional STEM equipment; the outer casing is often simple yet elegant, aiming not to distract from the bright glowing pixels that take up so much of our time. Let's hope this seduction by the digital world can be moderated by a desire to keep the natural, material world working.
Polaroid cameras aside, could it be the desire for a less minimal aesthetic that is driving such purchases? Older technology, especially if it is pre-integrated circuit, has a decidedly quaint look to it, sometimes with textures - and smells - to match. As an aside, it's interesting that whilst on the one hand current miniaturisation has reduced energy consumption for many smaller pieces of technology from the Frankenstein laboratory appearance of valve-based computing and room-sized mainframes to the smart watch etal, the giant scale of cutting-edge technology projects require immense amounts of energy, with nuclear fusion reactors presumably having overtaken the previous perennial favourite example of space rockets when it comes to power usage.
The interface between sci-tech aesthetics and non-scientific design is a complicated one: it used to be the case that consumer or amateur appliances were scaled-down versions of professional devices, or could even be home-made, for example telescopes or crystal radios. Nowadays there is a massive difference between the equipment in high-tech laboratories and the average home; even consumer-level 3D printers won't be able to reproduce gravity wave detectors or CRISPR-Cas9 genome editing tools any time soon.
The current trend in favour - or at least acknowledgement - of sustainable development, is helping to nullify the pervasive Victorian notion that bigger, faster, noisier (and smellier) is equated with progress. It's therefore interesting to consider the interaction of scientific ideas and instruments, new technology and consumerism over the past century or so. To my mind, there appear to be five main phases since the late Victorian period:
- Imperial steam
- Streamlining and speed
- The Atomic Age
- Minimalism and information technology
- Virtual light
1) Imperial steam
In the period from the late Nineteenth Century's first generation of professional scientists up to the First World War, there appears to have been an untrammelled optimism for all things technological. Brass, iron, wood and leather devices - frequently steam-powered - created an aesthetic that seemingly without effort has an aura of romance to modern eyes.Although today's steampunk/alternative history movement is indebted to later authors, especially Michael Moorcock, as much as it is to Jules Verne and H.G. Wells, the latter pair are only the two most famous of a whole legion of late Victorian and Edwardian writers who extolled - and occasionally agonised over - the wonders of the machine age.
I must confess I much prefer steam engines to electric or diesel locomotives, despite the noise, smuts and burning of fossil fuels. Although the pistons and connecting rods of these locomotives might be the epitome of the design from this phase, it should be remembered that it was not unknown for Victorian engineers to add fluted columns and cornucopia reliefs to their cast iron and brass machinery, echoes of a pre-industrial past. An attempt was being made, however crude, to tie together the might of steam power to the Classical civilisations that failed to go beyond the aeolipile toy turbine and the Antikythera mechanism.
2) Streamlining and speed
From around 1910, the fine arts and then decorative arts developed new styles obsessed with mechanical movement, especially speed. The dynamic work of the Futurists led the way, depicting the increasing pace of life in an age when humans and machines were starting to interact ever more frequently. The development of heavier-than-air flight even led to a group of 'aeropainters' whose work stemmed from their experience of flying.Although scientific devices still had some of the Rube Goldberg/Heath Robinson appearance of their Nineteenth Century forebears, both consumer goods and vehicles picked up the concept of streamlining to suggest a sophisticated, future-orientated design. Items such as radios and toasters utilised early plastics, stainless steel and chrome to imply a higher level of technology than their interiors actually contained. This is in contrast to land, sea and aerial craft, whereby the practical benefits of streamlining happily coincided with an attractive aesthetic, leading to design classics such as the Supermarine seaplanes (forerunners of the Spitfire) and the world speed record-holding A4 Pacific Class steam locomotives.
3) The Atomic Age
By the 1950s practically anything that could be streamlined was, whether buildings that looked like ocean liners or cars with rocket-like tailfins and dashboards fit for a Dan Dare spaceship. However, a new aesthetic was gaining popularity in the wake of the development of atomic weapons. It seems to have been an ironic move that somewhere between the optimism of an era of exciting new domestic gadgets and the potential for nuclear Armageddon, the Bohr (classical physics) model of the atom itself gained a key place in post-war design.Combined with rockets and space the imagery could readily be termed 'space cadet', but it wasn't the only area of science to influence wider society. Biological research was undergoing a resurgence, which may explain why stylised x-ray forms, amoebas and bodily organs become ubiquitous on textiles, furnishings, and fashion. Lighting fixtures were a standout example of items utilising designs based on the molecular models used in research laboratories (which famously gave Crick and Watson the edge in winning the race to understand the structure of DNA).
Monumental architecture also sought to represent the world of molecules on a giant scale, culminating in the 102 metre-high Atomium built in Brussels for the 1958 World's Fair. It could be said that never before had science- and technological-inspired imagery been so pervasive in non-STEM arenas.
4) Minimalism and information technology
From the early 1970s the bright, optimistic designs of the previous quarter century were gradually replaced by the cool, monochromatic sophistication of minimalism. Less is more became the ethos, with miniaturisation increasing as solid-state electronics and then integrated circuits became available. A plethora of artificial materials, especially plastics, meant that forms and textures could be incredibly varied if refined.Perhaps a combination of economic recession, mistrust of authority (including science and a military-led technocracy) and a burgeoning awareness of environmental issues led to the replacement of exuberant colour with muted, natural tones and basic if self-possessed geometries. Consumers could now buy microcomputers and video games consoles; what had previously only existed in high-tech labs or science fiction became commonplace in the household. Sci-fi media began a complex two-way interaction with cutting-edge science; it's amazing to consider that only two decades separated the iPad from its fictional Star Trek: The Next Generation predecessor, the PADD.
5) Virtual light
With ultra high-energy experiments such as nuclear fusion reactors and the ubiquity of digital devices and content, today's science-influenced designs aim to be simulacra of their professional big brothers. As stated earlier, although consumer technology is farther removed from mega-budget science apparatus than ever, the former's emphasis on virtual interfaces is part of a feedback loop between the two widely differing scales.The blue and green glowing lights of everything from futuristic engines to computer holographic interfaces in many Hollywood blockbusters are representations of both the actual awesome power required by the likes of the Large Hadron Collider and as an analogy for the visually-unspectacular real-life lasers and quantum teleportation, the ultimate fusion (sorry, couldn't resist that one) being the use of the real National Ignition Facility target chamber as the engine core of the USS Enterprise in Star Trek: Into Darkness.
Clearly, this post-industrial/information age aesthetic is likely to be with us for some time to come, as consumer-level devices emulate the cool brilliance of professional STEM equipment; the outer casing is often simple yet elegant, aiming not to distract from the bright glowing pixels that take up so much of our time. Let's hope this seduction by the digital world can be moderated by a desire to keep the natural, material world working.
Monday, 28 September 2015
Resurrecting megafauna: the various problems of de-extinction
The record-breaking success of Jurassic World proves that if there's anything a lot of people want to see in the animal kingdom it is species that are both large and fierce. Unfortunately, in these post-glacial times that type of fauna has been much reduced and will no doubt wane even further - not that I particularly wish to encounter an apex predator at close quarters, you understand.
Hollywood, of course, has much to answer for. There was plenty of poor science in the original Jurassic Park movie - the use of gap-filling frog DNA being a far worse crime in my book than the over-sized velociraptors (think Achillobator and similar species) but the most recent film in the franchise has pointedly ignored the advances in dinosaur knowledge made in the intervening period. Perhaps a CGI test of a feathered T-Rex looked just to comical?
In contrast, the amount of publically-available material discussing de-extinction has increased exponentially in the two decades since Jurassic Park was released, with the line between fact and fiction well and truly blurred. That's not to say that an enormous amount hasn't been learned about the DNA of extinct species during this period. I recently watched a rather good documentary on the National Geographic channel (yes, it does occasionally happen) about the one-month old baby mammoth Lyuba, recovered in Siberia almost forty-two thousand years after she died. The amount of genetic information that has been recovered from mammoths is now extremely comprehensive, but then they were alive until almost yesterday at geological timescales. Needless to say the further back in time a creature existed, the more problematic it is to retrieve any genetic material.
A lot has been written about the methods that have been, or could in the near future, be used to resurrect ancient animals. Some procedures involve the use of contemporary species as surrogate parents, such as elephants standing in for mother mammoths. But it seems fair to say that all such projects are finding difficulties rather greater than originally planned. One common misconception is that any resurrected animal would be a pure example of its kind. Even the numerous frozen mammoth carcasses have failed to supply anywhere near a complete genome and of course it isn't just a case of filling in gaps as per a jigsaw puzzle: one primary issue is how to know where each fragment fits into the whole. Our knowledge of genetics may have advanced enormously since Watson and Crick's landmark 1953 paper, but genetic engineering is still incredibly difficult even with species that are alive today. After all, Dolly the sheep wasn't a pure clone, but had nuclear DNA from one donor and mitochondrial DNA from another.
Therefore instead of resurrecting extinct species we would be engineering hybrid genomes. Jurassic World took this process to the extreme with Indominus rex, a giant hybrid of many species including cuttlefish! Some research suggests that the most of the original genes of any species over a million years old – and therefore including all dinosaurs – might never be recovered. Something terrible lizard-ish may be built one day, but it would be closer to say, a chicken, with added teeth, a long bony tail and a serious attitude problem. In fact, George Lucas has been a key funder of the chickenosaurus project with aims along these lines. Let's hope he doesn't start building an army of them, totally obedient clones, ready for world domination…oh no, that was fiction, wasn't it?
But if – or more likely, when – creating variants of extinct species becomes possible, should we even attempt it? Apart from the formidable technical challenges, a lot of the drive behind it seems to be for populating glorified wildlife parks, or even worse, game reserves. The mock TV documentary series Prehistoric Park for example only contained large animals from various periods, frequently fierce carnivores, with no attention given to less conspicuous creatures or indeed flora. This gee-whiz mentality seems to follow a lot of the material written about de-extinction, masking some very serious long-term issues in favour of something akin to old-style menageries. Jurassic Park, in fact.
A big question that would be near impossible to answer in advance is whether such a species would be able to thrive or even survive in a climate far removed from the original, unless there was major genetic engineering just for such adaptive purposes. Again, the further back the animal lived, the less likely it is that there is a contemporary habitat close to the original. It may be possible to recreate glacial steppes suitable for some mammoth species, but what about the Earth of ten million or one hundred million years ago? Prehistoric Park got around the issue for its Carboniferous megafauna by housing them in a high oxygen enclosure, which is certainly a solution, if something of a fire hazard!
Any newly-created animal will lack the symbiotic microbial fauna and flora of the original era, but I've not seen much that tackles this issue. I suppose there could be a multi-stage process, starting with deliberate injections of material in vitro (or via the host /mother). But once the animal is born it will have to exist with whatever the local environment/habitat has to offer. The chimerical nature of the organism may help provide a solution, but again this takes the creature even further from the original.
Then there is the rather important issue of food. To his credit, Michael Crichton suggested in Jurassic Park that herbivorous dinosaurs swallowing gizzard stones might accidentally eat berries that their metabolism couldn't handle. It would be extremely expensive to maintain compounds large enough for megafauna that are constantly kept free of wind-blown, bird-dropped and otherwise invasive material dangerous to the animals.
If the hybrids were allowed free reign, what if they escaped or were able to breed naturally? Given a breeding population (as opposed to say, sterilised clones) evolution via natural selection may lead them in a new direction. It would be wise to consider them as an integral part of the ecosystem into which they are placed, remembering Darwin's metaphor of ten thousand sharp wedges. Is there a possibility that they could out-compete modern species or in some other way exacerbate the contemporary high rate of extinction?
I've previously discussed the dangers of deliberate introduction of foreign species for biological control purposes: surely introducing engineered hybrids of extinct species is the ultimate example of this process? Or would there be a complete ban on natural reproduction for resurrected species, with each generation hand-reared from a bank of genetic material? At this point it should be clear that it isn't just the nomenclature that is confusing.
Some research has been undertaken to investigate the de-extinction of species whose demise during the past few centuries can clearly be blamed on humans, obvious examples being the Tasmanian tiger and the nine species of New Zealand moa. It could be claimed that this has more to do with alleviating guilt than serving a useful purpose (assuaging crimes against the ecosystem, as it were) but even in these cases the funds might be better turned towards more pressing issues. After all, two-thirds of amphibian species are currently endangered, largely due to direct human action. That's not to say that such money would then be available, since for example, a wealthy business tycoon who wants to sponsor mammoth resurrection - and they do exist - wouldn't necessarily transfer their funding to engineering hardier crops or revitalising declining pollinating insect species such as bees.
As it happens, even species that existed until a few hundred years ago have left little useable fragments of DNA, the dodo being a prime example. That's not to say that it won't one day be retrievable, as shown by the quagga, which was the first extinct species to have its DNA recovered, via a Nineteenth Century pelt.
As Jeff Goldman's chaos mathematician says in Jurassic Park, "scientists were so preoccupied with whether or not they could that they didn't stop to think if they should". Isn't that a useful consideration for any endeavour into the unknown? If there's one thing that biological control has shown, it is to expect the unexpected. The Romans may have enjoyed animal circuses, but we need to think carefully before we create a high-tech living spectacle without rather more consideration to the wider picture than appears to currently be the case.
Subscribe to:
Posts (Atom)