Sunday, 18 July 2021

The uncertainty principle: does popular sci-comm imply more than is really known?

Over the years I've examined how ignorance in science can be seen as a positive thing and how it can be used to define the discipline, a key contrast to most religions. We're still a long way from understanding many fundamental aspects of the universe, but the religious fundamentalist (see what I did there?) mindset is seemingly unable to come to terms with this position and so incorporates lack of knowledge into arguments disparaging science. After all, the hackneyed train of thought goes, scientific theories are really only that, an idea, not something proven beyond all possible doubt. Of course this isn't the case, but thanks to the dire state of most school science education, with the emphasis on exams and fact-stuffing rather than analysis of what science really is (a group of methods, not a collection of facts) - let alone anything that tries to teach critical thinking - you can see why some people fall prey to such disinformation, i.e. that most science isn't proven to any degree of certainty.

With this in mind, you have to wonder what percentage of general audience science communication describes theories with much more certainty than is warranted, when instead there is really a dearth of data that creates a partial reliance on inferred reasoning. Interestingly, the complete opposite used to be a common statement; for example, in the nineteenth century the composition of stars was thought to be forever unknowable, but thanks to spectroscopy that particular wonder came to fruition from the 1860s onwards. It is presumably the speed of technological change today that has reduced that negativity, yet it can play into the anti-rationalist hands of religious hardliners if scientists claim absolute certainty for any particular theory (the Second Law of Thermodynamics excepted). 

As it is, many theories are based on a limited amount of knowledge (both evidential and mathematical) that rely on an expert filling in of the gaps. As an aside, the central tenet of evolution by natural selection really isn't one of these: the various sources of evidence, from fossils to DNA, provide comprehensive support to the theory. However, there are numerous other areas which rely on a fairly small smattering of physical evidence and a lot of inference. This isn't to say the latter is wrong - Nobel-winning physicist Richard Feynman once said that a scientific idea starts with a guess - but to a non-specialist this approach can appear somewhat slapdash.

Geophysics appears to rely on what a layman might consider vague correlations rather than exact matches. For example, non-direct observation techniques such as measuring seismic waves have allowed the mapping of the interior composition of the Earth; unless you are an expert in the field, the connection between the experimental results and clear-cut zones seem more like guesswork. Similarly, geologists have been able to create maps of the continental plates dating back around 600 million years, before which the position of land masses hasn't been so much vague as completely unknown. 

The time back to the Cambrian is less than fifteen percent of the age our 4.5 billion year old planet. This (hopefully) doesn't keep the experts up at night, as well-understood geophysical forces mean that rock is constantly being subducted underground, to be transformed and so no longer available for recording. In addition, for its first 1.3 billion years the planet's surface would have been too hot to allow plates to form. Even so, the position of the continental crust from the Cambrian period until today is mapped to a high level of detail at frequent time intervals; this is because enough is known of the mechanisms involved that if a region at the start of a period is in position A and is later found at position Z, it must have passed through intermediate positions B through Y en route.

One key geological puzzle related to the building and movement of continental rock strata is known as the Great Unconformity, essentially a 100 million year gap in the record that occurs in numerous locations worldwide for the period when complex multicellular life arose. In some locales the period expands both forwards and backwards to as much as a billion years of missing rock; that's a lot of vanished material! Most of the popular science I've read tends to downplay the absent strata, presumably because in the 150 years since the Great Unconformity was first noticed there hasn't been a comprehensive resolution to its cause. The sheer scale of the issue suggests a profound level of ignorance within geology. Yes, it is a challenge, but it doesn't negate the science in its entirety; on the other hand, it's exactly the sort of problem that fundamentalists can use as ammunition to promote their own versions of history, such as young Earth creationism.

In recent decades, the usually conservative science of geology has been examining the evidence for an almost global glaciation nicknamed 'Snowball Earth' (or 'Slushball Earth', depending on how widespread you interpret the evidence for glaciation). It appears to have occurred several times in the planet's history, with the strongest evidence for it occurring between 720 and 635 million years ago. What is so important about this era is that it is precisely the time (at least in geological terms) when after several billion years of microbial life, large and sophisticated, multicellular organisms rapidly evolved during the inaccurately-titled Cambrian explosion.

All in all then, the epoch under question is extremely important. But just how are the Great Unconformity, global glaciation and the evolution of complex biota connected? Since 2017 research, including from three Australian universities, has led to the publication of the first tectonic plate map centred on this critical period. Using various techniques, including measuring the oxygen isotopes within zircon crystals, the movements of the continents has been reconstructed further back in time than ever before. The resulting hypothesis is a neat one (perhaps overly so, although it appears to be tenable): the top 3km to 5km of surface rock was first eroded by glacial activity, then washed into the oceans - where the minerals kick-started the Ediacaran and early Cambrian biota -  before being subducted by tectonic activity. 

The conclusion doesn't please some skeptics but the combined evidence, including the erosion of impact craters and a huge increase in sedimentation during the period, gives further support, with the additional inference that an immense increase in shallow marine environments (thanks to the eroded material raising the seafloor) had become available for new ecological niches. In addition, the glacial scouring of the primary biominerals calcium carbonate, calcium phosphate and silicon dioxide into the oceans altered the water chemistry and could have paved the way for the first exoskeletons and hard shells, both by providing their source material and also generating a need for them in the first place, in order to gain protection from the changes in water chemistry.

Deep-time thermochronology isn't a term most of us are familiar with, but the use of new dating techniques is beginning to suggest solutions to some big questions. Not that there aren't plenty of other fundamental questions (the nature of non-baryonic matter and dark energy, anyone?) still to be answered. The scale of the unknown should not be used to denigrate science; not knowing something doesn't mean science isn't the tool for the job. One of its more comforting (at least to its practitioners) aspects is that good science always generates more questions than it answers. To expect simple, easy, straightforward solutions should be left to other human endeavours that relish just-so stories. While working theories are often elegant and simpler than alternatives, we should expect filling in the gaps as a necessity, not a weapon used to invalidate the scientific method or its discoveries. 

Tuesday, 15 June 2021

Meat-free marvels: does a vegetarian diet reduce your risk of disease?

Is it me, or are there times when contemporary diet trends appear to verge on pseudoscientific crankery? While I briefly mentioned potentially dangerous items such as raw water and unpasteurised milk a few years' ago, it's surprising how many fad diets in developed nations bear a suspicious resemblance to the traditional ingredients of non-Western societies. 

Super foods are a particularly overhyped element of this faddish arena; the marketing suggests they can help achieve perfect 'balance' and 'wellness' in the body. Some assertions go much further, with consumption of the likes of kombucha claimed as something of a miracle cure. While the pseudocereal quinoa is sold in the West as the 'grain of the gods', it is unlikely to give the partaker any super powers. It certainly didn't save the Inca and Anasazi - who cultivated it in pre-Columbian America - from the rapid collapse of their civilisations and apparently suffered from disease and famine as much any other society.

There is a scientific basis for recommending certain non-meat items, from the antioxidants in tea and coffee to the vitamin D in mushrooms, while various plants and vegetable oils contain Omega-3 fatty acids. But a recent report has concluded that a vegetarian diet may have a marked positive effect on overall health compared to one with regular meat consumption. The research was conducted by the University of Glasgow, with the data showing substantial reductions in disease biomarkers for non-meat eaters. However, it was unable to provide an underlying reason for the positive results, once risk factors such as age, alcohol and nicotine intake had been accounted for. Cholesterol and products linked to increased risk of cancers, cardiovascular disease, and liver and kidney problems were all lower in vegetarians.

Apart from suggesting that vegetarians eat more fibre, fruit, vegetables and nuts - some of which have known health benefits - the report's conclusion also noted that rather than the positive effect of these items, avoiding processed meat products and red meat may have also contributed to the results. As someone who hasn't eaten meat in over thirty years, I find the research extremely interesting, although I think there are many other factors that should be considered, with the report forming just part of the debate. 

For example, the data was drawn from c.420,000 people living in just the UK, rather than from a variety of nations and environments. In the past century, the diet and lifestyle of most people in the West has changed enormously, with the emphasis on quick-to-prepare meat dishes including the likes of burgers and sausages, remaining at the forefront despite the replacement of physically demanding lives with predominantly sedentary ones. In other words, the diet hasn't changed to match the alteration in lifestyle. It's little wonder that obesity has outranked malnourishment in some nations.

In addition, it is thought that several billion people, predominantly in less developed regions, consume insect protein on a regular if not daily basis. This is a profoundly different diet to those of Western meat eaters with the latter's concentration on domesticated species such as cattle and horse, sheep/goat, poultry, etc. Although game, bush meat and exotic species such as crocodile are eaten in many regions, these are a much smaller element of the human diet. 

In contrast, vegetarians in many regions can eat an enormous variety of plants and fungi. The geographic and seasonal availability of many fruit and vegetables is expanding too: until a few years ago I hadn't heard of jackfruit, but it is now available as the tinned unripe variety from many stores here in New Zealand. So in both time and in space, there's no such thing as a typical vegetarian diet! This also doesn't include the differences between lacto-vegetarians and vegans; it would definitely be rather more time-consuming to plan a diet with an adequate mix of proteins in the absence of eggs and dairy products. It would therefore be interesting to conduct research to find out the health differences between these two groups.

Although some of the blame for poor health and obesity has been placed on processed and refined foods, there is an ever-increasing array of prepared vegetarian products, often marketed as meat substitutes for meatatarians wanting to cut down on their consumption of animal flesh. My daughters (regular meat eaters) and I have a penchant for fake bacon made of wheat, pea and soy and I also eat a variety of meat-free sausages and burgers as well as Quorn products. 

Many companies are now getting on the bandwagon, with products that aim to replicate the taste and texture of the real thing. Some brands such as Beyond Meat and Impossible Foods have seen a rapid rise to international success, while the UK bakery chain Greggs has benefitted from its tasty (if high-fat) Quorn-based vegan sausage roll becoming one of their top five selling products. Therefore the range of processed foods suitable for vegetarians has grown out of all proportion to those available several decades ago. Could it be that these may have detrimental health effects compared to the less refined ingredients traditionally eaten by Western vegetarians (and still eaten in developing nations)?

Just as there are shed loads of books claiming that epigenetics will allow you to self-improve your DNA through your lifestyle, diet gurus play upon similar fears (and gullibility) to encourage people to eat all sorts of weird stuff that at best maintains equilibrium and at worst can lead to serious health issues. I personally think that a wider amount of research, undertaken in all sorts of regions and societies, needs to be done before a vegetarian diet can be claimed to be distinctly superior to a meat-based one. Of course, a reduction in ruminant farming is good for the planet in general - both for saving water and reducing methane - but as far as a diet equates to health I still think that moderation and a sensible attitude can be key factors in this regard. Nevertheless the Glasgow study certainly is...wait for it...food for thought!

Friday, 14 May 2021

Weedbusting for a better world: the unpleasant truth about invasive plants

There's been a lot written about New Zealand's Predator Free 2050 programme, including my own post from 2016, but while the primary focus has been on fauna, what about the invasive species of flora? Until recently it was easy to think of plants as poor man's animals, with little in the way of the complex behaviour that characterises the life of vertebrates and many invertebrates. However, that's been changing thanks to studies that show the life of plants is actually rather complex - and includes the likes of chemical signalling. Although they might not have the emotional impact of animals, land vegetation alone has about one thousand times the mass of terrestrial fauna. So they're important - and then some!

A few months' ago I was volunteering on the sanctuary island of Motuihe, less than an hour's boat ride from downtown Auckland. Our group was charged with cutting down woolly nightshade, a soil-poisoning plant native to South America. Destroying these evil-smelling shrubs made me wonder how and why they were introduced to New Zealand in the first place, considering they don't look particularly attractive and their berries are poisonous to humans. Like so many exotic plant species, they were apparently deliberately introduced as a decorative garden plant, though frankly I can't see why. 

Like many similar stories from around the world, New Zealand has been inundated with large numbers of non-native floral species. Unlike woolly nightshade, some were introduced for practical purposes, such as radiata pine for timber and gorse for hedging, while others were accidentally brought in as seeds in soil. In many cases they are stories of greed and incompetence, for which later generations have paid a heavy price. 

Although there were pioneering lone voices who from as early as the late nineteenth century could see the deleterious effects of exotic plant species on native vegetation, it wasn't until the last half century that any serious effort was made to promote their removal. British botanist and presenter David Bellamy was one of the first scientists to popularise this message, starring in a 1989 television advert to explain why Clematis vitalba (AKA Old man's beard) needed eradicating. Bellamy then went on to present the tv series Moa's Ark, which drew attention to the country's unique biota and the dangers it faced from poorly managed development. 

Given his botanical background, it's perhaps not surprising that rather than see plants as the background to dramas of the animal kingdom Bellamy made them central to the ecosystem, claiming that we should put nature before culture. Again, although lacking the dynamic aspects of fauna, invasive weeds (by definition, aren't weeds just plants in the wrong place?) such as Old man's beard can gain up to ten metres in a single growing season. You only have to look around a suburban garden - mine included - to see that constant vigilance is required to remove the likes of self-seeded wattle and climbing asparagus before they take hold and smother native species.

It isn't just on land that we face this issue: freshwater systems can easily be choked by the likes of Elodea canadensis, a North American pondweed that has escaped from its ornamental aquarium environment (thanks to highly irresponsible people, of course) and been spread by boats and fishing equipment, clogging and stagnating streams and lakes. What is worrying is that it is far short of being the worst of the fifty or so non-native aquatic plants that threaten New Zealand's waterways. Considering that around three-quarters of all invasive species in this environment have a detrimental effect, it clearly makes the point that introduced flora is just not good.

So what can - and is - being done? Thanks to numerous volunteer groups, sanctuaries for rare native species (principally fauna, but occasionally flora too) are keeping invasive weeds at bay. Outside these protected environments, annual weeding programmes aim to reduce wilding pine, but the issue here is that commercial interest still maintains the upper hand. Whether for timber plantations or carbon sequestration, species such as Douglas fir continue to be planted, allowing the seed to spread to new areas far and wide on the wind. Luckily, there are numerous websites to help the public identify and  destroy pest plants; here are just some of the online resources available for New Zealanders:

Clearly, this isn't an issue that will ever go away. With most Government-led efforts focusing on pest animal species, eradicating invasive plants has been given far less support and so they remain comparatively unknown. Perhaps it would be good if schools undertook a compulsory programme, including practical work, in the identification and removal of non-native pest flora? Trapping and poisoning invasive animals can be a complex business, but weeding is comparative child's play. Everyone can help out: in effect, this is a form of citizen science that has a positive practical effect on the environment. Why not start with your garden today?


Thursday, 1 April 2021

Zapping zombies: how the US military uses the entertainment industries as a recruitment tool

We hear a lot about gamification these days. As video games edge closer to simulating the real world, while Hollywood blockbusters seem to more and more resemble video games, it's little wonder that businesses are using the gaming concept as a learning tool. If anyone has noticed an eerie similarity between the plethora of military sci-fi movies, combat video games and the technology used by United States' armed forces, then you might be interested to learn that this is no coincidence.

Developed at MIT in 1962, Spacewar! is frequently cited as the earliest combat video game. Of course, it was developed for mainframe computers and so it took a long time before high enough quality visuals - with sound effects - could be installed in gaming arcades, followed in the early 1980s by games written for the first generation of ready-assembled home microcomputers.

Hollywood capitalised on the rapidly burgeoning video game market - both at home and in arcades - via movies such as 1984's The Last Starfighter, in which an expert arcade player finds himself recruited into an alien war. In other words, the game he excels at is really a simulator designed to discover and hone players who can then use their gaming skills in genuine space combat.

So how does this fiction compare to the real world? Specialist aviation publications have been full of articles with titles such as 'Do Gamers Make Better Drone Operators Than Pilots?' - the answer being that in addition to the obvious skills such as good hand-eye coordination, gamers are used to not being at personal risk from playing video games (except possibly RSI) and so remain calm under pressure. The conclusion is that they may give them an edge for controlling drones, although not it has to be said, larger, manually piloted aircraft.

The big question is how deep is the military involvement in the development, promotion and assessment of video games that contain combat skills? The relationship certainly appears to go back many decades, considering that the MIT graduate students who developed Space War! were funded by the Pentagon. With the development of much more lifelike virtual worlds, the US military has taken a front seat in both producing games that hone useful skills and creating realistic simulators for training its warfighters. 

There is complex feedback loop between these two spheres and in 1999 the Department of Defense set up the Institute for Creative Technologies to work across them. Games such as Full Spectrum Warrior (2003) and its non-commercial officer training stablemate Full Spectrum Command attempted to portray realistic combat scenarios, facing enemies who frequently resemble their real-life counterparts. 

America's Army (2002) was the first of a series of (initially free) video games that began as propaganda and recruitment tools and then became a widespread commercial franchise. Marines and Special Forces soldiers were amongst those combat veterans involved in the development of these games. In addition, the developers were allowed to scan weapons (in order to build realistic digital simulations) and even shoot them on a firing range so as to experience the physical attributes at first hand. Needless to say, the potential for glorification of violence led to opposition from various quarters.

It isn't just the software that has crossed over between the military and civilian life: weaponry and control systems also feedback between the real world and combat simulations, easing the move from game playing to the genuine article. Of course, skills such as leadership and team cooperation are also being honed by these games. The idea is that they reduce the cost of recruitment and training, leading to the realisation that the free version of America's Army, having had 1.5 million downloads in its first month (and a whopping 40 million downloads over the following six years), proved how effective they could be. 

Going in the other direction, US armed forces personnel have taken part in campaigns such as Operation Phantom Fury, which let's face it, has more than a touch of the Xbox or PlayStation about it. I assume this is also part of the process to ensure a smooth transition between young combat game players and activities in the real-world military. The channel is unlikely to diminish any time soon, seeing as China is now following America's lead; their Glorious Mission online video game, aimed at potential recruits as well as enlisted service personnel, already has over 300 million players.

The US military gaming sector has also started to diversify. To minimise complaints - already prevalent in the gaming sector, due to the implacable enemy often being a group of Muslim fundamentalists - there needed to be a new target that wouldn't raise the ire of any particular nation or ethnic group. To this end, the Call of Duty series of games has introduced reanimated dead soldiers, AKA zombies, as opponents. Bearing in mind that in the past ten years there have been over fifty video games featuring zombie antagonists, its clear that this theme is just as popular as invading aliens and terrorist zealots. Perhaps it's not surprising that doomsday preppers and survivalist groups are often said to be getting ready for the zombie apocalypse!

Recently released - although heavily-redacted - files suggest that as well as developing and promoting video games centred on combat simulation, the Department of Defense has also secretly collected players' data in order to understand their demographics. This is presumably in order to tailor recruitment and training programmes for recruits with a gaming background. The same information also hints that Hollywood too is being used by the military-industrial complex to promote its own agenda. It sounds a bit far-fetched, but the facts speak for themselves. 

The US military have long taken an interest in how Hollywood portrays them. Ronald Reagan's Whitehouse had screenings of Red Dawn (1984) and WarGames (1983) with the former gaining the Pentagon's approval while the latter was not well received (hardly surprising, if you know the plot). Gung-ho space marine movies started back in the mid-1980s with likes of Predator and Aliens, but really took off in mid-1990s with blockbusters such as Independence Day, Stargate and Starship Troopers

Hollywood hasn't looked back since, and as well as the US military fighting off hordes of alien invaders, there are plenty of zombie movies - over 170 worldwide over the past decade - along with numerous zombie-themed tv series. Of course, this genre usually features civilians fighting against the living dead, but nonetheless the firearm-laden format resembles its military counterparts. Critics have been keen to note that just as the alien invasion films of 1950s and 1960s were thinly-disguised Cold War allegories, so zombie movies contain subtext of the unpredictable nature of global terrorism - and imply readiness to engage the perceived enemy is a patriotic duty.

So what is the underlying connection between these genres and the Pentagon? Even a minimum of research will reveal that a fair number of the Department of Defense's advanced weaponry projects, from the F22 Raptor tactical fighter to the Global Hawk surveillance UAV (that's an Unmanned Aerial Vehicle to you and me) have been truncated, in both these cases with only about half the number of units being built compared to the original proposals. The funding for those cancelled vehicles is being redirected elsewhere and Hollywood is the most likely recipient, the money being used for both movies and tv shows that follow the DoD agenda.

And how does the Pentagon know it's getting value for money? As more people book cinema tickets online and via their smartphones, the DoD is able to build frighteningly detailed profiles of those adolescents with the aptitude and skills they are looking for. Thanks to tv subscription services, it is also much easier to see exactly who is watching how much of what.

By immersing America's youth in popular entertainment across a variety of channels that both gives a homely familiarity to the military and allows niche targeting for potential recruits, the Pentagon is saving money on blanket advertising while promoting its own values as a mainstream cultural element. Thanks to a business culture that embeds military-derived phrases ('locked and loaded', 'SWAT team', 'strategic planning', etc) the distance between the armed forces and civilian life has been much reduced since the anti-war ethos of the 1970s. So if you're a teenager who plays certain types of video games and/or watches these sorts of movies and tv shows, don't be surprised if you start receiving recruitment adverts tailored closely to your personality profile. To paraphrase the Village People: they want you as a new recruit!

Monday, 15 March 2021

Distorted Darwin: common misconceptions about evolution and natural selection

A few months' ago, I discussed how disagreements with religious texts can lead the devout to disagree with key scientific theories; presumably this is a case of fundamentalists denying the fundamentals? Of all the areas of scientific research that cause issues today, it is evolutionary biology that generates the most opposition. This is interesting in so many ways, not least because the primary texts of the Abrahamic religions have little to say on the topic beyond the almost universal elements seen in creation myths, namely that one or more superior beings created all life on Earth and that He/They placed humanity at the zenith.

Thanks to opposition to the modern evolutionary synthesis, there is a plethora of misinformation, from material taken out of context to complete falsehoods, that is used to promote Creationist ideas rather than scientifically-gleaned knowledge. Even those with well-meaning intentions often make mistakes when condensing the complexity of the origin and history of life into easy-to-digest material. I've previously written about the concepts of evolutionary bushes rather than ladders, concurrent rather than consecutive radiation of sister species and speciation via punctuated equilibrium (i.e., the uneven pace of evolution) so here are a few other examples where the origin, implications and illustrations of natural selection has been distorted or overly simplified to the point of inaccuracy.

I've previously mentioned that Charles Darwin was the earliest discoverer - but only a decade or two ahead of Alfred Russel Wallace - of natural selection, and not as is often written, evolution per se. However, this is not completely accurate. Darwin's hypothesis was more complete than Wallace's, in the sense of being entirely scientific and therefore testable. Wallace on the other hand maintained there must have been divine intervention in the creation of our species, making us different from all other life forms.

In addition, there were several precursors who partially formulated ideas regarding natural selection, but who were unable to promote a consistent, evidence-based hypothesis to anywhere near the extent that Darwin achieved. For example, as early as 1831 the Scottish agriculturalist Patrick Matthew published some notes on what he termed 'new diverging ramifications of life' as he thought must occur after mass extinctions. Nevertheless, he failed to expand and fully explain his ideas, seemingly unaware of where they could lead. In this sense, he is a minor figure compared to the thorough research Darwin undertook to back up his hypothesis. 

Darwin appears to have been unaware of Matthew's ideas, although the same could not be said for Robert Chambers' (anonymous) 1844 publication Vestiges of the Natural History of Creation, which although highly speculative contained some kernels of truth about the mechanisms behind biological evolution. Just as Thomas Malthus' 1798 An Essay on the Principle of Population inspired Darwin, so the mid-nineteenth century contained other combinations of ideas and real-world inspiration that provided,an ideal background for the formulation of natural selection. In other words, the conditions were ready for those with the correct mindset to uncover the mechanism behind evolution. What Darwin did was to combine the inspiration with an immense amount of rigour, including examples taken from selective breeding.

Another frequently quoted fallacy is that evolution always maintains a single direction from earlier, simpler organisms to later, more complex ones. I've covered this before in discussions of the evolution of our own species, as many popular biology accounts seek parallels between technological progress and a central branch of animal evolution leading ever upwards until it produced us. 

Modern techniques such as genetic analysis and sophisticated examination of fossils - including scanning their internal cavities – has negated this appealing but incorrect idea. For example, mammals evolved around the same time as the dinosaurs (and over one hundred million years before flowering plants) while parasitic species often have a far more rudimentary structure than their ancestors. 

Despite this, we still see countless illustrations showing a clear-cut path from primordial organisms 'up' to Homo sapiens. No-one who has seen the cranial endocast of a dinosaur would consider it to be superior to even the least intelligent of mammals, although the later medium-sized carnivorous species were on the way to developing a bird-like brain-to-body mass ratio. Yet throughout the Jurassic and Cretaceous periods, dinosaurs filled most ecological niches at the expense of the mammals; you would be hard-pressed to state that the latter were the dominant type of land organism during the Mesozoic!

Research published last year shows that New Zealand's unique tuatara, the sole remaining member of the Rhynchocephalia, is a reptile that shares some genetic similarities to the Monotremata, the egg-laying mammalian species known as platypus and echidna. In addition, a report from the beginning of this year states that the ancestors of today's five monotreme species diverged from all other mammals 187 million years ago; therefore, they have spent approximately three times as long on their own evolutionary journey as they did when part of all the other mammalian lineages. As a result of retaining many ancestral features, the platypus genome is in some ways more like that of birds and reptiles rather than placental and marsupial mammals. But we still include them amongst the mammals rather than as a hybrid or separate class; both platypus and echidna have fur, are warm-blooded and produce milk (although with a unique delivery system!) This allows their inclusion in Mammalia; does this mean we arbitrarily allow certain traits and discard others?

Would it be fair to say that the boundaries we make between organisms are more for our convenience than the underlying reality? Are you happy to label birds as 'avian dinosaurs' and if not, why not? If they had feathers, nests and even underground burrows, some dinosaurs were clearly part of the way there; physiologically, it was teeth, bony tail, and a crocodilian-type brain that provided the differentiation from birds. Scans of fossils show that dinosaur hearts may have been more like birds than other reptiles, which along with the possible discovery of bird-like air sacs, means that they could have had something of the former's more active lifestyle. 

This doesn't confirm that they were warm-blooded: today there are eight species, including leatherback turtles, that are mesothermic and therefore lie between warm- and cold-blooded metabolisms. Eggshell analysis suggests that some of the theropod (carnivorous) dinosaurs could have been warm-blooded, but as dinosaurs existed for around 165 million years it may be that some evolved to be mesothermic and others to be endothermic (i.e., fully warm-blooded). In this respect then, some meat-eating dinosaurs especially may have had more in common with us mammals than they did with other reptiles such as lizards and snakes.

All this only goes to show that there is far more to life's rich pageant than the just-so stories still used to illustrate the history of life. Science communication to the public is fundamental to our society but it needs to present the awkward complexities of evolution via all the tortured pathways of natural selection if it is not to fall victim to those who prefer myths of the last few thousand years to the history of countless millennia, as revealed in the genes and rocks waiting for us to explore.