Friday, 14 May 2021

Weedbusting for a better world: the unpleasant truth about invasive plants

There's been a lot written about New Zealand's Predator Free 2050 programme, including my own post from 2016, but while the primary focus has been on fauna, what about the invasive species of flora? Until recently it was easy to think of plants as poor man's animals, with little in the way of the complex behaviour that characterises the life of vertebrates and many invertebrates. However, that's been changing thanks to studies that show the life of plants is actually rather complex - and includes the likes of chemical signalling. Although they might not have the emotional impact of animals, land vegetation alone has about one thousand times the mass of terrestrial fauna. So they're important - and then some!

A few months' ago I was volunteering on the sanctuary island of Motuihe, less than an hour's boat ride from downtown Auckland. Our group was charged with cutting down woolly nightshade, a soil-poisoning plant native to South America. Destroying these evil-smelling shrubs made me wonder how and why they were introduced to New Zealand in the first place, considering they don't look particularly attractive and their berries are poisonous to humans. Like so many exotic plant species, they were apparently deliberately introduced as a decorative garden plant, though frankly I can't see why. 

Like many similar stories from around the world, New Zealand has been inundated with large numbers of non-native floral species. Unlike woolly nightshade, some were introduced for practical purposes, such as radiata pine for timber and gorse for hedging, while others were accidentally brought in as seeds in soil. In many cases they are stories of greed and incompetence, for which later generations have paid a heavy price. 

Although there were pioneering lone voices who from as early as the late nineteenth century could see the deleterious effects of exotic plant species on native vegetation, it wasn't until the last half century that any serious effort was made to promote their removal. British botanist and presenter David Bellamy was one of the first scientists to popularise this message, starring in a 1989 television advert to explain why Clematis vitalba (AKA Old man's beard) needed eradicating. Bellamy then went on to present the tv series Moa's Ark, which drew attention to the country's unique biota and the dangers it faced from poorly managed development. 

Given his botanical background, it's perhaps not surprising that rather than see plants as the background to dramas of the animal kingdom Bellamy made them central to the ecosystem, claiming that we should put nature before culture. Again, although lacking the dynamic aspects of fauna, invasive weeds (by definition, aren't weeds just plants in the wrong place?) such as Old man's beard can gain up to ten metres in a single growing season. You only have to look around a suburban garden - mine included - to see that constant vigilance is required to remove the likes of self-seeded wattle and climbing asparagus before they take hold and smother native species.

It isn't just on land that we face this issue: freshwater systems can easily be choked by the likes of Elodea canadensis, a North American pondweed that has escaped from its ornamental aquarium environment (thanks to highly irresponsible people, of course) and been spread by boats and fishing equipment, clogging and stagnating streams and lakes. What is worrying is that it is far short of being the worst of the fifty or so non-native aquatic plants that threaten New Zealand's waterways. Considering that around three-quarters of all invasive species in this environment have a detrimental effect, it clearly makes the point that introduced flora is just not good.

So what can - and is - being done? Thanks to numerous volunteer groups, sanctuaries for rare native species (principally fauna, but occasionally flora too) are keeping invasive weeds at bay. Outside these protected environments, annual weeding programmes aim to reduce wilding pine, but the issue here is that commercial interest still maintains the upper hand. Whether for timber plantations or carbon sequestration, species such as Douglas fir continue to be planted, allowing the seed to spread to new areas far and wide on the wind. Luckily, there are numerous websites to help the public identify and  destroy pest plants; here are just some of the online resources available for New Zealanders:

Clearly, this isn't an issue that will ever go away. With most Government-led efforts focusing on pest animal species, eradicating invasive plants has been given far less support and so they remain comparatively unknown. Perhaps it would be good if schools undertook a compulsory programme, including practical work, in the identification and removal of non-native pest flora? Trapping and poisoning invasive animals can be a complex business, but weeding is comparative child's play. Everyone can help out: in effect, this is a form of citizen science that has a positive practical effect on the environment. Why not start with your garden today?


Thursday, 1 April 2021

Zapping zombies: how the US military uses the entertainment industries as a recruitment tool

We hear a lot about gamification these days. As video games edge closer to simulating the real world, while Hollywood blockbusters seem to more and more resemble video games, it's little wonder that businesses are using the gaming concept as a learning tool. If anyone has noticed an eerie similarity between the plethora of military sci-fi movies, combat video games and the technology used by United States' armed forces, then you might be interested to learn that this is no coincidence.

Developed at MIT in 1962, Spacewar! is frequently cited as the earliest combat video game. Of course, it was developed for mainframe computers and so it took a long time before high enough quality visuals - with sound effects - could be installed in gaming arcades, followed in the early 1980s by games written for the first generation of ready-assembled home microcomputers.

Hollywood capitalised on the rapidly burgeoning video game market - both at home and in arcades - via movies such as 1984's The Last Starfighter, in which an expert arcade player finds himself recruited into an alien war. In other words, the game he excels at is really a simulator designed to discover and hone players who can then use their gaming skills in genuine space combat.

So how does this fiction compare to the real world? Specialist aviation publications have been full of articles with titles such as 'Do Gamers Make Better Drone Operators Than Pilots?' - the answer being that in addition to the obvious skills such as good hand-eye coordination, gamers are used to not being at personal risk from playing video games (except possibly RSI) and so remain calm under pressure. The conclusion is that they may give them an edge for controlling drones, although not it has to be said, larger, manually piloted aircraft.

The big question is how deep is the military involvement in the development, promotion and assessment of video games that contain combat skills? The relationship certainly appears to go back many decades, considering that the MIT graduate students who developed Space War! were funded by the Pentagon. With the development of much more lifelike virtual worlds, the US military has taken a front seat in both producing games that hone useful skills and creating realistic simulators for training its warfighters. 

There is complex feedback loop between these two spheres and in 1999 the Department of Defense set up the Institute for Creative Technologies to work across them. Games such as Full Spectrum Warrior (2003) and its non-commercial officer training stablemate Full Spectrum Command attempted to portray realistic combat scenarios, facing enemies who frequently resemble their real-life counterparts. 

America's Army (2002) was the first of a series of (initially free) video games that began as propaganda and recruitment tools and then became a widespread commercial franchise. Marines and Special Forces soldiers were amongst those combat veterans involved in the development of these games. In addition, the developers were allowed to scan weapons (in order to build realistic digital simulations) and even shoot them on a firing range so as to experience the physical attributes at first hand. Needless to say, the potential for glorification of violence led to opposition from various quarters.

It isn't just the software that has crossed over between the military and civilian life: weaponry and control systems also feedback between the real world and combat simulations, easing the move from game playing to the genuine article. Of course, skills such as leadership and team cooperation are also being honed by these games. The idea is that they reduce the cost of recruitment and training, leading to the realisation that the free version of America's Army, having had 1.5 million downloads in its first month (and a whopping 40 million downloads over the following six years), proved how effective they could be. 

Going in the other direction, US armed forces personnel have taken part in campaigns such as Operation Phantom Fury, which let's face it, has more than a touch of the Xbox or PlayStation about it. I assume this is also part of the process to ensure a smooth transition between young combat game players and activities in the real-world military. The channel is unlikely to diminish any time soon, seeing as China is now following America's lead; their Glorious Mission online video game, aimed at potential recruits as well as enlisted service personnel, already has over 300 million players.

The US military gaming sector has also started to diversify. To minimise complaints - already prevalent in the gaming sector, due to the implacable enemy often being a group of Muslim fundamentalists - there needed to be a new target that wouldn't raise the ire of any particular nation or ethnic group. To this end, the Call of Duty series of games has introduced reanimated dead soldiers, AKA zombies, as opponents. Bearing in mind that in the past ten years there have been over fifty video games featuring zombie antagonists, its clear that this theme is just as popular as invading aliens and terrorist zealots. Perhaps it's not surprising that doomsday preppers and survivalist groups are often said to be getting ready for the zombie apocalypse!

Recently released - although heavily-redacted - files suggest that as well as developing and promoting video games centred on combat simulation, the Department of Defense has also secretly collected players' data in order to understand their demographics. This is presumably in order to tailor recruitment and training programmes for recruits with a gaming background. The same information also hints that Hollywood too is being used by the military-industrial complex to promote its own agenda. It sounds a bit far-fetched, but the facts speak for themselves. 

The US military have long taken an interest in how Hollywood portrays them. Ronald Reagan's Whitehouse had screenings of Red Dawn (1984) and WarGames (1983) with the former gaining the Pentagon's approval while the latter was not well received (hardly surprising, if you know the plot). Gung-ho space marine movies started back in the mid-1980s with likes of Predator and Aliens, but really took off in mid-1990s with blockbusters such as Independence Day, Stargate and Starship Troopers

Hollywood hasn't looked back since, and as well as the US military fighting off hordes of alien invaders, there are plenty of zombie movies - over 170 worldwide over the past decade - along with numerous zombie-themed tv series. Of course, this genre usually features civilians fighting against the living dead, but nonetheless the firearm-laden format resembles its military counterparts. Critics have been keen to note that just as the alien invasion films of 1950s and 1960s were thinly-disguised Cold War allegories, so zombie movies contain subtext of the unpredictable nature of global terrorism - and imply readiness to engage the perceived enemy is a patriotic duty.

So what is the underlying connection between these genres and the Pentagon? Even a minimum of research will reveal that a fair number of the Department of Defense's advanced weaponry projects, from the F22 Raptor tactical fighter to the Global Hawk surveillance UAV (that's an Unmanned Aerial Vehicle to you and me) have been truncated, in both these cases with only about half the number of units being built compared to the original proposals. The funding for those cancelled vehicles is being redirected elsewhere and Hollywood is the most likely recipient, the money being used for both movies and tv shows that follow the DoD agenda.

And how does the Pentagon know it's getting value for money? As more people book cinema tickets online and via their smartphones, the DoD is able to build frighteningly detailed profiles of those adolescents with the aptitude and skills they are looking for. Thanks to tv subscription services, it is also much easier to see exactly who is watching how much of what.

By immersing America's youth in popular entertainment across a variety of channels that both gives a homely familiarity to the military and allows niche targeting for potential recruits, the Pentagon is saving money on blanket advertising while promoting its own values as a mainstream cultural element. Thanks to a business culture that embeds military-derived phrases ('locked and loaded', 'SWAT team', 'strategic planning', etc) the distance between the armed forces and civilian life has been much reduced since the anti-war ethos of the 1970s. So if you're a teenager who plays certain types of video games and/or watches these sorts of movies and tv shows, don't be surprised if you start receiving recruitment adverts tailored closely to your personality profile. To paraphrase the Village People: they want you as a new recruit!

Monday, 15 March 2021

Distorted Darwin: common misconceptions about evolution and natural selection

A few months' ago, I discussed how disagreements with religious texts can lead the devout to disagree with key scientific theories; presumably this is a case of fundamentalists denying the fundamentals? Of all the areas of scientific research that cause issues today, it is evolutionary biology that generates the most opposition. This is interesting in so many ways, not least because the primary texts of the Abrahamic religions have little to say on the topic beyond the almost universal elements seen in creation myths, namely that one or more superior beings created all life on Earth and that He/They placed humanity at the zenith.

Thanks to opposition to the modern evolutionary synthesis, there is a plethora of misinformation, from material taken out of context to complete falsehoods, that is used to promote Creationist ideas rather than scientifically-gleaned knowledge. Even those with well-meaning intentions often make mistakes when condensing the complexity of the origin and history of life into easy-to-digest material. I've previously written about the concepts of evolutionary bushes rather than ladders, concurrent rather than consecutive radiation of sister species and speciation via punctuated equilibrium (i.e., the uneven pace of evolution) so here are a few other examples where the origin, implications and illustrations of natural selection has been distorted or overly simplified to the point of inaccuracy.

I've previously mentioned that Charles Darwin was the earliest discoverer - but only a decade or two ahead of Alfred Russel Wallace - of natural selection, and not as is often written, evolution per se. However, this is not completely accurate. Darwin's hypothesis was more complete than Wallace's, in the sense of being entirely scientific and therefore testable. Wallace on the other hand maintained there must have been divine intervention in the creation of our species, making us different from all other life forms.

In addition, there were several precursors who partially formulated ideas regarding natural selection, but who were unable to promote a consistent, evidence-based hypothesis to anywhere near the extent that Darwin achieved. For example, as early as 1831 the Scottish agriculturalist Patrick Matthew published some notes on what he termed 'new diverging ramifications of life' as he thought must occur after mass extinctions. Nevertheless, he failed to expand and fully explain his ideas, seemingly unaware of where they could lead. In this sense, he is a minor figure compared to the thorough research Darwin undertook to back up his hypothesis. 

Darwin appears to have been unaware of Matthew's ideas, although the same could not be said for Robert Chambers' (anonymous) 1844 publication Vestiges of the Natural History of Creation, which although highly speculative contained some kernels of truth about the mechanisms behind biological evolution. Just as Thomas Malthus' 1798 An Essay on the Principle of Population inspired Darwin, so the mid-nineteenth century contained other combinations of ideas and real-world inspiration that provided,an ideal background for the formulation of natural selection. In other words, the conditions were ready for those with the correct mindset to uncover the mechanism behind evolution. What Darwin did was to combine the inspiration with an immense amount of rigour, including examples taken from selective breeding.

Another frequently quoted fallacy is that evolution always maintains a single direction from earlier, simpler organisms to later, more complex ones. I've covered this before in discussions of the evolution of our own species, as many popular biology accounts seek parallels between technological progress and a central branch of animal evolution leading ever upwards until it produced us. 

Modern techniques such as genetic analysis and sophisticated examination of fossils - including scanning their internal cavities – has negated this appealing but incorrect idea. For example, mammals evolved around the same time as the dinosaurs (and over one hundred million years before flowering plants) while parasitic species often have a far more rudimentary structure than their ancestors. 

Despite this, we still see countless illustrations showing a clear-cut path from primordial organisms 'up' to Homo sapiens. No-one who has seen the cranial endocast of a dinosaur would consider it to be superior to even the least intelligent of mammals, although the later medium-sized carnivorous species were on the way to developing a bird-like brain-to-body mass ratio. Yet throughout the Jurassic and Cretaceous periods, dinosaurs filled most ecological niches at the expense of the mammals; you would be hard-pressed to state that the latter were the dominant type of land organism during the Mesozoic!

Research published last year shows that New Zealand's unique tuatara, the sole remaining member of the Rhynchocephalia, is a reptile that shares some genetic similarities to the Monotremata, the egg-laying mammalian species known as platypus and echidna. In addition, a report from the beginning of this year states that the ancestors of today's five monotreme species diverged from all other mammals 187 million years ago; therefore, they have spent approximately three times as long on their own evolutionary journey as they did when part of all the other mammalian lineages. As a result of retaining many ancestral features, the platypus genome is in some ways more like that of birds and reptiles rather than placental and marsupial mammals. But we still include them amongst the mammals rather than as a hybrid or separate class; both platypus and echidna have fur, are warm-blooded and produce milk (although with a unique delivery system!) This allows their inclusion in Mammalia; does this mean we arbitrarily allow certain traits and discard others?

Would it be fair to say that the boundaries we make between organisms are more for our convenience than the underlying reality? Are you happy to label birds as 'avian dinosaurs' and if not, why not? If they had feathers, nests and even underground burrows, some dinosaurs were clearly part of the way there; physiologically, it was teeth, bony tail, and a crocodilian-type brain that provided the differentiation from birds. Scans of fossils show that dinosaur hearts may have been more like birds than other reptiles, which along with the possible discovery of bird-like air sacs, means that they could have had something of the former's more active lifestyle. 

This doesn't confirm that they were warm-blooded: today there are eight species, including leatherback turtles, that are mesothermic and therefore lie between warm- and cold-blooded metabolisms. Eggshell analysis suggests that some of the theropod (carnivorous) dinosaurs could have been warm-blooded, but as dinosaurs existed for around 165 million years it may be that some evolved to be mesothermic and others to be endothermic (i.e., fully warm-blooded). In this respect then, some meat-eating dinosaurs especially may have had more in common with us mammals than they did with other reptiles such as lizards and snakes.

All this only goes to show that there is far more to life's rich pageant than the just-so stories still used to illustrate the history of life. Science communication to the public is fundamental to our society but it needs to present the awkward complexities of evolution via all the tortured pathways of natural selection if it is not to fall victim to those who prefer myths of the last few thousand years to the history of countless millennia, as revealed in the genes and rocks waiting for us to explore.


Friday, 19 February 2021

Science, society & stereotypes: examining the lives of trailblazing women in STEM

I was recently flicking through a glossily illustrated Australian book on the history of STEM when I found the name of a pioneer I didn't recognise: Marjory Warren, a British surgeon who is best known today as the 'mother of modern geriatric medicine'. Looking in the index I could find only two other women scientists - compared to over one hundred and twenty men - in a book five hundred pages long! The other two examples were Marie Curie (of course) and American astronomer Vera Rubin. Considering that the book was published in 2008, I was astounded by how skewed this seemed to be. Granted that prior to the twentieth century, few women had the option of becoming involved in science and mathematics; but for any history of STEM, wouldn't the last century contain the largest proportion of subject material?

I therefore thought it would be interesting to choose case studies from the twentieth century to see what sort of obstacles - unique or otherwise - that women scientists faced until recently. If you ask most people to name a female scientist then Marie Curie would probably top the list, although a few countries might have national favourites: perhaps Rosalind Franklin in the UK or Rachel Carson in the USA, for example. Rather than choose the more obvious candidates such as these I have selected four women I knew only a little about, ordered by their date of birth.

Barbara McClintock (1902-1992) was an American cytogeneticist who was ahead of her time in terms of both research and social attitudes. Although her mother didn't want her to train as a scientist, she was lucky to have a father who thought differently to the accepted wisdom - which was that female scientists would be unable to find a husband! McClintock's abilities showed early in her training, leading to post-graduate fellowships which in turn generated cutting-edge research.

At the age of forty-two, Barbara McClintock was only the third woman to be elected to the US National Academy of Sciences. However, her rapid rise within the scientific establishment didn't necessarily assist her: such was the conservative nature of universities that women were not allowed to attend faculty meetings. 

After publishing her research to broad acceptance, McClintock's work then moved into what today would broadly come under the term of epigenetics. Several decades' ahead of its time, it was seen as too radical by most of her peers and so after facing intense opposition she temporarily stopped publishing her results. It is unlikely that being a woman was entirely responsible for the hostility to her work; similar resistance has frequently been experienced throughout the STEM avant-garde. It seems that only when other researchers found similar results to McClintock did the more hidebound sections of the discipline re-examine their negative attitude towards her work.

There has been a fair amount of discussion as to whether it was because McClintock was female, or because of her secretive personality (both at home as well as at work, for she never married) - or a combination of both - that delayed her receipt of the Nobel Prize in Physiology or Medicine. Even by the slow standards of that particular awards committee, 1983 was rather late in the day. However, by then she had already been the recipient of numerous other awards and prizes.

Regardless of the recognition it gave her, Barbara McClintock relished scientific research for the sake of uncovering nature's secrets. In that regard, she said: "I just have been so interested in what I was doing and it's been such a pleasure, such a deep pleasure, that I never thought of stopping...I've had a very, very, satisfying and interesting life."

Tikvah Alper (1909-1995) was a South African radiobiologist who worked on prions - otherwise known as 'misfolded' or 'rogue' proteins - and their relationship to certain diseases. Her outstanding abilities were recognised early, allowing her to study physics at the University of Cape Town. She then undertook post-graduate work in Berlin with the nuclear fission pioneer Lise Meitner, only to be forced to leave before completing her doctorate due to the rise in anti-Semitism in Germany.

Having had her research curtailed by her ethnicity, Alper was initially also stymied on her return to South Africa thanks to her private life: due to the misogynist rules of that nation's universities, married women were not allowed to remain on the faculty. Therefore, along with her husband the veterinary medicine researcher Max Sterne, she continued her work from home. However, eventually her talents were acknowledged and she was made head of the Biophysics section at the South African National Physics Laboratory in 1948. Then only three years later, Alper's personal life intervened once again; this time, she and her husband were forced to leave South Africa due to their opposition to apartheid.

After a period of unpaid research in London, Alper turned to studying the effects of radiation on different types of cells, rising to become head of the Medical Research Council Radiopathology Unit at Hammersmith Hospital. Alper's theories regarding prions were eventually accepted into the mainstream and even after retirement she continued working, writing a renowned text book, Cellular Radiobiology, in 1979. 

Alper's life suggests she was very much a problem solver, tackling anything that she felt needed progressing. As a result of this ethos she worked on a wide range of issues from the standing of women in science and society, to the injustice of apartheid, even to learning and teaching sign language after one of her son's was born profoundly deaf. Despite being forced to leave several nations for different reasons - not because she was a woman - Alper was someone who refused to concede defeat. In that respect she deserves much wider recognition today.

Dorothy Crowfoot Hodgkin (1910-1994) was interested in chemistry, in particular crystals, from a young age. Although women of her generation were encouraged in this area as a hobby, it was highly unusual for them to seek paid employment in the field. Luckily, her mother encouraged her interest and gave Hodgkin a book on x-ray crystallography for her sixteenth birthday, a gift which determined her career path. 

After gaining a first-class honours chemistry degree at Oxford, she moved to Cambridge for doctoral work under the x-ray crystallography pioneer J.D. Bernal. Not only did Hodgkin then manage to find a research post in her chosen field, working at both Cambridge and Oxford, she was able to pursue cutting edge work labelled as too difficult by her contemporaries, Hodgkin and her colleagues achieved ground-breaking results in critical areas, resolving the structure of penicillin, vitamin B12 and insulin. 

Hodgkin's gained international renown, appearing to have faced few of the difficulties experienced by her female contemporaries. In addition to having a well-equipped laboratory at Oxford, she was elected to the Royal Society in 1947 and became its Wolfson Research Professor in 1960. She was also awarded the Nobel Prize in Chemistry in 1964 - the only British woman to have been a recipient to date. Other prestigious awards followed, including the Royal Society's Copley Medal in 1976; again, no other woman has yet received that award.

Presumably in response to the loss of four maternal uncles in the First World War, Hodgkin was an active promoter of international peace. During the 1950s her views were deemed too left wing by the American government and she had to attain special permission to enter the United States to attend science conferences. Ironically, the Soviet Union honoured her on several occasions, admitting her as a foreign member of the Academy of Sciences and later awarding her the Lenin Peace Prize. She also communicated with her Chinese counterparts and became committed to nuclear disarmament, both through CND and Operation Pugwash.

Her work on insulin, itself of enormous importance, is just one facet of her life. Ironically, as someone associated with left-wing politics, she is often remembered today as being one of Margaret Thatcher's lecturers; despite their different socio-political leanings, they maintained a friendship into later life. All this was despite the increasing disability Hodgkin suffered from her mid-twenties due to chronic rheumatoid arthritis, which left her with seemingly minimal dexterity. Clearly, Dorothy Hodgkin was a dauntless fighter in her professional and personal life.

Marie Tharp (1920-2006) was an American geologist best known for her oceanographic cartography work regarding the floor of the Atlantic Ocean. Despite followed the advice of her father (a surveyor) and taking an undergraduate degree in humanities and music, Tharp also took a geology class; perhaps helping her father as a child boosted her interest in this subject. It enabled her to complete a master's degree in geology, thanks to the dearth of male students during the Second World War. Certainly, it was an unusual avenue for women to be interested in; at the time less than four percent of all earth sciences doctorates in the USA were awarded to women.

From a modern perspective, geology during the first half of the twentieth century appears to have been exceedingly hidebound and conservative. Tharp found she could not undertake field trips to uncover fossil fuel deposits, as women were only allowed to do office-based geological work - one explanation for this sexism being that having women on board ship brought bad luck! In fact, it wasn't until 1968 that Tharp eventually joined an expedition. 

However, thanks to painstaking study of her colleague Bruce Heezen's data, Tharp was able to delineate geophysical features such as the mid-Atlantic ridge and consider the processes that generated them. Her map of the Atlantic Ocean floor was far more sophisticated than anything that had previously been created, giving her insights denied to both her contemporaries as well as her predecessors. As such, Tharp suspected that the long-denigrated continental drift hypothesis, as envisaged by Alfred Wegener three decades previously, was correct. It was here that she initially came unstuck, with Heezen labelling her enthusiasm for continental drift as 'girl talk'. Let's hope that phrase wouldn't be used today!

In time though, yet more data (including the mirrored magnetic striping either side of the mid-Atlantic ridge) proved Tharp correct. Heezen's incredulity was replaced by acceptance, as continental drift was reformulated via seafloor spreading to become the theory of plate tectonics. Mainstream geology finally approved what Wegener had proposed, and Marie Tharp was a fundamental part of that paradigm shift. 

What is interesting is that despite receiving many awards in her later years, including the National Geographic Society's Hubbard Medal in 1978, her name is mentioned far less often than other pioneers of plate tectonics such as Harry Hess, Frederick Vine, Drummond Matthews, even Heezen. It's unclear if Tharp's comparative lack of recognition is due to her being female or because she was only one of many researchers working along similar lines. Her own comment from the era suggests that just being a women scientist was reason enough to dismiss her work: she noted that other professional's viewed her ideas with attitudes ranging "from amazement to skepticism to scorn."

There are countless other examples that would serve as case studies, including women from non-Western nations, but these four show the variety of experiences women scientists underwent during the twentieth century, ranging from a level of misogyny that would be unthinkable today to an early acceptance of the value of their work and a treatment not seemingly different from their male colleagues. I was surprised to find such a range of circumstances and attitudes, proving that few things are as straightforward as they are frequently portrayed. However, these examples do show that whatever culture they grow up in, the majority of the population consider its values to be perfectly normal; a little bit of thought - or hindsight - shows that just because something is the norm, doesn't necessarily mean it's any good. When it comes to the attitudes today, you only have to read the news to realise there's still some way to go before women in STEM are treated the same as their male counterparts.

Monday, 25 January 2021

Ignorance is bliss: why admitting lack of knowledge could be good for science

"We just don't know" might be one of the best phrases in support of the scientific method ever written. But unfortunately it carries an inherent danger: if a STEM professional - or indeed an amateur scientist/citizen scientist - uses the term, it can be used by those wishing to disavow the subject under discussion. Even adding "- yet" to the end of it won't necessarily improve matters; we humans have an unfortunate tendency to rely on gut instinct rather than rational analysis for our world model, hence - well, just about any man-made problem you care to name, now or throughout history.

Even though trust in scientists and the real-world application of their work may have taken an upswing thanks to some rapid vaccine development during the current pandemic, there are many areas of scientifically-gleaned knowledge that are still as unpopular as ever. Incidentally, I wonder whether if it wasn't for much stricter laws in most countries today, we would have seen far more of the quackery that arose during the 1918 Spanish flu epidemic. During this period low-tech 'cures' included gas inhalation, enemas and blood-letting, the former about as safe as last year's suggestion to drink bleach. I've seen very little about alternative cures, no doubt involving crystals, holy water or good old-fashioned prayer, but then I probably don't mix in those sort of circles (and certainly don't have that type of online cookie profile). But while legislation might have prevented alternative pandemic treatments from being advertised as legitimate and effective, it hasn't helped other areas of science that suffer from widespread hostility. 

Partly this is due to the concept - at least in liberal democracies - of free speech and the idea that every thesis must surely have an antithesis worthy of discussion. Spherical planets not your bag, baby? Why not join the Flat Earth Society. It's easy to be glib about this sort of thing, but there are plenty of more serious examples of anti-scientific thinking that show no sign of abating. The key element that disparate groups opposing science seem to have in common is simple; it all comes down to where it disagrees with the world picture they learnt as a child. In most cases this can be reduced even further to just two words: religious doctrine.

This is where a humble approach to cutting-edge research comes in. Humility has rarely been a key characteristic of fictional scientists; Hollywood for example has often depicted (usually male) scientists as somewhere on a crude line between power-crazed megalomaniacs and naive, misguided innocents. The more sensational printed volumes and tv documentaries communicating scientific research to a popular audience likewise frequently eschew ambiguities or dead-ends in favour of this-is-how-it-is approach. Only, quite often, it isn't how it works at all. Doubts and negative results are not only a key element of science, they are a fundamental component; only by discarding failures can the search for an answer to an hypothesis (or if you prefer the description of the brilliant-yet-humble physicist Richard Feynman: a guess) be narrowed down. 

There are plenty of examples where even the most accomplished of scientists have admitted they don't know the answer to something in their area of expertise, such as Sir Isaac Newton being unable to resolve the ultimate cause of gravity. As it was, it took over two centuries for another genius - Albert Einstein - to figure it out. Despite all the research undertaken over the past century or so, the old adage remains as true as ever: good science creates as many new questions as it answers. Key issues today that are unlikely to gain resolution in the next few years - although never say never - include what is the nature of dark energy (and possibly likewise for dark/non-baryonic matter) and what is the ultimate theory behind quantum mechanics? 

Of course, these questions, fascinating though they are, hold little appeal to most people; they are just too esoteric and far removed from everyday existence to be bothered about. So what areas of scientific knowledge or research do non-scientists worry about? As mentioned above, usually it is something that involves faith. This can be broken down into several factors:

  1. Disagreement with a key religious text
  2. Implication that humans lack an non-corporeal element, such as an immortal soul
  3. Removal of mankind as a central component or focal point for the universe 

These obviously relate to some areas of science - from a layman's viewpoint - far more than others. Most non-specialists, even religious fundamentalists, don't appear to have an issue with atomic theory and the periodic table. Instead, cosmology and evolutionary biology are the disciplines likely to raise their ire. Both are not in any sense complete; the amount of questions still being asked is far greater than the answers so far gleaned from research. The former is yet to understand what 96% of the universe is composed of, while the latter is still piecing together the details of the origin and development of life of our planet, from primordial slime up to Donald Trump (so possibly more of a sideways move, then). 

Herein lies the issue: if scientists claim they are 'certain' about the cause of a particular phenomenon or feature of reality, but further research confirms a different theory, then non-scientists are  legitimately able to ask why the new idea is any more final than the previous one? In addition, the word 'theory' is also prone to misinterpretation, implying it is only an idea and not an hypothesis (guess, if you like) that hasn't yet failed any tests thrown at it, be they practical experiments, digital simulations or mathematical constructions. Bill Bryson's best-selling A Short History of Nearly Everything is an example of how science can be done a disservice by material meant to promote it, in that the book treats science as if it were an ever-expanding body of knowledge rather than as a collection of methods that are used to explore answerable questions about life, the universe, and of course, everything.

Perhaps one answer to all this would be for popular science journalism, from books written by professional scientists to short news items, to include elements related to what is not yet known. The simplistic approach that avoids the failures only serves to strengthen the opinion that experts are arrogant believers in their own personal doctrines, as inflexible and uncompromising as holy writ. 

Unfortunately, in efforts to be both concise and easy-to-comprehend, much science communication appears to render the discipline in this manner, avoiding dissension and doubt. In addition, the often wonderful - and yet to be resolved subtleties - of research are neglected. For example, the majority of specialists agree that birds are descended from theropod (i.e. carnivorous) dinosaurs, and yet the primary growth axis on the forelimbs of the two groups differs. This issue has not been satisfactorily answered, but the vast collection of evidence, both from fossils and experimentation, claims it as the most plausible solution to this particular phylogenetics tree. Further research, especially in embryology, may one day find a more complete solution.

Ultimately then, science education would probably benefit from it confirming boundaries of uncertainty, where they exist. This may help allay fears that the discipline wants to impose absolutes about everything; in most areas (the second law of thermodynamics excepted) we are still in the early stages of understanding. This doesn't mean that the Earth may be flat or only six thousand years old, but it does mean that science usually works in small steps, not giant paradigm shifts that offer the final say on an aspect of reality. After all, if scientists already knew everything about a subject, there wouldn't be any need for further research. What a boring world that would be!