Friday, 19 February 2021

Science, society & stereotypes: examining the lives of trailblazing women in STEM

I was recently flicking through a glossily illustrated Australian book on the history of STEM when I found the name of a pioneer I didn't recognise: Marjory Warren, a British surgeon who is best known today as the 'mother of modern geriatric medicine'. Looking in the index I could find only two other women scientists - compared to over one hundred and twenty men - in a book five hundred pages long! The other two examples were Marie Curie (of course) and American astronomer Vera Rubin. Considering that the book was published in 2008, I was astounded by how skewed this seemed to be. Granted that prior to the twentieth century, few women had the option of becoming involved in science and mathematics; but for any history of STEM, wouldn't the last century contain the largest proportion of subject material?

I therefore thought it would be interesting to choose case studies from the twentieth century to see what sort of obstacles - unique or otherwise - that women scientists faced until recently. If you ask most people to name a female scientist then Marie Curie would probably top the list, although a few countries might have national favourites: perhaps Rosalind Franklin in the UK or Rachel Carson in the USA, for example. Rather than choose the more obvious candidates such as these I have selected four women I knew only a little about, ordered by their date of birth.

Barbara McClintock (1902-1992) was an American cytogeneticist who was ahead of her time in terms of both research and social attitudes. Although her mother didn't want her to train as a scientist, she was lucky to have a father who thought differently to the accepted wisdom - which was that female scientists would be unable to find a husband! McClintock's abilities showed early in her training, leading to post-graduate fellowships which in turn generated cutting-edge research.

At the age of forty-two, Barbara McClintock was only the third woman to be elected to the US National Academy of Sciences. However, her rapid rise within the scientific establishment didn't necessarily assist her: such was the conservative nature of universities that women were not allowed to attend faculty meetings. 

After publishing her research to broad acceptance, McClintock's work then moved into what today would broadly come under the term of epigenetics. Several decades' ahead of its time, it was seen as too radical by most of her peers and so after facing intense opposition she temporarily stopped publishing her results. It is unlikely that being a woman was entirely responsible for the hostility to her work; similar resistance has frequently been experienced throughout the STEM avant-garde. It seems that only when other researchers found similar results to McClintock did the more hidebound sections of the discipline re-examine their negative attitude towards her work.

There has been a fair amount of discussion as to whether it was because McClintock was female, or because of her secretive personality (both at home as well as at work, for she never married) - or a combination of both - that delayed her receipt of the Nobel Prize in Physiology or Medicine. Even by the slow standards of that particular awards committee, 1983 was rather late in the day. However, by then she had already been the recipient of numerous other awards and prizes.

Regardless of the recognition it gave her, Barbara McClintock relished scientific research for the sake of uncovering nature's secrets. In that regard, she said: "I just have been so interested in what I was doing and it's been such a pleasure, such a deep pleasure, that I never thought of stopping...I've had a very, very, satisfying and interesting life."

Tikvah Alper (1909-1995) was a South African radiobiologist who worked on prions - otherwise known as 'misfolded' or 'rogue' proteins - and their relationship to certain diseases. Her outstanding abilities were recognised early, allowing her to study physics at the University of Cape Town. She then undertook post-graduate work in Berlin with the nuclear fission pioneer Lise Meitner, only to be forced to leave before completing her doctorate due to the rise in anti-Semitism in Germany.

Having had her research curtailed by her ethnicity, Alper was initially also stymied on her return to South Africa thanks to her private life: due to the misogynist rules of that nation's universities, married women were not allowed to remain on the faculty. Therefore, along with her husband the veterinary medicine researcher Max Sterne, she continued her work from home. However, eventually her talents were acknowledged and she was made head of the Biophysics section at the South African National Physics Laboratory in 1948. Then only three years later, Alper's personal life intervened once again; this time, she and her husband were forced to leave South Africa due to their opposition to apartheid.

After a period of unpaid research in London, Alper turned to studying the effects of radiation on different types of cells, rising to become head of the Medical Research Council Radiopathology Unit at Hammersmith Hospital. Alper's theories regarding prions were eventually accepted into the mainstream and even after retirement she continued working, writing a renowned text book, Cellular Radiobiology, in 1979. 

Alper's life suggests she was very much a problem solver, tackling anything that she felt needed progressing. As a result of this ethos she worked on a wide range of issues from the standing of women in science and society, to the injustice of apartheid, even to learning and teaching sign language after one of her son's was born profoundly deaf. Despite being forced to leave several nations for different reasons - not because she was a woman - Alper was someone who refused to concede defeat. In that respect she deserves much wider recognition today.

Dorothy Crowfoot Hodgkin (1910-1994) was interested in chemistry, in particular crystals, from a young age. Although women of her generation were encouraged in this area as a hobby, it was highly unusual for them to seek paid employment in the field. Luckily, her mother encouraged her interest and gave Hodgkin a book on x-ray crystallography for her sixteenth birthday, a gift which determined her career path. 

After gaining a first-class honours chemistry degree at Oxford, she moved to Cambridge for doctoral work under the x-ray crystallography pioneer J.D. Bernal. Not only did Hodgkin then manage to find a research post in her chosen field, working at both Cambridge and Oxford, she was able to pursue cutting edge work labelled as too difficult by her contemporaries, Hodgkin and her colleagues achieved ground-breaking results in critical areas, resolving the structure of penicillin, vitamin B12 and insulin. 

Hodgkin's gained international renown, appearing to have faced few of the difficulties experienced by her female contemporaries. In addition to having a well-equipped laboratory at Oxford, she was elected to the Royal Society in 1947 and became its Wolfson Research Professor in 1960. She was also awarded the Nobel Prize in Chemistry in 1964 - the only British woman to have been a recipient to date. Other prestigious awards followed, including the Royal Society's Copley Medal in 1976; again, no other woman has yet received that award.

Presumably in response to the loss of four maternal uncles in the First World War, Hodgkin was an active promoter of international peace. During the 1950s her views were deemed too left wing by the American government and she had to attain special permission to enter the United States to attend science conferences. Ironically, the Soviet Union honoured her on several occasions, admitting her as a foreign member of the Academy of Sciences and later awarding her the Lenin Peace Prize. She also communicated with her Chinese counterparts and became committed to nuclear disarmament, both through CND and Operation Pugwash.

Her work on insulin, itself of enormous importance, is just one facet of her life. Ironically, as someone associated with left-wing politics, she is often remembered today as being one of Margaret Thatcher's lecturers; despite their different socio-political leanings, they maintained a friendship into later life. All this was despite the increasing disability Hodgkin suffered from her mid-twenties due to chronic rheumatoid arthritis, which left her with seemingly minimal dexterity. Clearly, Dorothy Hodgkin was a dauntless fighter in her professional and personal life.

Marie Tharp (1920-2006) was an American geologist best known for her oceanographic cartography work regarding the floor of the Atlantic Ocean. Despite followed the advice of her father (a surveyor) and taking an undergraduate degree in humanities and music, Tharp also took a geology class; perhaps helping her father as a child boosted her interest in this subject. It enabled her to complete a master's degree in geology, thanks to the dearth of male students during the Second World War. Certainly, it was an unusual avenue for women to be interested in; at the time less than four percent of all earth sciences doctorates in the USA were awarded to women.

From a modern perspective, geology during the first half of the twentieth century appears to have been exceedingly hidebound and conservative. Tharp found she could not undertake field trips to uncover fossil fuel deposits, as women were only allowed to do office-based geological work - one explanation for this sexism being that having women on board ship brought bad luck! In fact, it wasn't until 1968 that Tharp eventually joined an expedition. 

However, thanks to painstaking study of her colleague Bruce Heezen's data, Tharp was able to delineate geophysical features such as the mid-Atlantic ridge and consider the processes that generated them. Her map of the Atlantic Ocean floor was far more sophisticated than anything that had previously been created, giving her insights denied to both her contemporaries as well as her predecessors. As such, Tharp suspected that the long-denigrated continental drift hypothesis, as envisaged by Alfred Wegener three decades previously, was correct. It was here that she initially came unstuck, with Heezen labelling her enthusiasm for continental drift as 'girl talk'. Let's hope that phrase wouldn't be used today!

In time though, yet more data (including the mirrored magnetic striping either side of the mid-Atlantic ridge) proved Tharp correct. Heezen's incredulity was replaced by acceptance, as continental drift was reformulated via seafloor spreading to become the theory of plate tectonics. Mainstream geology finally approved what Wegener had proposed, and Marie Tharp was a fundamental part of that paradigm shift. 

What is interesting is that despite receiving many awards in her later years, including the National Geographic Society's Hubbard Medal in 1978, her name is mentioned far less often than other pioneers of plate tectonics such as Harry Hess, Frederick Vine, Drummond Matthews, even Heezen. It's unclear if Tharp's comparative lack of recognition is due to her being female or because she was only one of many researchers working along similar lines. Her own comment from the era suggests that just being a women scientist was reason enough to dismiss her work: she noted that other professional's viewed her ideas with attitudes ranging "from amazement to skepticism to scorn."

There are countless other examples that would serve as case studies, including women from non-Western nations, but these four show the variety of experiences women scientists underwent during the twentieth century, ranging from a level of misogyny that would be unthinkable today to an early acceptance of the value of their work and a treatment not seemingly different from their male colleagues. I was surprised to find such a range of circumstances and attitudes, proving that few things are as straightforward as they are frequently portrayed. However, these examples do show that whatever culture they grow up in, the majority of the population consider its values to be perfectly normal; a little bit of thought - or hindsight - shows that just because something is the norm, doesn't necessarily mean it's any good. When it comes to the attitudes today, you only have to read the news to realise there's still some way to go before women in STEM are treated the same as their male counterparts.

Monday, 25 January 2021

Ignorance is bliss: why admitting lack of knowledge could be good for science

"We just don't know" might be one of the best phrases in support of the scientific method ever written. But unfortunately it carries an inherent danger: if a STEM professional - or indeed an amateur scientist/citizen scientist - uses the term, it can be used by those wishing to disavow the subject under discussion. Even adding "- yet" to the end of it won't necessarily improve matters; we humans have an unfortunate tendency to rely on gut instinct rather than rational analysis for our world model, hence - well, just about any man-made problem you care to name, now or throughout history.

Even though trust in scientists and the real-world application of their work may have taken an upswing thanks to some rapid vaccine development during the current pandemic, there are many areas of scientifically-gleaned knowledge that are still as unpopular as ever. Incidentally, I wonder whether if it wasn't for much stricter laws in most countries today, we would have seen far more of the quackery that arose during the 1918 Spanish flu epidemic. During this period low-tech 'cures' included gas inhalation, enemas and blood-letting, the former about as safe as last year's suggestion to drink bleach. I've seen very little about alternative cures, no doubt involving crystals, holy water or good old-fashioned prayer, but then I probably don't mix in those sort of circles (and certainly don't have that type of online cookie profile). But while legislation might have prevented alternative pandemic treatments from being advertised as legitimate and effective, it hasn't helped other areas of science that suffer from widespread hostility. 

Partly this is due to the concept - at least in liberal democracies - of free speech and the idea that every thesis must surely have an antithesis worthy of discussion. Spherical planets not your bag, baby? Why not join the Flat Earth Society. It's easy to be glib about this sort of thing, but there are plenty of more serious examples of anti-scientific thinking that show no sign of abating. The key element that disparate groups opposing science seem to have in common is simple; it all comes down to where it disagrees with the world picture they learnt as a child. In most cases this can be reduced even further to just two words: religious doctrine.

This is where a humble approach to cutting-edge research comes in. Humility has rarely been a key characteristic of fictional scientists; Hollywood for example has often depicted (usually male) scientists as somewhere on a crude line between power-crazed megalomaniacs and naive, misguided innocents. The more sensational printed volumes and tv documentaries communicating scientific research to a popular audience likewise frequently eschew ambiguities or dead-ends in favour of this-is-how-it-is approach. Only, quite often, it isn't how it works at all. Doubts and negative results are not only a key element of science, they are a fundamental component; only by discarding failures can the search for an answer to an hypothesis (or if you prefer the description of the brilliant-yet-humble physicist Richard Feynman: a guess) be narrowed down. 

There are plenty of examples where even the most accomplished of scientists have admitted they don't know the answer to something in their area of expertise, such as Sir Isaac Newton being unable to resolve the ultimate cause of gravity. As it was, it took over two centuries for another genius - Albert Einstein - to figure it out. Despite all the research undertaken over the past century or so, the old adage remains as true as ever: good science creates as many new questions as it answers. Key issues today that are unlikely to gain resolution in the next few years - although never say never - include what is the nature of dark energy (and possibly likewise for dark/non-baryonic matter) and what is the ultimate theory behind quantum mechanics? 

Of course, these questions, fascinating though they are, hold little appeal to most people; they are just too esoteric and far removed from everyday existence to be bothered about. So what areas of scientific knowledge or research do non-scientists worry about? As mentioned above, usually it is something that involves faith. This can be broken down into several factors:

  1. Disagreement with a key religious text
  2. Implication that humans lack an non-corporeal element, such as an immortal soul
  3. Removal of mankind as a central component or focal point for the universe 

These obviously relate to some areas of science - from a layman's viewpoint - far more than others. Most non-specialists, even religious fundamentalists, don't appear to have an issue with atomic theory and the periodic table. Instead, cosmology and evolutionary biology are the disciplines likely to raise their ire. Both are not in any sense complete; the amount of questions still being asked is far greater than the answers so far gleaned from research. The former is yet to understand what 96% of the universe is composed of, while the latter is still piecing together the details of the origin and development of life of our planet, from primordial slime up to Donald Trump (so possibly more of a sideways move, then). 

Herein lies the issue: if scientists claim they are 'certain' about the cause of a particular phenomenon or feature of reality, but further research confirms a different theory, then non-scientists are  legitimately able to ask why the new idea is any more final than the previous one? In addition, the word 'theory' is also prone to misinterpretation, implying it is only an idea and not an hypothesis (guess, if you like) that hasn't yet failed any tests thrown at it, be they practical experiments, digital simulations or mathematical constructions. Bill Bryson's best-selling A Short History of Nearly Everything is an example of how science can be done a disservice by material meant to promote it, in that the book treats science as if it were an ever-expanding body of knowledge rather than as a collection of methods that are used to explore answerable questions about life, the universe, and of course, everything.

Perhaps one answer to all this would be for popular science journalism, from books written by professional scientists to short news items, to include elements related to what is not yet known. The simplistic approach that avoids the failures only serves to strengthen the opinion that experts are arrogant believers in their own personal doctrines, as inflexible and uncompromising as holy writ. 

Unfortunately, in efforts to be both concise and easy-to-comprehend, much science communication appears to render the discipline in this manner, avoiding dissension and doubt. In addition, the often wonderful - and yet to be resolved subtleties - of research are neglected. For example, the majority of specialists agree that birds are descended from theropod (i.e. carnivorous) dinosaurs, and yet the primary growth axis on the forelimbs of the two groups differs. This issue has not been satisfactorily answered, but the vast collection of evidence, both from fossils and experimentation, claims it as the most plausible solution to this particular phylogenetics tree. Further research, especially in embryology, may one day find a more complete solution.

Ultimately then, science education would probably benefit from it confirming boundaries of uncertainty, where they exist. This may help allay fears that the discipline wants to impose absolutes about everything; in most areas (the second law of thermodynamics excepted) we are still in the early stages of understanding. This doesn't mean that the Earth may be flat or only six thousand years old, but it does mean that science usually works in small steps, not giant paradigm shifts that offer the final say on an aspect of reality. After all, if scientists already knew everything about a subject, there wouldn't be any need for further research. What a boring world that would be!

Monday, 14 December 2020

Biomaterial bonanza: putting plastics out of a job

With the rapidly approaching midwinter (at least for the Northern Hemisphere) festival - and traditionally a time of gift-giving - wouldn't it be great to say that humanity can offer a present to the entire planet? The amount of plastic-based products manufactured every year is somewhere between three hundred and four hundred million tons, about fifty percent of which is single-use or disposable. 

Presumably if you've got any sort of interest whatsoever in the world around you (and how your children will get on) then you have been replacing disposable plastic items with reusable non-plastic, or at least biodegradable, alternatives. But are the companies producing the latter guilty of subtle greenwashing?

A friend recently told me that he had put some allegedly biodegradable plastic bags into his compost heap, only to retrieve them - albeit with some holes in - a year or so later. Bearing in mind there isn't an internationally-recognised set of characteristics for just what defines biodegradable, is it surprising that the wool (sorry, polyester) is being pulled over consumers' eyes?

A report last year summarised a three-year research programme at the UK's University of Plymouth, offering clear evidence that many types of allegedly biodegradable bags do not break down when buried in soil or underwater. Although the material did decay in open air, it was just into smaller pieces of plastic rather than degrading into simpler molecules. 

Recent studies by Tel Aviv University and the Goethe Universität in Frankfurt go even further in putting allegedly ecofriendly materials in a bad light. Both claim that not just biodegradable plastics but even those based on starch and cellulose contain numerous toxic chemicals. Such materials are used in food and drink packaging. So where do we go from here?

Last year I wrote a post about the potential of chitosan, a genuinely biodegradable material made from marine arthropod carapaces (i.e. shellfish discards) that can be produced in an eco-friendly process. 

There now appear to be several other materials that also have the possibility to replace traditional plastics. A group at the University Of Science And Technology Of China have developed a lightweight but durable material using mica and cellulose-derived nanofibre that has more than double the strength of high-performance petroleum-based plastics.

Another alternative to plastic that utilises surprising source material has been developed by a student at the University of Sussex in the UK. Lucy Hughes has used red algae to augment discarded fish scales and fish skin to produce a single-use translucent substance called MarinaTex. In addition to its use of material otherwise destined for landfill, MarinaTex - which biodegrades within six weeks - is the antithesis of conventional plastics in that the red algae component makes its production carbon positive! 

The bad news is that both materials are still at the research and development stage and there is no indication of when they would be ready for commercial mass-production. Crucially, there doesn't appear to be any news of large corporations buying the research for implementation; why is it that so many paradigm-shifting projects are having to be developed by crowd-funded start-ups rather than established multi-nationals? 

Surely there are enough ethical executives out there to pick up once the research has shown such potential? But as I've no doubt mentioned before, we are living in a world where the largest national economy - the United States of course - spends more each year on pet grooming products than on nuclear fusion research. Will future historians dub our era the Decades of Dubious Sanity?

Meanwhile, immense amounts of plastics are dumped in landfill and the oceans, polluting everything, including microplastics in the food we eat. Isn't it time these researchers are given the backing they need to convert smart ideas into ecosystem saviours? After all, no-one, no matter how wealthy, can opt out of the planetary biosphere!

Monday, 23 November 2020

Self-destructive STEM: how scientists can devalue science

Following on from last month's exploration of external factors inhibiting the scientific enterprise, I thought it would be equally interesting to examine issues within the sector that can negatively influence STEM research. There is a range of factors that vary from the sublime to the ridiculous, showing that science and its practitioners are as prey to the whims of humanity as any other discipline. 

1) Conservatism

The German physicist Max Planck once said that a "new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it." With peer review of submitted articles, it's theoretically possible that a new hypothesis could be prevented from seeing the light of day due to being in the wrong place at the wrong time; or more precisely, because the reviewers personally object to the ideas presented.

Another description of this view is that there are three stages before the old guard accept the theories of the young turks, with an avant garde idea eventually being taken as orthodoxy. One key challenge is the dislike shown by established researchers to outsiders who promote a new hypothesis in a specialisation they have no formal training in. 

A prominent example of this is the short shrift given to meteorologist Alfred Wegener when he described continental drift to the geological establishment; it took over thirty years and a plethora of evidence before plate tectonics was found to correlate with Wegener's seemingly madcap ideas. More recently, some prominent palaeontologists wrote vitriolic reviews of the geologist-led account of the Chicxulub impact as the main cause of the K-T extinction event. 

This also shows the effect impatience may have; if progress in a field is slow or seemingly negative, it may be prematurely abandoned by most if not all researchers as a dead end.

2) Putting personal preferences before evidence 

Although science is frequently sold to the public as having a purely objective attitude towards natural phenomena, disagreements at the cutting edge are common enough to become cheap ammunition for opponents of STEM research. When senior figures within a field disagree with younger colleagues, it's easy to see why there might be a catch-22 situation in which public funding is only available when there is consensus and yet consensus can only be reached when sufficient research has as placed an hypothesis on a fairly firm footing.

It is well known that Einstein wasted the last thirty or so years of his life trying to find a unified field theory without including quantum mechanics. To his tidy mind, the uncertainty principle and entanglement didn't seem to be suitable as foundation-level elements of creation, hence his famous quote usually truncated as "God doesn't play dice". In other words, just about the most important scientific theory ever didn't fit into his world picture - and yet the public's perception of Einstein during this period was that he was the world's greatest physicist.

Well-known scientists in other fields have negatively impacted their reputation late in their career. Two well-known examples are the astronomer Fred Hoyle and microbiologist Lynn Margulis. Hoyle appears to have initiated increasingly fruity ideas as he got older, including the claim that the archaeopteryx fossil at London's Natural History Museum was a fake. Margulis for her part stayed within her area of expertise, endosymbiotic theory for eukaryotic cells, to claim her discoveries could account for an extremely wide range of biological functions, including the cause of AIDS. It doesn't take much to realise that if two such highly esteemed scientists can publish nonsense, then uninformed sections of the public might want to question the validity of a much wider variety of established scientific truths.

3) Cronyism and the academic establishment

While nepotism might not appear often in the annals of science history, there have still been plenty of instances in which favoured individuals gain a position at the expense of others. This is of course a phenomenon as old as natural philosophy, although thankfully the rigid social hierarchy that affected the careers of nineteenth century luminaries such as physicist Michael Faraday and dinosaur pioneer Gideon Mantell is no longer much of an issue. 

Today, competition for a limited number of places in university research faculties can lead to results as unfair as in any humanities department.  A congenial personality and an ability to self-publicise may tip the balance on gaining tenure as a faculty junior; scientists with poor interpersonal skills can fare badly. As a result, their reputation can be denigrated even after their death, as happened with DNA pioneer Rosalind Franklin in James Watson's memoirs. 

As opponents of string theory are keen to point out, graduates are often forced to get on bandwagons in order to gain vital grants or academic tenure. This suggests that playing safe by studying contemporary ‘hot' areas of research is preferred to investigating a wider range of new ones. Nobel Laureate and former Stephen Hawking collaborator Roger Penrose describes this as being particularly common in theoretical physics, whereby the new kids on the block have to join the entourage of an establishment figure rather than strike out with their own ideas.

Even once a graduate student has gained a research grant, it doesn't mean that their work will be fairly recognised. Perhaps the most infamous example of this occurred with the 1974 Nobel Prize in Physics. One of the two recipients was Antony Hewish, who gained the prize for his "decisive role in the discovery of pulsars”. Yet it was his student Jocelyn Bell who promoted the hypothesis while Hewish was claiming the signal to be man-made interference. 

4) Jealousy and competitiveness

Although being personable and a team player can be important, anyone deemed to be too keen on self-aggrandising may attract the contempt of the scientific establishment. Carl Sagan was perhaps the most prominent science communicator of his generation but was blackballed from the US National Academy of Sciences due to being seen as too popular! This is despite some serious planetary astronomy in his earlier career, including work on various Jet Propulsion Laboratory probes. 

Thankfully, attitudes towards sci-comm have started to improve. The Royal Society has advocated the notion that prominent scientists should become involved in promoting their field, as public engagement has been commonly judged by STEM practitioners as the remit of those at the lower end of scientific ability. Even so, there remains the perception that those engaged in communicating science to the general public are not proficient enough for a career in research. Conversely, research scientists should be able to concentrate on their work rather than having to spend large amounts of their time of seeking grants or undertaking administration - but such ideals are not likely to come to in the near future!

5) Frauds, hoaxes and general misdemeanours 

Scientists are as human as everyone else and given the temptation have been known to resort to underhand behaviour in order to obtain positions, grants and renown. Such behaviour has been occurring since the Enlightenment and varies from deliberate use of selective evidence through to full-blown fraud that has major repercussions for a field of research. 

One well-known example is the Piltdown Man hoax, which wasn't uncovered for forty years. This is rather more due to the material fitting in with contemporary social attitudes rather than the quality - or lack thereof - of the finds. However, other than generating public attention of how scientists can be fooled, it didn't damage science in the long run. 

A far more insidious instance is that of Cyril Burt's research into the heritability of intelligence. After his death, others tried to track down Burt's assistants, only to find they didn't exist. This of course placed serious doubt on the reliability of both his data and conclusions, but even worse his work was used by several governments in the late twentieth century as the basis for social engineering. 

Scandals are not unknown in recent years, providing ammunition for those wanting to deny recognition of fundamental scientific theories (rarely the practical application). In this age of social media, it can take only one person's mistake - deliberate or otherwise - to set in motion a global campaign that rejects the findings of science, regardless of the evidence in its favour. As the anti-vaccination lobby have proven, science communication still has long way to go if we are to combine the best of both worlds: a healthy scepticism with an acceptance of how the weird and wonderful universe really works, and not how we would like it to.

Tuesday, 27 October 2020

Bursting the bubble: how outside influences affect scientific research

In these dark times, when some moron (sorry, non-believer in scientific evidence) can easily reach large numbers of people on social media with their conspiracy theories and pseudoscientific nonsense, I thought it would be an apt moment to look at the sort of issues that block the initiation, development and acceptance of new scientific ideas. We are all aware of the long-term feud between some religions and science but aside from that, what else can influence or inhibit both theoretical and applied scientific research?

There are plenty of other factors, from simple national pride to the ideologies of the far left and right that have prohibited theories considered inappropriate. Even some of the greatest twentieth century scientists faced persecution; Einstein was one of the many whose papers were destroyed by the Nazis simply for falling under the banner 'Jewish science'. At least this particular form of state-selective science was relatively short-lived: in the Soviet Union, theories deemed counter to dialectical materialism were banned for many decades. A classic example of this was Stalin's promotion of the crackpot biologist Trofim Lysenko - who denied the modern evolutionary synthesis - and whose scientific opponents were ruthlessly persecuted. 

Even in countries with freedom of speech, if there is a general perception that a particular area of research has negative connotations then no matter how unfounded, public funding may be affected likewise. From the seemingly high-profile adulation of STEM in the 1950s and 1960s (ironic, considering the threat of nuclear war), subsequent decades have seen a decreasing trust in both science and its practitioners. For example, the Ig Nobel awards have for almost thirty years been a high-profile way of publicising scientific projects deemed frivolous or a waste of resources. A similar attitude is frequently heard in arts graduate-led mainstream media; earlier this month, a BBC radio topical news comedy complemented a science venture that was seen as "doing something useful for once." 

Of course, this attitude is commonly related to how research is funded, the primary question being why should large amounts of resources go to keep STEM professionals employed if their work fails to generate anything of immediate use? I've previously discussed this contentious issue, and despite the successes of the Large Hadron Collider and Laser Interferometer Gravitational-Wave Observatory, there are valid arguments in favour of them being postponed until our species has dealt with fundamental issues such as climate change mitigation. 

There are plenty of far less grandiose projects that could benefit from even a few percent of the resources given to the international, mega-budget collaborations that gain the majority of headlines. Counter to the 'good science but wrong time' argument is the serendipitous nature of research; many unforeseen inventions and discoveries have been made by chance, with few predictions hitting the mark.

The celebrity-fixated media tends to skew the public's perception of scientists, representing them more often as solitary geniuses rather than team players. This has led to oversimplified distortions, such as that inflicted on Stephen Hawking for the last few decades of his life. Hawking was treated as a wise oracle on all sorts of science- and future-related questions, some far from his field of expertise. This does neither the individuals involved nor the scientific enterprise any favours. It makes it appear as if a mastermind can pull rabbits out of a hat, rather than hardworking groups spending years on slow, methodical and - let's face it - from the outsider's viewpoint what appears to be somewhat dull research. 

The old-school caricature of the wild-haired, lab-coated boffin is thankfully no longer in evidence, but there are still plenty of popular misconceptions that even dedicated STEM media channels don't appear to have removed. For example, almost everyone I meet fails to differentiate between the science of palaeontology and the non-science of archaeology, the former of course usually being solely associated with dinosaurs. If I had to condense the popular media approach to science, it might be something along these lines:

  • Physics (including astronomy). Big budget and difficult to understand, but sometimes exciting and inspiring
  • Chemistry. Dull but necessary, focusing on improving products from food to pharmaceuticals
  • Biology (usually excluding conventional medicine). Possibly dangerous, both to human ego and our ethical and moral compass (involve religion at this point if you want to) due to both working theories (e.g. natural selection) and practical applications, such as stem cell research. 

Talking of applied science, a more insidious form of pressure has sometimes been used by industry, either to keep consumers purchasing their products or prevent them moving to rival brands. Various patents, such as for longer-lasting products, have been snapped up and hidden by companies protecting their interests, while the treatment meted out to scientific whistle blowers has been legendary. Prominent examples include Rachel Carson's expose of DDT, which led to attacks on her credibility, to industry lobbying of governments to prevent the banning of CFCs after they were found to be destroying the ozone layer.

When the might of commerce is combined with wishful thinking by the scientist involved, it can lead to dreadful consequences. Despite a gathering body of evidence for smoking-related illnesses, the geneticist and tobacco industry spokesman Ronald Fisher - himself a keen pipe smoker - argued for a more complex relationship between nicotine and lung disease. The sector used his prominence to denigrate the truth, no doubt shortening the lives of immense numbers of smokers.

If there's a moral to all this, it is that even at a purely theoretical level science cannot be isolated from all manner of activities and concerns. Next month I'll investigate negative factors within science itself that have had deleterious effects on this uniquely human sphere of accomplishment.