Showing posts with label Kepler. Show all posts
Showing posts with label Kepler. Show all posts

Wednesday 20 November 2013

Newton and Einstein: fundamental problems at the heart of science

As previously discussed, Arthur C. Clarke's First Law is as follows: "When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong." Now there have been many examples of prominent scientists who have been proved wrong but don't want to lose their pet idea - think astronomer Fred Hoyle and the Steady State Theory - or bizarrely negated their own hypothesis, such as natural selection's co-discoverer Alfred Russel Wallace and his supernatural explanation of the human mind.

But although with hindsight we can easily mock when pioneers have failed to capitalise on a theory that later proves canonical (assuming any theory except the second law of thermodynamics can ever be said to be the final word in the matter) there are some scientists who have followed profoundly unorthodox paths of thought. In fact, I would go so far as to as say that certain famous figures would find it almost impossible to maintain positions in major research institutes today. This might not matter if these were run-of-the-mill scientists, but I'm talking about two of the key notables of the discipline: Sir Isaac Newton and Albert Einstein.

The public perception of scientists has changed markedly over the past half century, from rational authority figures, via power-mad destroyers, to the uncertainties of today, when the often farcical arguments surrounding climate change have further undermined faith in scientific 'truth'. But the recognition of Newton and Einstein's achievements has never wavered, making them unassailable figures in the history of science. Indeed, if there were ever to be two undisputed champions of physics, or even for all of science - as chosen by contemporary scientists, let alone the public - this contrasting pair is likely to the among the most popular. Yet underneath their profound curiosity and dogged search for truth there are fundamental elements to their personal research that make the offbeat ideas of Wallace, Hoyle & co. appear mildly idiosyncratic.

1) Sir Isaac Newton
While some historians have tried to pass off Newton's non-scientific work as typical of his age, his writings on alchemy, eschatology and the general occult are at least as numerable as those on physics. Some of the more recent examinations of his work have suggested that without these pseudo-scientific studies, Newton would not have gained the mind-set required to generate the scientific corpus he is renowned for. Although he claimed to have no need for hypotheses or 'occult qualities', preferring to examine natural phenomena in order to gain understanding, much of Newton's surviving notes suggest the very opposite. Whether he was using numerology to research the date of the end of the world, or alchemy to search for the Philosopher's Stone, the real Newton was clearly a many-faceted man. This led economist (and owner of some of Newton's papers) John Maynard Keynes to label him "the last of the magicians". Indeed, key aspects of Newton's personality appear entirely in tune with pseudo-science.

It is well known that Newton was a secretive man, given to hiding his discoveries for decades and not wanting to share his theories. Part of this was due to his wish to avoid having to waste time with the less intelligent (i.e. just about everybody else) and partly to his fear of plagiarism, frequently experiencing conflicts with contemporary natural philosophers. To some extent this unwillingness to publish only exacerbated the issue, such as when Leibniz published his version of calculus some years after Newton had completed his unpublicised 'fluxions'.

Today, establishing scientific priority relies upon prompt publication, but Newton's modus operandi was much closer to the technique of the alchemist. Far from being a non-systematic forerunner of chemistry, alchemy was a subjective discipline, couched in metaphor and the lost wisdom of 'ancient' sages (who, after Newton's time, were frequently discovered to be early Medieval or Ptolemaic Egyptian frauds). The purity of the practitioner was deemed fundamental to success and various pseudoscientific 'influences' could prevent repeatability of results.

In addition, such knowledge as could be discovered was only to be shared between a few chosen adepts, not disseminated to a wide audience for further examination and discussion. In personality then, Newton was far more like the pre-Enlightenment alchemist than many of his contemporaries. He believed in a sense of his own destiny: that he had been chosen by God to undertake the sacred duty of decoding now-hidden patterns in the universe and history. When Descartes postulated a 'clockwork universe', Newton opposed it on the grounds that it had no place for a constantly intervening deity. And surprising as it may seem, in that respect he had a lot in common with Einstein.

2) Albert Einstein
Einstein was in many ways a much more down-to-earth (and fully rounded human being) than Newton. Whereas the latter frequently neglected such basic human activities as food and sleep, Einstein indulged in pipe tobacco and playing the violin (shades of Sherlock Holmes, indeed!) However, he was just as much a determined thinker when it came to solving fundamental riddles of nature. A good anecdote, possibly true, tells of how whilst searching for a makeshift tool to straighten a bent paperclip, Einstein came across a box of new paperclips. Yet rather than use one of the new ones per se, he shaped it into the tool required to fix the original paperclip. When questioned, he replied that once had started a task it was difficult for him to curtail it.

But one of the oft-quoted phrases surrounding him is that Einstein would have been better off spending his last two or three decades fishing, rather than pursuing a unified field theory. The reason for this is that despite being a pioneer in the quantum theory of light, he could not accept some of the concepts of quantum mechanics, in particular that it was a fundamental theory based on probability rather than simply a starting point for some underlying aspect of nature as yet unknown.

Even today there are only interpretations of quantum mechanics, not a completely known explanation of what is occurring. However, Einstein considered these as more akin to philosophy rather than science and that following for example the Copenhagen interpretation prevented deeper thought into the true reality. Unfortunately, the majority of physicists got on the quantum mechanics bandwagon, leaving Einstein and a few colleagues to try to find holes in such strange predictions as entanglement, known by Einstein under the unflattering term of "spooky action at a distance".

Although it was only some decades after his death that such phenomena were experimentally proven, Einstein insisted that the non-common sense aspects of quantum mechanics only showed their incompleteness. So what lay at the heart of his fundamental objections to the theory? After all, his creative brilliance had shown itself in his discovery of the mechanism behind Newtonian gravitation, no mean feat for so bizarre a theory. But his glorious originality came at a price: as with many other scientists and natural philosophers, from Johannes Kepler via Newton to James Clerk Maxwell, Einstein sought answers that were aesthetically pleasing. In effect, the desire for truth was driven by a search for beautiful patterns. Like Newton, there is the concept of wanting to understand the mind of God, regardless of how different the two men's concept of a deity was (in Einstein's case, looking for the secrets of the 'old one').

By believing that at the heart of reality there is a beautiful truth, did Einstein hamper his ability to come to terms with such ugly and unsatisfying concepts as the statistical nature of the sub-atomic world? In this respect he seems old-fashioned, even quaint, by the exacting standards required - at least theoretically - in contemporary research institutes. Critical thinking unhampered by aesthetic considerations has long been shown a myth when it comes to scientific insights, but did Einstein take the latter too far in his inability to accept the most important physics developed during the second half of his life? In some respects, his work after the mid-1920s is seemingly as anachronistic as Newton's pseudo-scientific interests.

As a result of even these minimal sketches, it is difficult to believe that Newton would ever have gained an important academic post if he were alive today, whilst Einstein, certainly in the latter half of his life would probably have been relegated to a minor research laboratory at best. So although they may be giants in the scientific pantheon, it is an irony that neither would have gained such acceptance by the establishment had they been alive today. If there's a moral to be drawn here, presumably it is that even great scientists are just as much a product of their time as any other human being, even if they occasionally see further than us intellectual dwarves.

Tuesday 14 May 2013

What, how and why? Are there 3 stages to science?

Not being philosophically inclined I was recently surprised to find myself constructing an armchair thesis: it had suddenly dawned on me that there might be three, broad phases or stages to the development of scientific ideas. I'm fairly certain I haven't read about anything along similar lines, so let me explain,  safe in the knowledge that if it's a load of fetid dingo's kidneys, it's entirely of my own doing.

Stage 1

Stage one is the 'what' phase: simply stated, it is about naming and categorising natural phenomena, a delineation of cause and effect. In a sense, it is about finding rational explanations for things and events at the expense of superstition and mysticism.  In addition, it utilises the principle of parsimony, otherwise known as Occam's (or Ockham's) Razor: that the simplest explanation is usually correct. 

Although there were a few clear moments of stage one in Ancient Greece - Eratosthenes' attempt to measure the size of the Earth using Euclidean Geometry being a prime example - it seems to have taken off in earnest with Galileo. Although his work is frequently mythologised (I follow the rolling weights rather than dropping objects from the Leaning Tower of Pisa brigade), Galileo most likely devised both actual and thought experiments to test fundamental findings, such as the separate effects of air resistance and gravity.

Of course, Galileo was primarily interested in physics but the other areas of science followed soon after. Systematic biology came to the fore in such practical work as the anatomical investigations of William Harvey - pioneer in the understanding of blood circulation - and the glass bead microscopes of Antony van Leeuwenhoek. The work of the latter, interestingly enough, was largely to understand how small-scale structure in edible substances created flavours.  It's also worth thinking about how this research expanded horizons: after all, no-one had ever seen the miniature marvels such as bacteria. I wonder how difficult the engravers of illustrated volumes found it, working from sketches and verbal descriptions on sights they have never seen themselves? But then again, no-one has ever directly imaged a quark either…

Talking of biology, we shouldn't ignore Carl Linnaeus, the Swedish scientist who started the cataloguing methodology in use today. New Zealand physicist Ernest Rutherford may have disparagingly referred to all branches of science other than physics as mere stamp collecting but apart from the wild inaccuracy of his statement it is seemingly obvious that without various standards of basic definitions there is no bedrock for more sophisticated research.

The repetitive, largely practical aspect of the phase in such disciplines as geology and taxonomy meant that largely untrained amateurs could make major contributions, such as the multitude of Victorian parsons (of whom Charles Darwin was almost a member) who worked on the quantity over quality principle in collecting and cataloguing immense amounts of data. Of course, Darwin went far beyond phase one but his work built on the evaluation of evolutionary ideas (try saying that three times fast) that numerous predecessors had discussed, from the Ancient Greeks to John Ray in the late Seventeenth Century.

This isn't to say that stage one science will be finished any time soon. The Human Genome Project is a good example of a principally descriptive project that generated many surprises, not least that it is proving more difficult than predicted to utilise the results in practical applications. Although in the BBC television series The Kingdom of Plants David Attenborough mentioned that the Royal Botanic Gardens at Kew contains 90% of known plant species, there are still plenty of remote regions - not to mention the oceans - yet to yield all their secrets to systematic scientific exploration.  In addition to the biota yet to be described in scientific records, the existing catalogues are in the process of major reorganisation. For example, the multitude of duplicate plant names is currently being addressed by taxonomic experts, having so far led to the finding of 600,000 superfluous designations. It isn't just plants either: a recent example was the announcement that DNA evidence suggests there is probably only a single species of giant squid rather than seven. It may sound tedious and repetitive, but without comprehensive labelling and description of natural elements, it would be impossible to progress to the next stage.

Stage 2

Who was the first person to move beyond cataloguing nature to in-depth analysis? We'll probably never know, but bearing in mind that some of the Ionian philosophers and Alexandrian Greeks performed practical experiments, it may well have been one of them.

By looking to explore why phenomena occur and events unfold the way they do, our species took a step beyond description to evaluation. If art is holding a mirror up to nature, then could the second phase be explained as holding a magnifying glass up to nature, reducing a phenomenon to an approximation, and explaining how that approximation works?

For example, Newton took Galileo and Kepler's astronomical work and ran with it, producing his Law of Universal Gravitation. The ‘how' in this case is the gravitational constant that explained how bodies orbit their common centre of gravity. However, Newton was unable to delineate what caused the force to act across infinite, empty space, a theory that had to wait for stage three.

So different from the smug, self-satisfied attitude of scientists at the beginning of the Twentieth Century, the techniques of modern science suggest that there is a feedback cycle in which knowing which questions to ask is at least as important as gaining answers, the adage in this case being ‘good experiments generate new questions'. Having said that, some of the largest and most expensive contemporary experiments such as the Laser Interferometer Gravitational-Wave Observatory (LIGO) and Large Hadron Collider (LHC) have each been principally designed to confirm a single hypothesis.

As recent evidence has shown, even some of the fundamentals of the nature, including dark matter and dark energy, are only just being recognised. Therefore science is a long way from recognising all first principles, let alone understanding them. Closer to home, that most complex of known objects, the human brain, still holds a lot of secrets, and probably will continue to do so for some time to come.
Though microelectronics in general and computers in particular have allowed the execution of experiments in such fields as quantum teleportation, considered close to impossible by the finest minds only half a century ago, there are several reasons why computer processing power is getting closer to a theoretical maximum using current manufacturing techniques and materials. Therefore the near future may see a slowing down in the sorts of leading edge experimental science that has been achieved in recent decades. But how much progress has been made in phase three science?

Stage 3

This is more difficult to define than the other two phases and can easily veer into philosophy, a discipline that has a poor press from many professional scientists. Physicist Richard Feynman for example is supposed to have disparaged it as ‘about as useful to scientists as ornithology is to birds'.  Despite this - and the probability that there as many philosophies of science as there are philosophers -  it's easy to see that the cutting edge of science, particularly theoretical physics, generates as much discussion over its validity as any work of art. If you've read one of the myriad critiques of superstring theory for example, then you will know that it can be viewed as a series of intellectual patterns (accompanied by diabolical equations) that may never be experimentally confirmed. In that case is string theory really just a collection of philosophical hypotheses, unproven by experiment or observation and likely to remain so? The minuteness of the scale (an underwhelming description if ever there was one) makes the prospect of directly recording strings themselves  - as opposed to their effects - highly unlikely.

If that is the case then just where can you draw the line between science and philosophy? Of course one of the fundamental tenets of a valid hypothesis is to make testable predictions that no other hypothesis can account for. But with over a century of theories that increasingly fail to follow common sense  or match everyday experience perhaps this is a sign of approaching maturity in science, as we finally advance beyond the crude limitations of our biological inheritance and its limited senses. Surely one key result of this is that the boundaries between new ideas promulgated by scientists and the thoughts of armchair philosophers will become increasingly blurred? Or is that just fighting talk?

Whereas scientists engaged in phase two investigations seek to find more accurate approximations for phenomena, phase three includes the search for why one theory is thought to be correct over another. A prominent example may help elucidate. Further to Galileo in phase one and Newton in phase two, Einstein's General Relativity, which explains the cause of gravity via the curvature of spacetime, is clearly an example of phase three. Of course, contemporary physicists would argue that Einstein's equations are already known to be lacking finality due to its incompatible with quantum mechanics. Herein lies the rub!

One problem that has caused dissension amongst many scientists is a possibly even more ‘ultimate' question: why is the universe finely tuned enough for life and more than that, intelligent life, to exist? The potential answers cover the entire gamut of human thought, from the conscious design principle supported by some religiously-minded scientists, to the invocation of the laws of probability in a multiverse hypothesis, requiring an immense number of universes all with the different fundamentals (and therefore including a lucky few capable of producing life). But the obvious issue here is that wouldn't Occam's Razor suggest the former is more likely than the latter? As Astronomer Royal Sir Martin Rees states, this is veering into metaphysical territory, which except for the scientists with religious convictions, is usually an area avoided like the plague. However, it may eventually become possible to run computer models that simulate the creation of multiple universes and so as bizarre as it seems, go some way to creating a workable theory out of something that to most people is still a purely philosophical notion. Talk about counting angels on a pinhead!

I can't say I'm entirely convinced by my own theory of three stages to science, but it's been interesting to see how the history and practice of the discipline can be fitted into it. After all, as stated earlier no-one has ever observed a quark, which in the first days of their formulation were sometimes seen as purely mathematical objects any way. So if you're doubtful I don't blame you, but never say never...

Friday 15 March 2013

Preaching to the unconverted: or how to convey science to the devout

It's said that charity begins at home. Likewise, a recent conversation I had with a pious Mormon started me thinking: just how do you promote science, both the method and the uncomfortable facts, to someone who has been raised to mistrust the discipline? Of course, there is a (hopefully) very small segment of the human race that will continue to ignore the evidence even after it is presented right in front of them, but stopping to consider those on the front line - such as biology teachers and ‘outed' atheists in the U.S. Bible Belt - how do you present a well-reasoned set of arguments to promote the theory and practice of science? 

It's relatively easy for the likes of Richard Dawkins to argue his case when he has large audiences of professionals or sympathetic listeners, but what is the best approach when endorsing science to a Biblical literalist on a one-to-one basis? The example above involved explaining just how we know the age of the Earth. Not being the first time I've been asked this, I was fully prepared to enlighten on the likes of uranium series dating, but not having to mention the 'D' words (Darwin or Dawkins) made this a relatively easy task. To aid any fans of science who might find themselves in a similar position I've put together a small toolkit of ideas, even if the conversation veers into that ultimate of controversial subjects, the evolution of the human race:
  1. A possible starting point is to be diffident, explaining the limitations of science and dispelling the notion that it isn't the catalogue of sundry facts it is sometimes described as (for example, in Bill Bryson's A Short History of Nearly Everything). It is difficult but nonetheless profitable to explain the concept that once-accepted elements of scientific knowledge can ostensibly be surpassed by later theories, only to maintain usefulness on a special case basis. A good illustration of this is Newton's Law of Universal Gravitation, which explains the force of gravity but not what creates it. Einstein's General Theory of Relativity provides a solution but Newton's Law is much easier to use, being accurate enough to use even to guide spacecraft. And since General Relativity cannot be combined with quantum mechanics, there is probably another theory waiting to be discovered…somewhere. As British astrophysicist and populariser John Gribbin has often pointed out, elements at the cutting edge of physics are sometimes only describable via metaphor, there not being anything within human experience that can be used as a comparison. Indeed, no-one has ever observed a quark and in the early days of the theory some deemed it just a convenient mathematical model. As for string theory, it's as bizarre as many a creation myth (although you might not want to admit that bit).
  2. Sometimes (as can be seen with Newton and gravity) the 'what' is known whilst the 'why' isn't. Even so, scientists can use the partial theories to extrapolate potential 'truths' or even exploit them via technology. Semi-conductors require quantum mechanics, a theory that no-one really understands. Indeed, no less a figure than Einstein refused to accept many of its implications.  There are many competing interpretations, some clearly more absurd than others, but that doesn't stop it being the most successful scientific theory ever, in terms of the correspondence between the equations and experimental data. So despite the uncertainty - or should that be Uncertainty (that's a pun, for the quantum mechanically-minded) - the theory is a cornerstone of modern physics.
  3. As far as I know, the stereotype of scientists as wild-haired, lab-coated, dispassionate and unemotional beings may stem from the Cold War, when the development of the first civilisation-destroying weapons led many to point their fingers at the inventors rather than their political paymasters. Yet scientists can be as creative as artists. Einstein conducted thought experiments, often aiming for a child-like simplicity, in order to obtain results. The idea that logic alone makes a good scientist is clearly bunkum. Hunches and aesthetics can prove as pivotal as experimental data or equations.
  4. Leading on from this, scientists are just as fallible as the rest of us. Famous examples range from Fred Hoyle's belief in the Steady State theory (and strangely, that the original Archaeopteryx fossils are fakes) through to the British scientific establishment's forty-year failure to recognise that the Piltdown Man finds were crude fakes. However, it isn't always as straightforward as these examples: Einstein's greatest blunder - the cosmological constant - was abandoned after the expansion of the universe was discovered, only for it to reappear in recent years as the result of dark energy. And of course mistakes can prove more useful than finding the correct answer the first time!
  5. There are numerous examples of deeply religious scientists, from Kepler and Newton via Gregor Mendel, the founder of genetics, to the contemporary British particle physicist the Reverend John Polkinghorne. Unlike the good versus evil dichotomy promoted by Hollywood movies, it's rarely a case of us versus them.
  6. Although there are searches for final theories such as the Grand Unified Theory of fundamental forces, one of the current aspects of science that differs profoundly from the attitudes of a century or so ago is that there is the possibility of never finding a final set of solutions. Indeed, a good experiment should generate as many new questions as it answers.
  7. If you feel that you're doing well, you could explain how easy it is to be fooled by non-existent patterns and that our brains aren't really geared up for pure logic. It's quite easy to apparently alter statistics using left- or right-skewed graphs, or to use a logarithmic scale on one axis. In addition, we recognise correlations that just aren't there but we which we would like to think are true. In the case of my Mormon colleague he was entrenched in the notion of UFOs as alien spacecraft! At this point you could even conduct an experiment: make two drawings, one of a constellation and one of evenly-spaced dots, and ask them to identify which one is random. Chances are they will pick the latter. After all, every culture has seen pictures in the random placements of stars in the night sky (or the face of Jesus in a piece of toast).
Constellation vs random dots
Ursa Major (see what you like) vs evenly-spaced dots

So to sum up:
  1. There's a fuzzy line at the cutting edge of physics and no-one understands what most of it means;
  2. We've barely started answering fundamental questions, and there are probably countless more we don't even know to ask yet;
  3. Science doesn't seek to provide comforting truths, only gain objective knowledge, but...
  4. ...due to the way our brains function we can never remove all subjectivity from the method;
  5. No one theory is the last word on a subject;
  6. Prominent scientists easily make mistakes;
  7. And most of all, science is a method for finding out about reality, not a collection of carved-in-stone facts.
So go out there and proselytise. I mean evangelise. Err...spread the word. Pass on the message. You get the picture: good luck!

Sunday 3 January 2010

What's in a label? How words shape reality

With the start of a new year it seems appropriate to look at how our perception of the universe is created via language - after all, there's no position in space identifying an orbital starting point. We grow up with a notion of reality that is largely defined by convenience and historical accidents embedded into our language and therefore our thought patterns (and vice versa). For at least the last six hundred years many societies have called our planet Earth, whilst of course Ocean would be more appropriate. Whilst this is just an obvious chauvinism for a land-based species, there are other terms that owe everything to history. We count in base ten, position zero longitude through the Greenwich Meridian and usually show the Earth from one perspective, despite there not being an arrow in our galaxy stating 'this way up' (but then had Ancient Egyptians' view prevailed, Australia and New Zealand would be in the Northern Hemisphere.)

So how far can go with constructs? Our calendar is an archaic, sub-optimal mish-mash, with the interpolation of July and August meaning the last four months of the year are inaccurately named seven through ten. The changeover from the Julian to Gregorian calendar varied from nation to nation, meaning well-known events such as the birth of George Washington and the Bolshevik Revolution have several dates depending on the country defining that piece of history. As for the majority of humans agreeing that we are now in AD 2010, thanks to a fifteen hundred year-old mistake by Dionysius Exiguus our current year should really be at least AD 2014, if we accept that an historical figure called Jesus of Nazareth was born during the lifetime of Herod the Great. It appears that even the fundamentals that guide us through life are subjective at the very least if not far from accurate in many cases.

The philosopher of science Thomas Kuhn argues that all scientific research is a product of the culture of the scientists engaged on those projects, so whilst we might argue that Galileo was the first scientist in a strictly modern use of the word, can there be a definitive boundary between the quasi-mystical thought processes of Copernicus and Kepler (and even Newton), and that of the modern exponents typified by Einstein and Hawking? Whilst we would like to believe in a notion of pure objectivity, scientists are just as subjective as everyone else and their theories are therefore built on assumptions directly related to history, both cultural and biological.

We use labels to comfort ourselves, even boost our egos, via unconscious assumptions that are gradually looking more ridiculous as we delve ever deeper into the mysteries of creation. For example, the past sixty-five million years has been a period frequently named 'the Age of Mammals'. Yet as Stephen Jay Gould was fond of pointing out, most of the world's biomass is microbial and we macroscopic life forms are comparative newcomers, restricted to a far reduced range of environments compared to bacteria, protists and other small-scale organisms.

Despite such sense-expanding tools as infra-red telescopes and electron microscopes, we still process sensory input and use primarily audio-visual output to define scientific theories and methodology. We are in thrall to the languages we use define our thoughts, both conversational language and mathematics. Although the lingua franca of science has varied over the centuries, all languages from Latin to English have one thing in common: they are used to tell us stories. At a basic level, the history of science is riddled with fables and apocrypha, from Newton being hit by an apple (and inventing the reflecting telescope) to Galileo dropping weights from the Leaning Tower of Pisa, even Columbus believing the world was a sphere (he didn't - he thought it was pear-shaped!)

So if scientific history cannot be relied upon, what about the hypotheses and theories themselves? In the words of John Gribbin, we construct 'Just So' stories to create a comprehendible version of reality. Presumably this reliance on metaphor will only increase as our knowledge becomes further divorced from everyday experience but our technology fails to keep pace with confirming new theories; for example, it is far from likely that we will ever be able to directly view a superstring.

In addition, language doesn't just restrict our ideas: if a term has a scientific sense differing from vernacular meaning, problems frequently arise. A classic example would be quantum leap, which to most people means an enormous step forward but to physicists is an electron's miniscule change of energy level. However, even personal computer pioneer Sir Clive Sinclair used the term in its former meaning for his 1984 Quantum Leap microcomputer (at least I assume he did, although QL owners may disagree...)

Speaking of which, perhaps when we finally build (or machines build for us) computers capable of true artificial intelligence, new ways of exploring the universe not tied down to conventional linguistic-based thought patterns may arise. Then again, since we will be the parents of these machines, this may not be feasible. As one of Terry Pratchett's characters stated: "I think perhaps the most important problem is that we are trying to understand the fundamental workings of the universe via a language devised for telling one another where the best fruit is." But all things considered, we haven't done that badly so far.

Technorati Tags: , ,