The Search for Meaning

The Search for Meaning

It is difficult, or perhaps even impossible, for us to pinpoint the exact moment that human beings first began to wonder about the nature of the world and their relationship to it. Nonetheless, some of the first attempts ever made would fit quite comfortably into the category of myth. From the Greek mythos meaning ‘story’, myths are commonly believed to be nothing more than fictions concocted by our ancient ancestors to explain the unknown. There are schools of thought, however, that see the original meaning of the word to be something more in alignment with stories that are grounded in, or at least related in some way to, true events.


Either way, myths are traditional or legendary tales―usually grand or heroic―and they are used to explain many things, for example, the origins of cultural practices, natural phenomena, and, of course, creation itself; which is to say, how all things came to be.


There are numerous creation myths, and they can be found in almost every culture on Earth. They range from simple stories involving significant characters or beings, through to others that are far broader in principle. For example, the basic idea of the Hindu creation myth says that all things in the Universe arose out of the self-sacrifice of Purusha, a primordial or cosmic giant who had a thousand heads, faces, thighs, arms, eyes, and feet. In accordance with this myth, everything in existence―including the Sun and the moon, as well as the structural basis of the Indian caste system itself―is a rearrangement of Purusha.


Somewhat less specific are the creation myths of the Australian Aborigines. These stories refer to an early period called the ‘Dreamtime’, being the very foundation of life itself. In this time, the pre-existing ancestral spirits transformed a world of things and conditions into the structures of today. The ways of life, the law, and the moral code were set down to be followed eternally. The Dreamtime was the period of fashioning, organising, and moulding an unordered world, and is the foundation of the Aboriginal world view.


The Dao creation myth describes more of a process rather than specific characters being responsible for the formation of things. The essence of this myth says that in the beginning there was a featureless, yet complete, ‘something’. It was silent, amorphous, and stood alone and unchanging. It was called the ‘Way’, and it gave birth to unity, from which arose duality. Duality then created trinity, and this gave birth to the myriad creatures. This myth, or way of seeing, is simple, elegant and incorporates many of the ideas that are evident in Origin as a model―from the greatest simplicity arises infinite complexity.


Religion is another vehicle that people use to understand the world they live in and feel at home within it. Although religion incorporates mythology to some degree, it goes further to describe certain ways that people should live in order to satisfy the gods or deities (or the belief systems that support the respective religion) and live joyous, meaningful and fulfilling lives. There are more than 4,000 different religions in the world today, and it is not the purpose of this work to explore them, rather merely to acknowledge them for the value they serve some people. When it comes to understanding how the Universe works, however, most people these days seem to want facts, proof and evidence, formulas and equations that give tangible and demonstrable results that are beyond sheer belief.


Enter science. And although, once again, it would be difficult to say exactly when science began, it is an approach that emerged from the impulse of philosophical enquiry. The word ‘philosophy’ comes from the Greek words philo and sophia respectively from philien meaning ‘to love’ and sophos meaning ‘knowledge’ or ‘wisdom’, the combination of which equals a word that means the ‘love of wisdom’. In the broadest sense, then, philosophy is something that relates to us as the ‘wanting to understand all things’, and therefore encompasses everything that might help us towards reaching this goal. By this definition, myth and religion fall under the umbrella of philosophy, as do science, metaphysics, theology, and pretty much every field of enquiry towards learning and understanding. Not surprisingly, this is why those who study any subject at university to gain a high academic award, receive a Doctorate of Philosophy (Ph.D.), no matter what the field.


The fact that philosophy underpins and essentially defines every attempt that we make to understand the world, can also be demonstrated through a fun and interesting experiment that you can do yourself right now. Think of any word you like, and enter it into a Wikipedia search. Then follow the first hyperlink in the main body of information about the word (excluding any italicised words). If you continue doing this you will, in almost every case, eventually end up on the philosophy page! This is indeed an interesting exercise and helps demonstrate that all enquiry towards understanding the world and ourselves leads to philosophy.


Considering this area of philosophical enquiry, one of the first and most famous names that humanity has some historical record of is Socrates, who lived in the period 470 - 399 BCE. He is usually recognised and credited as one of the major contributors to the foundations of Western philosophy, so it is valuable to consider what he did in his time.


Socrates was an enigmatic and powerful philosopher. His style was to ask questions that were challenging, and he forced people to think for themselves so that they might grow in wisdom and come to know themselves. He saw that ultimate wisdom and happiness only came through such knowing―a perspective that I, too, believe would be something exceedingly beneficial, perhaps even necessary, in the quest for us to thoroughly understand our Universe and be happy in it.


Some of Socrates’ main contributions to humanity lie in the field of ethics. He held strong ideas about right and wrong and the way people should live. He believed, for example, that only philosophers―who had the greatest knowledge, virtue and ability―were the ‘right’ kind of people to govern others―a bold and challenging perspective that would ultimately bring about his demise. Socrates lived in a period of significant political unrest in Athens, and his virtuous views and values began to clash with those held by the majority of people, especially those in power. He questioned the Athenians’ beliefs in their gods, encouraged the youth to question everything, and generally ‘stirred the pot’ by exposing the logical flaws in the thinking of politicians. Although many admired these challenges to the status quo, and the often-humorous ways in which Socrates presented them, others became angry and felt that he threatened their very way of life. Consequently, Socrates was eventually brought before the courts and charged with ‘not recognising the gods recognised by the state, but introducing novel divinities and corrupting the young’. He was sentenced to death by hemlock poisoning, although when asked to propose his own punishment, he suggested ‘a wage paid by the government and free dinners for the rest of his life instead, to finance the time he spent as Athens’ benefactor’.


Perhaps the most significant contribution made by Socrates, especially in the context of this work, was that of the ‘Socratic Method’. The philosopher, Plato, first described this method in the Socratic Dialogues, a series of discussions between Socrates and other individuals of his time. The Socratic Method is a technique which breaks any problem down into a series of questions. The idea is that through reflecting on and answering these questions, the solution a person (or group of people) seeks will become apparent. It is a process of hypothesis elimination, which finds the better or more correct hypotheses by a systematic identification and elimination of hypotheses that lead to contradictions. This approach to problem solving has been a significant influence on what is known to us today as the ‘scientific method’.


The scientific method usually begins with observing the world we live in and questioning what arises from those observations. This questioning then leads to the development of hypotheses that might explain the phenomena observed. The best hypotheses lead to predictions that can be tested or measured in some way, and depending on how well the results match the predictions, this becomes a measure of the plausibility of the hypotheses. This process may then lead to the development of a general theory.


Socrates developed a method that has helped lead to the current scientific method, and the very nature of that method is at least one of the reasons he never recorded any of his thoughts and ideas himself. He was more interested in asking provocative questions than having any specific philosophy attributed to his name. Therefore, a picture of the man can be formed only as a function of his influence and impression on others. The value of much of his work stems from its interpretation by Plato, Socrates’ most prized student.


Plato (429 - 327 BCE) is one of the world’s best-known and most widely read philosophers of all time, and the work and ideas he presented throughout his life still influence our thinking today. The idea of ‘Platonic love’, named after him, is one of the better-known examples of his influence. Platonic love is a kind of love that is nonsexual, or chaste. It is derived from the idea that the perceived or felt love for another person can inspire the mind and soul in the direction of spiritual matters. In the modern, popular sense, it is seen as an affectionate, nonsexual relationship between two people.


In the context of this work, however, and its quest to understand the nature of existence and how things work, what is interesting is that Plato considered the world we live in as ‘not real’. He called the world we observe on a daily basis the material or perceived world, and regarded it as a representation, rather, of an essential world within or beneath it, which he called the ‘World of Forms’, or the ‘World of Ideas’. In this World of Forms, things were true, unchanging, eternal, and never faded or died. It was a perfect place that had always been there, always would be, and it informed the material world as an ever-changing approximation of itself. This is a powerful concept that later we will explore in more depth and endeavour to link to the modern principles of quantum mechanics. The connection between the two worlds also forms the basis of the first hint of the body-mind connection; the body, of course, existing in the material world, and the mind (or soul) having its place in the eternal World of Forms.


Elaborating on the work of Plato was Aristotle (384 - 322 BCE), Plato’s finest student, who contributed significantly to many fields of study including logic, physics, astronomy, meteorology, zoology, metaphysics, theology, psychology, politics, economics, ethics, rhetoric, and poetics. Important in the context of this work is that he was also the first recorded person to put forward a clear and practical description of the workings of the Universe―one that would evolve into the basic and agreed-upon model that we have today.


Around 350 BCE, in his work On the Heavens, Aristotle theorised that the Earth was spherical and fixed as the centre of the Universe and that the observed planets revolved around it. He also regarded the spherical, celestial bodies in the heavens as having a soul. This was an extension of Plato’s beliefs (although, unlike Plato, Aristotle did not believe that the soul was eternal), and this idea―or the reason that ideas of this nature emerge―will become apparent as the Origin model develops.


Aristotle’s proposal was the first departure from the flat Earth belief held since the first days of self-conscious (conscious of self or individuality, or self-aware) humans. Some five hundred years later (early in the second century) a Greek scientist and mathematician, Claudius Ptolemy, consolidated this proposal of the Earth having fixed concentric spheres or shells around it, and that it is these which carry the planets. What he described is a model of the cosmos known as ‘Geocentrism’―geo meaning the Earth, and centrism meaning the centre―and this became the collectively accepted belief for the next fifteen hundred years―no doubt in part because the Church strongly endorsed the idea.


Christian scholars associated certain passages from the Old Testament with the geocentric model, such as ‘the world also shall be established that it shall not be moved’, and ‘let there be lights in the firmament of the heaven’.


Firmament’ is an ancient scientific word for a structure that supports all creation. One aspect of the firmament is its appearance as a vault in the sky, rather like viewing the inside of a sphere. Interestingly this word describes something critical that will be discussed in great detail in the Origin model. In addition, the geocentric model allowed for considerable interpretation by the Church, and the possibility of incorporating a ‘Heaven and Hell’ in the unknown, outer region beyond the ‘lights in the firmament’. There are similar references to geocentricity in the Qur’an, and Islamic astronomers also supported the model for an equal period.


The Ptolemaic model prevailed for a long time, and it was not until the mid-16th century that Nicolaus Copernicus put forward another idea; an idea now proven and in alignment with our current knowledge.


Copernicus was a Prussian priest, astronomer and mathematician, and is now commonly regarded to be the person who first suggested that the Earth was not the centre of the Universe, as Aristotle had theorised. From his observations of the movement of celestial bodies, he claimed that the Sun was the centre (of the Solar System), and the Earth moved around it along with the other observable planets. A model of this configuration is known as ‘Heliocentrism’―helio meaning the Sun and centrism, once again, the centre.


On reflection, it is quite surprising that this idea has been attributed to Copernicus. The proposal that the Sun was fixed, and that the Earth revolved around it, was first suggested a long time earlier by the Greek astronomer Aristarchus of Samos, in 270 BCE. Curiously, this was around the same time that Aristotle was proposing the geocentric model, which subsequently gained traction, and Aristarchus’ ideas faded into the background.


In a time when geocentrism was the accepted model among the people―with complete support from the Church―Copernicus was hesitant about publishing any of his controversial proposals. He did not want to risk the scorn ‘to which he would expose himself on account of the novelty and incomprehensibility of his theses’. Consequently, he first presented the theory of heliocentrism in his major publication On the Revolutions of the Celestial Spheres close to the time of his death.

In the late 16th to early 17th century, some fifty years later, two men―Italian Galileo Galilei, and German Johannes Kepler―began to study and become very interested in the work of Copernicus. In 1608, a new and very exciting invention from the Netherlands hit the scene and piqued Galileo’s curiosity. The German born, Dutch spectacle maker, Hans Lippershey, had discovered that when you put glass lenses in a tube and peer through it, it somehow made objects appear closer and larger. The invention, of course, was the telescope.


Galileo developed a special version of the telescope to peer at the skies, and began discovering some fascinating things. He observed the beauty and detail of the moon’s craters and the fact that there were many, many more stars in the heavens than anyone had ever imagined.


When he focused his telescope on Jupiter one night, he saw four objects (moons) circling the planet, and it was this most famous discovery that would have major consequences for astronomy―and for Galileo himself. From a scientific point of view, that night Galileo had gathered the evidence required to finally prove that the Earth was not the centre of the Universe and that Copernicus’s theory of heliocentrism was correct. If there were other bodies in the Universe in orbit, then perhaps the Earth itself could be in orbit too. When this idea was integrated with the observations and measurements of the other planetary paths and the Sun and the moon, it all began to add up. This finally spelled the end of Ptolemy’s geocentrism, and ultimately the end of Galileo’s freedom as well.


Due to the regard Christian religion held for geocentrism, Galileo’s work and the idea of heliocentrism was met with significant opposition from the Catholic Church. In fact, his findings were so confronting to the Church that in 1616 the Roman Inquisition declared heliocentrism to be formally heretical, stating that it was ‘foolish and absurd in philosophy’. Heliocentric books were banned and Galileo was ordered to refrain from holding, teaching or defending heliocentric ideas. He didn’t, however, and in 1633 he was brought before the inquisitor, found ‘vehemently suspect of heresy’, and was ordered to spend the remainder of his years under house arrest. Galileo slowly went blind and in 1642 died from fever and heart palpitations. It took the Church more than 300 years to clear his name of heresy, when on October 31, 1992, Pope John Paul expressed ‘regret for the Church’s treatment of the scientist.’


Johannes Kepler, the German mathematician, focused much of his work on the motion of celestial bodies, and eventually showed that the planets orbited in ellipses, not perfect circles as Galileo had believed. To this day, and following in his name, the orbits of planetary bodies are referred to as the ‘Kepler laws of planetary motion’.

Thinking that Changed the World

Following on closely after the lives and work of Galileo and Kepler―and also somewhat inspired by them―came Isaac Newton, who is regarded as one of the most influential scientists of all time. He made major contributions in the fields of mathematics, light, and motion, but is best known for his work on gravity.


Isaac Newton was born on Christmas day 1642, three months after the death of his father, after whom he was named. He was an unusually brilliant man, yet despite his genius―or perhaps because of it―he was also troublesome, argumentative, and capable of extremely odd behaviour. It is recorded that at one point in his life he stuck a long needle into his eye just to see what would happen, and on another occasion, he stared at the Sun for as long as he could bear to see what effect it might have on his vision.


Throughout his life, Newton was well known for his outbursts of temper and vitriolic attacks upon his contemporaries. One of the more famous was his carefully orchestrated campaign to destroy the reputation of Gottfried Leibniz, the German polymath, whom he (wrongly) believed had stolen his discovery of calculus (the mathematical study of how things change). Newton discredited Leibniz’s work in any way he could, claiming that it was plagiarised or that his discoveries revealed nothing new. Similarly, he had ongoing arguments with the English polymath, Robert Hooke, who was also a major contributor in understanding the nature of light. One of the biggest disagreements between them arose over whether light was a particle or a wave. This question, as we will discover later, is still a mystery and an area of much confusion and debate. The journey towards understanding the different perspectives and apparent dual nature of light has also contributed a great deal to the emergence of the modern physics of quantum mechanics.


In 1687, Newton published his Philosophiæ Naturalis Principia Mathematica [The Mathematical Principles of Natural Philosophy]. This work presented a science that exceeded anything that had come before. In it, Newton formulated the laws governing the motion of bodies and gravitation, and laid the foundations for classical mechanics. He described the workings of a Universe that ran as predictably as clockwork and could be explained and represented mathematically. Even today, Newton’s laws and equations still hold true for most practical applications.


In the context of this work, it is gravity that we are most interested in for reasons that will become clear. Newton’s hallmark law of universal gravitation states that all bodies have mass and attract each other in a way that is proportional to their mass and inversely proportional to the distance between them, i.e. the further away from each other they are, the less the force of attraction between them. Although this law does not define what gravity is, it does explain and describe the movement of the planets and how the Universe is held together.


What is not so well known about Isaac Newton is that he was also a deeply religious man and wrote more on religion in his time than he did on science. An essay appended to his Principia (the General Scholium) deals mostly with his religious views. In it he revealed, for example, that he saw a ‘monotheistic God as the masterful creator whose existence could not be denied in the face of the grandeur of all creation’. Perhaps one of the most remarkable of his analyses in the General Scholium concerned his view that our power to freely move our bodies at will could give us insights into God’s relationship with creation. This is an idea I will explore and use to help explain the conditions that birthed us as creative beings in the later sections of this work.


For the next 200 years or so, the Newtonian worldview prevailed and provided the framework for the emergence of the industrial revolution. These were exciting times, and throughout the 18th and 19th century physicists began to think that most of the laws of physics were now understood, and that it wouldn’t take long to reach an understanding of how the Universe worked. This is accurately reflected in the words of Scottish mathematical physicist, William Thomson (Lord Kelvin), when in the year 1900 he said: ‘There is nothing new to be discovered in physics now. All that remains is more and more precise measurement’.


But five years later, along came a 26-year-old patent clerk with a wild new idea. He had been developing a theory based on a synthesis of Newtonian mechanics and the relationship between electric and magnetic fields as espoused by the mathematical physicist James Maxwell. Together this synthesis indicated a completely different perspective on space, time, and light. That young man was Albert Einstein, and what he presented to the world in 1905 was the famous special theory of relativity that shook the foundations of physics to the core.



Today, Albert Einstein’s name is one that is often linked with the word ‘genius’, and he was certainly a man capable of thinking outside the square. Isaac Newton’s model of the Universe had claimed that ‘space’ was a three-dimensional absolute, that ‘time’ was also an absolute, and that there was little or no relationship between them, except that together they formed an arena in which events could take place. Einstein, in contrast, argued that space and time were not separate entities at all, but were inextricably entwined. Hence, he coined the phrase ‘space-time’ as the representation of a single, four-dimensional continuum (three dimensions of space, and one of time). Further, Einstein had realised that the speed of light always appeared to be constant, no matter what the speed of the observer.


As a 16-year-old, the young German-born Albert had entertained an extraordinary question; ‘What would the world look like if I were sitting on a beam of light?’ Contemplation of this question led him to the discovery that unlike the speed of anything else we know, the ‘speed of light’ is entirely unusual. Our familiar experience of speed is that it is a cumulative phenomenon, not something independent and constant. To explain; if you throw a ball straight out from you at, say, 50 km/hour, then the ball is travelling―and will continue to travel―at 50 km/hour (ignoring wind resistance and gravity). If you now jump in your car and drive along at 100 km/hour, then throw the ball forward at 50 km/hour, the speed of the ball is 150 km/hour―the component speeds are cumulative.


When it comes to light, however, if you shine a light when standing still, it will travel away from you at the speed of light (represented by the letter ‘c’) which is roughly about one billion km/hour. Now if you are driving in your car at 100km/hour and then shine that same light, the speed of the light is still a billion km/hour, not a billion plus 100. So, what does this mean?

To understand, we need to backtrack a little, back to the work of James Maxwell that expanded upon the work of Michael Faraday in the early 1800s. In his time, Faraday explored and revealed much about the behaviour of electric and magnetic fields, but saw them as separate entities. It was Maxwell who showed that they were, in fact, different aspects of a single entity, and coined the term ‘electromagnetism’ to describe the relationship. He discovered that disturbances in electromagnetic fields generate energy that travels in a wavelike manner and at the speed of light. From that he deduced that light itself must be an electromagnetic wave.


The waves we are most familiar with in our experience travel in a medium, for example, ocean waves are energy carried by water, and sound waves are energy carried by air. The speed of those waves is a measurement with respect to the medium they travel in. Logically, then, if light was a wave, that raised the question, ‘What is the medium in which light travels?’


That question set physicists of the time on a long and unproductive search for the mysterious medium that allowed light to travel and by which the speed of it was measured. Many believed that this substance was ether (or the ‘luminiferous’ ether, because it carried light), a term that harked back to Plato’s days when it was seen as the non-physical ‘something’ that filled the region of space beyond the terrestrial. But the ether proved to be entirely elusive, and it required the likes of Einstein with his theory of relativity to finally shine some light on the matter (pun intended!)


Einstein was famous for his simplicity. In fact, he is quoted to have said, ‘Everything should be made as simple as possible, but no simpler’. Rather than continuing the fruitless search for the ether, Einstein simply dismissed the notion of it entirely. So, in answer to the question of the medium by which the speed of light was measured, he declared that it was measured with respect to anything and everything!


To understand the implications of this, and how it gave rise to the special theory of relativity, let’s look at a specific scenario. Imagine that you have just taken delivery of the latest and greatest automobile ever made―the very new and fancy Tesla EM Drive Superblaster, a vehicle capable of a top speed of 600 million km/hour. It’s the first of its kind and, of course, you can’t wait to show it off to your mates. So, you invite them around, and after due time is spent ogling the fine lines of this new super supercar, you fire it up, ready to impress. Someone challenges you to see if you can beat the speed of light, and although you know that your top speed is only a bit more than half that, you can’t resist the challenge, nor the opportunity to show everyone―including yourself―what this machine can do. So off you go, you against the light, and after racing for one hour (to keep the maths simple) you have, of course, covered exactly 600 million km. The light, on the other hand, has gone 1 billion km. When you get back you compare notes, and from your mates’ perspective, they see the difference in distance covered and conclude that: 1. you lost the bet, but more importantly; 2. the light must have been moving away from you at exactly 400 million km/hour.

You agree that you lost the bet, but you cannot agree on the speed that the light was moving away from you. You measured the speed from the car during the race, and from your perspective, the light was moving away from you at its constant 1 billion km/hour. My example, of course, is a somewhat ridiculous scenario, but it demonstrates the point. Both perspectives are correct and yet they yield conflicting results.


On the face of it, this makes no sense at all. The only logical conclusion is to see that because speed is the measure of distance travelled, divided by the time taken to travel it, and that the speed of light is constant, then the example I have just given implies that somehow time and space must be affected by speed. And this is exactly what Einstein’s theory states―that not only do things move through space, they also move through time, and there is a direct relationship between them that must always equal the speed of light. As speed increases, time slows down and this effect is known as time dilation.


Although the technology and equipment was not available to test Einstein’s theory at the time, these relationships have now been tested many times and proven to be correct. In fact, the reason we can all enjoy the convenience and extreme accuracy of moment-to-moment guidance from GPS (Global Positioning System) navigation these days is due to the understanding of, and accounting for, time dilation described by relativity. The 24 satellites that form the basis of the system travel at speeds of about 14,000 km/hour, and this is enough to cause time differences between them and the Earth. If they were not accounted for, your GPS may be prompting you to turn long before you had reached the turnoff and, of course, this effect would only escalate over time, making the technology useless.

Time dilation is perhaps a little difficult to comprehend but things gets even more quirky. Special relativity goes further to predict that not only does time distort as things speed up, but also length decreases and mass increases. Einstein could only derive these predictions from his theory, and once again had no way of proving it in his day. But many experiments have since been done which show that the predictions are entirely accurate. So, what are we to make of all this?


Put simply, special relativity shows us that time and space are related and entirely subjective. We all carry our own measure, which is valid, but different from that of another who is in relative and constant motion (i.e. anyone who is moving with respect to you). Einstein’s theory of special relativity was a ground-breaking work, and yet he was not entirely happy with it because it conflicted with Newton’s theory of gravity. The prime concern was that if the speed of light was the fastest that anything could travel, it suggested that it was also the fastest that anything could be ‘reached’ and therefore influenced by something else. This information conflicted with the apparent fact that the effects of gravity were ever-present and immediate over vast distances. For the next seven or eight years, Einstein knuckled down once again and began working on a theory of gravity that would complement his special relativity. Finally, in 1915 he emerged with the results―the general theory of relativity.


As mentioned earlier, Einstein’s ideas were shaped to a large degree by the work of Faraday and Maxwell on electromagnetism. A mysterious aspect of electromagnetic fields is their ability to influence things without touching them, and it is this fact that directed Einstein’s initial search for a theory of gravity. If electromagnetism generates a field of influence, then perhaps in a similar way gravity also might generate a gravitational field. Coupled with this major insight was Einstein’s discovery that gravity was, in fact, an acceleration. He stumbled upon this idea and referred to it as ‘the happiest thought of his life’, and it certainly was a perspective that would have a significant impact. This idea is counter-intuitive, but not difficult to grasp once we think it through.


When your body is at rest, or when it is moving with constant velocity (like in a car that is going a steady 100 km/hour), no forces are acting upon it and you do not feel anything unusual. But when you press on the accelerator, you feel a force pushing you back into the seat. And that very statement is the clue. There is a direct relationship between acceleration and force, and Einstein called this relationship the principle of equivalence. From this he deduced that gravity was not so much a force but rather an acceleration, and that it is this acceleration which creates a gravitational field that distorts (curves or warps) space-time.


To fully grasp why this is so, let us go back to our Tesla Superblaster. If we were to travel along at a fixed speed and shine our light once again, this time straight up rather than straight ahead, the beam would move away from us in a straight line. If we then opened the throttle and accelerated to top speed, the beam of light would begin to curve. Now because light will always travel the shortest distance between two points, which on a flat surface is in a straight line, of course, then the curved beam of light we are witnessing indicates that somehow space is curving in the cabin of our car. This is the basis of general relativity, which encompasses gravity in terms of the principles revealed in special relativity.


The full implications of the relativity theories, which generally speaking are now combined and referred to today as the theory of relativity―or more simply just relativity―are complex and far-reaching, and it is not the purpose of this work to explore them any more than I have. The main points of relativity that I have extracted and highlighted are that the speed of light is considered to be a constant, that experience is unique and relative to each and every observer, and that space-time and gravity are related. In combination, this indicates that relativity is something extremely significant in determining the nature of what we call ‘our reality’.


Reality, of course, is something that is highly ponderable, and it would seem that it has moved from the more objective to the more subjective point of view as time has marched on. The Newtonian model described a fixed, deterministic existence that could be measured, explained and predicted by mathematical formula. The Einsteinian model shook that up a great deal as we have discovered. But nothing has shaken science more than the weird and baffling, yet very well-established physics of quantum mechanics, that describes a Universe where nothing is stable or fixed at all, but rather a vast probability field awaiting final determination upon measurement by an observer.


Quantum Mechanics

Quantum mechanics had its beginnings around the same time that Einstein was rolling out his theories of relativity. In the late 1890s the German physicist Max Karl Ernst Ludwig Planck was puzzled as to why hot metal glowed red. Just as Newton’s observation of a falling apple produced a science that underpins the principles of mechanical engineering, and Einstein developed a science that helps us fly spaceships to Mars by contemplating the experience of a falling man, this simple observation and questioning was the first step in the development of a science that has allowed possibly the most influential technology ever―the computer.


The only way Planck could explain the phenomenon he observed was to hypothesise that energy was released in distinct packages rather than in a continuous stream. He called these packages quanta, hence this name and principle lie behind the term ‘quantum mechanics’ (quantum theory or quantum physics―the terms are interchangeable). Effectively, he showed that the very structure of existence was an assembly of quanta of certain discrete values―multiples of a particular constant, which is now known as the Planck constant. This relationship is not unlike the fact that the population of the world is made up from discrete packages called people, some bigger some smaller, but nonetheless all people together make up the world population. Now this leads us back to the nature of light (energy)―is it a wave or is it a particle? In the 17th century, Isaac Newton and Robert Hooke argued furiously over which it was, as did many others. Today, light and all electromagnetic energy is considered to be both a wave and a particle, and this phenomenon is known as wave/particle duality, which is the cornerstone of quantum physics.


It all began in 1801 when Thomas Young from Great Britain devised a very simple and clever experiment to support his wave theory of light. The experiment is known as the ‘double slit’, and it is something very simple to do without expensive equipment. He showed that when light was shone through two narrow vertical slits in a card and onto a screen, a very distinct pattern of wave interference emerged, and this was irrefutable evidence that light was a wave. See Figure A.

Wave Interference Pattern

               Figure A ― Wave Interference Pattern generated by shining light through two slits

But then along came Einstein once again, with his work on the photoelectric effect that won him a Nobel Prize in 1921, and showed that light was unquestionably a particle which he called a photon. This agreed with Planck’s idea of the quantum being a discrete package of energy.


So back to the double slit experiment. If light did come in discrete little packages called photons, then logically, when these photons were fired at the two slits, they would go through both of them and appear on the screen as two vertical lines (being an accumulation of zillions of dots of light) directly behind the slits. As an analogy, consider the act of throwing paintballs at a wall. Between you and the wall there is a shield with two vertical slits in it, big enough to allow the balls to pass through. If you are a good shot, you might be able to get most of them through the slits (some might hit the shield, in the same way the photons would), but the result will be that after throwing enough paintballs, they will make an accumulation of marks on the wall in the shape of two vertical lines behind the shield.


However, when the experiment was done, this was not the result obtained. Instead, just as Young had shown a hundred years or so earlier, the interference pattern appeared, indicating that light was a wave. Very strange. Einstein’s work showed that light was a particle, but the double slit experiment continued to show that it was a wave. The scientists performing these experiments then decided that somehow the photons must be interfering with each other to create the wave-like appearance. So, they decided to fire single photons through the slits, one at a time. The unexpected and disturbing result was that once again the interference pattern appeared (Figure A).


The next move was to examine the interaction at the point where the photons were passing through the slits, and to this end, the scientists placed detectors at both slits to determine and measure which slit the photons were passing through. And now the most baffling thing occurred. The photons, after being measured, then appeared as expected―an accumulation of dots of light appearing as two bands on the screen behind! See Figure B.


Figure B

This result was utterly inconceivable. But the spookiness didn’t stop there. They then discovered that if the sensors were activated but not recording any information about what was sensed, the result was once again an interference pattern. This suggested that the result of a photon appearing as a particle was not so much that it was being sensed, but rather that the information about it was being recorded.


This phenomenon has baffled scientists from the first time it was discovered through experimentation right up to this very day. Many people have considered the problem and developed explanations that have helped define and describe the science of quantum mechanics. One contributor was a man named Erwin Schrödinger who saw the interference pattern of light appearing on the screen as representing a probability distribution, or ‘Bell curve’. A curve of this nature describes the normal distribution of a variable. The highest point on the curve―the top of the bell―represents the most probable event. All other possible occurrences are equally distributed either side of the most likely event, and this creates a downward-sloping curving line on each side of the peak. See Figure C.

Figure C

Figure C

Figure C ― Bell curve representation of a wave interference pattern

Figure C ― Bell curve representation of a wave interference pattern

Bell curves can be ascribed to almost any statistical observation, and suggest that nothing is absolute or certain, but rather that there is greater or lesser likelihood of something occurring as the curve describes. This discovery was a significant piece of information in the wave/particle duality puzzle, and Schrödinger went on to suggest that a particle was not so much a fixed thing but rather a probability distribution. This perspective helped explain the phenomenon, and following from it, a highly successful maths was developed that predicts the outcomes of many experiments in particle physics with extremely high accuracy.


Quantum mechanics shows quite clearly that whatever the true nature of particles is, they do not really exist as fixed items until they are measured and recorded as such. This discovery effectively settled the argument about whether light was a wave or a particle, and confirmed that it can be either.


Erwin Schrödinger continued to contribute a great deal to the science of quantum mechanics throughout his life, but his name is perhaps best known today for his famous thought experiment involving a cat. ‘Schrödinger’s cat’ has been popularised through the media in recent times and is almost a household name. To understand the purpose and implications of the experiment, we must backtrack just a little and talk about Niels Bohr.


Bohr was a Danish physicist and another major contributor to the field of quantum mechanics. He is perhaps now best known for his development of the very reasonable and useful model of the atom―the ‘Bohr model’ of atomic structure that describes electrons orbiting a nucleus of protons and neutrons. His interpretation of the double slit experiment, famously known as the ‘Copenhagen Interpretation’ proposed in the 1920s, was that quantum particles do not exist in one state or another, but in all possible states simultaneously. This principle is known as quantum superposition. It is only when we observe its state that a quantum particle is essentially forced to choose one probability―its wave function collapses to become what we perceive.


However difficult this may be to comprehend, the idea might be conceivable or understandable for microscopic, fundamental particles. But when the interpretation is applied to larger, let us say more ‘real’ objects―like a cat―it’s a little more difficult to accept. Through his hypothetical cat experiment, Schrödinger attempted to demonstrate this point as follows: a cat is placed and sealed in a box, along with a bit of radioactive material and a device for detecting radiation. If any decay of the radioactive material is detected, the device triggers a hammer which breaks a flask containing hydrocyanic acid, which would then kill the cat.


The experiment takes place within a fixed period of one hour, which is long enough so that some of the radioactive material could decay, but short enough so that it was also possible that none would. Effectively, there would be a 50/50 chance as to whether the cat would live or die. Since the cat could not be observed during this time in the box, it could not be said with any degree of certainty whether the cat was alive or dead. The Copenhagen interpretation, therefore, would say that while the cat was in the box, it existed in all possible states―some kind of zombie reality of both life and death, which is counter-intuitive and also somewhat ridiculous. And this is precisely the problem with quantum mechanics―it describes beautifully and accurately occurrences at the sub-atomic level, but fails when applied to cats and larger objects, especially the Universe itself.


Because quantum mechanics fails to reconcile the smallest with the largest, one of the prime goals of theoretical physics today is to somehow resolve this and formulate an all-encompassing theory that explains everything, at all scales and all the time. This ultimate theory, of course, is the ToE we spoke about in the introduction, and there have been many attempts to crack it. Einstein spent the last twenty-five years of his life trying to formulate a ‘Unified Field Theory’ but failed to develop anything conclusive. Many have criticised that he consequently ‘wasted’ the remaining years of his life, but overall Albert Einstein contributed more to understanding the mysteries of the Universe in his lifetime than any other person either before him, or since.


The Theory of Everything

There have been numerous attempts to discover this Unified Field Theory that Einstein sought, and many physicists believe that the highly theoretical and modern-day String Theory is a strong contender. However, string theory has not yet provided satisfactory evidence or gained sufficient support, and is far from being able to offer any practical application for us in the world.


For this reason, the two most accepted theories that remain today are Einstein’s relativity and quantum mechanics. Together they stand as two mighty pillars of modern physics, and yet they compete with, rather than complement one another, and therefore cannot ever provide the ultimate solution.


To understand the challenge a little better, and to help deepen the framework for our discussion of Origin as a model that might solve the issue, we now need to talk about the four fundamental forces in the Universe which most modern physicists agree upon as being real and true. It is my view that the quest to understand the nature of all things and develop a comprehensive ToE must be something that directly links to our ability to understand and reconcile these forces, because it is these forces alone (and the relationship between them) that give rise to matter, motion, and interaction―which is pretty much what life is all about from a scientific point of view.


The four forces are gravity, electromagnetism, the strong nuclear force, and the weak nuclear force. We have discussed gravity previously, and it is, of course, the force that we are all most familiar with experientially, because it is that which prevents us from floating off into space! Gravity has an infinite reach, cannot be absorbed or transformed, nor can it be shielded against. According to current understanding, gravity is the weakest of the four forces because it has an almost negligible effect on sub-atomic structure in comparison to the other forces that occur within the atom. It is also considered to be the most mysterious force in the Universe because we do not know precisely what it is, or why it even occurs. To understand gravity is therefore essential in coming to a complete understanding of the nature of existence.


The strongest of the four fundamental forces is the strong nuclear force and it binds the nucleus of an atom together. If it were not for the strong nuclear force, atoms could not exist, and in turn, matter could not be.


The electromagnetic force is a type of physical interaction between electrically charged particles, which is to say, the interaction of attraction or repulsion between positive and negative charges. This force plays an important part in determining the properties, and therefore the appearance, of everything we encounter in daily life. Everything is made up of matter, and that matter is a construction of atoms and molecules that form and link together courtesy of the electromagnetic force.


The weak nuclear force is more complicated to explain. It is a force stronger than gravity but weaker than the strong force that binds sub-atomic particles as a nucleus. While the other forces tend to hold things together, the weak force plays a role in things coming apart. ‘Things coming apart’ at the sub-atomic level is called radioactive decay and is the process by which the nucleus of an unstable atom loses and emits energy. If the weak force did not exist, many types of matter would be more stable―like uranium and plutonium―and consequently something like nuclear power would not ever be an option for us. A good example of the weak force in action is the initiation of the fusion reaction that converts hydrogen into helium, and the best example of that at work is the Sun.


These are the four fundamental forces of the Universe, and the first mighty theoretical pillar of physics―general relativity―focuses on only one of them to successfully explain the workings of the Universe on a large scale, i.e. the movement and relationship of celestial bodies, the stars, planets, galaxies, etc. That force is gravity. The other pillar―quantum mechanics―explains the workings of the Universe on an extremely small scale, i.e. atomic and sub-atomic particles, molecules, etc. It effectively describes the remaining three forces (electromagnetism, the strong and weak nuclear forces), but fails to incorporate gravity. Effectively, these two theories are mutually incompatible, and yet they are both extremely successful in their respective domains. Years of research by scientists have accurately confirmed predictions made by each, and both underpin very powerful and valuable technologies. Therefore, inarguably a successful ToE must be able to reconcile gravity (as represented by general relativity) with the principles of quantum mechanics, and the race is now on in physics to derive a theory named, not surprisingly, quantum gravity.


I hope that as we have briefly travelled through the history of physics as it relates to understanding the Universe and the nature of existence, it has become increasingly apparent that gravity is the hot topic, and understanding it is key. Quite clearly, for us to come to know what it is beyond the mere acceptance that it exists, would be something truly valuable. But before we have any hope of achieving that, we must now explore the beginning of our Universe in an entirely new way, and describe the original condition from which gravity resulted. Then, and only then, can all other conditions, relationships, and occurrences be derived with any degree of certainty. Surprisingly, this will be much easier than you might imagine.