Caveat: The following is based on my lay understanding of physics-based literature that I’ve read. I am not a quantum physicist nor any other type of physicist for that matter.
Several years ago French physicist Alain Aspect conducted a test of a proposition first formulated by John Bell in 1964 (Bell’s Theorem). Bell’s Theorem asserts that the nature of reality is local. What this means is that if you do something to x it cannot have any effect on y if the two are separated by enough distance so that even at the speed of light the effect on x could not transit the distance between x and y in the time it takes to measure y. Bell was reacting to the prediction of quantum physics that two particles (see note on particles at end) that have interacted with one another are from that point on entangled. What this means is that when something is done to one (x) it will instantly affect the other (y) and the distance between the two is of no consequence. This is what Albert Einstein once referred to as “spooky action at a distance.” In short, what quantum physics predicted was that at root, reality is non-local. What non-local means is unbounded by space-time. Thus, a confirmation of Bell’s Theorem would support locality and a refutation of it would support non-locality.
It was not until near the end of the twentieth century that it became technically possible to conduct a controlled experiment of the theorem. This experiment was done by Aspect, and the results supported non-locality. This resolved a debate that had gone on for 23 years between Albert Einstein and Neils Bohr (both Nobel Laureates in physics) in Bohr’s favor. Unfortunately, neither lived to see the debate resolved. The finding has been replicated and extended by subsequent experiments by other physicists, much to the chagrin of many in the physics community who are committed to a local view and choose to ignore the implications of the experiments.
Another paradoxical experimental outcome has been the wave/particle duality established by the famous double slit experiment. As I understand it, the traditional double-slit experiment observes that when particles; e.g., photons, or electrons are directed at a panel with two slits, the particles produce an interference pattern on a sheet of film behind the panel. Think about dropping several pebbles close together into a pool of still water. Each pebble produces a ripple pattern and because they are close together, the ripples interfere with one another forming a complex pattern. This is called an interference pattern. In the experiment, the only way that this pattern could be produced would be if the particles went through both slits in a wave form rather than a particle form. If the particles had been in a particle form they would have produced two separate and similar patterns on the film that indicate no interference took place. If you repeat the experiment with detectors set up to identify which particles go through each of the slits as they pass through the respective slits, what you get is a particle pattern on the film behind the panel. The implication being that observing and knowing which particles pass through each slit causes the wave form to collapse into a particle form. If you redo the experiment and take away the detectors, you once again get a pattern on the film indicating interference and thus the particles must have gone through the slits as waves.
In 1978 Princeton University’s John Wheeler proposed a thought experiment that hypothesized that the critical factor in the outcome of the famous double-slit experiment was not simply measurement of movement through the slits. This proposal is known as the “delayed choice” experiment and it proposes that it is the decision (must be prior to an actual observed outcome) to measure not the measurement at the slits that determines the observed outcome. Andrew Truscott, at the Australian National University (ANU), ran one the most recent experimental tests of John Wheeler’s “delayed choice” thought experiment (click here). This experimental result again confirmed Wheeler’s prediction of the outcome of his proposal. Even if you wait and decide to make your measurement just before the “particle” hits the target film and after it has passed through the slits, you get a particle pattern on the screen instead of the expected interference pattern. In other words, it is the conscious decision and implementation of that decision that determines the outcome whether the decision is made before or after the “particle” passes through the slits. There is a concrete illustration of what is going on in this experiment offered by retired NASA physicist Tom Campbell, which you can see here. If you want a more detailed explanation click here.
In other words, Wheeler’s thought experiment asked what would happen if you did not use a detector until after the particles had passed through the slits and were about to hit the film. That is, measure the end result rather than the movement through the slits. The “particles” had already passed through the slits and, based on the prior experiments, should be in a wave form given no measurement was made at the slits. Passing through the slits in a wave form is the only explanation for the interference pattern observed when the state of the particles have not been assessed at the slit. If no measurement has been taken at the slits, the expected pattern is an interference pattern. However, if the measurement is taken just before the particles hit the film, you get a particle pattern on the film, which implies that the particles did not pass through the slits as waves but as particles. The measurement just before the particles hit the film appears to retroactively affect the particles prior to their passing through the slits. Think of jumping off of a high dive into a swimming pool. Once you jump, you cannot reverse the action and return to the diving board but the experiment seems to imply that at the quantum level this is possible. The lead researcher, Truscott, in the ANU experiment said about the result, “It proves that measurement is everything. At the quantum level, reality does not exist if you are not looking at it.” This result also supports non-locality because it implies conscious actions can produce results that are outside of space-time (i.e., locality). Accepting that macro reality is built upon the principles that appear to govern micro reality, we may be due for some serious revisions of the nature of reality.
The results from testing Wheeler’s proposal have also been described as retroactive causation. What this is intended to convey is that the effect of the delayed choice measurement actually went backward in time and changed the state of the “particles” before they passed through the slits. However, given the earlier experiment discussed that confirmed reality at the quantum level to be non-local (not bound by space-time), it may be unnecessary to invoke “time travel” to explain the results. Campbell has argued that a better explanation of the obtained results is the one that Wheeler himself proposed. According to Campbell, Wheeler thought that a particle is actually neither a wave or a particle, though potentially both. It is an information packet. Campbell suggests that measurement is tantamount to a query of the information packet, which provides a data stream defining one of the potential outcomes available in the information. In this scenario the critical variable is not where the query is made (at the slits or at the film) but that it was made. In short, the slits only appear to cause the outcome. The real cause is whether there is a query made or not. When you know does not matter so much as that you know. Wheeler thought reality was at root constructed from information. Campbell agrees and suggests that what we actually experience is a self-evolving, virtual reality. Campbell is not the first to suggest this possibility. Three physicists (Silas R. Beane, Zohreh Davoudi and Martin Savage) published a paper in 2012 proposing that the universe appears to have characteristics similar to a computer simulation.
Recently, a physicist (Menas Kafatos) and a Philosopher of Science (Robert Nadeau) wrote a book (The Conscious Universe) explaining the debate and exploring the implications of Aspect’s experimental findings. In their view, the implication is that given the 12-14 billion-year age of the universe, every particle comprising the universe has had more than enough time to have interacted with every other particle. In short, every particle comprising the entire universe is entangled with every other particle. They propose that entanglement, non-locality, order and the manifestation of the physical dimension out of a wave of probabilities through measurement or observation requires that consciousness be a fundamental aspect of the universe and is a primary, not an emergent, property. Thus, if conscious intent, as many experiments suggest, is required for a particle to be manifest out of a range of probable outcomes present in the quantum field, then consciousness is primary and matter an emergent property.
Their interpretation of universal entanglement is that the universe is an undivided whole. This has serious consequences for both the ontological (matter is primary) and epistemological (understanding the whole from the parts; i.e., reductionism) foundations on which science has been based since the time of Newton. They argue that in the case of the universe the whole cannot be known from studying the parts because an indivisible whole cannot be the sum of its parts. Further, they argue that this imposes an event horizon on human scientific knowledge. There is a point beyond which analytic study of apparent parts will yield no useful results. They do think that science can play a role in expanding human knowledge, just that it has an inherent limitation beyond which it cannot pass. They also suggest that for science to make much further progress it must undertake a serious examination and revision of its paradigm (reductionist materialism).
The authors also explore at some length the role of Bohr’s Principle of Complementarity, which in physics is the tenet that a complete knowledge of phenomena on the quantum scale requires a description of both wave and particle properties. However, Bohr himself thought the principle to be more generally applicable and discussed some of the potential macro applications in such fields as biology and psychology. Kafatos and Nadeau think that many of the phenomena in the physical world and human culture can be thought of as complementary pairs such as good and evil, logic and intuition, life and death, male and female, thinking and feeling, order and chaos, etc. Each pair comprises a whole that defies complete understanding when examined as separate phenomena. It is advocated that more holistic approaches to the study of such phenomena are needed.
One possibility explored is that the whole might be knowable through an intuitive process referred to as “knowing by being,” which is equated with reports by mystics through the ages. They suggest that it may be possible for an individuated aspect of universal consciousness to intuitively access the source and experience the whole (infinite mind, God, etc.). However, the knowledge of mystics is private and largely subjective whereas scientific knowledge is public and relatively objective. Each has a legitimate claim on its particular knowledge and way of knowing, and both are experiential as opposed to being mere beliefs. The authors also point out that given their mutually exclusive but complimentary natures, neither is capable of validating the other. They discuss the Indian system of yoga known as Kashmir Shaivism as possibly having the most to say to people from western culture about knowing by being. For a discussion of what yoga has to offer western science read the free ebook by Donald DeGracia, PhD, titled What is Science?.
End Note: It has been said that physicists have retained from the 19th century the use of the label “particle” for particular phenomena even though they know better. Think of an atom, which is generally thought to be a critical building block of the physical world. Our generic atom consists of an electromagnetic field populated with various “particles” such as protons, neutrons and electrons. What are these “particles” in an atom? We lay people are inclined to think of them as very small bits of matter; however, they are actually “excited states” of the field. Think of the ocean with waves arising from the surface. The wave is still the same water that the ocean is comprised of just in a different state. One might say that an ocean wave is an excited state of the underlying ocean. Further, like a particle, a wave consists of nothing more or less than what it arose from; i.e., the ocean, which is analogous to the atom’s electromagnetic field. Or, to quote Albert Einstein, “There is no place in this new kind of physics both for the field and matter, for the field is the only reality.”
It has been suggested that one think of an atom as a field one hundred yards across with a green pea in the center of the field (to represent the nucleus) and a BB at the outer edge of the field (to represent an electron). This leaves a lot of room or unexcited area. Someone calculated that if you took a human being and removed everything except the “particles” in each atom comprising that individual and then repeated the process with every person on the planet, one could fit the human race inside the volume of a sugar cube. So, how is it that things composed of “matter” comprised of these atoms appear to be so dense? Why can’t you easily stick your hand through a wall? The answer seems to be something similar to opposing lines of force associated with the vibratory quality of the excited states within atoms. By way of analogy, think about attempting to push the opposite poles of two magnets together. Anyone who has attempted this has observed a considerable resistance for no apparent reason.
Paradigms are conceptual models that serve an umbrella function for theories in diverse areas of study. For example, the current paradigm in science (see What is Science?) is scientific materialism. This paradigm serves an umbrella function for theories about such things as physical process, biological processes and behavioral process. This paradigm has its origins in the scientific revolution inspired by the scientific thinking of Nicolaus Copernicus in the sixteenth century and Isaac Newton in the seventeenth century. Scientific materialism as a paradigm assumes that everything is comprised of physical particles (principle of physicalism; i.e., the root assumption of the paradigm) governed by cause and effect relationships (principle of causal determinism), that change is continuous (principle of continuity), that phenomena occur within a finite space and over finite periods of time (principle of locality), that phenomena have objective existence independent of observation (principle of strong objectivity) and can be understood through reducing phenomena to their essential components (the principle of reductionism), which implies that phenomena are assembled from the bottom up, piece by piece. All theories falling under the umbrella share these basic assumptions. See Goswami’s Quantum Philosophy (Part I) and Goswami’s Philosophical Alternative for more detail.
One tenet of science as a methodology is that it holds to certain principles about the nature of knowledge. One of these principles is that our knowledge consists of models of reality and are not elucidations of reality itself. In other words, what we know is always considered to be an approximation never truth. Another principle is that what we know is held as tentatively valid until shown otherwise. How we know is through creating explanations for what appear to be related observations or facts about phenomena in the world. These explanations (a.k.a. theories) are then used to derive hypotheses that can be experimentally tested. Successful tests of hypotheses derived from theory increase the confidence that we can have in the explanation or theory. Confirmation of an hypothesis is sometimes possible by successful prediction of an outcome, such as the prediction of planetary motion based on a theoretical model or explanation of the forces governing such motion. In other and more confounded cases, confirmation of an hypothesis is sought through statistical testing in which a conclusion is reached based on probability calculations. The typical standard in such cases is a p. <= .05, which means the observed result would be expected by chance only 5 times in 100 or 1 time in 20. Standards such as this can, of course, result in some false positives but is considered an acceptable error rate for theory testing (see “What is Science?“)
The flaw in this system is that a paradigm can come to be so central to the scientific process that it begins to be viewed as Truth. Once this happens its assumptions acquire the status of dogma. When this occurs, theories subsumed by the paradigm become inoculated against accepting results that are contrary to dogma, which also means contrary to theory or theories grounded in the paradigm. Once this happens, science has become scientism. It appears that contemporary science is grappling with the problem of scientism. With the advent of quantum mechanics in the early twentieth century the basic assumptions of scientific materialism were challenged. Experimental evidence refutes or strongly questions the validity of the principles or assumptions of scientific materialism enumerated above. Resistance to this challenge has been evident in a variety of fields that have simply ignored the challenges and continued to act as if nothing had changed. This is especially true in the cases of biological and behavioral sciences. Many physical sciences have found ignoring the shifting paradigm more difficult. However, even in the physical sciences the tendency has been to attempt to limit the shift to effects occurring at the micro level and preserve the paradigm at the macro level. Unfortunately, experimental evidence is accumulating that demonstrates quantum effects can also be detected at and thus have effects at the macro level.
Another source of challenge to scientific materialism that became evident during the twentieth century was the results from psi experiments (e.g., see Spirituality and Religion). One early body of experimentation was that done by J.B. Rhine at Duke University. Rhine produced evidence that certainly should have caused some serious questioning of the adequacy of scientific materialism, he and his results were widely rationalized away because they were inconsistent with the prevailing paradigm suggesting that the assumptions of the paradigm had become dogma. Later in the twentieth century a large body of research was accumulated under the leadership of Robert Jahn at Princeton University in its engineering anomalies laboratory. This work too was rationalized away to maintain the integrity of the paradigm or if you prefer to preserve the dogma of scientism. In both cases, the correct scientific response should have been intense investigation rather than out-of-hand dismissal.
There are of course researchers that continue to pursue investigation into these challenges to scientific materialism (e.g., see Society for Scientific Exploration). A large group of open minded investigators have formed an organization (Academy for the Advancement of Post-materialist Science) dedicated to finding a new and better paradigm. Whether or not a new paradigm is justified, careful investigation of challenges should be applauded, not ridiculed as is often the case from those wedded to scientism.
Traditionally science and the educated public have held a Newtonian view of the world, which is in most respects a common sense view rooted philosophically in materialism. The materialist model is reductionist and holds that all macro phenomena can be reduced to the basic building blocks of matter, i.e. atoms. The quantum model superseded this model nearly a century ago. However, the materialist model was not supplanted but subsumed. One can think of the materialist model as a special case subsumed within the quantum model, which works well enough for many purposes but has been shown to be capable of only inaccurate approximations when tasked with describing the reality underlying the world and indeed the universe. To see an outline comparing scientific materialism with Goswami’s alternative paradigm click here.
By way of analogy, think of a computer with a huge amount of RAM or working memory. Within this “working memory” there is nestled a small reserved area, which might be thought of as having a shell that that partitions it off from the rest of working memory. Within this reserved area there is a self-evolving virtual reality program running. The program has to follow certain rules, which impose limits on what it can produce but still allows a number of degrees of freedom for its operation. From the sheltered perspective of the virtual reality program, the reality created by the program is all there is and the vast field of “working memory” within which it runs goes undetected. Think of the huge “working memory” as the unified field of consciousness, the shell around the reserved area as space/time, the self-evolving virtual reality program as the material model of reality and the rules that govern the operation of the program as classical (Newtonian) physics (see Figure below). With the advent of quantum physics, cracks have been opened in the shell. Through these cracks in the shell, the inhabitants of this world are beginning to get glimpses of a broader and deeper perspective on reality.
To appreciate the quantum perspective one needs to look at its impact on the defining aspects of the materialist model. The first aspect is causal determinism or the hypothesis that the world is a machine like a mechanical clock. Events proceed in a linear fashion, where A is the antecedent for B and B is the antecedent for C and so on. In other words classical determinism requires the identification of the originating cause and the end result. Experimental studies in quantum physics demonstrate that the exact position and velocity of an electron cannot both be known. In the Newtonian model, classical determinism depends upon being able to predict exactly both initial position and initial velocity. If things cannot be predicted with precision, classical determinism is out the window because the beginning point for the causal chain can never be known. Thus, all one can do is create probability distributions (bell curves) for both variables and identify probable values for the variables. The two distributions of values together represent a wave of possibilities. Heisenberg, one of the co-founders of quantum mechanics, expressed this finding in his now famous uncertainty principle. What is left is statistical determinism.
Why don’t we experience the effects of statistical determinism in everyday life? Planck’s constant h fixes the scale at which quantum effects are large. Fortunately, h is small, which means that quantum effects are only “large” and easily observed effects at the micro level. The small value for h hides quantum effects at the macro level. However, even macro objects have been demonstrated to retain some aspect of the wave of possibilities from which they collapsed. The wave aspect of a collapsed possibility continues to spread out over its probability distribution extremely slowly. Collapsed waves or objects (comprised of particles) are still governed by statistical determinism but the collapsed wave spreads so slowly that its inherent uncertainty can be ignored for all practical purposes. However, even though it is hardly detectable with the most sophisticated instrumentation, the continuing spread of the collapsed wave implies that there remains some connection to the wave of possibilities existing prior to collapse and material manifestation.
One way to think of this process might be to imagine that a wave of possibilities is like a continuous loop of images, where there are 6 images of A, 5 images of B, 4 images of C, 3 images of D, 2 images of E and 1 image of F. Thus, if one slows down the loop until one image becomes the focus, you have the collapse of the wave of possibilities. Statistical determinism tells us that the image that becomes the focus is most likely to be image A (p = .30) but could be image F (p = .05). The loop (wave) has taken on the appearance of a single frame (particle) or collapsed possibility wave (see Figure below). However, recall that one has only slowed down the loop, not frozen it. Thus, the loop is still progressing but in very slow motion. Whether you or other observers will ever detect this slow movement depends upon how long and with how much precision you observe the image. Even though one now observes only a single frame, that frame still retains a “hidden” connection to the loop. This analogy also illustrates the difficulty of identifying a linear chain of causation within a loop (wave of possibilities).
The second aspect of the materialist model is continuity or the hypothesis that all change is continuous. Experimental studies confirm that atomic energies exist at discontinuous energy levels, which are fixed. Thus, an electron cannot exist at intermediate energy levels residing between fixed levels. When an electron changes orbits, which are at fixed distances from the nucleus, it goes from one discrete energy level (orbit) to another in a single quantum leap. The electron’s change in orbit provides evidence for spatial discontinuity. This is further illustrated by the phenomenon known as quantum tunneling. This can be observed in transistors in which an electron disappears from one side of a barrier and reappears on the other side without passing through the barrier. More concretely, think about standing with your back to the wall of an empty room. You look to your left and there is your mother standing against the wall to your left. You look to your right and your mother is now standing against the wall to your right. You had a clear view of the entire room and you never detected your mother’s transit from the wall on the left to the wall on the right. The move was not a progressive transit of space over time but instantaneous. Your mother simply disappeared from one location and reappeared at a different location.
The third aspect is locality or the hypothesis that all effects and their causes occur in space with a finite velocity over a finite amount of time. Before quantum mechanics, all influences were assumed to be local, i.e., taking a certain amount time to travel through a certain amount of space. Think about your mother walking from one side of the room described above to the other side. However, in quantum mechanics the discontinuous collapse of a sprawling possibility wave is instantaneous and therefore nonlocal. Think of your mother as a wave of possibilities and one of those possibilities is that she will manifest on the right side of the room. When that possibility is collapsed, your mother instantly materializes on the right side of the room. A possibility wave exists in transcendent potentia, that is, outside of space and time, which is why when it collapses and becomes manifest within space-time, the effect is instantaneous. Nonlocal correlation (Einstein’s spooky action at a distance) between quantum objects has been experimentally verified and confirms that a transcendent domain is part of reality, which contradicts the locality assumptions of the materialist model and affirms non-locality.
The fourth aspect is strong objectivity or the hypothesis that the material world is independent of observers (consciousness). However, as we’ve seen, the wave is transcendent and the particle is manifest. What then causes the transition from wave to particle? It is widely accepted that observation or measurement produces the collapse. Mathematician John von Neumann suggested that the operative property in observation or measurement is consciousness since an instrument cannot observe anything. Think of a telescope pointed at the moon. Is the telescope observing the moon? Or is it the astronomer looking at the moon through the telescope that is observing the moon. While not conclusively demonstrated, it appears that consciousness chooses where a wave will manifest as a particle in a particular event. Thus, how can there be strong objectivity in physics if consciousness has the power to choose material reality? If consciousness causes wave collapse, the material world (collapsed waves) cannot be independent of observers.
The fifth aspect is reductionism or the hypothesis that every material phenomenon can be reduced to its essential components. If reductionism is correct, then all of physical reality can be reduced to elementary material particles. In other words, everything arises from the bottom up as aggregates of material particles coalesce into ever-larger objects, including us. However, if consciousness is needed to collapse waves of possibility into material actuality (particles), which is top down causation, one has mutually exclusive causal mechanisms. In such a case, reductionism ultimately fails.
In the materialist model, phenomena such as consciousness are considered as epiphenomena or secondary properties arising from matter. Thus, all non-material phenomena such as mind, thought and consciousness can ultimately be reduced to matter by considering them epiphenomena of the brain. From this view arises a dualism such that mind and body are from different classes of phenomena, i.e., a difference in kind. However, if consciousness has the causal power to determine material reality, how can it be a derivative of matter? The long, progressive build up to a material brain capable of producing consciousness could never take place, if consciousness is required for the collapse of a possibility wave into a particle of matter to begin with. Thus, while reductionism and bottom up causation may be a useful way of looking at phenomena within the context of the classical worldview, its utility is limited when it comes to understanding the ultimate nature of reality.
Thus, quantum physics has demonstrated empirically that the principles of causal determinism, continuity and locality do not in the final analysis hold up. Quantum physics has also raised serious doubts about but not yet empirically demonstrated that strong objectivity and reductionism are likewise ultimately invalid. These principles do work reasonably well in that subset of quantum reality that we think of as Newtonian or classical physics, which underlies the material model of reality. They just aren’t suitable for grasping the underlying nature of reality.
Given the above, what are the implications for how we view the nature of reality? Amit Goswami, emeritus professor of physics at the University of Oregon, offers some thoughts on this question. The implications that he draws are radical and generally considered to be extreme by many physicists because they turn the world upside down. According to the interpretation of Goswami, Consciousness is the ground of all being and matter exists only as a possibility within consciousness. Thus, there is nothing but consciousness or as some might say, God is all that is. You and I are material manifestations of God as are plants, bacteria, insects, fish, animals, chairs, shirts, houses shovels, pistols, water, earth, planets and stars, ad infinitum. From this one might jump to the conclusion that all things are interconnected. However, Goswami argues that this is an over interpretation and that any two things are only potentially interconnected. They do, however, always have in common their origins in the unified field of consciousness.
Goswami says that dualism is an illusion. The belief that the mind is distinct from the brain or that spirit is distinct from matter or that man is distinct from God is all an illusion. There is only the unified field of consciousness. Everything is a manifestation of consciousness. Consciousness permeates and fills our being, and our brain is a conduit for waves of possibility. When self focuses upon a possibility, wave collapse takes place and there is awareness of the object of the choice. The presence of awareness implies a subject-object split between the subject and the object. One might ask: if we can affect reality by our choices, why isn’t there constant conflict and chaos? For example, everyone who buys a lottery ticket would choose to have the winning number but obviously everyone’s choice can’t prevail. Goswami suggests that the reality that is commonly perceived is created by what might be thought of as a consensus of consciousness. We have the freedom to make choices that affect us but not consensus reality. We can’t personally change consensus reality.
The apparent split responsible for dualism is the product of the dependent co-arising of the subject that chooses and the objects of awareness. Herein objects refer to anything that is perceived as “not me” and can include ideas or thoughts as well as material objects. The consciousness from which both the subject and the object arise identifies with the subject pole of the dyad. This gives rise to the mistaken perception that there is a subject independent of objects. This mistake or illusion is necessary in order for experience, as we know it, to occur. The basis for this mistaken perception or illusion is self-reference, which is not unlike the circular meaning in the statement “I am a liar.” In this sentence the predicate defines the subject and the subject redefines the predicate, the predicate then redefines the subject, setting up an endless oscillation. This is called a tangled hierarchy. The meaning in this statement seemingly forever eludes us, as does the recognition that I (ego) and it (object) arise from the same source.
For Goswami the subject-object split is an epiphenomenon. If we don’t identify with the subject in the subject-object dyad we can escape the illusion. This state of consciousness is what the American mystic Franklin Merrell-Wolff called introception, which denotes consciousness without an object (and thus also without a subject). To experience a state of pure consciousness is to achieve enlightenment or bliss consciousness. The illusion of self develops as choices are made, memories are formed and habitual responses are established and reinforced. As this process unfolds, the range of free choice constricts and consciousness repeatedly collapses conditioned outcomes from among the myriad possibilities actually available. Thus, personal identity or what we call ego is created through a conditioned pattern of perception and response. Habitually adhering to this conditioned pattern in making choices is what the psychic Edgar Cayce referred to as following the path of least resistance or collapsing for oneself what is the most probable outcome or possibility. The freedom to make creative choices is always present but seldom exercised. Understanding that we have this freedom and the exercise of it allows us to step beyond ignorance and discover our true nature.