Caveat: The following is based on my lay understanding of physics-based literature that I’ve read. I am not a quantum physicist nor any other type of physicist for that matter.
Caveat: The following is based on my lay understanding of physics based philosophical literature that I’ve read. I am not a quantum physicist nor any other type of physicist for that matter.
Some years ago French physicist Alain Aspect conducted a test of a proposition first formulated by John Bell in 1964 (Bell’s Theorem). Bell’s Theorem asserts that the nature of reality is local. What this means is that if you do something to x it cannot have any effect on y if the two are separated by enough distance so that even at the speed of light the effect on x could not transit the distance between x and y in the time it takes to measure y. Bell was reacting to the prediction of quantum physics that two particles (see note on particles at end) that have interacted with one another are from that point on entangled. What this means is that when something is done to one (x) it will instantly affect the other (y) and the distance between the two is of no consequence. This is what Albert Einstein once referred to as “spooky action at a distance.” In short, what quantum physics predicted was that at root reality is non-local. What non-local means is unbound by space-time. Thus, a confirmation of Bell’s Theorem would support locality and a refutation of it would support non-locality.
It was not until near the end of the twentieth century that it became technically possible to conduct a controlled experiment of the theorem. This experiment was done by Aspect and the results supported non-locality. This resolved a debate that had gone on for 23 years between Albert Einstein and Neils Bohr (both Nobel Laureates in physics) in Bohr’s favor. Unfortunately, neither lived to see the debate resolved. The finding has been replicated and extended by subsequent experiments by other physicists much to the chagrin of many in the physics community who are committed to a local view and choose to ignore the implications of the experiments.
Recently, a physicist (Menas Kafatos) and a Philosopher of Science (Robert Nadeau) wrote a book (The Conscious Universe) explaining the debate and exploring the implications of Aspect’s experimental findings. In their view, the implication is that given the 12-14 billion year age of the universe every particle comprising the universe has had more than enough time to have interacted with every other particle. In short, every particle comprising the entire universe is entangled with every other particle. They propose that entanglement, non-locality, order and the manifestation of the physical dimension out of a wave of probabilities through measurement or observation requires that consciousness be a fundamental aspect of the universe and is a primary, not an emergent property. Thus, if conscious intent, as many experiments suggest, is required for a particle to be manifest out of a range of probable outcomes present in the quantum field then consciousness is primary and matter an emergent property.
Their interpretation of universal entanglement is that the universe is an undivided whole. This has serious consequences for both the ontological (matter is primary) and epistemological (understanding the whole from the parts – reductionism) foundations in which science has been grounded since the time of Newton. They argue that in the case of the universe the whole cannot be known from studying the parts because an indivisible whole cannot be the sum of its parts. Further, they argue that this imposes an event horizon on human scientific knowledge. There is a point beyond which analytic study of apparent parts will yield no useful results. They do think that science can play a role in expanding human knowledge just that it has an inherent limitation beyond which it cannot pass. They also suggest that for science to make much further progress it must undertake a serious examination and revision of its paradigm (reductionist materialism).
The authors also explore at some length the role of Bohr’s Principle of Complementarity, which in physics, is the tenet that a complete knowledge of phenomena on the quantum scale requires a description of both wave and particle properties. However, Bohr himself thought the principle to be more generally applicable and discussed some of the potential macro applications in such fields as biology and psychology. Kafatos and Nadeau think that many of the phenomena in the physical world and human culture can be thought of as complementary pairs such as good and evil, logic and intuition, life and death, male and female, thinking and feeling and so on. Each pair comprises a whole that defies complete understanding when examined as separate phenomena. It is advocated that more holistic approaches to the study of such phenomena are needed.
One possibility explored is that the whole might be knowable through an intuitive process referred to as “knowing by being,” which is equated with reports by mystics through the ages. They suggest that it may be possible for an individuated aspect of universal consciousness to intuitively access the source and experience the whole (The One, The Absolute, The Unified Field, God, etc.). However, the knowledge of mystics is private and subjective whereas scientific knowledge is public and objective. Each has a legitimate claim on its particular knowledge and way of knowing and both are experiential as opposed to being mere beliefs. The authors also point out that given their mutually exclusive but complimentary natures neither is capable of validating the other. They discuss the Indian system of yoga known as Kashmir Shaivism as possibly having the most to say to people from western culture about knowing by being. For a discussion of what yoga has to offer western science read the free ebook by Donald DeGracia, PhD titled What is Science?.
Andrew Truscott, at the Australian National University (ANU), ran the most recent experimental test of John Wheeler’s “delayed choice” thought experiment first proposed in 1978 (click here for video). This experimental result again confirmed Wheeler’s prediction of the outcome. There is a concrete illustration of what is going on in this experiment offered by retired NASA physicist Tom Campbell, which you can see by clicking here. If you want a more detailed explanation click here.
As I understand it, the traditional double slit experiment observes that when atoms are directed at a panel with two slits the atoms produce an interference pattern on a sheet of film behind the panel. Think about dropping several pebbles close together into a pool of still water. Each pebble produces a ripple pattern and because they are close together the ripples interfere with one another forming a more complex pattern. This is called an interference pattern. In the experiment, the only way that this pattern could be produced would be if the atoms went through both slits in a wave form rather than a particle form. If the atoms had been in a particle form they would have produced two separate and similar patterns on the film that indicate no interference took place. If you repeat the experiment with detectors setup to identify which atoms go through each of the slits as they pass through the respective slits, what you get is a particle format on the film behind the panel. The implication being that observing and knowing which atoms pass through each slit causes the wave form to collapse into a particle form. If you redo the experiment and take away the detectors, you once again get a pattern on the film indicating interference and thus the atoms must have gone through the slits as waves.
Wheeler’s thought experiment asked what would happen if you did not use a detector until after the atoms had passed through the slits and were about to hit the film. In short, the atoms had already passed through the slits and based on the prior experiments should be in a wave form. Thus, the decision to measure and determine the state of the atoms was delayed until after they had passed through the slits. Passing through the slits in a wave form is the only explanation for the interference pattern observed when the state of the atoms has not been assessed at the slit. If no measurement has been taken at the slits, the expected pattern is an interference pattern. However, if the measurement is taken just before the atoms hit the film, you get a particle pattern on the film, which implies that the atoms did not pass through the slits as waves but as particles. In short, the measurement just before the atoms hit the film appears to retroactively affect the atoms prior to their passing through the slits. Think of jumping off of a high dive into a swimming pool. Once you jump, you cannot reverse the action and return to the diving board but the experiment seems to imply that at the quantum level this is possible. The lead researcher, Truscott, in the ANU experiment said about the result, “It proves that measurement is everything. At the quantum level, reality does not exist if you are not looking at it.”
The results have also been described as retroactive causation. What this is intended to convey is that the effect of the delayed choice measurement actually went backward in time and changed the state of the atoms before they passed through the slits. However, given the earlier experiment discussed that confirmed reality at the quantum level to be non-local (not bound by space-time) it may be unnecessary to invoke “time travel” to explain the results. Campbell has argued that a better explanation of the obtained results is the one that Wheeler himself proposed. According to Campbell, Wheeler thought that an atom is actually neither a wave or a particle though potentially both. It is an information packet. Campbell suggests that measurement is tantamount to a query of the information packet, which provides a data stream defining one of the potential outcomes available in the information. In this scenario the critical variable is not where the query is made (at the slits or at the film) but that it was made. In short, the slits only appear to cause the outcome. The real cause is whether there is a query made or not. When you know does not matter so much as that you know. Wheeler thought reality was at root constructed from information. Campbell agrees and suggests that what we actually experience is a self-evolving, virtual reality (click here for a video explanation). Campbell is not the first to suggest this possibility. Three physicists published a paper in 2012 proposing that the universe appears to have characteristics similar to a computer simulation (click here for their paper abstract).
What we’ve seen in the above experiments are confirmations of two thought experiments that finally got put to an empirical test. One suggests that all of reality is a singular whole and the other that this whole could very well be nothing more nor less than a massive database used to virtually manifest what we think of as the universe. The underlying nature of reality is truly mysterious.
End Note: It has been said that physicists have retained from the 19th century the use of the label “particle” for particular phenomena even though they know better. Think of an atom, which is generally thought to be a critical building block of the physical world. Our generic atom consists of an electromagnetic field populated with various “particles” such as protons, neutrons and electrons. What are these “particles” in an atom? We lay people are inclined to think of them as very small bits of matter; however, they are actually “excited states” of the field. Think of the ocean with waves arising from the surface. The wave is still the same water that the ocean is comprised of just in a different state. One might even say an excited state compared to the underlying ocean.
It has been suggested that one think of an atom as a field one hundred yards across with a green pea in the center of the field (to represent the nucleus) and a BB at the outer edge of the field (to represent an electron). This leaves a lot of room or unexcited area. Someone calculated that if you took a human being and removed everything except the “particles” in each atom comprising that individual and then repeated the process with every person on the planet one could fit the human race inside the volume of a sugar cube. So, how is it that things composed of “matter” comprised of these atoms appear to be so dense? Why can’t you easily stick your hand through a wall? The answer seems to be something similar to opposing lines of force associated with the vibratory quality of the excited states within atoms. By way of analogy, think about attempting to push the opposite poles of two magnets together. Anyone who has attempted this has observed a considerable resistance for no apparent reason.
Paradigms are conceptual models that serve an umbrella function for theories in diverse areas of study. For example, the current paradigm in science (see What is Science?) is scientific materialism. This paradigm serves an umbrella function for theories about such things as physical process, biological processes and behavioral process. This paradigm has its origins in the scientific revolution inspired by the scientific thinking of Nicolaus Copernicus in the sixteenth century and Isaac Newton in the seventeenth century. Scientific materialism as a paradigm posits that everything is comprised of physical particles (principle of materiality) governed by cause and effect relationships (principle of causal determinism), that change is continuous (principle of continuity), that phenomena occur within a finite space and over finite periods of time (principle of locality), that phenomena have objective existence independent of observation (principle of strong objectivity) and can be understood through reducing phenomena to their essential components (the principle of reductionism), which implies that phenomena are assembled from the bottom up, piece by piece. All theories falling under the umbrella share these basic assumptions. See Goswami’s Quantum Philosophy (Part I) and Goswami’s Philosophical Alternative for more detail.
One tenet of science as a methodology is that it holds to certain principles about the nature of knowledge. One of these principles is that our knowledge consists of models of reality and are not elucidations of reality itself. In other words, what we know is always considered to be an approximation never truth. Another principle is that what we know is held as tentatively valid until shown otherwise. How we know is through creating explanations for what appear to be related observations or facts about phenomena in the world. These explanations (a.k.a. theories) are then used to derive hypotheses that can be experimentally tested. Successful tests of hypotheses derived from theory increase the confidence that we can have in the explanation or theory. Confirmation of an hypothesis is sometimes possible by successful prediction of an outcome, such as the prediction of planetary motion based on a theoretical model or explanation of the forces governing such motion. In other and more confounded cases, confirmation of an hypothesis is sought through statistical testing in which a conclusion is reached based on probability calculations. The typical standard in such cases is a p. <= .05, which means the observed result would be expected by chance only 5 times in 100 or 1 time in 20. Standards such as this can, of course, result in some false positives but is considered an acceptable error rate for theory testing (see “What is Science?“)
The flaw in this system is that a paradigm can come to be so central to the scientific process that it begins to be viewed as Truth. Once this happens its assumptions acquire the status of dogma. When this occurs, theories subsumed by the paradigm become inoculated against accepting results that are contrary to dogma, which also means contrary to theory or theories grounded in the paradigm. Once this happens, science has become scientism. It appears that contemporary science is grappling with the problem of scientism. With the advent of quantum mechanics in the early twentieth century the basic assumptions of scientific materialism came under challenge. Experimental evidence refutes or strongly questions the validity of the principles of scientific materialism enumerated above. Resistance to this challenge has been evident in a variety of fields that have simply ignored the challenges and continued to act as if nothing had changed. This is especially true in the cases of biological and behavioral sciences. Many physical sciences have found ignoring the shifting paradigm more difficult. However, even in the physical sciences the tendency has been to attempt to limit the shift to effects occurring at the micro level and preserve the paradigm at the macro level. Unfortunately, experimental evidence is accumulating that demonstrates quantum effects can also be detected at and thus have effects at the macro level.
Another source of challenge to scientific materialism that became evident during the twentieth century was the results from psi experiments (e.g., see Spirituality and Religion). One early body of experimentation was that done by J.B. Rhine at Duke University. Rhine produced evidence that certainly should have caused some serious questioning of the adequacy of scientific materialism, he and his results were widely rationalized away because they were inconsistent with the prevailing paradigm suggesting that the assumptions of the paradigm had become dogma. Later in the twentieth century a large body of research was accumulated under the leadership of Robert Jahn at Princeton University in its engineering anomalies laboratory. This work too was rationalized away to maintain the integrity of the paradigm or if you prefer to preserve the dogma of scientism. In both cases, the correct scientific response should have been intense investigation rather than out-of-hand dismissal.
There are of course researchers that continue to pursue investigation into these challenges to scientific materialism (e.g., see Society for Scientific Exploration) From the work of these open minded investigators a new and better paradigm may slowly emerge. Whether or not a new paradigm is justified, careful investigation of challenges should be applauded, not ridiculed as is often the case from those wedded to scientism.