This is a response to an essay by Robert C Koons that was recently reproduced in a collection entitled: The Hard Labor of Christian Apologetics. (1) The essay by itself is available here:
Robert C Koons is a professor of philosophy and Christian apologist. Note I did not include philosopher on that list, as apologists differ from philosophers in the most important aspect. A philosopher starts with questions and begins a path along which he hopes to discover the truth. This ultimate truth is, of course, never found but the great philosopher does bring us new and amazing treasures along the way. The apologist begins with a claims of truth and packages it in an attempt to sell it to us, much as any other salesman. (I do not mean to suggest that there are no great philosophers who are also Christian. For example, I greatly value my classes with Paul Ricoeur during my graduate years and admire Kierkegaard.)We have today several well known apologists who package their offerings in the trappings of philosophy, and I would include Koons among them. We can see this at work in his opening paragraph:
“Quantum mechanics is one of the most successful theories in the history of science. In some form, it is here to stay. The quantum discoveries of the 20th century transformed our understanding of the natural world. In fact, the quantum revolution is a theologically wholesome development, reconciling our scientific view with the possibility of human agency and knowledge.”
As foreshadowed, Koons will attempt to resurrect Thomism through an attempt to resurrect the Aristotelian metaphysics forming the intellectual foundation of Aquinas’s metaphysics; all to the sole purpose of resurrecting the intellectual justification for Christianity so severely damaged over past centuries. His pitch here is to acknowledge the decline of Christian justification with the advent of classical physics born of the scientific method, and the subsequent displacement of classical physics by the discovery of quantum reality. He builds from these uncontroversial facts to the surprising claim that our quantum discoveries are in accord with Aristotle’s physics and metaphysics, who was right all along. Therefore, Christianity is saved:
“The Greek philosopher Aristotle (382-322 BC) had a theory of nature that offered a number of advantages from the Christian viewpoint. While Aristotle recognized a profound difference between human beings and other “substances” (i.e., fundamental entities), based on our unique rationality, he avoided dualism, and he conceived of human aspirations as continuous with the striving of all natural things to their essential ends (i.e., teleology), providing an objective basis for norms in ethics, aesthetics, and politics.”
He proceeds from this to claim that quantum physics accords specifically to Aristotle’s notions of teleology, four causes, and the possibility of objective knowledge, all of which, however, have no applicability at all to the most elemental reality we yet know: quantum field theory. Koons asserts:
“The quantum revolution of the last 100 years has transformed the image of nature in profound ways, reviving Aristotelian modes of understanding. Physicists first discovered in the early 20th century that the energy of isolated systems cannot vary continuously but must jump from one discrete level (quantum) to another. This apparently modest discovery has profound implications for all of science. It actually constituted a kind of “Scientific Counter-Revolution,” reviving the Aristotelian conception of nature in at least three ways: rehabilitating teleology, unseating the microscopic world from its privileged position, and securing the ontological autonomy of chemistry and thermodynamics (and potentially also the autonomy of biology and psychology) from mere physics.”
Let’s begin by clarifying the meaning of teleology and how Aristotle believed it to exist; important to do because Koons will attempt to expand that meaning:
a: the study of evidences of design in nature
b: a doctrine (as in vitalism) that ends are immanent in nature
c: a doctrine explaining phenomena by final causes
New Latin teleologia, from Greek tele-, telos end, purpose + -logia -logy —
Aristotle understood this as nature’s design of specific intent in everything that exists. For example, a rock always returns to its lowest position when dropped because in its essence it is meant to be there. Koons then claims, mostly unargued:
“Classical mechanics can be formulated in either of two ways: in terms of differential equations, based on Newton’s laws of motion, or in terms of integral equations, relying on the conservation of energy (the Hamiltonian method). The Newtonian model is completely bottom-up, but the Hamiltonian is Aristotelian, being both holistic and teleological. The total energy of a closed system is a holistic property: it cannot be reduced to the properties of the system’s constituents,”
He starts with a true statement concerning the difference between Newtonian expression of motion through the application of calculus from discreet measure and Hamiltonian calculation of energy from motion in total in a closed system, and that they result in the same answer but describe different perspectives on reality. It is also true that only the Hamiltonian method can be applied to quantum events, but that in no way implies teleology. Just the opposite, Hamiltonian calculation is necessary for reasons that will undercut Koons’s later arguments: there are no components or substances at the quantum field level. There is only exchange of energy among the interacting quantum fields, requiring a holistic measure. No components or essences, just interplay of the oscillations of quantum fields. There is no way to apply the concept of intention in an indeterminate world of quantum superpositions of events, with events being this interplay among the fields – which is what the Hamiltonian is measuring. Anything resembling intent would be our emergent perceptions in our decohered environment. Carlo Rovelli describes this elemental reality in “The Order of Time”:
““On closer inspection, in fact, even the things that are most “thinglike” are nothing more than long events. The hardest stone, in the light of what we have learned from chemistry, from physics, from mineralogy, from geology, from psychology, is in reality a complex vibration of quantum fields, a momentary interaction of forces, a process that for a brief moment manages to keep its shape, to hold itself in equilibrium before disintegrating again into dust, a brief chapter in the history of interactions between the elements of the planet, a trace of Neolithic humanity, a weapon used by a gang of kids, an example in a book about time, a metaphor for an ontology, a part of a segmentation of the world that depends more on how our bodies are structured to perceive than on the object of perception—and, gradually, an intricate knot in that cosmic game of mirrors that constitutes reality. The world is not so much made of stones as of fleeting sounds, or of waves moving through the sea.
If the world were, however, made of things, what would these things be? The atoms, which we have discovered to be made up in turn of smaller particles? The elementary particles, which, as we have discovered, are nothing other than the ephemeral agitations of a field? The quantum fields, which we have found to be little more than codes of a language with which to speak of interactions and events? We cannot think of the physical world as if it were made of things, of entities. It simply doesn’t work.” (2)
Which brings us back to Aristotle’s rock at rest in its intended place on the ground, and the law of conservation of momentum that describes this situation. What if we launch that rock into space, free from Earth’s gravitational field and atmospheric friction? Now its intended design appears to be eternal motion through the cosmos. What quantum reality teaches us is there is no intention, just momentary co-relations of events that come and go. Everything depends on momentary perspective and relations. Relativity, not teleology.
In other works, Koons has tried to equate potential energy with teleology, such as the example of heat having the teleological designation to boil water, even when in a waterless point in the universe. Besides conflating possible events with teleology, this claim will just further complicate his troubles when we get to causation itself.
This problem of causation appears as Koons transitions to “The measurement problem”:
“A quantum particle doesn’t typically have any position or momentum at all: it has merely the potential to interact with macroscopic systems in various locations. Thus, the quantum world cannot be a complete basis for the macroscopic world.”
He thinks to solve this mystery through the application of Aristotle’s notion of potentiality:
Aristotle offers a ready answer to this puzzle. The microscopic constituents of macroscopic objects exist only as potentialities for interaction. They are only virtually present, except when they are activated.
However, this answer rests on several crucial errors:
1. Koons asserts that the decohered environment is what is physically “real” and the quantum level is mere ideal (virtual) potentiality. (3) In fact, quantum computers work on the principle that superpositions exist physically (not virtually), and the intent is to have the superposed states work solutions simultaneously prior to decoherence. The power of quantum computing rests entirely on the physical existence of entanglement and superposition.
2. From the above assumption Koons views the quantum state as in service to our decohered environment, waiting for causes to allow them to become real, much like Pinocchio’s desire to become a real boy.
3. He ignores the most important advances in coming to an understanding of decoherence of the past 30 years.
Let’s start with Wojchiech Zurek’s theories of Existential Decoherence and Darwinian Interpretation. (4)
Zurek begins within the question of whether decoherence is entirely a physical process, or whether is is entirely epistemological reduction, and ultimately proposes a combination of the two:
The overarching open question of the interpretation of quantum physics—the “meaning of the wave function”—appears to be in part answered by these recent developments. Two alternatives are usually listed as the only conceivable answers. The possibility that the state vector is purely epistemological (that is, solely a record of the observer’s knowledge) is often associated with the Copenhagen Interpretation (Bohr 1928). The trouble with this view is that there is no unified description of the Universe as a whole: The classical domain of the Universe is a necessary pre- requisite, so both classical and quantum theory are necessary and the border between them is, at best, ill-defined. The alternative is to regard the state vector as an ontological entity—as a solid description of the state of the Universe akin to the classical states. But in this case (favored by the supporters of Everett’s Many Worlds Interpretation), everything consistent with the universal state vector needs to be regarded as equally “real.”
The view that seems to be emerging from the theory of decoherence is in some sense some- where in between these two extremes. Quantum state vectors can be real, but only when the superposition principle—a cornerstone of quantum behavior—is “turned off” by einselection. Yet einselection is caused by the transfer of information about selected observables. Hence, the on- tological features of the state vectors—objective existence of the einselected states—is acquired through the epistemological “information transfer.” (2)
He had earlier described a density matrix analysis for “Schroedinger’s Cat” and the mechanics of how decoherence physically could take place as a reduction of the two off-diagonal cones:
Figure 3: Evolution of the Density Matrix for the Schro ̈dinger Cat State in Figure 2. (a)This plot shows the density matrix for the cat state in Figure 2 in the position representation ρ(x,x′) = φ(x)φ∗(x). The peaks near the diagonal (green) correspond to the two possible locations of the particle. The peaks away from the diagonal (red) are due to quantum coherence. Their existence and size demonstrate that the particle is not in either of the two approximate locations but in a coherent superposition of them. (b) Environment-induced decoherence causes decay of the off-diagonal terms of ρ(x, x′ ). Here, the density matrix in (a) has partially decohered. Further decoherence would result in a density matrix with diagonal peaks only. It can then be regarded as a classical probability distribution with an equal probability of finding the particle in either of the locations corresponding to the Gaussian wave packets.
The point I wish to make here is that he starts with a purely physical quantum reality. The perturbation occurring when introduced into the (decohered) environment reduces the off-diagonal cones, creating a probability distribution (impossible prior to introduction into the environment due to the absence of causality in elemental quantum state) which will almost instantaneously decohere into the most probable, (or even solely possible) eigenstate within the timeline of that particular environment.
He couples this with observation by a subject as an element of this process, but solely from the physical decoherence within a brain:
In particular, the process of decoherence we have described above is bound to affect the states of the brain: Relevant observables of individual neurons, including chemical concentrations and electrical potentials, are macroscopic. They obey classical, dissipative equations of motion. Thus, any quantum super- position of the states of neurons will be destroyed far too quickly for us to become conscious of the quantum “goings on”. Decoherence, or more to the point, environment-induced superselection, applies to our own “state of mind”.
One might still ask why the preferred basis of neurons becomes correlated with the classical
observables in the familiar universe. It would be, after all, so much easier to believe in quantum physics if we could train our senses to perceive nonclassical superpositions. One obvious reason is that the selection of the available interaction Hamiltonians is limited and constrains the choice of detectable observables. There is, however, another reason for this focus on the classical that must have played a decisive role: Our senses did not evolve for the purpose of verifying quantum mechanics. Rather, they have developed in the process in which survival of the fittest played a central role. There is no evolutionary reason for perception when nothing can be gained from prediction. And, as the predictability sieve illustrates, only quantum states that are robust in spite of decoherence, and hence, effectively classical, have predictable consequences. Indeed, classical reality can be regarded as nearly synonymous with predictability.
Behind this is the the principle that probability can only be conceived according to the number of bits of information, which required an evolutionary adaptation that greatly reduced perception to the minimum necessary bits to create an orderly sketch of the environment – just enough to guess correctly often enough to survive, and remain blind to everything else. The reduction of chaos to a semblance of order requires what Carlo Rovelli will later call “blurring” in his Relativist Interpretation, whereas decoherence, as the somewhat arbitrary depiction of a closed system in the environment, creates the initial arrow of entropy, as Zurek (and Max Tegmark) went on later to explain as:
Entropy can only increase when interacting with the environment
Entropy can only decrease when it interacts with a subject
The point of the exposition above is to clarify the understanding of physical reality and what is fundamental. When Koons claims that the quantum reality alone cannot explain classical reality, he falsely assumes it is the classical that is real while the quantum remains virtually ready to become real. Koons is certainly correct when he says “the quantum world cannot be a complete basis for the macroscopic world.” Certainly, something is added, but nothing physical or teleolgical. Rather, it is our subjective reduction of fundamental reality. As Rovelli said above: “a part of a segmentation of the world that depends more on how our bodies are structured to perceive than on the object of perception.” Or as Zurek put it, the immediate decoherence in the information transfer from quantum state to our decohered brain.
This is a fundamental example of how Aristotelian metaphysics obscures the nature of fundamental reality. Of course, Aristotle could not possibly have had any inkling of any of this, and necessarily speculated solely from his own decohered environment. But as these speculations pertain only to the representations constructed in space, time, and causality, he offers no entree into a more fundamental reality that is arational, non-causal, and without spacetime. Fundamental reality is simply impenetrable via Aristotle.
Koons then focuses on the “causes” of decoherence from the perspective of our decohered environment, willfully remaining within Aristotle’s unwillful ignorance. Let’s again return to Rovelli’s “The Order of Time” as he describes causality as now understood:
“In our experience, the notion of cause is thus asymmetrical in time: cause precedes effect. When we recognize in particular that two events “have the same cause,” we find this common cause in the past, not in the future. If two waves of a tsunami arrive together at two neighboring islands, we think that there has been an event in the past that has caused both. We do not look for it in the future. But this does not happen because there is a magical force of “causality” going from the past to the future. It happens because the improbability of a correlation between two events requires something improbable, and it is only the low entropy of the past that provides such improbability. What else could? In other words, the existence of common causes in the past is nothing but a manifestation of low entropy in the past. In a state of thermal equilibrium, or in a purely mechanical system, there isn’t a direction to time identified by causality.
The laws of elementary physics do not speak of “causes” but only of “regularities,” and these are symmetrical with regard to past and future. Bertrand Russell noted this in “a famous article, writing emphatically that “The law of causality . . . is a relic of a bygone age, surviving, like the monarchy, only because it is erroneously supposed to do no harm.”103 He exaggerates, of course, because the fact that there are no “causes” at an elementary level is not a sufficient reason to render obsolete the very notion of cause.104 At an elementary level there are no cats either, but we do not for this reason cease to bother with cats. The low entropy of the past renders the notion of cause an effective one.
But memory, causes and effects, flow, the determined nature of the past and the indeterminacy of the future are nothing but names that we give to the consequences of a statistical fact: the improbability of a past state of the universe.”
“Causes, memory, traces, the history itself of the becoming of the world that unfolds not only across centuries and millennia of human history but in the billions of years of the great cosmic narrative—all this stems simply from the fact that the configuration of things was “particular” a few billion years ago.105
And “particular” is a relative term: it is particular in relation to a perspective. It is a blurring. It is determined by the interactions that a physical system has with the rest of the world. Hence causality, memory, traces, the history of the happening of the world itself can only be an effect of perspective: like the turning of the heavens; an effect of our peculiar point of view in the world. . . . Inexorably, then, the study of time does nothing but return us to ourselves.”
The important considerations here are:
1. There is “no magic force of causality”, but rather our representation arising from increase in entropy.
2. Causality does not exist at the level of elementary physics, but rather emerges as part of our constructive determination of the world after decoherence.
3. It always exists in relation to a perspective, not as elemental reality (our rock now cruising the cosmos). Causation emerges with our representation of increase in entropy, and “does not but return us to ourselves.”
This accords well with Zurek’s Existential Interpretation, where only quantum states that do not conflict with events in the decohered environment (absent the off-diagonal cones) can eigenselect, reinforcing the appearance of classical causality. And as with Rovelli, decoherence is impossible without a reductive subject, not something entirely objective.
This renders Aristotle’s “top down”, and “bottom up” irrelevant to the understanding of fundamental reality. There is no purpose and no design, just the appearance of order and causality due to our blurring of all but a tiny slice of reality, representing it as a subsystem, and intuiting time and causality as the inevitable increase in entropy – the inevitable recision of the appearance of order back into primordial chaos.:
The interplay of entropy reducing through a subject and increasing with quantum information exposed to the environment.
“Quantum mechanics re-affirms what Aristotelians have known all along: that the world’s ultimate constituents are not the extremely small and simple particles of physics, but much larger, composite bodies with irreducibly holistic and teleological properties and powers. This puts us firmly on the path toward recognizing that even more complex bodies–namely persons and other living organisms–can also be metaphysically fundamental entities, with irreducibly biological and psychological properties and powers.”
But quantum physics tells us no such thing at all, but rather that the world is elementally waves within fields which we represent at emergent levels to be “things”. There are no metaphysical entities at all, and no such thing as irreducible properties, but rather temporary interplay of these waves within fields. As Zurek describes in his Evolutionary Interpretation, things appear to be permanent when they are nearest pointer states (where a quantum position and decohered position are nearest to being equal in the border area) because pointer states are resilient to perturbations of the environment. This causes multiple copies of information from multiple perspective, permitting a survival much as in Darwinian evolution. This has nothing at all to do with irreducible properties or purpose, but merely the accidental correlation between the coherent reality and the pointer state in decoherence.
No metaphysics is capable of penetrating fundamental reality because all metaphysics are projections of reason, number, space, and time – our fundamental categories of representational thought – onto a realm where these simply don’t exist. In attempting to do so, we tightly seal ourselves off from the proper questions and approaches. We have constructed dichotomies that have only led to error.
1. Metaphysical/physical (or material/immaterial) – there is only the physical which at bottom is energy as oscillations in a field.
2. Macroscopic/microscopic – size as the imposition of our intuition of space on a reality that knows nothing of space. Coherent quantum reality infuses all of existence, not confined to our concept of microscopic. Quanta themselves are just temporary excitations along a wave.
Even if we were to posit a dichotomy of what we can know representationally/reductively and what remains hidden in quantum superposition and entanglement, we would be closer to our situation but still overly reductive. It really comes down to degrees of reduction. In addition to our rational mode of experience, we also experience esthetically, which is itself less reductive. Is it also more direct? As Zurek explained, we sit in a photon-rich environment where we perceive nothing directly. Photons are conveying secondary information about whatever they last encountered. We only know things through perturbations caused by probes. But are we more directly knowledgeable through sensation? Emotion? Music? Germinative poetic metaphor as opposed to reductive scientific metaphor? Less reductive, to be sure, which opens up this middle space heretofore obscured by the dichotomy of material/immaterial. And necessitates the rethinking of who we are as elementally quantum beings entangled in a reality at bottom chaotic, void of spacetime, and impervious to our representational modes of cognition. In light of that, what really counts as knowledge, and how do we approach it?
The honest approach is to refrain from metaphysical/theological speculations; to refrain from attempts to retrofit obsolete ideologies. These approaches are merely an anthropomorphic attempt to tell the universe what it is from our own illusions. This is the ultimate failure of all reductive approaches, even the science of physics itself which, as Heidegger demonstrated, is itself an offspring of metaphysics. Physics does serve the crucial function, however, of pointing to the prevailing mystery, within which beckons the most questionable. If ultimate reality cannot be described reductively, we need to advance our abilities to think non-reductively (esthetically); i.e. to allow reality to instruct us in our esthetic thinking of what is presented in experience. An approach where we ourselves expand rather than reduce the world. An approach germinative rather than reductive.
(2) In the everyday use of the term. The Leggett Inequalities proved that quantum states are not only nonlocal, but also non-real. This is often misunderstood as not existing, when Leggett is merely using the technical definition of not having two or more measurable characteristics. It does not deny the fundamental waves or energy of quantum fields and their entangled interplay.
(3)The Order of Time