The Quantum Thomist

Musings about quantum physics, classical philosophy, and the connection between the two.
The Philosophy of Quantum Physics 6: Quantum Bayesianism


The Philosophy of Quantum Physics 7: Is Aristotelian Philosophy of Nature Obsolete?
Last modified on Tue Aug 6 19:14:59 2024


Introduction

I am having a look at different philosophical interpretations of quantum physics. This is the sixth post in the series. The first post gave a general introduction to quantum wave mechanics, and presented the Copenhagen interpretations. I have subsequently discussed spontaneous collapse, the Everett interpretation, Pilot Wave, consistent histories interpretations, and quantum Bayesian intepretations.

In this post, I am going to do something a little different. The first reason why this is different is that this post is more of a summary and review of a book than an outline of a particular position. The book in question is by Robert Koons, Is St Thomas's Aristotelian Philosophy of Nature Obsolete? This book was released a couple of years ago, and I have been meaning to review it for some time. Professor Koons is a philosopher, rather than a physicist, specialising in Thomistic and scholastic philosophy. Indeed, he is among the world leading contemporary Thomistic philosophers, and a writer who has to be respected.

The second way in which this post differs from my previous ones is that it primarily looks at a different question. There are two fundamental problems that quantum physics attempts to answer.

  1. Given our knowledge of the present state of the universe, what is the probability that the universe will evolve into a particular future state?
  2. Given some knowledge of the underlying structure of a material, what properties (macroscopic and microscopic) would we expect to observe for that material.

I will call these the question of dynamics and the question of statics. Quantum physics very successfully allows us to answer both questions (at least if we exclude the effects of gravity). The goal of the philosophy of quantum physics is, or ought to be, to explain why quantum physics is successful, in terms of a proposed underlying ontology and metaphysics.

Most of the interpretations I have looked at so far address the first of these questions, and largely ignore the second. There are perhaps reasons for this: maybe it is thought that once you have an ontology that explains the dynamics, then the statics will fall into place as well. Or maybe the study of statics is just a lot less intellectually sexy than the dynamics: after all, the most obviously weird parts of quantum physics are related to the dynamics. Thirdly, the philosophy behind mechanistic physics contains its own theory of statics, and people naively carry this forward into quantum physics. After all, the weird parts of quantum physics are related to the dynamics, so it is natural to suppose that the statics is unaffected by the movement from classical to quantum.

The question of dynamics is raised first in a physics course (where we start by introducing and solving the Schroedinger equation), and is the topic of the cutting edge of particle physics research. The question we all want to know is what lies behind the standard model to allow us to unify quantum physics and general relativity. Questions are raised in the dynamics, and maybe this is as far as the philosophers of science get. However, a physics student would then move on to spend far more time studying, for example, condensed matter theory or atomic physics, which more concerns the question of statics. These are hugely important branches of physics, and to the physicist they are certainly no less sexy or interesting than the more fundamental questions of physics, where dynamics are more important. But condensed matter physics and other fields are more complex and harder than just solving the Schroedinger equation or drawing some Feynman diagrams for isolated particles. Not in the sense that more brain power is required to perform cutting edge research -- that's about the same between particle physics and condensed matter physics. But it has a higher barrier of entry -- you need to know more before you can start appreciating it -- and it is far more varied and covers a far larger range of phenomena.

However, most philosophers of quantum physics concentrate on the dynamics. This is true of my own work as well as many others. I briefly discuss statics, but not in the depth that the topic deserves.

Robert Koons book is unusual in that it spends a relatively small proportion of the text discussing the dynamics. It still devotes two chapters to it, culminating in a discussion of Pruss' travelling forms interpretation (which Koons advocates), and he does consider the interpretations I have looked at in previous posts. Instead it mainly concentrates on the statics. There is nothing wrong with this approach. It is work that needs to be done.

Obviously, Robert Koons and myself both adopt a similar underlying philosophical background, and as such I find myself in agreement with a lot of his aims and intentions in writing this book. I will discuss how much I agree with his conclusions and how he gets there as I go through the review.

The book has eight chapters.

  1. Introduction
  2. The rise and fall of physicalism.
  3. Hylomorphism and the Quantum World
  4. Deriving Hylomorphism from Chemistry and Thermodynamics
  5. Hylomorphism and the Measurement Problem
  6. The Many-Worlds Interpretation of Quantum Mechanics
  7. Biology and Human Sciences
  8. Substances, Accidents, and Quantitive Parts

Is Aristotelian Natural Philosophy necessary and concordant with modern science?

Aristotle's philosophy in general, especially his natural philosophy, metaphysics, and ethics, was neglected for a long period of time. This was due to the rise of early modern mechanism and then empiricism, which initially promised to be a powerful explanatory mechanism working in tandem with the science. However, modernism didn't fulfil its early promise, or offer us with all the answers. In the last forty years or so, quite a few philosophers have turned back to Aristotle and his medieval successors to see if they can offer a better solution to the problems of modernism than following the post-modern and counter-enlightenment world-view. Developments have been made in particular in ethics and the approach to science.

In particular, Professor Koons will concentrate on hylomorphism, the form-matter distinction that lies at the heart of Aristotle's philosophy of nature, metaphysics, and epistemology.

But, Koons argues, this is all for nothing if the challenge from modern science cannot be overcome.

So, we must ask two questions: can we have an Aristotelian metaphysics (a theory of being as such) without an Aristotelian philosophy of nature? And can we have an Aristotelian philosophy of nature without Aristotelian natural science?

To answer Yes to either of these questions would be to cut Aristotle's philosophy (and successors such as Thomism) off from natural science. It would be to adopt either Aristotelianism without hylomorphism, or to suggest that hylomorphism has no implications for the practices of the sciences. This makes Aristotle's philosophy unfalsifiable. Convenient for those who want to work without any great trouble, but hardly an approach which will endear itself to a mainstream scholarship sceptical of Aristotle's metaphysics.

Professor Koons suggests that we should answer both questions No. I would agree with him. In my view, to answer the first question Yes cuts out a core part of Aristotle's system, and destroys its internal consistency to an extent that one's philosophy can no longer really be regarded as Aristotelian. Professor Koons notes that it would undermine the doctrines of transubstantiation (which, as a Protestant, I would reject anyway, but it is important for Roman Catholics such as Koons), and the union between soul and body in a living creature (which I do agree would be a problem). The second question should be answered No. Aristotelian philosophy has always been empirical in nature. It might have slightly different aims to modern science, and less precision and certainty, but it is applied to the same objects. Hylomorphism's search for essence and modern sciences search for mathematical laws seem different, but could just be the same end discussed in different ways. Aristotle's natural science is certainly different from a positive or Humean conception of science, but those perceptions are just modern theories about natural science, not natural science itself.

So instead, Koons rightly concludes

Our natural philosophy should be continuous with our natural science, albeit at a higher level of generality and a deeper level of philosophical explanation. This does inevitably entail that Aristotelian natural philosophy (and so, ultimately, Aristotelian metaphysics) is in principle falsifiable, or at least subject to disconfirmation by empirical results. Now I think this is a plus.

Koons believes, that unlike classical physics and chemistry (from Galileo to Rutherford) which seemed to be against hylomorphism, the quantum revolution of the twentieth century is consistent with it, and in fact provides empirical support for it. The conflict was never between hylomorphism and modern science, but between hylomorphism and a certain philosophy of science. That philosophy of science developed alongside classical physics, and as classical physics was proved to be more successful than Aristotle's physics, it was natural that people should gravitate towards what Professor Koons calls microphysical materialism. But things have changed since the quantum revolution in physics. This challenges microphysical materialism in various and profound ways, and thus ought to have led to a change of perspective on the philosophy. However most philosophers, and even most physical scientists (who rarely pay much of an interest in philosophy) have not appreciated this. The new image of nature has revived an Aristotelian understanding across a wide range of scientific disciplines.

This claim obviously needs to be backed up, which is what Koons will attempt to do in the rest of the book.

Physicalism

This chapter lays out the main elements of hylomorphism, and describes the anti-Aristotelian revolution of the seventeenth century. I will firstly summarise Professor Koons' text, and then offer my own thoughts.

Hylomorphism is the belief that all substances are composed of matter and form. Matter is whatever the thing is made of. This might be some simpler substance (leading to a hierarchical relationship between substances), or, for the most fundamental particles, it would be prime matter. Prime matter doesn't really exist by itself (it would always be combined with some form to make a substance), but is a useful concept. Material explanation is bottom up.

Formal explanation is top down. It looks at the substance as a whole, and allows us to identify its essence, or what makes it what it is and not some other type of thing.

The next component of hylomorphism is the importance given to powers. These come in two classes, active powers and passive powers. An active power is the ability to induce change in something else. A passive power the ability to be changed by something else. Obviously, there was a criticism of discussing physics in terms of powers. The claim was that naming a power asserts without really explaining anything. I think this criticism fails. Firstly, the hylomorphist will try to provide a link between the powers and the underlying form of the substance, and explain why a particular form leads to a particular set of powers. In older times, this sort of explanation was difficult, because we lacked the required knowledge of physics. But now we can make an attempt at it. Secondly, the same criticism can be applied to the forces of classical physics. Ultimately, saying that everything generates a gravitational force which acts on all other particles, and force equals mass times acceleration, asserts but does not explain how those forces arise and why they have that effect. Again, with more advanced physics we can attempt to answer those questions, but Newton could not do so. The replacement of mysterious powers with mysterious forces doesn't really gain us anything.

This notion of powers leads to the notion of efficient and final causes. Efficient causality corresponds from passive powers; it asks which entities were involved in inducing the change in the entity under study. Final causality more looks at the active powers of a substance. It is a directedness towards an end, and asks what particular entities and states can emerge from the thing we are looking at. Note that efficient and final causality both link one set of substances with another set of substances; they are different from notions of causality used in classical physics which more concentrate on events.

Thus in the hylomorphic understanding of nature, substances exist at many levels of scale and composition. For this reason, we cannot give a complete description of the world just by focusing on the microscopic description, and treating larger objects as just straight-forwardly reducible to a collection of microscopic parts.

From the late middle ages, hylomorphism was attacked. Three key elements of the system was abandoned. Firstly, the matter/form distinction was replaced with matter as such, which had an inherent nature of its own. The model of causal powers was replaced with a model of interaction based on abstract laws of motion. Thirdly, the notions of formal and final causality were abandoned. (Although, I am not sure that these are three independent points. The first point entails the removal of form, and the second the removal of final causality.)

The first change was made by the likes of Scotus and Occam, and involved introducing the idea of matter in itself as providing an intelligible nature. (Although Koons doesn't discuss this explicitly, the replacement of philosophical realism with nominalism was also a key component of this philosophical change.) The question then what are the dispositions of matter? Descartes proposed that matter is disposed to travel in inertial motion except when there was contact with some other matter. The action of a distance that was seen in the straight-forward understandings of classical gravity and electromagnetism is problematic for this picture. However, Descartes' proposal was the leading idea for long enough to replace Aristotelian notion of powers with a notion centred around laws of motion (or laws of nature).

The removal of formal and final causality followed from these two developments. The new mathematical laws also encouraged people to focus on predictive abilities of science. It seemed that the concept of natural ends did not contribute anything to either our understanding of these laws, or our ability to use the laws to gain control over matter. As argued by the likes of Descartes and Francis Bacon, any notion of natural ends or powers, if they exist, must be redundant, as they would ultimately find their explanation in what are in this philosophy the more fundamental concepts of matter and the laws of motion.

There are problems with this microphysical picture. One of these is that it struggles to account for human thought and agency. With the rejection of hylomorphism, two possible pictures emerged. The first was Cartesian dualism, where the soul is a mental entity apart from the body, radically divorced from the world of matter. As such, it is not bound by the restrictions of mechanistic physics. This idea however fails to explain how the soul interacts with the body, or its correlations with brain states. The second alternative was a monistic materialism, which makes human agency only apparent. This, however, does not accord with our most basic observations about ourselves, and is ultimately self-refuting: if we are just pre-programmed machines, then any beliefs we find are merely a result of our programming, rather than by weighing options and choosing the one that best fits the facts. There is no reason to suspect that such beliefs would be true. This would include the belief in monistic materialism.

The end result of microphysicalism is the idea that everything is ultimately explicable in terms of and only in terms of its microphysical structure, defined as the intrinsic properties and relations between the most fundamental material constituents. The Aristotelian, on the other hand, accepts that things are explained in terms of the microphysics (and that there are some cases where things genuinely are the sum of their parts, such is in how an ideal gas can be reduced to the motion of the individual gas molecules -- even though the gas molecules are not reducible to the sum of their own parts), but says that this is only part of the explanation.

In the microphysicalist view, only the most fundamental layer of matter (if that) can be thought of as substances in themselves. Secondary quantities of composite beings, such as colour, smell, and taste, need to be denied as mere illusions, as they are not part of the fundamental layer of matter. Defining observations and experiment is also difficult. How can we define, without introducing circularity, what is meant by an observation of the physical world if the only terms of reference allowed are the microphysical world? The best you can do is say that in an observation certain fundamental particles interact with certain other fundamental particles and change their state. But then how is this different from any other event, including those which are not observations? Well, you can say that the second set of particles constitute a brain state. But in microphysicalism, the notion of a brain state doesn't correspond to anything in reality; it is only meaningful as a human-constructed shorthand to describe the particular set of fundamental particles that contribute it it. So you are back to saying that an observation is when one set of particles interacts with another set of particles. As such, the brain state cannot be referred to when we try to explain or define a concept such as observation when accepting the premises of microphysicalism.

Hylomorphism, on the other hand, does not have this problem because it does admit to macroscopic substances as fundamental, with some of those substances containing brain states as a quantitative part. This inability to define observation (and consequently experiment) undermines the scientific endeavour on which the materialist philosophy is based. Experiments require macroscopic setups, but if the macroscopic world is only one of misleading appearances, then so is the experiment. And this radical materialism stands in tension with our self-perception of ourselves as rational agents. If our brains are just a particular aggregate of particles in motion, and contain no more than what was present in those particles, then where do things such as thoughts and intentionality come from? This is not a problem for hylomorphism, which admits to final causes in general, with thoughts and intentionality one example of them. Ethical principles are also difficult to reconcile with an extreme materialism, but fit in neatly with a hylomorphic system.

Thus if physics requires us to reject hylomorphism, and the rejection of hylomorphism leads to numerous philosophical problems, including related to the methods used to establish physics, then we clearly have a major problem on our hands.

However, the rejection of hylomorphism was based on Newtonian physics. Whatever the unpleasant consequences of materialism, they must surely be accepted, if that is what classical mechanics demands, and classical mechanics is correct? But when Newton is overthrown by the quantum revolution, this argument no longer holds. Quantum physics has clearly put microphysicalism on the defensive. It does not follow from the science. One can only continue to believe in it despite and in the face of the scientific evidence.

The relationship between possible quantum states and the Aristotelian notion of potentiality was first pointed out by Heisenberg. Potentiality was no longer something redundant, but key to understanding physics. For example, in Feynman's sum over histories approach, one's empirical predictions must take into account the sum total of the potential paths of the actual particle, regardless of which of those paths were actually taken.

Teleology also arises in quantum physics. Firstly, this is directly implied by the re-emergence of potentiality. But, more significantly, it is visible in the Lagrangian form of quantum physics. Classical physics can be expressed in two mathematically equivalent ways. Firstly, there are the differential equations of Newton's equations of motion, which most easily conform to a mechanistic philosophy. Then there are least action principles, which most easily conform to a teleological philosophy. If physical objects travel by the path that minimises the action, then that in itself suggests a particular end, namely the minimisation of the action. The total action of a system, which depends on the system as a whole and not its individual parts, becomes fundamental in understanding motion, in contrast to microphysicalism which states that only the individual particles themselves are fundamental. In classical physics, either approach can be used. In quantum physics, however, the Lagrangian approach underlying the least action principle becomes mandatory. The two main approaches to quantum physics are the path integral approach, which directly uses the Lagrangian, or the canonical approach, which was constructed by adapting the Hamiltonian formulation of classical physics, which in turn was constructed as a reformulation of Lagrangian mechanics.

The Lagrangian approach also makes clear the holistic nature of the physics. The Hamiltonian for complex substances is not separable into that of the most fundamental substances. The causal powers are irreducibly joint powers.

Another blow given by quantum physics to a microphysicalist materialism arises from the non-separability seen in phenomena such as entanglement.

The measurement problem also undermines all three of the premises of microphysicalism (it is not clear what Koons means by the "three premises", but possibly the three rejections of aspects of Aristotelian philosophy which he used to start his discussion of materialism). In the Copenhagen interpretation, a quantum particle typically lacks any specified location or momentum at all, but merely the potential to interact with macroscopic systems as if it had a precise location. Thus the quantum world cannot be a complete basis for the macroscopic world. So what is the relationship between the macroscopic and quantum worlds? It cannot be entirely bottom-up, because (at least in this interpretation of quantum physics) the quantum world is ill defined in the absence of measurement. Neither can it be entirely top-down, since macroscopic objects are still composed of their microscopic parts. One solution to this problem is Hylomorphism, which denies that macroscopic substances can be reduced to their microscopic parts.

The Copenhagen interpretation is, of course, only one of a number of interpretations of quantum physics (as opposed to classical physics, where generally only the mechanistic interpretation was accepted). Professor Koons outlines the philosophy of "Pluralistic Quantum Hylomorphism," inspired by Heisenberg and defended by Wolfgang Smith, Nancy Cartwright and Stanley Grove.

In this view, the world consists of a variety of domains, each at a different level of scale. Most of these domains are fully classical (where all operators describing observables commute), but at least one is quantum (where there are non-commuting operators describing observable properties, so if a substance definitely has one property, then some other property would be wholly undefined). There are three possible general philosophies of science (the positivist, which makes our own interests and assumptions crucial to understanding science, the realist, which states that there is a single unified structure to reality, realised at a single scale, and the structural, where reality consists of a plurality of different structures at different scales). While the realist perspective underlay classical physics, Professor Koons claims that quantum physics has offered support for the structuralist philosophy.

The hylomorphic interpretation combines features of both the Copenhagen and objective collapse approaches. It is fully realist about the microphysics, contrary to the Copenhagen approach, but is ontologically pluralistic, contrary to objective collapse. It lacks the dualism of Copenhagen, with its clear distinction between classical and quantum, but instead offers a layered approach where you start off fully quantum at the bottom, end up fully classical at the top, and with multiple dappled scales between them. (If quantum and classical are distinguished by whether or not variables commute, I guess that this can be thought of by considering the size of the commutators between two operators [A,B] in relation to the size of the system. If [A,B] = 0, or sufficiently small that it can't be measured, for all pairs of operators, then you have classical physics. If it is on the scale of ℏ times the number of particles, then the system is fully quantum; as it becomes smaller and the tails of the wavefunction are suppressed after decoherence you move from the quantum to the classical, and of course it might reduce to zero faster for some pairs of operators than others.) The quantum hylomorphism interpretation both embraces realism about classical objects, and can make sense of experimental practices. And it fits the actual practice of scientists well.

There are a few apparent tensions between quantum physics and an Aristotelian philosophy. Firstly, there appears to be a violation of the law of non-contradiction or the excluded middle arising from superposition.

The theorem of John S Bell demonstrated that, if quantum theory is correct, we cannot suppose that individual particles take definite paths -- that is we cannot assume that quantum probabilities merely reflect our ignorance of which path is actual.

I'm personally not sure about this quote. There are two understandings of "paths". The most familiar one (to me) is in the context of the path integral formulation of quantum physics, where essentially we consider every possible and potential history of states a particle could take between A and B, calculate the amplitude for each of these histories, and sum up those amplitudes to get a total amplitude that the particle will arrive at B. This representation is not directly related to Bell's theorem. Possibly Koons means instead by a definite path that all the quantum variables are fixed and determined between the origin of that particle and its measurement. This is certainly one way of interpreting Bell's theorem (if we dismiss the possibility of non-local contributions to the dynamics). I'm not sure that this undermines the idea that quantum probabilities merely reflect our ignorance. Psi-epistimic interpretations are not ruled out by Bell's theorem (and certainly not if they allow for some non-local influence of quantum events).

Koons argues that if the superposition in the wavefunction reflects something in reality rather than our knowledge, superposition seems to imply that in some sense the particle can be in two places at once, which seems to violate the principle of non-contradiction. I am not sure that it does even on a psi-ontic interpretation -- one needs the additional premise that a particle being in one place logically excludes it being in another place at that time, which, someone who believes that the wavefunction (including in a state of superposition) is real would reject. What would cause a contradiction would be if the wavefunction could be in two different superposed states in the same basis. But that isn't permitted in quantum physics.

Koons argues that this objection concerning superposition fails to take into account the distinction between actuality and potentiality. In a double slit experiment, the electron is not a substance, but it is rather a feature of the actual source that generates the electrons. This electron has the unactualised potential of passing through either slit; but when actualised will always be in one place at a time.

Next, there appear to violations of the causal principle. The apparent indeterminacy of quantum physics is not really an issue, since there is a difference between being determined and being caused. Aristotle's philosophy has always accepted that rational agents can act as indeterministic causes. Quantum physics merely extends this to non-rational substances as well. The spontaneous creation of particles from the vacuum seems to violate the principle that nothing comes from nothing. But Koons does not regard virtual particles as substances, but only potential actions of substances. The particle is generated by a quantum field, which represents a potential action. Only when the particle is detected does it become actual; and it is only actual particles which are subject to the principle that nothing arises from nothing.

The issue of non-local correlations poses significant problems for microphysical materialists, but no difficultly for Aristotelians. Aristotle, believing the speed of light to be infinite, accepted spontaneous influence at a distance. He was, of course, wrong about light, but this example shows that Aristotle's philosophy has no problem with the general concept of correlations at a distance.

So that is my summary of Koons' account (obviously abridged -- for full details you should read the book). What do I make of it?

I have some reservations about his presentation of Pluralistic Quantum Hylomorphism. I agree with much of it. Certainly, one can consider quantum physics in terms of different levels, starting from the most fundamental layer (that we know of) of leptons, quarks, photons, gluons and so on, to a layer that describes the simplest bound states (protons, pions, Kaons, and so on), and then gradually up the chain through atoms and molecules to the macroscopic layer. These levels are linked through the framework of effective field theory. And they are genuinely different Hamiltonians and consequently different descriptions of the theory at each level. I also agree that hylomorphism -- where for each level the matter is taken to be the more fundamental layer and the form is imposed on that layer. I am less enthused about his need for a clear distinction between the quantum and the classical layers. It is true that in general (with some exceptions) as you get further away from the fundamental layer of physics, classical physics becomes a more accurate approximation to the description of the physics. And the point arises where the approximation is so good that there is no point in adding in the additional complexities of quantum physics. But there isn't a sharp divide where we can say "below this scale is quantum and above it is classical." It is a matter of a gradual change through different degrees. This makes life harder for Professor Koons, as he needs a clear distinction at some point. He accepts the larger scale objects as substances, while denying that fundamental particles can be regarded as substances (instead these only exist virtually within substances defined in less fundamental layers of physics). At some point, he needs a clear distinction between not-a-substance and is-a-substance, and ideally that ought to be reflected in the physics, perhaps by the quantum-classical distinction. So I think there is a bit more work needed here.

Leaving that aside, I am in agreement with his general conclusions in this chapter. It is abundantly clear that the microphysical materialism that developed alongside classical physics is dead, both from its own internal contradictions, and most significantly from is disagreements with quantum physics. The pilot wave and Everett interpretations are probably closest to allowing that philosophy to continue. Koons will discuss both those interpretations in later chapters, so his focus on just the Copenhagen, objective collapse and his own Pluralistic Quantum Hylomorphism interpretations will be rectified in due course. I also agree with Koons (perhaps more controversially) that the solution to both the internal problems of microphysical materialism and its conflict with quantum physics is to bring back elements of Aristotle's philosophy. But we also shouldn't ignore the developments of post-scholastic philosophy and science -- it has got many things wrong, but not everything. Microphysical materialism is wrong in many of its underlying premises, but still has some correct premises and has developed some important conclusions. I disagree with it, but (unlike post-Hegelian philosophy) I can respect it, because it is a rational position and the world could have been like that. Scholastic philosophy was broadly correct in its underlying premises, but made mistakes in working through the detailed consequences of those premises (particularly with regards to the physical sciences, but also in some other areas). These mistakes were partly due to lack of knowledge, and partly because of some misleading approaches. We should seek to combine the best conclusions drawn the modern sciences, and use those to critique and correct the work of the scholastic philosophers, rather than, as was done in the late medieval and Renaissance periods, abandon them completely. This would resolve the various problems of modern philosophy, and I think that quantum physics makes clear that it would support rather than undermine the underlying principles of hylomorphism and moderate realism. And in this I think Koons would agree with me.

There are a few points where I think Professor Koons was in error in this chapter.

  1. While I agree with Professor Koons that quantum physics relies on the concept of potentiality, I am not convinced that the Feynman's sum over histories is the best place to show this, or at least it is more complex than he seemed to suggest. The problem I have is that he discussed the sum over histories as though it referred to the path of a single particle, with one of those paths actual, and the rest potential, but the potentiality still having an effect on the overall outcome. The problem I have with this is that the sum over histories is used to calculate an amplitude, which is converted to a probability, which is used to predict the outcome of an ensemble of particles. It is quite possible that within that ensemble (which would be of infinite size for the measured frequency distribution to precisely match up with the calculated probability distribution) each path is sampled by one particle or another. So I think more work needs to be done to establish an equivalence between the possible paths and Aristotelian potentiality.

    Of course, there are paths which sum up to a zero amplitude, and hence zero probability, and these will not be frequented by any particle. They still contribute to the sum, they are still potential paths (in the sense that they are not forbidden by any conservation law or symmetry), but they would never be actualised. So possibly Professor Koons' example can still be rescued on this basis, but it needs more work. I think there are better examples to use when considering the relationship between quantum physics and Aristotelian potentia.

  2. I think that Professor Koons is correct to say that in classical physics the differential equation approach most naturally supports a mechanistic philosophy, while the Lagrangian approach most naturally supports a teleological philosophy. But does this apply to quantum physics? The first problem is that there are more than just two ways of formulating classical physics. There is also, for example, the Hamiltonian approach. Perhaps Professor Koons included this within the overall Lagrangian approach, since there is a close mathematical relationship between the Hamiltonian and the Lagrangian. But Hamiltonian mechanics once again reduces to solving differential equations, and as such is possibly more in tune with a mechanistic philosophy.

    In quantum physics, one also has a choice between different formulations. The most significant of these are the canonical approach, which is closest in inspiration to the Hamiltonian formulation of classical physics, and the path integral approach, which is closest to the Lagrangian formulation. The canonical approach does not strike me as any more teleological than the Hamiltonian classical mechanics. Even in the path integral approach to quantum physics, there is no longer a principle of least action, which was the main driver for saying that Lagrangian classical physics is teleological. I won't dispute that in the path integral approach one needs to look at the overall picture -- the total amplitude for each outcome -- and maybe one can say it is teleological on this account. The principle of least action can be derived from the path integral by considering destructive interference away from the path of minimal action. But there are still complexities here which Professor Koons seems to be unaware of. I agree with him that quantum physics is teleological, but there are better ways of establishing that (such as the other argument he gives in terms of powers).

  3. I think that he has misconstructed what Bell's theorem shows. Bell's theorem (or rather its experimental confirmation) shows that there must be non-local correlations between certain events in quantum physics which cannot be explained by a classical hidden variables theory (i.e. that all the properties were fixed when the the entangled particles last directly interacted with each other). Koons however seems to relate it to the problem of superposition, particularly between different possible paths to an outcome, and claims that Bell's theorem rules out an explanation of that superposition in terms of our ignorance of which state is actual. In other words, he is stating that Bell's theorem rules out psi-epistemic theories. But this is not what Bell's theorem shows. Psi-epistemic interpretations such as consistent history or QBism both imply violations of the Bell inequalities, exactly as in standard quantum physics. At best, Bell's theorem can be said to rule out certain types of hidden variables theories, but not all psi-epistimic theories. There are other no-go theorems which do purport to rule out psi-epistemic theories, such as the PBR theory I discussed in the previous post. But, as I stated in that post, I think it has enough loopholes that the major psi-epistemic theories escape its grasp.
  4. I also disagree with Professor Koons when he states that virtual particles are not real. By virtual particles, I mean particles which are created and then annihilated as the system transverses from its initial to final state. They are not observed, but are crucial to the calculation. I would agree that virtual particles are (by definition) unobserved. As such, we cannot know which virtual particles were created and annihilated as the system moved from initial to final state. But that does not mean that they are not real. We know that creation and annihilation of particle states happen. It would be amazing if it only happened the minimum amount required to produce the observed outcome. Obviously we tend to work in an unrenormalised basis, meaning that the virtual particles used in the Feynman diagrams won't correspond to reality. But they will still have renormalised counterparts. Certain experimental results (i.e. the Lamb shift, or dynamic Casimir effect) can only be theoretically calculated by assuming the existence of virtual particles. Thus we have good reasons to infer their actual existence, even if they are not observed.

    Also of interest are fictitious particles such as phonons or ghosts. Phonons arise in the effective field theories used in condensed matter physics, where they mediate the vibrational modes of atoms, and are used in a similar way to gauge Bosons in particle physics. Like virtual particles, they are not directly observed, but only indirectly through their effects. However, I don't see how a supporter of hylomorphism in particular can deny their real existence, any more than they can deny the real existence of the other particles which emerge from the effective field theory (such as protons, neutrons, molecules, and atoms). Ghosts, on the other hand, arise as an artefact of gauge fixing. This is an important part of the theoretical calculation, but it is something we introduce into the theory rather than arising inevitably from the theory. I would treat ghosts as just a mathematical artefact rather than something physical.

Hylomorphism and the quantum world

This chapter sketches the basic requirement of an Aristotelian philosophy of nature, and in particular its balance between formal and material modes of explanation. It introduces a notion of ontological escalation corresponding to relations between different levels of scale. It asks which entities on quantum physics can qualify as Aristotelian substances.

Deriving Hylomorphism

In this section, he suggests that there is no evidence from physics against hylomorphism, and it is at least as well supported as the microphycalist alternative.

Professor Koons outlines four ontological options:

  1. Powerism, where causal powers are fundamental.
  2. Hypotheticalism, where facts expressed by subjunctive conditionals are fundamental.
  3. Nomism, where causal laws of nature are fundamental.
  4. Neo-Humeanism, where none of these are fundamental, but all grounded in the mosaic of categorical qualities distributed across space time.

Koons states that the choice is now between powerism and Neo-Humeanism, with the other two having fallen out of favour. I am not convinced that Koons is correct in so quickly dismissing Nomism. My reading of Maudlin, for example, is that he would support this view (albeit maybe expressing it slightly differently as just the laws of nature are fundamental). Koons complaint about Nomism is that it requires attributing a causal power to the laws themselves. I have some sympathy with this complaint, although its strength depends on what we take the laws of physics to be. If the laws of physics are taken to be some abstract principle, then I agree with Professor Koons that we cannot reasonably attribute a causal power to them. If they are taken to be a description or partial description of how some immaterial being interacts with the universe, then I have no objections to attributing active causal powers to that immaterial being.

Koons also dismisses Neo-Humeanism as it fails to provide adequate accounts of the directionality of time and causality, of dispositions and powers, of objective probability, or scientific induction.

So that leaves a powers ontology, with its forms, processes, and active and passive powers. Aristotelian forms are required to explain why various powers are clustered together into the same object. Otherwise we would be left with just massive coincidences that the same powers are always associated with the same type of substance.

The powers based ontology is an updated version of scholastic philosophy. The neo-Humean ontology is what has developed from modernism.

In the powers view, science is primarily concerned with identifying forms and explaining why and how they are related to various powers. While mathematics can be a useful tool, it needs to be supplemented in order to get a complete picture. Ultimately we are concerned with real natural kinds rather than mathematical abstractions. These kinds can exist at any scale. But our initial thoughts concern what we directly observe at the macroscopic level (such as biological organisms, or measurement instruments in physics). In the modern view, science is primarily about discovering mathematical relations which either explain or govern observable phenomena. The ultimate goal is to find a single set of laws that apply to all interactions at all scales. Ultimately, in this picture, we are most concerned with the fundamental layer of reality. Compound substances are just regarded as emerging from the sum of their parts.

Since the scientific revolution, the modern view has dominated. Many theists have tried to accommodate this by embracing dualism or some form of idealism or holism. This approach is problematic for various reasons, such as the interaction problem. This leaves three options: atomism, where only the smallest particles are metaphysically real; monism, where only the universe as a whole is metaphysically real; or pluralism, where metaphysically fundamental entities can be found at any scale. Professor Koons argues for pluralism, as it alone resolves problems concerning human agency and human epistemology.

Professor Koons outlines the notion of ontological escalation. This is based on the idea that the world consists of a number of levels of compositional scale. Except for the smallest scale, all entities are composed from smaller beings, and their powers and causal relations are partially grounded in facts about the smaller scale entities. Except for the largest scale, some entities have powers or causal relations which are partially grounded in facts concerning the larger scale entities.

Thus, for your standard entity in the middle of the sequence of scales, it requires knowledge of the smaller scale entities that it is composed to fully understand it, but equally there are other parts of our representation where we cannot understand the smaller scale entities without reference to the layers above them. The same is true when we look up from the standard entity to larger objects. Dependence from the smaller scale corresponds to material causation; dependence on larger scale entities corresponds to formal causation.

This is similar but not quite the same as the concept of emergence, where the constituent entities do survive the fusion into composite beings.

Professor Koons defines a substance as an entity that exists in the most fundamental meaning of that term. Substances constitute the most important layer of reality. They can be composite, but are then metaphysically prior and more important than their parts. This definition supposes that substances are irreducible -- cannot be reduced to more fundamental metaphysical beings -- orthogonal -- do not overlap -- and complete -- everything can be described in terms of one substance or another. This is known as the tiling constraint. The substance can be composed of parts which are unified by a substantial form. The form is not simply a description of the nature of the parts and their arrangement in space. In part this is because the substantial form grounds and explains those natures and that arrangement. It also explains the bonds between the parts, and any powers which arise from their joining together into a single substance. The powers of a substance are not reducible to the sum of the powers of the individual parts. The Matter plays the role of individuation. Parts can have natures which can be considered as being dependent or independent of the whole: integral and virtual parts respectively. An accident, or abstract particular, similar to the modern concept of a trope, is a real entity in its own right, but bound to and arising from a single substance.

So what are the worlds substances? Koons lists six possible candidates.

  1. Organisms
  2. Inorganic beings
  3. Artefacts
  4. Groups of organisms or inorganic beings
  5. Elementary particles
  6. The cosmos as a whole

An organism is a living being, and the paradigm case of a substance. Each living organism has certain powers and potentialities which are grounded in the various parts of the organism, albeit dependent on those parts being in good working order. However, many of these powers are emergent, i.e. their functions can only be explained in terms of their role in the organism itself.

The inorganic beings are non-living equivalents of organisms, in that the powers of the being depends on the being as a whole rather than just reducing to its individual parts. An isolated molecule, or a rock or metal crystal would be examples of such a being. These would also qualify as substances.

An artifact is not a substance in its own right, because it is merely the sum of its individual parts. The powers of an artifact can be explained entirely in terms of the powers and arrangements of those parts. An artifact also depends on extrinsic facts, and it only has a clear identity and it only persists as that entity because some human says so.

Groups of organisms or inorganic beings likewise are not substances as the nature and properties of their parts (the individual beings) are not grounded in the group as a whole. A grain of sand on the beach is not dependent on the rest of the sand on the beach in the same way that a living cell depends and is defined in terms of on the animal of which it is a part. The individual grain of sand is a substance, but the totality of all the sand is merely a collection of substances.

The cosmos as a whole is not a substance. For example, the powers of an organism are not explained in terms of its place in the universe as a whole. Empirical knowledge depends on an interaction between the observer and the object being studied. This relies on the object under study being isolated from its wider environment. We also cannot observe the universe; we only see individual things within the universe.

Professor Koons argues that fundamental particles do not qualify as substances. In quantum physics, particles lose their individual entities as a result of being incorporated into quantum systems. When there is a unified or entangled system, there is no distinct identity associated with either particle. This, he continues, replaces classical Maxwell-Boltzmann statistics with Bose-Einstein or Fermi-Dirac statistics. His main argument for this is based on the indistinguishability of particle states. He also claims that the number of particles in relativistic quantum field theory can vary according to one's frame of reference. He also states that, outside Bohmian mechanics, particles lack a definite position and do not follow definite trajectories. These only appear in the context of a measurement, when the particle becomes entangled with a larger system. There are thus, according to Koons, good reasons to reject the idea that individual particles ought to be regarded as substances.

I don't fully agree with Professor Koons here. In most cases, his conclusion is correct. A free quark, for example is not observed in all but extreme combinations of temperature, chemical potential, and magnetic field. It becomes bound, together with gluons and other gauge Bosons, into a proton, neutron, or other particle. The compound particle would then be the Aristotelian substance (unless it itself is absorbed into some larger compound being). This would be the case for almost all fundamental fermions in the universe. They are bound into a hadron, atom, or larger substance, and thus lose their individuality.

However, it is possible to have free and isolated leptons, such as electrons or neutrinos, and also photons. True, we cannot observe them until they become entangled with some sort of detector, but that doesn't mean that they are not there. And I think it quite reasonable to treat these (or, at least, the renormalised counterpart of the electron or neutrino) as substances in their own right.

So what of Professor Koon's reasons for rejecting the idea of fundamental particles as substances? Firstly, he considers entangled systems of particles. Not every particle is entangled, and even for those that are entanglement states that certain measurements of those particles are correlated. This strikes me as a significantly weaker statement than saying that they lose their individual identities entirely. Then he states that in most interpretations of quantum physics the wavefunction of the particle is spread out, potentially across the universe. This is only the case in certain interpretations of quantum physics. Koons rightly mentions the Pilot wave interpretation as an exception; I would also add many of the psi-epistemic interpretations. I should also say that it is not just fundamental particles which have wavefunctions spreading out over space, but also many compound particles, including some which Professor Koons would be happy to label as substances.

Koons most interesting point relates to the particle number being different in different reference frames. I don't have access to the sources he cites, so I can only guess that he is referring to the Unruh effect. This is related to additional radiation which is claimed to be observed in an accelerating reference frame (with an equivalent effect in a strong gravitational field). I can't see how there can be a difference when comparing non-accelerating inertial frames (as long as one correctly redefines the region in which one counts particles), since the number operator (integrated over volume) is Lorentz invariant. When discussing accelerating reference frames, one is starting to move into semi-classical quantum field theory. Here things get more complicated, for example one needs to be careful when defining the vacuum state or what is meant by a particle. This is an area where I need to do more research myself, so I am reluctant to make any more comments than this at this time. But I will say that the Unruh effect doesn't just apply to fundamental particles, but to any physical being including those Professor Koons would regard as substances, so I don't think that Professor Koons can use this as a reason for excluding isolated fundamental particles (such as electrons or neutrinos) from being substances. He would need to find a reason for saying why the affect does not prevent compound substances from being substances, and yet that reason is not applicable once we get to the fundamental level.

Deriving hylomorphism

Professor Koon's thesis is dependent on the idea that one cannot reduce the objects of chemistry and biology to more fundamental physical constituents. The objects of chemistry and biology are themselves fundamental.

So what is meant by reduce? Professor Koons discusses both the dynamical laws describing an object and the phase space describing that object. If these are the same as the laws and space describing the parts of that object, then there is this sort of reduction. So, for example, in a mechanistic Newtonian physics, the laws forces (both internal and external acting on a compound object) are just the external forces acting on the individual parts of that object. (The Newtonian dynamical laws of course just reduce to the three laws of motion coupled with a list of all the applicable forces.) The phase space of allowed states in the compound object is just the product of the phase spaces of each of its parts. There is nothing additional or changed when these come together to form a new object. However, many alternative accounts of reduction only focus on the dynamical laws, and ignore the importance of the phase space.

In classical physics, the problem of whether we can reduce the phase space is straight-forward. The phase space for every object, no matter its scale, consists of its location, momentum, and perhaps a few other properties. The same phase space is thus used for everything. The momentum of a compound object is just the sum of the momenta of its parts. The location of the compound object is spread over the location of its parts. The problem of reducing the phase space doesn't occur. For this reason, people can get away with just considering whether the dynamical laws reduce.

But in quantum physics, this is no longer the case. The way I would argue this is to look at effective field theory. You perform a change of variables, so that the Hamiltonian and the creation and annihilation operators are completely different, perhaps integrating out unneeded variables. Since the underlying phase space is the Fock state defined by the creation and annihilation operators, the phase space for the effective field theory is not the same as that for the original theory. Thus one argues that the compound entities do not reduce to those of the fundamental particles. There is also a non-linear change in the Hamiltonian operator, so the dynamical laws also do not straight-forwardly reduce to their fundamental parts as they do in a Newtonian mechanism.

Professor Koon's approach is to draw a clear distinction between fundamental particles, described by particle physics, and compound thermal substances, described by thermodynamics and statistical physics. Thermal substances have properties, such as temperature and entropy, which are absent in the fundamental particles. He states that the fundamental particles contain only finitely many degrees of freedom (such as the location and momentum of each particle). Thermal substances, on the other hand, have infinitely many degrees of freedom. Or, at least, the number of degrees of freedom is sufficiently large that it can be well approximated by a model that uses an infinite number of degrees of freedom. This is a potential rather than actual infinity. It lists the number of possible states rather than the number of states which are actually occupied. In classical physics constructing an infinite number of degrees of freedom from a finite number of parts with finite number of degrees of freedom would be impossible, but quantum physics operates by different rules.

It is common when studying quantum systems to take infinite limits. One example of this is the continuum limit, where the limit is taken as the number of particles approaches infinity while the density is kept constant. This limit can introduce new structures into the representation. This limit is crucial to allow a description of phase transitions, entropy and other thermodynamic phenomena. The methods used in contemporary condensed matter physics and chemistry allow theorists to introduce selection rules which distinguish between different phases of matter. If these models are to be genuinely explanatory, then the use of the continuum limit must be justified in ontological terms. The molecules really cooperate in such a way that they fuse into a dynamic Aristotelian continuum in space. The molecules form a continuous field of matter, with an infinite number of distinct sub systems. This allows the quantum particles to not relate to our three-dimensional space as discrete, separate units, but as a single, cooperating mass, with new dynamics and a new Hamiltonian. This is contrary to microphysicalism. The escalation does not depend on any new force. It is an due to a need to re-express the degrees of freedom describing the physical state.

Professor Koons proposes that a substantial form of a thermal substance defines an appropriate topology and algebra. The underlying Hilbert space represents the microstates in each representation. The ontic states of each substance correspond to a subset of the linear functionals on the algebra. Each thermal substance corresponds to a set of mutually commuting observables in the quantum algebra, representing the substance's essential properties. These are represented by disjoint spaces rather than vectors. Because the observables are mutually commuting, thermal substances do not exhibit superposition in their essential properties. Accidental properties can still be observed in the virtual parts of the substances, and these can have superposition.

Professor Koons goes on to relate this idea of ontological escalation to six phenomena:

I won't go into details for all these points -- you would have to consult his book and the references he provides. Professor Koons seeks for a way of grounding the irreversibly of time in the continuum limit. A finite system cannot ground the irreversibly of time. We therefore need to consider the continuum limit to explain this. I am not sure this is wise or necessary; I prefer to see the irreversibly of time as something more fundamental to the universe which ought to be regarded as an assumption of the physical model rather than a conclusion from it (similar to how we assume that there are 3+1 dimensions with an underlying Reimann geometry, rather than extract this from the physics). Since the irreversibly of time is so fundamental to the model, I think any other approach would just become circular.

Spontaneous symmetry breaking is when a symmetry of the Hamiltonian is broken by a "random" selection of one of a number of ground states. The consequences of this phenomena play an important role in both particle and condensed matter physics. Professor Koons notes that a system with infinite degrees of freedom is required to exhibit spontaneous symmetry breaking, which he relates to the continuum limit at the heart of his model of ontological escalations. Professor Koons states that phase transitions are an example of spontaneous symmetry breaking. Against his argument here, I note that the source he is relying on using only cites one particular example of a phase transition. This does not show that all phase transitions are due to the same mechanism. However, the idea that phase transitions require an infinite system, I think, well established. A phase transition is described by a discontinuous change in some observable, the order parameter. To have discontinuous change as the solution to a finite system of differential equations (without any infinities in the parameters to those equations) is not possible. One has to go to a system with an infinite number of degrees of freedom. Avoiding the phase transition requires either denying that transitions between phases can happen, or to suppose that an infinite model is a useful approximation to what could be explained by a finite model (i.e. that the change in the parameter is not really discontinuous, but only a very, very sharply varying continuous function). Neither of these fit well with current research of the observed phenomena.

So that is Professor Koons' presentation (albeit in summary and with a few comments of my own scattered here and there). What do I make of it? To my mind, it is a bit mixed. I don't actually disagree with his conclusions concerning how fundamental particles are absorbed into thermodynamic substances, or his characterisation of substantial forms in terms of the topology and algebra appropriate for those substances. I think that he ought to have mentioned decoherence when discussing how classical properties emerge, which would provide a more rigorous account than the one he supplied.

I saw two big weaknesses in his presentation. Firstly, his claim that fundamental particles, as described by particle physics, exhibit a finite number of degrees of freedom. This is correct in wave-mechanics, but incorrect in quantum field theory (the actual theory of these particles). The idea that these are characterised by location and momentum, non-commuting observables which cannot be realised at the same time, is a simple mistake. The second error is in his discussion of the six phenomena. He states that an infinite system is required to exhibit them. This is correct (albeit, as stated, I have reservations about this approach to explaining why time objectively has a direction). However, he then jumps to say that the required infinity arises from the continuum limit. I am not convinced this follows. For example, spontaneous symmetry breaking (e.g. in the Higgs model or QCD vacuum) and phase transitions (e.g. between a quark gluon plasma and confined quarks) are observed in particle physics models which do not make use of the continuum limit (even though that is important in condensed matter physics and quantum chemistry). I don't think that his model adds much to the discussion of these phenomena.

But as stated I think his overall conclusion is reasonable. He just could have done a better job of arguing for it.

Back to Professor Koons' narrative. He discusses four objections.

  1. Anti realism concerning thermodynamic properties and phenomena. I agree with Professor Koons that this is not reasonable. Such properties are observed. If they disagree with your metaphysics, then that is a reason to abandon your metaphysics rather than suppose they don't exist.
  2. It could be argued that infinite models are merely mathematical conveniences. The infinite models serve as a useful approximation to reality, but reality is in practice finite-dimensional with steep functions rather than genuine discontinuities. Professor Koons rightly argues that this doesn't work. Certain theories do require a genuine singularity to make their predictions.
  3. The microphysicist could complain that the cost of abandoning atomism is too high. Professor Koons rightly dismisses it. If there are philosophies which explain quantum physics better than atomism, then there is no good reason why we should hang onto atomism. If that means rejecting much of the philosophy we hold dear, then so be it. It is much better to reject one's cherished philosophy than to reject scientific findings which contradict it. One can, and ought to challenge the science, but there comes a point where you have to admit that those challenges fail, and in the case of quantum physics and microphysicalist atomism, we are well past that point.
  4. Coupling with the electromagnetic field can provide the infinite number of degrees of freedom. Advocates of this view would agree that there are an infinite number of degrees of freedom at the thermodynamic level, agree that this is required to explain the phenomena Professor Koons discusses, but disagree that the process of ontological escalation is needed to provide this infinity. And, as you would note from my objections above, I do have sympathies for this particular criticism.

    Professor Koons offers three responses.

    • He states that we don't really know that the electromagnetic field entails an infinite degrees of freedom, mentioning the use of energy cutoffs and other regulators as a possible avoidance of it. I would disagree here. The purpose of adding a regulator -- and not all remove the infinite degrees of freedom (e.g. dimensional regularisation, or Pauli-Villars regularisation) -- is so that we can keep track of the singularities in a calculation in an unrenormalised basis in a systematic way. We then transform to a renormalised basis, which removes the singularities, and then remove the regulators. As soon as we remove the regulators we are back to an infinite number of degrees of freedom in the renormalised theory.
    • Professor Koons notes that it is not sufficient for a system to have an infinite number of degrees of freedom, there must also be an infinite number of sub-systems, with a non-separable Hilbert space representation. I agree, which is why I don't disagree with Professor Koons' overall model of ontological escalation when one switches to an effective field theory to describe a compound substance. But this isn't really an answer to the objection that the continuum limit is not required to produce the infinity needed to explain his six phenomena.
    • Thirdly, he states that it is not merely enough to add an infinite number of degrees of freedom to the system; these have to have relevance to the phenomena in question. I wouldn't dispute this; but Professor Koons needs to demonstrate rather than merely assert that only the continuum limit can introduce the correct infinite degrees of freedom to explain the phenomena.

As stated, I don't disagree with Professor Koons' overall idea. But I feel that the way he argues for it is insufficiently rigorous. He would be better off considering decoherence and effective field theory as a means to get to his conclusion.

The measurement problem

In this chapter, Professor Koons considers the Copenhagen, Bohm and objective collapse models. As before I will start by paraphrasing his argument, before offering my own comments.

What are the probabilities that are the result of quantum calculations? The traditional answer is that they are the probabilities for measurement results. But then we come to the issue of what is a measurement. If we just discuss things in a loose way initially before trying to pin the definition down specifically, then a quantum measurement would occur when a human experimenter and some instrument interact with a quantum system. Yet the macroscopic systems are assumed to also ultimately be governed by quantum dynamics. This leads to an infinite regress. The observer should also be treated as a quantum system, meaning that his brain states would also be regarded as probabilities rather than certainties.

However, in the hylomorphic philosophy, it is denied that macroscopic entities can be completely represented in terms of their microscopic parts. Thermal substances have classical, mutually commuting, properties, and as such it is wrong to suppose that they are governed by a quantum dynamics. A key assumption in the argument outlined in the previous paragraph is undermined, evading the paradox in its conclusion.

A macroscopic thermal substance has classical properties. It therefore cannot be in a superposition. It is wrong to say that the cat would be in a superposition of alive and dead states, or an ice cube in a superposition of frozen and melted states. It would be in one state or the other with certainty, even if we don't know which state it is in until we look.

Measurement thus occurs when a quantum system interacts with a thermal substance. This need not involve an observer, or consciousness. We thus avoid any implications of idealism. So a thermal substance will have a definite location in space, even if the individual quantum particles, which are just momentary accidents of the thermal substance, do not. When not actualised by measurement, individual quantum particles are merely usually non-localised powers of interaction.

This avoids a dualism between quantum and classical entities, rather we have complementary entities (substances and accidents, form and matter) existing in mutual dependency.

In the hylomorphic picture, observers and instruments are substances (or composed of substances). Substances are not solely composed of quantum particles. The states of substances are not reducible to the states of the underlying particles. Thus there is no inconsistency in supposing that substances have classical properties exempt from superposition, and thus always have definite outcomes. If we try to solve the measurement problem with an alternative philosophy, dependent on powers alone and not the matter/form distinction, we would have to attribute those powers only to quantum particles. This includes both active and passive powers. Solving the measurement problem (as Professor Koons has expressed it) requires that observers have non-quantum passive powers. Quantum particles do not have this capacity, and so observers (and other macroscopic substances) cannot be reducible to just the quantum particles of which they are composed. There must also be something else determining their nature, which plays the same role as the form of hylomorphism.

There are certain constraints on any solution to the measurement problem. It must endorse the fact that our sensory perception of physical events are generally reliable. It must allow for reliable memory of past observation. The events we sample must be a reliable representation of the totality of the events. Professor Koons claims that each of the alternative interpretations of quantum physics must fail at least one of these tests. For example, the non-locality of quantum physics raises questions about the reliability of our sensations, because we appear to observe just our local environment, but our brains and senses could be entangled with distant objects which influence what we believe to be observations of local conditions.

With these introductory remarks, Professor Koons proceeds to discuss the Copenhagen interpretations, dualist interpretations (where human consciousness leads to a collapse of the wavepacket), the pilot wave interpretations, objective collapse theories, and (in the next chapter) the Everett many worlds interpretation. The power ontology of Hylomorphism would represent a sixth interpretation.

The Copenhagen Interpretations

The old Copenhagen view of Bohr was too narrowly dualist in its division between the classical and quantum worlds. The wavefunction evolves in a deterministic manner. Some observable results must be deduced from the theory, using Born's rule. This gives testable probabilities for the result of measuring some classical parameter. This assumes that we can use classical physics to describe the measuring instruments. This leads to an implicit inconsistency if we assume the microphyscal materialistic assumption that compound objects are reducible to their parts. The dividing line between quantum and classical is also not so clear cut, as quantum computers or supercooled fluids are macroscopic systems that possess quantum properties. We cannot consistently describe all macroscopic objects in purely classical terms.

An alternate model by Primas postulates that the wavefunction collapses when it becomes correlated with a classical property of a disjoint system. Substances can have both classical and quantum properties (by virtue of their virtual parts). Interaction with the classical properties of entities in the environment drive quantum superpositions to eigenstates in a short period of time. This solution provides a continuous rather than discrete collapse into given states. (This model builds on the notion of quantum decoherence.) Professor Koons believes that his hylomorphic model is similar to the Copenhagen model, but with hylomorphism replacing the dualism between the classical and quantum entities, and making use of Primas' description of a gradual objective collapse.

The pilot wave interpretation

Like the hylomorphic interpretation, the Bohmian approach adopts a realist stance of the classical world. It seems to offer neo-Humeans and microphysicists the best chance of surviving the quantum revolution. However, Professor Koons does not regard it as epistemologically adequate, as it casts doubt on the reliability of our senses and memories.

Koons' argument is adapted from a work by Brown and Wallace

  1. To be empirically adequate, the pilot wave model must give an account not just of the pointer settings of measuring instruments, but also our perceptions of those settings.
  2. Mental states depend on all the functional features of the brain, and not just the locations of particle states alone.
  3. In the pilot wave interpretation, this requires that the basis of mental states is influenced by the cosmic wavefunction, which leads to the radical non-locality of the brain state.
  4. In the absence of a pervasive and stable decoherence linking brain states and sensible objects, the brain states do not fix particle positions. Two pairs of brain-object relational states can be functionally indistinguishable, even if they involve radically different particle positions and trajectories. The brain cannot be reliable in both tracking functional states and particle positions.
  5. Non-local quantum effects threaten to destroy any reliable correlation between brain states and particle positions.
  6. The Pilot wave theory raises various technical problems related to the widespread application of decoherence (Koons cites this paper, and this paper). Thus brain states cannot be perfectly correlated with particle positions in the Pilot wave approach.
  7. Evolution is insufficient to resolve this problem, as it is indifferent as to whether we can track particle positions.

I will leave readers to consult Koon's book or the cited papers to flesh out these points. But the summary is that the radically non-local character of Bohmian mechanics casts doubt on our ability to reliably perceive the objects around us. Our brain states are also influenced by distant configurations of the Pilot wave, which in turn are influenced by distant configurations of matter. As such our perception of a local object will be influenced in subtle ways on events which are happening far away, and as such cannot be completely trusted. Since all science, including quantum physics, assumes that our senses are generally reliable, and the pilot wave interpretation assumes quantum physics, it ultimately becomes self-defeating.

One concern with this argument is that Bell's theorem suggests that we cannot evade some non-local influence in physics, leading to correlations in measurement events between two entangled particles. Does this suggest that whatever our philosophy of quantum physics we will need some non-locality, meaning that the objection would not just cover the pilot wave interpretation but every interpretation? I'm not so sure. In the pilot wave interpretation, the non-local dynamics of the particle affects every particle. In other interpretations, the non-local effect merely affects correlations in measurement outcomes between two entangled particles. If particles in the brain is not entangled with any particles outside the brain, then we need not posit non-local influences on the brain. This objection then would not apply to those interpretations, but only the pilot wave interpretation and perhaps a few others which also require some form of universal non-local dynamics.

The objective collapse interpretation

Professor Koons offers several ways in which he proposes that the hylomorphic interpretation improves over the objective collapse interpretation.

  1. The objective collapse interpretation requires some unknown and speculative mechanism to trigger the collapses.
  2. The hylomorphist can account for the stability of various beings such as molecules and cells. The objective collapse model needs to be combined with a further account of ontology. He argues that the flash ontology has flashes too rare to account for the continued existence of molecules. The matter density interpretation has problems in verifying the reliability of our sense perception.
  3. Objective collapse models struggle to explain the reliability of memory, and particularly that we have memories that seem to span a continuous range of times rather than just when there was an collapse event. The rate of collapse events is a free parameter in any version of the spontaneous collapse interpretation, which needs to be fixed. There are known phenomenological issues in making it either too low or too high, but the advocates of this interpretation hope that there is somewhere in the middle which escapes all the problems. Koons (following Pruss) asks the question of whether we can determine this rate without begging the question. If the rate of collapse events is too low, then there is no reason to think that our apparent memory of a continuum of past times is reliable. We would only have data of whatever we are observing at the moment there is an objective collapse event. Between these times, we would just rely on a reconstruction or extrapolation, which is bound to introduce false memories. Thus our memory would be unreliable. This picture can be avoided if the rate of collapses is sufficiently high. Then our perception of nature would be close enough to a continuum that we wouldn't notice the difference. But arguments against the rate of collapse event being too low assume the reliability of our memory, and thus such arguments would be circular.
  4. Objective collapse theories also fail to explain perception. Our eyes can detect a very small number of photons -- too small to trigger an objective collapse event.

My own comments and thoughts

I think that Professor Koons makes some interesting points in this chapter. What particularly caught my eye was his statement that it is necessary for the interpretation of quantum physics to give grounds to (rather than undermine) the assumption that our own senses are reliable, and the way he applied this to critique the various interpretations. For example, the objection he raised to the pilot wave model differs from my own concerns about it.

I am not convinced that his description of the measurement problem fully captured it. He focuses on the relationship between the macroscopic scale where we appear to get definite measurement results and the microscopic quantum dynamics where states can be in superposition and are indeterminate. This is certainly part of the problem, although one which I would say is largely answered by decoherance. However, there are other aspects of the measurement problem, such as the apparent need for two different mechanisms (Schroedinger evolution of the wavefunction and collapse), where the Born rule emerges from, and in particular answering the question of why the system ends up in that particular state rather than another. I think that all of these issues surround measurement in Bohr's original Copenhagen interpretation (which did not incorporate decoherence, as that was not understood until later). For psi-ontic models (apart from the pilot wave) there is also the problem of the relationship between the wavefunction (which is a description of an individual particle, i.e. part of the real world) and the probability (which is a prediction concerning frequency distributions for an ensemble of experimental results in the limit of an infinite number of runs of the experiment, i.e. something which only exists in the abstract). Different interpretations of quantum physics address some or all of these problems, although some lead to others in their place.

So I think Professor Koon's discussion was reasonable as far as it went, and raised some interesting thoughts, but was incomplete. This is significant because he needs to show that his own preferred travelling forms interpretation correctly addresses the measurement problem. He is not going to do that unless he has a complete expression of the measurement problem. I am also concerned that he made no mention of the second major problem of quantum physics, namely the "spooky action at a distance" that arises in entanglement scenarios, and in particular how to reconcile this with special relativity.

The many worlds and travelling forms interpretations

In this chapter, Professor Koons considers the many worlds interpretation, and offers his alternative.

The many worlds interpretation

Professor Koons describes the many worlds interpretation as one in which all the macroscopic results predicted by a superposition are all equally real. These are sometimes thought of in terms of branches of reality into different worlds. The early approaches suffered from a problem related to the preferred basis. Precisely what superposition you have depends on the basis, but the choice of basis is arbitrary. There will be one basis where there is no superposition, and only one state is occupied, and another basis where the wavefunction is split across several states. In the mathematical formalism, both of these representations are equally valid. But if we think of many worlds in terms of branching into different realities, then there is a clear difference, as in one case the universe will not branch and in the other case it will branch several times. But these are meant to describe the same physics and the same real life situation.

The resolution to this came with the discovery of decoherence, where entanglement with a macroscopic system selects a basis by strongly suppressing superpositions in that basis. Decoherence explains why an apparently classical world emerges from a quantum system. What it does not explain is which particular result happens -- that is still determined probabilistically with Born's rule. The Everett interpretation provides an answer to this question -- all the results happen in different "branches" of the universe (albeit that we have to be careful with the language here, as there are different understandings of what exactly it is that branches and whether branching is the right way of thinking about the interpretation, as I discussed in my own discussion of the many worlds interpretation). Meanwhile decoherence seemingly resolves the preferred basis problem of the Everett interpretation. I say seemingly because I am not entirely convinced that it explains what happens between measurements or other decoherence events.

The other major problem with the many worlds approach (which I focused on in my own review) concerns the interpretation of probabilities in this model.

Professor Koons attacks the underlying philosophy of functionalism, used in the Many worlds approach to link the world we observe in experiment and the underlying quantum reality.

The most manifest and naive problem with the many worlds approach is bridging the gap between the reality it supposes, where there are many different versions of the world, and our perception that there is just a single reality. Wallace's solution is to say that all features of the manifest image should be reduced to functional roles realised by the quantum wavefunction. This builds on ideas by Mill, Carnap, Russell, and in particular Lewis and Ramsey.

The idea concerns how we can extend a model of the fundamental reality with uses a particular language or set of definitions, when we expand those definitions to also include emergent features. This adds constants, function symbols, and predicates that signify an emergent, non-fundamental level of reality. A theory of the emergent world is realised in the base model.

Suppose that all truths about physical objects are realised by subjective sense experience. So we start with a base language which describes our sense data and a subjunctive conditional. There will be many models for this language, with a set of worlds W, an actual world, an interpretation function, and so on. We can then enlarge this language by adding terms referring to physical objects in space time. We can then express conditional relationships between the sense experiences and physical objects. In general there will be many theories in the expanded language consistent with the phenomenological model. Too many theories to cope with, so we add various constraints to reduce the number of them.

In Wallace's presentation of this philosophy, the fundamental model is built on the language of pure quantum physics. This requires a model that contains a domain of worlds (representing the space of possible wavefunctions, not the "many worlds" of the interpretation), each with a single wavefunction evolving through time, and a system to evaluate subjunctive conditionals. This system describes relations of comparative similarity between the worlds -- a supplement to pure quantum theory.

In Wallace's proposal, everything that fulfils a functional role in the emergent theory must be entities and sets of entities found in the correct model expressed in the language of quantum physics. There is no room for constraints on the model based on either causal connections between emergent and quantum mechanical entities, pure semantic conditions or metaphysical priority. Everything in the philosophy must ultimately rest on and be derived from quantum physics; you can't have additional assumptions seeping in from the emergent theory, because the emergent theory is constructed from the foundational theory and not itself.

So how should we extend the language of quantum mechanics to cover the emergent domain of our experience? The proposals from Mills, Russell, and Lewis don't work in this context. We just started with the language of pure quantum theory, which tells us nothing of the world we experience. The only constraints we have available to reduce the number of possible theories of the emergent world are those of internal consistency. We must thus consider every possible language and every possible theory to describe the emergent world.

So what are the problems with this functionalist account?

  1. The account leads to a radical indeterminacy of content that would afflict all scientific theories. In the view of functionalism, all the entities and properties of an emergent theory are just useful fictions. All that matters is whether models in that theory agree with experiment, at least approximately (within the relevant precision). An emergent theory contains an interpretation function that extends the interpretation, I, of the fundamental theory to the language that describes the emergent theory.

    The problem is that this map is underdetermined. There are many extensions to I which will also be consistent with the truth of the emergent theory. So we can't say what realises a predicate on the emergent (macroscopic) theory. The traditional approaches to functionalism understood the foundational theory to be our phenomenological experience and the emergent theory whatever lies behind it. Without constraints, the emergent (microscopic) theory is underdetermined, so we add various constraints (taken from real world data or or our common intuitions) to constrain it. The constraints have to be based on the level described by the fundamental theory. On the other hand, the many worlds approach reverses this order. It is the representation which corresponds to what we observe which is underdetermined. And we cannot add constraints to reduce this vagueness, because we would be basing those constraints on the emergent theory. These constraints are just as important as the data we take from quantum physics in defining the interpretation. But to include them would violate the assumption that everything we need to understand the world is ultimately determined by the fundamental theory. In Wallace's approach, there are no phenomenological qualia on either the foundational or emergent side of the mapping which sufficiently constrain the data.

    For example, there is no objective fact of the matter about whether the quantum probability of a given branch (such as the one we are experiencing) is high or low. It becomes a meaningless question. If a given emergent world is in a low probability segment of the wavefunction, there would be an alternative interpretation function (which might have a different permutation of quantum objects) that assigns it to a high probability. Emergent realities in which the statistics disconfirm quantum physics would be as real as our own and posses the same status in regard to the underlying quantum probabilities.

  2. The account implies that any consistent story of the world (no matter how fantastic) would count as equally real. Let TE be a target theory of the emergent world. It has a realisation in the model of quantum physics MQM. Let II be the interpretation function that links TE to MQM, mapping between the appropriate domain of the quantum world and the language appropriate the the theory and model. Now consider a theory TB, which uses the same interpretation function but a different model of the quantum world MC. However, there will also be another interpretation function IB which links the true model of quantum physics MQM with TB. Thus TB can be said to emerge from the quantum world just as TE does. This would be true for any theory in the macroscopic world. So we can say whatever we want about the emergent world of macroscopic objects, and, under the criteria used by Wallace, we would be bound to be telling the truth as long as the theory satisfies the minimal criteria of internal consistency. Thus all such worlds are equally real.
  3. If this is true, then any theory of the macroscopic world is true by definition. Consequently, it is impossible for any scientific theory to be wrong. And if no theory can be falsified neither can any theory be confirmed. For example, all the evidence we have for, say, special relativity could be misleading (in terms of metaphysics rather than epistemology). Our experiments and observations depend on the language and ideas in the emergent theory, which is required to interpret those observations. If the emergent theory is undetermined, then neither can we conclude anything definite from our observations. If we accept the assumptions behind Wallace's model of functionalism, then the evidence could have been produced by some other mechanism.
  4. And this will also be true for the theory of quantum physics. If we cannot interpret our theories of the emergent world realistically, then no belief in such a theory can be objective. Consequently the many worlds theory undermines our confidence in the correctness of quantum physics. Yet it assumes the correctness of quantum physics; that is the only reason we might adopt an Everettian interpretation. In this way the many worlds theory is self-defeating.

Professor Koons states briefly that he thinks that this argument can also be used to undermine pilot wave interpretations, but without going into details. That might be the case if some or all pilot wave interpretations require some form of functionalism to map the quantum wavefunction and macroscopic world, but I am not sure that all ways of interpreting the pilot wave need this. That might, however, just be my lack of understanding of them.

I am also not fully convinced that it defeats many world interpretations in general. If successful, it defeats Wallace's particular interpretation. But there there might be other models within the general family of many world interpretations which use a different means of relating the landscape of quantum physics with our perception (even if nobody has yet proposed a viable alternative model). Professor Koons says nothing in criticism of these other models. Although that might be the point, since he himself advocates for an interpretation based on many worlds with an alternative to Wallace's functionalism.

Consistent histories

Professor Koons briefly mentions the consistent histories approach to quantum physics during this discussion, but I am not sure that he understands it. He seems to regard it as a corollary to many worlds which particularly emphasises the importance of decoherence and resolving the preferred basis problem. I can sort of see where he is coming from, since consistent histories also outlines a family of possible future trajectories; which in some respects are similar to the branches of the many worlds theory. However, he fails to make the clear distinction: the many worlds theory is a psi-ontic theory which treats all of those branches as equally real, while the consistent histories approach is psi-epistemic, when they only exist in the theorists notebook as he attempts to calculate probabilities for possible events that follow from a set of initial conditions. In consistent histories, only one of the histories is actually real (i.e. describes what actually happens) on any single run of the experiment. Professor Koons doesn't highlight this distinction. As such, I don't think that consistent histories requires the functionalism to map between the wavefunction and the macroscopic world. The wavefunction is not real in consistent histories, but merely an abstraction introduced by the theorist to help him calculate probabilities. Reality in consistent histories consists of the appropriate physical particles, whether isolated fundamental particles, or macroscopic composite objects depending on whether we need the standard model or some effective theory to describe the physics at the scale we are considering. It doesn't take much to make this approach consistent with hylomorphism.

Travelling forms

So how does Professor Koons understand the solution to the problem of relating the macroscopic and quantum worlds? He suggests that we need to construct the semantics and theory simultaneously. This requires a top-down ontological constraint in addition to the bottom-up constraint provided by quantum physics. In other words, he takes Wallace's functionalism, but removes the assumption that everything in the philosophy must ultimately arise from the level of quantum physics. He also allows top down constraints coming from the notion of form, which exists at the macroscopic level we are familiar with. These represent a set of natural kinds of emergent entities with a fixed real essence. We cannot limit the fundamental structure of reality to just the particles of the standard model. We need a description of the macroscopic world that co-exists with quantum physics. This, of course, is precisely what is provided by hylomorphism. The macroscopic objects would then be dependent on but not wholly determined by the quantum realm.

The existence of a composite macro-object on a branch of the overall wavefunction would be causally dependent on there being a composite object on that same branch in the past with the correct causal powers. This fits well with Aristotle's vision where the generation of new composite substances is always the result of the corruption of pre-existent substances. The quantum wavefunction provides a material cause. But for this to be actual, it needs to be united with a formal cause reflecting the essence of a real macroscopic object. This formal cause would select one of the branches of the wavefunction as real. The other branches are relegated to having potential existence only (in the Aristotelian sense).

In this travelling forms interpretation, all branches but one still have the underlying wavefunction describing the same microscopic particles. But although they have everything needed from the microphyscial perspective everything needed for the potential existence of macroscopic objects, no actual composite entities correspond to these branches. They are compositional zombies.

Whether a branch corresponds to actual physical objects depends on two factors. 1) Is it sufficiently well decohered? 2) Has it been occupied by composite objects in the past, with the power to either persist or generate new objects? Now when there is a branching event, only one of the subsequent branches will contain the actual object, with a probability to be determined by Born's rule.

My own comments and thoughts

I thought that Professor Koon's criticisms of the Everett interpretation interesting, although I would like to see them fleshed out in more detail than in his book (which provides more details than I have listed here) to properly appreciate them. The issue is my own: I am not a professional philosopher (only an amateur one), and thus there are gaps in my knowledge of philosophy. The functional approaches of the likes of Carnap and Lewis are in one of those gaps. Thus I need things explained in full detail step-by-step, without assuming some background knowledge, to properly understand what Wallace is trying to do, and what Professor Koons is refuting in that. As such, I can't really predict what sort of response Wallace would raise against Professor Koon's objections to his functionalism.

As stated, I also think that Professor Koons' treatment of consistent histories is rather too brief, and throws it in with many worlds when they are different things.

Professor Koons' travelling forms interpretation is effectively to combine hylomorphism with the many worlds interpretation. As stated above, I think hylomorphism is the way to go in understanding substance and effective field theories. But hylomorphism alone doesn't describe the dynamics. So it needs to be combined with some model of the dynamics; and we may as well pick one of the established interpretations to work as a basis. Professor Koons has picked the Everett interpretation.

His (and Pruss's) travelling forms approach keeps the ontic nature of the wavefunction, and admits to the branching, but adds in the element of the substantial form to prioritise one particular branch. This, I think, solves the main problems with the Everett interpretation: the difficulty in interpreting the meaning of probabilities in quantum physics (which I raised in my own post), and the difficulty in relating the wavefunction to the macroscopic world (which is Professor Koons' main concern).

However, although he might have solved some problems, I fear that he has reintroduced others. Firstly, there is the measurement problem. I see this as the problem that there are two competing, entirely unrelated and very different in character, mechanisms that act on an ontic wavefunction. The problem is in reconciling these. Which one takes precedence at any given time? How can we say that the wavefunction evolves according to the Schroedinger equation, when it also every now and then suddenly jumps to a defined state. Decoherence by itself still has the wavefunction evolving according to the Schroedinger equation. It selects a basis, but does not explain the indeterministic jumps into a particular state in that basis. If the two mechanisms are separate, then why should the jumps occur when the system decoheres? If they are related, then why are they of such different character, and why can't they be combined into a single equation or framework? In the Copenhagen interpretation, these two steps are a) the deterministic and local Schroedinger evolution and b) the indeterminate and non-localised wavefunction collapse. In the travelling forms proposal, you have 1) the deterministic and local Schroedinger evolution and 2) the indeterminate and non-local selection of which branch becomes real.

Other interpretations do solve this problem, mainly by denying that one of the two steps happens in the real world. For example, the pilot wave and Everett interpretations resolve it by denying that there is a wavefunction collapse (in the pilot wave interpretation by supposing that what we observe is a particle which, although guided by the wavefunction, always has a fixed position; the Everett interpretation supposes that every branch remains real). Or the psi-epistemic interpretations deny that the Schroedinger evolution takes place in the real world (but only describes our uncertain knowledge of the location of the particle), while the particles themselves, the only things in reality, undergo indeterministic but local motion and always have a definite state. But the travelling forms interpretation cannot use the Everett evasion, because it selects one of the branches as real while the others remain potential. Nor can it use the psi-epistemic evasion, as it treats the wavefunction as real. Nor can it use the pilot wave evasion, because it isn't a pilot wave theory.

Then there is the problem of non-local correlations in measurements in entanglement experiments. The Everett interpretation attempts to evade this by denying that there is a single measurement result. The pilot wave allows non-locality in the equations of motion (which, like many physicists, I view as problematic, but some people are willing to accept). The standard psi-epistemic interpretations evade Bell's theorem by pointing to various technical parts of Bell's proof of his inequalities which violate key assumptions of their interpretation. These arguments of the advocates of psi-epistemic are impeachable, except that while they show that their interpretations correctly violate Bell's inequalities, they do not address the philosophical problem of correlated non-local observations. The mathematics shows the probability of those histories with forbidden pairs of results is zero, but there is no satisfactory explanation of that mathematical result, or mechanism which explains the non-local correlations. They have the mathematics which predicts non-local correlations, but I am not aware of an explanation of that mathematics in terms of the underlying ontology that allows for non-local correlations. But even if we ignore my concerns over the traditional psi-epistemic interpretations, none of these evasions are available to the travelling forms interpretation. It accepts there is a single measurement result, it does not have a non-local guidance equation, and, since it makes the wavefunction ontic, the technical difficulties in the proof of Bell's inequality highlighted by the consistent histories advocates do not apply.

Can the travelling forms interpretation be rescued? I think to do so, it would have to be moved closer to either the pilot wave or consistent histories (psi-epistemic) approach. I would imagine that the latter would be more acceptable to Koons (given his criticisms of the pilot wave interpretation), but he would still need a more satisfactory way of explaining the non-local correlations.

Biology and human sciences

In this chapter, Professor Koons considers the implications of the Aristotelian model for our understanding of biology. This chapter is of less interest to me, so I will just skim over it in this review. Essentially, he argues that the form of the organism (the soul) provides another top down constraint on the underlying matter, in this case the various molecules that make up the cells. So the molecules exist virtually within the substance of the organism, and a subsumed into it. This different form implies that the various parts of the organism can exhibit different powers, this time related to the substinance of the organism as a whole. This conception of the soul is different from that expressed in post-Cartesian philosophy (and criticised by atheists), and does not introduce any new force. Instead, it provides a metaphysical ground and explanation for the standard scientific explanations of the organism (and a better explanation than that provided by microphysical mechanism).

Teleology is grounded in the various powers of substances, determined by their form, and exists outside human thought and intention. Biological teleology depends on an appropriate metaphysics that recognises causal powers at the level of biological organs and organisms. Evolution presupposes reproduction, which presupposes these powers directed towards a particular end. The form and powers also help us to normal state of the organism, since a substance is supposed to create a particular effect in given circumstances if its nature contains the appropriate power.

Hylomorphism also provides a better explanation of epistemology, because the secondary qualities (such as colour) we seem to perceive really exist in the (macroscopic) substances, while a microphysicalist believes that the fundamental substances are microscopic and lack these qualities. This leaves him with the problem of philosophically explaining how they emerge, and how to avoid epistemological doubt.

Aristotle's system does not presupposed the eternal fixity of biological species, so there is no conflict with biological evolution. Some might raise an objection on how proper accidents arise from contingent accidents. A proper accident is one which emerges from the specific nature of the substantial form. Otherwise it is contingent. The question is whether some property can be a contingent accident of an ancestral form, but a proper accident of the organism that (eventually) evolved from it. Koons answers this by proposing that during evolution a population comes to realise substantial forms that belong simultaneously to two different species Species here is determined by the metaphysical category of whether the accident in question is contingent or accidental. A genetic species, on the other hand, is determined by whether the genes are sufficiently similar that the two organisms can successfully interbreed. I read Koons as saying that the population would be of one genetic species, but contain two different metaphysical species. Environmental selection pressures would then cause one of the metaphysical species to become dominant over the other, and after that the genetics will continue to drift further.

Obviously, you ought to consult Professor Koon's book for more details, but this chapter all looks reasonable to my non-expert (in the field of biology) eyes.

Substances, Accidents and parts

In this chapter, Professor Koons develops the hylomorphic model to deal with some empirical facts about remnants and vestiges of substances.

Professor Koons begins this chapter by outlining what the Aristotelian substance is and its importance and attributes. A substance is an entity with a defined nature. It is a fundamental, unified, persisting bearer of causal powers. Substances can be composite, and can undergo intrinsic change.

Accidents are abstract particulars of an entity which are not universal to the substance. Accidents can be either proper (flowing necessarily from a substance's essence), or contingent (i.e. which can vary from one particular instance of the substance to another). The form of a substance is the ground from which accidents are derived, and these are distinct from the properties of the substances.

A finger or heart can be thought of as a quantitative part. These are contained within a substance, but are not substances in themselves. It is something which can be investigated in part in its own right. A part is not fundamental, as it cannot be fully understood except with reference to the wider substance. Quantitative parts of substances are neither accidents nor substances. No substance can have other substances as parts. There are also non-quantitative parts, such as the form or underlying prime matter.

Can accidents persist beyond the substance? For those accidents which relate to the whole substance rather than a quantitative part, the accident must reside in a substance, and as such it is impossible for accidents to persist after their substances cease to exist. Despite concluding this, Koons does not seem certain. He does not seem to take this as an essential part of the Aristotelian system, and is willing to revise his view in the light of new information.

Koons offers two counterexamples to the rule that accidents cannot persist beyond their substance. Firstly, there are accidents in the category of action, and the effects of an action can continue and persist after the agent who originally performed the action ceases to exist. I would concur with this, although this is obviously a distinct case (I am not sure I would classify it under an accident). One cannot transfer this to other types of accidents such as colour or shape. Koons agrees that this is the only example where an accident can naturally persist after the substance. Secondly, Koons states that the rule that an accident can persist without its substance can be overridden by divine power. He cites transubstantiation as an example. I can see why a Roman Catholic would have to say this, but the counterexample is dependent on this particular view of the Eucharist being correct, which is disputed. Indeed, as a Protestant, one of the reasons I reject transubstantiation is its violation of the general rule that accidents cannot persist outside their substances. God cannot do the impossible. The real presence of Christ in the Eucharist is clearly taught in scripture and the Church Fathers, and as such ought not to be denied, but transubstantiation is not the only way of expressing this, and other approaches avoid transubstantiation's philosophical problems. My own view is that the precise mechanism by which Christ is present in the Eucharist is not specified, so it is wrong for us to be too dogmatic about any view unless it contradicts scripture or reason. If I had to choose a model, I would tend towards Luther's sacramental union. The bread and wine do not become a different nature entirely, but instead take on in addition to it a divine nature, while still remaining bread and wine. The analogy is with how the incarnation does not deny the human nature of Christ, but instead this is joined with a divine nature into a unified person. "Not by conversion of the Godhead into flesh; but by taking of the manhood into God; not by confusion of substance but by unity of person" and does not undermine the human nature of Christ, but brings the human and divine natures together. I think that transubstantiation goes too far in contradicting philosophical principle, and further than is necessary to satisfy the testimony of scripture and the earliest Church fathers. Of course, the symbolic Zwinglian view of the host is also to be rejected as it clearly contradicts the New Testament. The Lutheran and -- although I am less comfortable with this -- Calvinist understandings take a more reasonable middle position.

For Koons, Quantum particles are not individual things -- neither substances nor quantitative parts of substances, but merely aspects of potential actions of the substance which omits them. Substances are not composed of particles in the sense that the solar system is composed of the sun and planets. Instead we have a continuum of matter with the potential to act in a quantised way. The metaphor of particle is appropriate to describe this quantisation of action. For example, we see distant stars via the photons emitted from them. If photons just represent the actions of the stars, then they can persist after the stars die.

Koons goes on to discuss the relationship between substances and their accidents, if accidents can exist after the substance's demise. The relationship only requires that both exist in potentiality. For an accident to exist in actuality, it is necessary for its substance to either be actual or have been actual in the past. Every accident must receive its existence at some point from its substance, and is individuated by its substance. Thus no accident can be transferred from one substance to another. This is analogous to how a soul can persist after its body's death, but they are individuated because each soul was once joined with a particular bit of matter to create a particular individual.

My concern with Koon's discussion here is that he talks about the persistence of accidents, and much of his discussion seems to discus accidents in general. But he only demonstrated this persistence for one class of accidents, actions. (And, as stated, I would prefer to use a different word to distinguish action from properties and individual states of matter, as they are sufficiently different to require their own terminology to avoid confusion.) I think this is more a matter of presentation than an actual error in his thought.

Koons next discusses the relation of quantitative parts to their substance. What prevents a substance from being merely a collection of parts? What relationship exists between the substance and the parts? Can a substance do anything above what is done by the parts separately? The substance has a single substantial form, which grounds its parts, while the heap consists of many substances, and the properties of each part is not based on that it is in a collection. A part, on the other hand, does depend on the wider substance to ground its properties and powers. As such, one cannot consider the powers of the parts apart from the substance, because they ultimately depend on how the parts fit into the wider substance.

So each quantitative part receives its existence and nature from the substance. It's quasi-essence is defined in terms of its teleological function within the whole. These do not prevent the part from continuing to exist beyond the end of the wider substance, as a skull would persist beyond the death of the man. The accidents of the parts are also accidents of the whole, but can survive beyond the demise of the whole.

When a brown cow dies, the carcass is also brown (at least for a while). According to Aristotelianism, nothing persists as numerically the same thing through death. So how can the continuities in accidents such as colour be explained? They can no longer persist in the cow, as the cow is no more, and the cow by itself is not sufficient to explain the corpse as its natural powers would maintain it as a living cow. The Thomist cannot appeal to an extrinsically grounded law of nature, since substances and the forms of substances ground all dynamic laws. Koons solves this problem by referring to the notion of maverick parts, which is when a part of an organism ceases to fulfil its original function. The active powers of these parts combine with the passive powers of the cow to produce a corpse with the correct properties.

Professor Koons next addresses the problem of how to individuate thermal substances. Although he believes that the question can only really be fully answered by empirical enquiry, he does offer a few suggestions. Sharp boundaries or discontinuities in space and time would distinguish one substance from another. Collective powers, developed from the whole substance, are a necessary condition for unity. And the integration of the fundamental or essential powers into the whole is a sufficient condition of unity.

I ought to state my own answer to this question at this point. As usual, I tend to think in terms of quantum physics. At each level of physics, there is an actual or effective Hamiltonian, which describes both the different energy levels of the substance, how the substance evolves in time, and how it interacts with other substances. The actual state of the particle will be described by an energy eigenstate of the Hamiltonian operator, or a superposition of those states. Different energy eigenstates of the same Hamiltonian will all correspond to different potential states in the same substance. That much I think is clear. But we should also think about changes to the Hamiltonian when we deform it slightly, i.e. put it under pressure, or in a strong magnetic field. Here we can distinguish between continuous change, such as when the atoms of a crystal are pushed together under compression, or a discontinuous change, for example when a radioactive nuclei undergoes decay. In the later case, there is a high potential wall separating the energy levels governing a parent nuclei, and the energy levels that represent the two daughter nuclei. The latter has a lower energy, which is what allows the decay, but it is separated from the initial parent state by a potential barrier which, in classical physics, the parent particle would not be able to cross. The parent nuclei can jiggle around in its own potential well -- these would be continuous changes -- but in classical physics cannot move outside that well without a large influx of energy. In quantum physics, however, there is the possibility of quantum tunnelling, where there is a small but non-zero probability that the system will suddenly jump to the state of lower potential energy. This represents a discontinuous change in the substance; there is no continuous deformation of the energy levels that will allow you to pass from the parent nuclei to the daughter nuclei. This jump from one energy state to another happens instantaneously (at least in the theory). Thus we can identify the daughter nuclei as different substances to the parent nuclei.

There are various field configurations which are not possible to reach from other field configurations by continuous deformations of the underlying fields. These belong to what are known as different homotopy classes. This is well studied and known at the level of particle physics (where I am most familiar), and there are known examples where the same principle apply in condensed matter physics; although I am not aware how much the principle has been extended to truly macroscopic substances. There are also questions around phase transitions, and whether they can be used to distinguish substances. I think that the application of topology and phase transitions to the more complex effective Hamiltonians which describe macroscopic substances can be used to distinguish between different substances.

Conclusion

In conclusion, I found Professor Koons' book somewhat mixed. It is very good in its articulation and defence of hylomorphism as a philosophy of quantum physics, and presents a case which I think is difficult to counter. I am not enthusiastic about his denial that fundamental particles cannot be substances. I agree that they are not substances when subsumed into a more complex being, but I see no good objection to saying that an isolated photon or electron can count as an Aristotelian substance. We might not be able to directly observe these beings (we can only observe them when they become entangled in a measuring device, whether mechanical or our own eyes), but we can infer that isolated particles (or at least as isolated as anything else) do exist between measurements.

With regards to the dynamics, Professor Koons offered an interesting response to several of the standard philosophies, taking a different approach to my own. He did not consider in detail the psi-epistimic interpretations, which I think is an unfortunate omission. I am not enthusiastic about his preferred travelling forms interpretation. He sets this up as a variation of the many-worlds interpretation, but modified to remove the difficulties of many-worlds. I agree that it removes those difficulties, but I think this comes at a cost in that it re-introduces some problems which are solved in the many-worlds interpretation.

So this is a good work, thought-provoking in places, and well worth purchasing and reading. It gets a lot right, but I don't think that it gets everything right.

Reader Comments:

1. Michael Brazier
Posted at 21:15:10 Monday August 5 2024



In the "Deriving Hylomorphism" section, the body text has the subtitle style. You may want to correct that.

Koons' claim that quantum particles are never substances in their own right was what I had the most difficulty believing when I read it. I didn't spot how he failed to distinguish between psi-ontic and psi-epistemic interpretations, though I agree that it's a significant fault.

Regarding transubstatiation - I don't think it raises any more ontological difficulties than do cases where everyone agrees that accidents persist through substantial change, for instance an organism's death. Just as the continuity of the organism's matter grounds its accidents when the organism's form departs, the continuity of the matter underlying bread and wine allows their accidents to persist when God replaces their form. Luther's theory of consubstantiation breaks the tiling constraint, for Christ expressly said the bread is His body, and only His human nature has a body. The Eucharist could be God and bread as Christ is God and man, but it can't be man and bread simultaneously.

2. Matthew
Posted at 05:47:30 Saturday September 14 2024



Hello again! I have a couple of comments, one regarding your evaluation of the travelling forms interpretation, and another regarding Koons' critique of pilot-wave theory. (Because of course I do.) :)

***On the travelling forms interpretation***

IIRC, you misunderstand the travelling forms interpretation in your critique here:

"However, although he might have solved some problems, I fear that he has reintroduced others. Firstly, there is the measurement problem. I see this as the problem that there are two competing, entirely unrelated and very different in character, mechanisms that act on an ontic wavefunction. ... In the travelling forms proposal, you have 1) the deterministic and local Schroedinger evolution and 2) the indeterminate and non-local selection of which branch becomes real."

The quantum state in the travelling forms interpretation, just as in the many-worlds interpretation and in pilot-wave theory, never undergoes collapse. It always evolves according to the Schrodinger equation. The "selection of which branch becomes real" happens entirely on the side of the forms and which determinate properties they take on when the branching in the quantum state occurs. In this way the travelling forms interpretation is actually very similar to a pilot-wave theory (and in fact is not mutually exclusive with pilot-wave theory; e.g., the forms could among other things specify the positions of particles that make up the substances).

***On Koons' argument against pilot-wave theory***

So, your summary of the argument is "the radically non-local character of Bohmian mechanics casts doubt on our ability to reliably perceive the objects around us." You also write:

"In the pilot wave interpretation, the non-local dynamics of the particle affects every particle. In other interpretations, the non-local effect merely affects correlations in measurement outcomes between two entangled particles. If particles in the brain is not entangled with any particles outside the brain, then we need not posit non-local influences on the brain. This objection then would not apply to those interpretations, but only the pilot wave interpretation and perhaps a few others which also require some form of universal non-local dynamics."

But this is simply not the case. The configuration of particles in system A can influence the motion of particles in system B arbitrarly far away *if the two systems are entangled*, but if they are not entangled (more precisely, *if the decohered branch of the wavefunction, containing the actual configuration, can be written as a product state*), then it is demonstrably the case that the configuration of system A does not affect the behaviour of system B, nor does B affect A.

And to be clear, the kind of non-local influences that occur in PWT between particles that *are* entangled are such that, for example, a pair of particles in a singlet state are guaranteed to produce opposite results of spin along the same axis when measured by distant S-G devices - in other words, they produce the motions of the particles that will conform to the predictions of standard quantum mechanics. There is no reason to expect any greater non-local influence on our perceptions from entanglement with distant particles in PWT than there is in orthodox QM, and in fact, every reason to believe that such influence would be no more pervasive than orthodox QM predicts (which is to say, not pervasive at all). If entanglement with distant particles threatens the reliability of our perceptions and completely undermines PWT's epistemic foundations, then all of quantum theory is in the same boat.

You write that Koons adapted his argument from a paper by Brown and Wallace's. Most of the points 1-7 from Koons' argument do not appear in the paper you cite, so I'm not certain what other support Koons has for them. But the pilot-wave theorist can easily deny his point 2, or the implication from 2 to 3. 4 and 5 seem poorly supported, and the paper you link under point 6 doesn't support the conclusion either. So I'm really not sure where this argument is coming from.

I can, however, comment on Brown and Wallace's argument, which is something of a variation on the "many-worlds in denial" argument against pilot-wave theory. The basic argument is that the wavefunction itself is sufficient to solve the measurement problem (via many-worlds theory) and the particle positions of pilot-wave theory are superfluous. But this is question-begging: it assumes that many-worlds theory is successful in accounting for the macroscopic world we observe - something which Koons denies, in his argument against Everettian functionalism! Moreover, it tendentiously ignores the roles that PWT holds the wavefunction and the particles to perform.

You mention that Koons thinks his argument against Everettian functionalism can also be wielded against PWT, but I think it is more accurate to say it can be wielded against any kind of microphysicalism that follows functionalist lines for explaining the existence of macroscopic objects. I simply don't believe PWT is beholden to microphysicalism - just because it ascribes definite positions to particles, doesn't mean it has to say that those particle positions are a complete description of reality. (E.g., like I said earlier - PWT and the travelling forms interpretation are not mutually exclusive.)



Post Comment:

Some html formatting is supported,such as <b> ... <b> for bold text , < em>... < /em> for italics, and <blockquote> ... </blockquote> for a quotation
All fields are optional
Comments are generally unmoderated, and only represent the views of the person who posted them.
I reserve the right to delete or edit spam messages, obsene language,or personal attacks.
However, that I do not delete such a message does not mean that I approve of the content.
It just means that I am a lazy little bugger who can't be bothered to police his own blog.
Weblinks are only published with moderator approval
Posts with links are only published with moderator approval (provide an email address to allow automatic approval)

Name:
Email:
Website:
Title:
Comment:
What is 10-9?