The Quantum Thomist

Musings about quantum physics, classical philosophy, and the connection between the two.
Is God a failed Hypothesis? Part 7: Laws, origins and miracles


Is God a failed Hypothesis? Part 8: Fine Tuning
Last modified on Sun Jan 31 22:59:39 2021


Introduction

This is the eighth post in a series on Victor Stenger's book God: The failed hypothesis. In this post, I intend to review his fifth chapter, The uncongenial universe.

A lot of the contemporary discussion around physics and theism revolves around fine-tuning arguments (also known as the anthropic principle). When constructing a physical theory, the contemporary physicist needs to a number of decisions. The first is whether he wants to construct a classical or quantum theory. Secondly, he needs to decide how many dimensions the theory will have, and its topology and geometry. Thirdly, there are the symmetries which constrain the theory. Fourthly, he needs to decide what particles will populate his theory. And fifthly, he needs to fix various constants and parameters, for example describing the strength of the interactions between different particles. Having done all that, the theoretical physicist can get to work. He can write down all possible interactions between particles which are both self-consistent and consistent with those principles, and begin calculating what properties his new universe will have and what it will look like. I won't say that this calculation is always easy, but in most cases we know various methods about how to do it, at least to a good approximation.

Of course, we are most interested in exploring the actual physical universe, which has its own geometry and dimensions. However, particularly in condensed matter physics, one can construct systems which obey a wide range of different rules. For example, one can create one or two dimensional systems, or propose weird topologies, or populate the experimental system with strange combinations of particles. In particle physics, the experimentalist is stuck with the universe as we have it, so the efforts of the theoretician is most closely directed there. But we can also spread our wings and think just as "easily" about other possible universes, where some of the parameters and particles are a little bit different, or even a lot different.

The symmetries and quantum and classical nature of the universe ought to be derived from more fundamental philosophical principles. The number of dimensions and the enumeration of particles are fairly easy to confirm experimentally (at least if you are willing to neglect anything with a mass greater than the energy of your accelerator). But the parameters cannot be determined by either philosophical or theoretical principles. There doesn't seem to be any reason why they are as they are. They have to be directly measured. And so the natural question to ask is "What if they were a little bit different?"

Now it is important to emphasise that we are here concentrating on dimensionless parameters. These are parameters which are not expressed in terms of some sort of units. For example, you can measure the speed of light in terms of meters per second, or furlongs per the length of time it takes to drop a ball off the top of the tower at Pisa. You will get a different number whichever unit of measurement you use. It thus makes no sense to ask "What if the speed of light were a little bit smaller," because effectively all you will be doing by making the speed of light a bit smaller is redefine what you mean by the meter. Nothing in the universe cares what your definition of the meter is, and everything would continue exactly as it is now.

But a dimensionless constant remains the same no matter what your choice of units. An example is the fine structure constant -- which describes the strength of the electromagnetic force, and ultimately the size of the electric charge. This constant is hard-wired into the structure of the universe. The experimental physicist can't change it. He is just stuck with it.

Well, nearly. There is one complication which I usually omit but I need to discuss here because Professor Stenger makes reference to it. The "constants" are not actually constant. They depend on the renormalisation scale. Renormalisation is related to the resolution at which you look at the universe. For example, you can use a grid to split the universe into small cubes, average over everything inside those cubes, and consider interactions between the things in different cubes. The laws of physics, including the fine structure constant, will depend on the size of those cubes. (If you are working strictly in a continuum, then you would still introduce some sort of energy cut-off which will also imply some sort of scale.) You can write down an equation which describes how the parameters change as vary the scale. And they will change. So when we say that the fine structure constant is roughly 1/137, we mean the fine structure constant at a low energy renormalisation scale. But the solution to that equation still needs one free dimensionless parameter. So the parameter we are interested in is not strictly the fine structure constant, but this input into the solution of the Callan-Symanzik equation which determines the fine structure constant at each energy scale. So we still have a dimensionless parameter which determines the fine-structure constant at each energy scale, and it is this parameter whose value is important to fine-tuning arguments.

So we about thirty or so parameters in the standard models of particle physics and cosmology. We don't know any reason why they can't take any value between zero and infinity. The question is what happens if we change any of those parameters. Theoretical physicists can and have performed those calculations, and it is found that there are dramatic effects. Increase the mass of the neutron a little bit in relation to the proton, and neutrons would too easily decay, and you will have nothing in the universe except hydrogen. Decrease the mass of the neutron a bit, and you would have no hydrogen. And so on. There are hundred or so known conditions with similar effects. For many of these conditions, if they were violated, then any form of life impossible. For example, life requires that complex information can be stored, passed on, and read by various molecules. Good luck doing that if everything is only made up of hydrogen. Stars rely on the fusion of hydrogen into helium (and then the heavier elements). Good luck with that if there is no hydrogen to burn.

So, although these parameters could be anywhere within an almost limitless range, if the universe is to support life, then there is only a very narrow window in which they could be. So if you were randomly throwing darts at the universe, with wherever you hit determining what the parameters will be, the chances of you hitting that sweet spot are very remote.

There are only three ways in which you can beat long odds:

  1. The most incredible luck.
  2. A large number of samples. For example, if you have a million to one odds, but randomly pick a million samples, then the chances are that at least one of them will hit the jackpot.
  3. Bias the selection process so you get the result you want.

You might respond "Of course the universe can support life -- we are living in it." But the question is why the universe is such that life is supported. If we have two philosophies, one of which gives a good reason why we should expect a life supporting supporting universe, and the other which relies on the most unlikely chance, and you are given a life supporting universe and have to choose between them, you are naturally going to pick the philosophy that offers the best explanation for that fact.

Option 2 requires a multiverse. Alongside our own universe, there is a near infinite array of other universes, each with different parameters chosen by some random process. The vast majority of these universes won't support life -- they would either have no hydrogen, or nothing but hydrogen, and so on. But one or two universes would support life. So the probability that somewhere within the multiverse there is at least one universe which supports life is quite good. And, of course, we would find ourselves in one of those universes. This might sound like science fiction, but there are speculative theoretical frameworks were such a multi-verse might arise. It would, of course, be impossible to confirm experimentally.

Option 3 implies that the laws of physics are determined by some sort of intelligence. This intelligence would have to exist outside the universe (otherwise it too would be bound by the laws and thus unable to explain them), and would have power to determine which laws govern the universe. This naturally leads to either theism or deism. This conclusion for obvious reasons has caught the interest of various theologians, apologists and philosophers of religion.

So as far as most of the scientists and philosophers who discuss fine-tuning are concerned, the debate is between options 2 and 3. Theists and deists will generally produce arguments against the reasonableness of option 2, and argue that option 3 is superior. Atheists will generally produce arguments in favour of the reasonableness of option 2, and say that it should be preferred because things like the problem of evil and so on undermine option 3. There is quite a debate over this, but fortunately I don't have to discuss it here, because Professor Stenger takes another approach.

Professor Stenger advocates for option 1. He does so by arguing that fine tuning is nowhere as severe as is commonly supposed.

This chapter is divided into two parts. The first part responds to fine tuning arguments. The second part makes the argument against theism that if God's goal in creating the universe was the creation of life, then a surprisingly large amount of the universe is inhospitable, and thus wasted. Since these two arguments are largely disconnected, I'll discuss the first part of the chapter in this post, and the second part in another.

Bad fine tuning arguments.

Some philosophers and theologians when discussing fine tuning have taken the idea and gone too far with it. One of these I have already mentioned: to discuss "fine-tuning" of dimensional constants such as the speed of light, Planck's constant. Professor Stenger explains why such arguments are bogus, and I completely agree with him. Professor Stenger also criticises in a similar way an argument based on neutrino masses. The argument was that if the neutrino mass was 10-34kg rather than 10-35kg, then the extra mass will cause the universe to collapse. This seems to be an objection of a similar sort, since the mass is not a dimensionless quantity. However, the neutrino mass is determined by various dimensionless parameters, including an either direct or indirect coupling to the Higgs Boson (depending on quite how you want to explain the neutrino mass), and it is these parameters which are fine tuned. But Professor Stenger is quite correct to comment that the mass is not fine tuned to one part in 1034. One has to express the degree of fine tuning in terms of the variation of those fundamental dimensionless parameters, which will be considerably smaller.

The second bad example that Professor Stenger discusses concerns the earth and its place in the solar system. There are certainly unusual features about the earth which play an important role in the development of life. It is just the right size, just the right distance from just the right type of star, just the right amount of tectonic activity, has a large moon and so on. I have seen arguments that this is another instance of fine tuning (Professor Stenger cites a work by Ward and Brownlee). There are two problems I have always had with such accounts. Firstly, there are a lot of stars in the galaxy, and a lot of galaxies in the universe. Each one could, in principle, have a life-generating planet. The chances of any particular world hitting all the right criteria is certainly small. But the law of large numbers comes into play. Are the chances of a world being suitable to generate life so small in comparison to the vast number of planets in the universe? Secondly, the second standard atheist objection to fine tuning also seems to be in play. While life as we know it might require a very much earth like world and conditions, why should we assume that no different form of life could evolve in other conditions. There is a reasonable argument to be made that there would be no life if the universe was made of hydrogen alone, or there were no stars. But not so much if you are trying to argue that no other type of life, or even complex multi-cellular life, based around a different biology, would be possible if the gravitational force was too strong or the world too cold for the life we know about to exist and evolve.

So I think Professor Stenger is quite correct to dismiss arguments of this sort. At least, such arguments are far weaker than those of the fine-tuning arguments built around the physical parameters.

Is there fine tuning?

So, as mentioned, Professor Stenger's approach to the problem (for him) of fine tuning is to deny it. He outlined his argument in detail in the book The fallacy of fine-tuning, which was published a few years after God, the Failed Hypothesis. I will here just respond to the points made in this particular chapter, but turn to the fallacy of fine tuning, and responses to it, for more details.

He starts by listing out five of the key fine tuning conditions. He calls these most important, although I am not sure why the conditions can be ranked in order of importance, or picks these particular five. You would have thought that most important conditions would be those which provide the tightest bounds; that is so that the area they exclude as being unsuitable for life comes as close as possible to the actual observed parameters.

  1. The electromagnetic force is 39 orders of magnitude stronger than the gravitational force. If they were of comparable strength, stars would have quickly collapsed. He doesn't go into details, but I assume that the reason he compares the electromagnetic force (which is not involved so much in fusion or stellar processes in comparison to the strong and weak nuclear forces) is that it controls the various chemical reactions which drive to the key interactions that allow living organisms to grow, reproduce and so on.
  2. The vacuum energy of the universe (or dark energy) is 120 orders of magnitude smaller than some calculations suggest, unless there is another factor causing an almost complete, but not totally complete, cancellation.
  3. The electrons mass is less than the difference between the masses of the proton and neutron, allowing neutron decay and the stability of the hydrogen atom.
  4. The neutron is just the right amount heavier than the proton to allow stable heavy elements.
  5. The Hoyle resonance, an excited carbon state which allows the creation of carbon atoms via nuclear fusion.
  1. Professor Stenger's first response is that some of the constants, such as the fine structure constant, vary according to the renormalisation scale. He then goes on to say that according to current understanding, the four known forces (gravity, electromagnetic, the weak and strong interactions) were unified at the big bang as one force. All four forces were of equal strength. However, due to spontaneous symmetry breaking, the forces were split into the same basic kinds we have today and their strengths evolved to current values. Life just had to wait until the forces were separated by the right amount.
  2. He considers four key parameters: the electron and proton masses, and strength of the electromagnetic and strong interactions. He performed a study showing that the life times of stars remained large enough when these parameters were varied over ten orders of magnitude.
  3. Investigators into fine tuning vary a single parameter while keeping all the others constant. If you vary them all together, you can remain within a life permitting region.
  4. These investigators also assume that the parameters are independent of each other.
  5. Physicist Antony Aguirre has independently examined the universes when six parameters are varied, and concluded that there was no fine tuning problem.
  6. Craig Hogan has done similar work.
  7. A group lead by Nakamura has looked in particular at heavy element formation in stars, and found that it does not depend strongly on the exact parameters of star formulation.
  8. All the basic parameters will be determined by theories unifying gravity with the standard model, such as string theory.
  9. The Hoyle state is not significant in the formulation of Carbon in stars. Rather, it relies on a different excited state, which is only 20% lower than the required threshold -- not such a close call.
  10. The original calculation for the vacuum energy was incorrect, and it is likely that a correct calculation will give a result of zero. The observed value is non-zero, but very small, so it is plausible that some currently unknown physics could allow us calculate the correct value in terms of other parameters. There is an upper bound on the vacuum energy needed for life to emerge, but if the correct value is zero plus a small correction, it is not so surprising that that value is met. Nor is it unreasonable to suppose that there is some unknown physics coming into play -- the calculation depends on the interaction between quantum physics and general relativity, which is poorly understood.
  11. In other universes, there might be life not based around carbon. We only know about one particular set of facts, U1, about the universe. The fine tuning argument proposes that they could be different, U2. In that case, we cannot use U1 to tell us anything about U2. Unfortunately, Professor Stenger doesn't expand on this argument, but it seems to be based on empiricist assumptions: that our knowledge only comes from observation. In any case, we have no idea about what other forms of life might be possible with other physical parameters. Theists have the burden of proof of showing that no other set of parameters could have supported life.
  12. Why does an omnipotent God need to rely on getting the universe just right. He could have created us to live in a hard vacuum if he likes?
  13. Why should God be required to build a system of laws which required such fine balancing. After all, God is omnipotent -- why rely on such tricks, or make it so hard for life to evolve?

Response

Quite a long collection of objections. I should first of all reference the article by Luke Barnes which has responded to Professor Stenger's arguments in depth. I should note that I personally have not done any research on this area, so I am responding based on my reading of other people's work.

Varying parameters

Professor Stenger is quite correct that the fundamental constants vary according to the renormalisation scale. This is well known -- including by the people who performed the calculations that determined that there was fine tuning. Does Professor Stenger consider it plausible that those people didn't take that into account?

Even though the "constants" depend on the scale, according to a differential equation, the Callan-Symanzik equation, that equation still has a free parameter. Meaning that there isn't a single solution to the equation, but a family of solutions, parametrised by a single number. This number we have to read in from experiment. So even despite the variation under renormalisation, there is still a constant controlling the strength of each of the various interactions.

Professor Stenger's claims that all four forces were unified is just speculation. There are, of course, ideas which attempt to do so, though none of them are proven and many already discredited. Including gravity with the others is particularly controversial. The main justification for saying that the electromagnetic, weak and strong nuclear forces were part of a unified theory, which was spontaneously broken into the three forces we know today is from supersymmetry. It was found that if supersymmetry were true, the solutions of the renormalisation equations for the coupling constants for all three theories line up at a single point. This is certainly interesting, if inconclusive. If it were true, then it would mean that the relative strength of the three forces was not an accident. It wouldn't remove fine-tuning, but would reduce the number of parameters we need to consider.

Supersymmetry is a proposed discrete symmetry of an extended standard model linking Fermion fields with Boson fields and vice versa. It was originally proposed as a putative solution to the hierarchy problem -- why isn't the observed Higgs mass pushed up to infinity (or at least the energy scale where gravity becomes important) by radiative corrections from heavier particles. This remains one of the outstanding problems in particle physics; the standard model "solution" relies on some incredible cancellations through fine tuning, which is seen as unsatisfactory. Bosons and Fermions contribute to the correction to the Higgs mass with opposite signs, so the idea was if you link each Fermion field with a Boson field, you can get a precise cancellation.

If supersymmetry were exact, then you would expect that for every Fermionic field, you would expect to see a Boson field with the same mass and similar properties. So, as well as the electron, we would see the selectron with the same mass. Obviously this would have been observed if it existed, so proponents of supersymmetry have to propose that the symmetry is not exact but broken. This means that the selectron will still exist, but at a higher mass. In order to still solve the hierarchy problem, there is a limit to how high this mass should be. At least for the simplest models, this should be within the range of the LHC in CERN, and at least for the simplest models it has not been observed. This means that the idea of supersymmetry is in trouble, and with it the idea that there is a single point where the renormalised coupling of the three forces meet.

There are other issues with the idea that the three forces are actually a unified force that was spontaneously broken. The weak force and strong force are different to each other. Firstly, the weak force is a chiral gauge theory, while the strong force isn't. Fermion fields are described by four parameters, which represent four different particles: for example left-handed electrons, right-handed electrons, left-handed anti-electrons (or positrons), right-handed anti-electrons. (Left and right-handedness is related to the particles spin, but not tied to any particular axis.) The force-carrying fields in the electro-weak theory couple differently to the left and right handed particles. In the strong interaction, the coupling is the same. This is a fundamental difference between the strong nuclear force and the electro-weak sector, and it is difficult to see how it could have arisen if the forces were unified. Equally, the mass eigenstates of the strong and electroweak contributions to the Hamiltonian operator are different. There are three families of fermions. Among the quarks, we have, for example, the down, strange and bottom quarks. The gauge fields interact with these different types of quarks in the same way -- the only difference is that they have different masses. This means that you can write the Hamiltonian operator describing the time evolution of these quarks as three separate equations, with the same gauge term and an individual mass, or as one matrix equation with all the information where the masses are represented by a diagonal matrix. Expressing it as a matrix means that you can easily consider transformations analogous to rotations which combine the different types of fermion field -- so rather than having a down quark creation operator, strange quark creation operator and bottom quark creation operator, you can use a basis where the creation operators contain a bit of all three fields. Of course, the penalty for doing this would be that the mass matrix will no longer be diagonal, making it harder to calculate things. So a bit pointless, you might think. Except that is what happens in nature. There are two different sectors in the standard model -- the electro-weak and the strong, and they act upon the quark fields as expressed in slightly different bases. So the heaviest quark as far as the strong interaction is concerned is actually a linear combination of the three quark fields as far as the electro-weak sector is concerned. If all the forces were originally unified into a single force, then this would be unnatural.

This is before we consider spontaneous symmetry breaking. This is a real phenomena, and it is important. It relates to the ground, or minimum energy, state of a quantum field. Sometimes there is not just one ground state, but a large number of them. These are linked together by a symmetry of the action. However, in real life the particle can't occupy all of these states at once. It has to pick one of them. This breaks the symmetry -- one state is preferred over the others. We say that the symmetry is broken by the vacuum state, or spontaneous state. It is one of the few cases in physics where a symmetry of the action is not mirrored by a symmetry in the physical world. However, there is a tale-tale sign of spontaneous symmetry breaking -- it either leads to the emergence of a massless Boson for symmetry broken (known as a Goldstone Boson), or, if it is a gauge symmetry that's broken, the Bosons that mediate the gauge field become massive. There is spontaneous symmetry breaking of both types in the standard model. Firstly, there is spontaneous symmetry breaking of the Higgs field, which leads to the W and Z gauge Bosons becoming massive. Secondly, there is an approximate symmetry of the strong interaction known as chiral symmetry (related to left and right-handedness), which would be exact if the quark masses are zero. The up and down quark masses are still light enough that we can still see a remnant of the effects of this symmetry. The symmetry is spontaneously broken in such a way that we expect to see three massless (or near-massless, given that the symmetry isn't exact) Bosons. And we do; these are the pions.

So for spontaneous symmetry breaking, we first of all need a field with a a ground state with the particular symmetry that is broken, and we expect to see either massless Bosons or massive gauge Bosons. There is (at least to my knowledge) no such field whose ground states are connected by the symmetry of the underlying groups of the gauge fields, and all the massless particles and massive gauge Bosons are either accounted for by the spontaneous symmetry breaking in the standard model, or the gauge fields which (in the absence of spontaneous symmetry breaking) are forced to be massless by the gauge symmetry and don't have the properties we expect for Goldstone Bosons. Perhaps there is some idea proposed to circumvent all this which I am not aware of (and I am a standard model person, and not an expert in beyond the standard model physics, so that is quite possible), but if so, it is speculative, unproven, and has to overcome a number of obstacles.

And that is before we come to the question of gravity. Gravity is very different from the other forces. The symmetry it is based on relates to space and time, not the gauge symmetries of the other forces. Obviously string theory claims to incorporate gravity into a quantum theory, but as far as I know it doesn't do so by merging gravity with the other forces. Furthermore, string theory is notoriously bad at actually making useful predictions -- it is consistent with pretty much anything, so can't be used to explain fine-tuning except via a multiverse, and it requires supersymmetry for its mathematical consistency. As I mentioned above, supersymmetry is in trouble, and string theory will fall with it.

So Professor Stenger's first objection is a) irrelevant, since what is true in it is already accounted for by those who perform the fine-tuning calculations; and b) based on speculative and quite possibly incorrect physics. Fine tuning arguments are based on the best known physics. It is fully accepted that maybe some new idea in the future will come along and explain some of the coincidences. If that happens, then the argument will have to be reassessed. But it is equally true that a theory of quantum gravity might make the fine tuning problem worse, or leave it unchanged. So arguing on the basis of unknown physics could go any way. We have to construct arguments based on what we know now, not atheistic wishful thinking about what we might know in the future. Even if some of the parameters are dependent on each other, and determined by a single parameter in the unified theory, that does not remove the fine-tuning. It only will lead to more stringent requirements on that single parameter.

Professor Stenger's study of stars

Professor Stenger's study of stars is not usually regarded as being very good. There is a full discussion by Luke Barnes here, but the main objections are:

  1. It relies on incorrect stellar physics
  2. He picks a fairly loose example of fine-tuning
  3. He fails to consider other fine-tuning conditions
  4. Professor Stenger's choice of measure is biased towards life-giving universes

Professor Stenger's case is based around his next objection, which I will discuss here. He claimed that if you vary just one parameter, while keeping all the others constant, then you indeed find that you can't vary it very much before you run into catastrophic problems. However, if you vary all the parameters at once, you can weave a much longer path through the space of parameters, allowing you (in this case) to have long-living stars.

No advocate for fine tuning denies this. The problem is that there is more than one fine tuning condition. So you can avoid the stellar lifetime problem by varying the parameters as Professor Stenger does. And you can avoid (for example) the proton decay problems by changing all the parameters together in another way. But if you vary the parameters in such a way that the stellar lifetime condition that Professor Stenger considers remains satisfied, you will break the proton decay condition. That's the real issue in fine tuning: if you manoeuvre to continue to satisfy one condition, you will break a whole host of others.

Professor Stenger calls out the advocates of fine-tuning for only varying one parameter at a time. If they did this, then it would be a serious error on their part. But they don't do it. Professor Stenger instead makes the mistake of only considering one fine-tuning condition at a time. It is fully acknowledged that each condition by itself only prevents a fairly weak constraint on the space of possible universes. It is when you put them all together that you get the strong constraint.

For example, take a square piece of paper. The axes on the paper represent the possible values of two of the parameters. Draw a line from one corner of the paper to the other. That represents one fine tuning condition. Everything above the line leads to a universe unsuitable for life. Everything below it is OK. And that looks fine -- you still have half the page. Now draw another line down the other diagonal, representing another fine tuning condition. Discard everything above the line. Again, the line by itself only discards half the possible space, but if you put them together, the available parameter space becomes smaller. Now draw another line, horizontal, about three fifths of the way from the top of the page. Discard everything below it. By my reckoning, the allowed area is down to 1 percent of the original space. Thus even though each condition by itself doesn't seem to be so stringent, the combined effect leaves us with only a limited area to manoeuvre. Professor Stenger is saying "Look, the first condition isn't so bad, so fine-tuning can't be bad," and "Look, the second condition isn't so bad, so fine tuning can't be bad," but ignores the problem of putting all the conditions together. While those who argue for fine-tuning don't make the mistake he attributes to them. They know what their conditions mean, and know that if you vary multiple parameters each condition in itself is not so bad.

The measure is basically how you define the probability that you get any particular physical parameter. So what is the probability that the fine structure constant is 1/137, plus or minus 1 percent? The question is actually meaningless. You can only define a probability in terms of some distribution. For example, you could select a uniform range (over some finite interval), or a Gaussian distribution centred on the physical value, or a host of other possibilities. Which one you select ought to be determined by whatever mechanism generates universes. The problem face in fine-tuning arguments is that we don't know what mechanism is. We thus don't know what measure to use. So while we can say that the parameters can only vary over a tiny range of values if the universe is to be able to support life, we can't assign a probability to that number, at least not in an objective way, and not with our current knowledge. This is a genuine unsolved problem for fine-tuning arguments. It is evaded by noting that whatever the true probability distribution happens to be, the probability for a life giving universe is going to be absolutely tiny.

But whatever the correct probability distribution describing the possible values of the physical parameters is, it is not going to be biased towards the actual values in an atheistic model. Professor Stenger's paper makes that mistake: he biases his probability distribution such that the real physical values are most probable, and you can't vary too far from them. Thus it is not surprising that he finds that it is highly probable that there is a life giving universe. But his methodology introduces bias -- and as I wrote earlier, solving fine tuning by introducing bias in the sampling leads one inexorably to the conclusion that God exists and is responsible for it all. Not what Professor Stenger wants to do.

Independence of the parameters

The independence or otherwise of the parameters is only relevant when calculating the precise odds of the universe being as it is. Since the calculation of those odds is never done precisely, because of the measure problem, objections of this sort only amount to a minor correction to the chances. If two of the parameters were not independent, then the odds that the universe is fine tuned by chance will shorten, but almost certainly not by enough to avoid the conclusion that the only two reasonable options are design and a multiverse.

Equally, our best theory is that the parameters are independent. We have to present the case based on our best available knowledge -- we can't speculate based on degrees of dependence in theories which we don't know and don't know whether or not they are true. If it later becomes established that the parameters are dependent on each other, then we would have to revise any conclusions drawn from fine tuning. That's how science works: when new data comes in which disagrees with your previous model, you change the model, and any conclusions drawn from it, so it fits both the old and new data. But it would be foolish to do so now when there is no evidence to support the claim.

Aguire, Hogan and Nakamura

Apart from his own paper, in opposition to the hundreds of papers which suggest fine tuning, Professor Stenger has presented three counter-examples.

Firstly, there is this paper by Antony Aguire. It is an interesting paper, related to the cosmological fine-tuning. Since I am not a cosmologist, I am not the best person to critique the paper; I note that Luke Barnes, who is a cosmologist, accepts the findings of the paper. The paper considers six basic parameters: the photon to baryon ratio, the cosmological constant, the spatial curvature scale, the ratio of non-baryonic dark matter to baryonic matter, the lepton to baryon ratio, and the amplitude of the variations in the density of matter (called Q). An earlier paper on fine tuning by Tegmark and Rees, assumed a particular cosmological model known as the hot big bang model, which maps what happened in our universe. They presented a fairly stringent limit on the density perturbations Q. The fine tuning in this case represents galaxy formation. On one hand, galaxies would not cool quickly enough; on the other, stars would be so dense that there wouldn't be stable planetary orbits. However, Aguirre showed that there are other models based around a cold big bang model (where there is absolute zero temperature at the start of the universe) which can also fit the criteria for life.

He searches for sets of parameters which allow for the formation of a main-sequence star with a moderate fraction of heavy elements such as carbon, nitrogen, oxygen, etc. The star must burn steadily and without significant disturbance (e.g., which would disrupt planetary orbits) for more than an "evolutionary timescale" consistent with how long it took for life to evolve on this planet.

Aquirre's conclusion is that our particular regime of parameter space is not the only one which could support life, at least as far as the stellar evolution and his conditions are concerned. However, a few criticisms can be drawn of the work as a refutation of fine-tuning. Firstly, his allowed region of parameter space, while less restrictive than that of Tegmark and Rees, is still very restrictive. Secondly, he only considers some of the fine tuning criteria. It might be that other criteria rule out the larger variation (although I am not an expert on the topic, so it equally might not be the case). The other criticism of this work is that it assumes a cold big bang model, which we know did not happen in our own universe. The question then would be "Why are we in this hot big bang sector of parameter space, with its very limited room for manoeuvre, rather than the cold big bang, with its larger range of universes which support life?"

Hogan's paper discusses the standard model parameters. He mentions that a few criteria, the age of the universe, and the stability of DNA, are not fine tuned, but roughly the time that we would expect it to take for life to emerge on one hand, and the underlying physical law and symmetries on the other. These are not criteria that advocates for the fine tuning argument generally champion. However, the main focus on the paper is on the effect of grand unification can be used to create links between some of the key parameters, so they are no longer independent of each other. He concludes that it is true for some of the parameters, but there is still enough freedom in the particle masses for a multiverse to select a life-giving universe.

As such, I fail to see why Professor Stenger cited this paper as leading to similar conclusions to Aguirre's study. The relevant parts of the paper support the idea that there is fine tuning, and just discuss it in the context of (speculative) grand unified theories.

Nakamura's paper is again a short cosmology paper. It looks at one fine tuning requirement: the requirement for sufficient supernovae in the first generation of stars to scatter heavier elements around the galaxy. The key parameter is the initial mass function of the galactic disk, which describes the proportion of stars in the galaxy that have a particular mass. This function is known for the galaxy as it is today, but the question concerns what it resembled for the galaxies when they were first formed. That is unknown (or at least was when the paper was written back in 1996). They argue that the number of large stars which could supernova depends on the total mass of all the stars in the galaxy and the mass of the smallest stars. If the smallest and most common first generation stars had been brown dwarfs, as we observe today, then there would have been too few supernova to scatter the heavy elements around the galaxy without fine tuning of parameters. However, instead, there was a proposal that the smallest first generation stars were given by a function of the proton mass and Planck mass, which comes out as about the same mass as our own sun. If you plug this into their formula, then you find a large number of supernovae in that first generation. This is likely to be true regardless of the actual formula of the initial mass function. They thus conclude that regardless of the initial mass function, there would be enough heavy elements to form planets.

I'm not a cosmologist, so can't comment on the quality of research in Nakamura's paper. I'll assume that it is valid.

We can then compare the paper against what Professor Stenger said about it:

And, theoretical physicists at Kyoto University in Japan have shown that heavy elements needed for life will be present in even the earliest stars independent of what the exact parameters for star formation might have been.

Again, I fail to see how the actual paper supports Professor Stenger's case. Nakamura's paper is not a wide ranging discussion of the parameters for star formation. Instead, it discusses one scenario, concludes that a bit of unknown physics won't change the conclusion that heavy elements can form, and introduces a relatively weak fine-tuning condition on the proton mass and the strength of gravity.

So of these three papers, only Aguirre's actually does what Professor Stenger claims of it. Aguirre's paper is a warning that researchers into fine tuning need to be careful that there might be other regions in parameter space aside from our own might support life. It is discussed in the fine-tuning literature, and regarded as insufficient to change the conclusion that the only options are the multiverse or design.

String theory and determination of parameters

String theory is renowned for not being able to make any predictions about low energy (standard model) physics. There are just too many possible universes consistent with it. Maybe when Professor Stenger was writing, there was a hope that a unified theory would force the parameters to take the particular values that they do, but that hope, if it existed, has diminished. Instead, the flexibility of string theory is usually used to support the multi-verse model. The other candidate unified theories equally show no evidence of determining a single set of the physical parameters.

We should also not be speculating about unknown physics. At the moment, we have to judge on the basis of the information and evidence we have. It is, however, likely that any unified theory will also depend on constants. At some point, we will reach the limit of physical explanations, and have to turn to a philosophical explanation of those theories. But the best a philosophy can do is demand self-consistency. I can see how there might be a philosophical explanation of why the laws of physics have the basic structure they do, or of the symmetries that constrain them. Ultimately these are discrete statements; you either have the symmetry or you don't. There is no middle ground. But these constants are different, because they are real numbers. If the universe is consistent with a certain parameter taking the value x, then why would that consistency break down if the value was instead a tiny bit larger? There is no obvious reason why the rest mass of the electron should be 0.511MeV rather than 0.512Mev (or, more accurately, whatever those numbers imply about the electron-Higgs coupling, which is the relevant dimensionless constant). Possibly the constants we ponder over aren't fundamental, and are instead derived from something else, but that something else will still have to be fine tuned. I might be wrong here, and if so any conclusions would have to be changed, but at the moment we cannot hope that a grand unified theory will solve the fine tuning problem by fixing the standard model constants to a particular value without introducing other constants which equally need fine tuning.

The Hoyle state

The 20% figure (actually it is closer to 16%) is based on the difference between the energy level of the 0+ carbon excitation and the required energy to fuse three helium atoms. However, this is not the number we are actually interested in. What is needed is the allowed variation in the underlying fundamental parameters.

That the 0+ resonance exists is not the question. It follows from the structure of the basic theory. The problem is related to the energy of the resonance and its width. The target energy has to be within the width of the resonance in order for Carbon to be produced. This produces limits on the fine structure constant and the strength of the strong interaction which have been tentatively calculated to be one part in 100000 and 0.4% respectively (see Barnes' paper for the references). There is still some uncertainty in these calculations, but the figures are probably in this sort of range. This will again be a band in parameter space rather than two absolute limits, so you could escape the limits by varying both parameters together, but then you would violate other fine tuning requirements. (Of course, vary the parameters enough, and another Carbon resonance will move into the target window.)

Equally, discussing the difference as a percentage of the observed value of the parameter is misleading. To use these percentages as the measure of the degree of fine tuning is to bias the calculation towards the observed value. We should be comparing it against the whole range of possible values of the parameter, and they would be a much smaller percentage of that phase space.

So Professor Stenger's objection here is poor even by his standards. His 16% variation takes an incorrectly calculated percentage of the incorrect observable. The Hoyle state is still a good example of fine tuning.

Vacuum energy calculation

The 1 in 10120 fine tuning of the vacuum energy to get the correct cosmological constant is obviously a fairly extreme example of fine tuning, if correct. But here I actually agree with Professor Stenger that we shouldn't have confidence in this calculation, though not for the same reason. The original calculation of the vacuum energy has always struck me as being rather ad-hoc, based on various assumptions which need not be correct and arbitrarily cutting the integration off at the scale which gravity becomes important. It is based on a particular philosophical view of the quantum vacuum. We can avoid the problem by accepting a different model of the vacuum, with different assumptions. The whole area is too reliant on the interface between gravity and quantum theory for us to take any calculation here too seriously.

Non-carbon based life

Could life be based on something other than carbon and oxygen? Perhaps. But what Professor Stenger needs to avoid fine tuning is for life to be based on just Hydrogen to Boron -- the elements below carbon in the periodic table. The second problem is that life needs sufficient chemical complexity to store, read, and pass on information. That in turn requires large enough molecules to do that, which in turn needs an atom which is capable of forming three or four covalent bonds. The only atom in the period cable before carbon capable of doing that is boron. But in practice, boron doesn't form long chain molecules which just contain beryllium, lithium and hydrogen. These molecules would have to be sufficiently complex to store information. The burden is, I think, on Professor Stenger, to propose a model where boron based life is possible.

His argument is also flawed because of its over-reliance on the empiricist world view, based on the false assumption that we can only know what we directly observe. Just because we can't observe universes with other parameters, doesn't mean that we can't say anything about them.

Why would God need fine tuning?

Since God is omnipotent, why would he need fine tuning? Wouldn't it be more miraculous if he created life in a universe which ought not to be able to support it, given the laws of physics?

The problem with this argument is that it assumes that the laws of physics operate independently of God. If instead we adopt theism, then the laws of physics are a description of God's activity. Now let us suppose that we have the sort of universe that Professor Stenger is advocating for. The bulk of the universe is governed by deistic laws hostile to life, but there is a bubble somewhere where God constantly and miraculously intervenes to ensure that living processes continue. For this to work, and for there to be no interruptions in the life-supporting bubble, God's interventions in that bubble would have to be reliable and regular. In that bubble, those life forms would interpret that regularity as laws of physics, and they will uncover that they were supportive of life. They wouldn't know that those "laws" were actually due to the continual miraculous intervention of God. They wouldn't know that life unfriendly laws were the norm elsewhere in the universe, and if they did find it out, they would create a more general -- and still fine tuned -- rule explaining the universe both inside and outside their bubble. In other words, it is logically inconsistent to say that there could be life in a universe whose laws don't support life. God's omnipotence does not entail that the self-contradictory could happen.

So how big is this life permitting bubble? Is it just a solar system? A galaxy? An observable universe? Why should God create a universe with one set of laws in one place and another set in another, given that God is outside of space and time, and therefore to be expected to relate to each point in space in the same way?

And there might be good reasons why God wanted to create a regular and simple universe. Why not create the universe as we observe it? "The heavens declare the glory of God, and the sky his handiwork." While I am sure that Professor Stenger would have regarded the words of the Psalmist as naive, if God wanted to produce that sense of awe through the heavens in early human society, then it would be difficult to do so if there were no heavens (or at least no hydrogen in the heavens, and hence no visible stars). Now, of course, we know that there is far more to the heavens than just what is visible, and can explain everything by detailed mathematical laws. But there is still the same sense of awe, only directed to the magnificence of those laws. At least there ought to be.

So would God create a fine-tuned universe? In part, because omnipotence doesn't deny the need for logical consistency. And with what remaining freedom is left, because that's the universe God, for whatever reason, wanted to create, to bring glory to Himself, and to display His love to His creation.

Conclusion

The argument from fine tuning isn't a robust argument for God's existence, unless you have independent grounds for ruling out a multiverse. I don't want to discuss the pros and cons of the multiverse here. Equally, it is not a robust argument for the existence of a multiverse, unless you have independent grounds for denying the existence of God. Furthermore, existence of a multiverse (in this sense) wouldn't rule out the existence of God. I think the fine tuning argument is useful as part of a deductive argument taking the existence of God as a premise, and asking what sort of universe we might expect as a consequence, as a check in self-consistency. It limits the possibilities for the sort of universe we could live in if God existed, and thus improves the explanatory power of "the God hypothesis."

What fine tuning arguments do achieve is to embarrass those atheists whose arguments rest on the supposed lack of evidence for theism. I would, of course, deny that there is a lack of evidence for theism. But it puts atheism in the same position that they claim for the theist. The fine tuning argument shows that atheism can only be true if there is a multiverse. But, aside from fine-tuning and some theoretical speculations, there is no empirical evidence for the multiverse. Thus atheism suffers from a problem on requiring something with no direct evidence for it, and the only indirect evidence just as much supports the existence of God.

Professor Stenger obviously tried to avoid this problem by denying the force of the fine-tuning argument. But none of his objections, at least in God, the failed hypothesis, stand up to scrutiny. The fine tuning argument still looks as strong as ever. So, unless some dramatic new evidence comes in, it is either theism, deism or the multiverse. Your choice.

Next time, I will complete my look at this chapter, and ask whether the general hostility of the universe to life is an argument against God's existence.



Is God a failed Hypothesis? Part 9: The inefficient God


Reader Comments:

1. Gerlof
Posted at 11:04:57 Tuesday February 2 2021

Life

Very insightful. Thank you!

How would you define 'life' though?

2. Nigel Cundy
Posted at 18:28:17 Tuesday February 2 2021

Life

Good question, and probably one that is disputed by the experts. I'm not sure how the people who argue about the anthropic principle would define life. But I would personally define a living organism as a complex inhomegenous organism (i.e. not a crystalline substance such as a salt crystal) which has an intrinsic tendency towards growth, reproduction, and the absorbtion of nutrients from outside itself.

3. Michael Brazier
Posted at 01:16:39 Wednesday February 3 2021

The Multiverse

The notable thing about the multiverse is that, not only is there no empirical evidence for it, but there could not be such evidence, even in principle. Any object from a universe with different physical constants, if somehow transferred to this universe where we could observe it, would be altered by that very transfer into an object that could have arisen in this universe, or else cease to exist.

In contrast, empirical evidence that God exists and sustains the universe at every moment can easily exist - a miracle is, precisely, such evidence. So for arguments based on "lack of evidence", the believer in the multiverse is actually much worse off than the theist. The theist knows what evidence of God's interest in the cosmos would look like; the multiverse theorist can't even suggest how one universe could influence another.

4. Charlesimard
Posted at 18:58:59 Monday May 10 2021

Looking for a girl for one night?

MEET HOT LOCAL GIRLS TONIGHT WE GUARANTEE FREE SEX DATING IN YOUR CITY CLICK THE LINK:

<a href="https://about.me/alexa.smith">FREE SEX</a>



Post Comment:

Some html formatting is supported,such as <b> ... <b> for bold text , < em>... < /em> for italics, and <blockquote> ... </blockquote> for a quotation
All fields are optional
Comments are generally unmoderated, and only represent the views of the person who posted them.
I reserve the right to delete or edit spam messages, obsene language,or personal attacks.
However, that I do not delete such a message does not mean that I approve of the content.
It just means that I am a lazy little bugger who can't be bothered to police his own blog.
Weblinks are only published with moderator approval
Posts with links are only published with moderator approval (provide an email address to allow automatic approval)

Name:
Email:
Website:
Title:
Comment:
What is 1+3?