a blog about things that I've been thinking hard about

The Second Law of Thermodynamics Does Not Prohibit a Decrease of Entropy in a Closed System

5 August, 2006
creationist thermodynamics – not completely wrong

A beneficial mutation is a decrease of entropy in a closed system.

Which is apparently prohibited by the Second Law.

But, very, very tiny decreases in entropy are permitted.

Which is exactly how evolution by natural selection happens.

The Big Misconception

One of the biggest misconceptions about the Second Law of Thermodynamics is that it prohibits a decrease of entropy in a closed system.

It doesn't. What it does prohibit, and here I am being loose with the meaning of the word "prohibit", is a large decrease of entropy in a closed system. (What do I mean by "large"? I'll get to that soon.)

It is this myth of absolute prohibition that underlies the persistence of the creationist second-law-of-thermodynamics argument against the theory of evolution. Until the defenders of evolution recognise that evolution does require the occurrence of entropy decreases within closed systems, and that the Second Law does not prohibit these decreases, this argument will not fade into the obscurity that it deserves.

So What Does the Second Law of Thermodynamics Really Say?

The big achievement of statistical mechanics is the unification of two different notions of entropy. The first notion is entropy as the integral of dQ/T for reversible processes, where Q represents heat energy and T is temperature. The second notion of entropy is the negative logarithm of a probability, i.e. -log P where P is the probability.

To properly reconcile these two notions of entropy, you have to introduce a constant, which is called Boltzmann's constant, named after Ludwig Boltzmann, who discovered the relationship between these two types of entropy. The two equations of entropy are:

\(dS \geq \frac{dQ}{T}\) (with equality holding only for reversible processes)
and
\(S = -k log{P}\) (where \(k\) is Boltzmann's constant)

The value of \(k\) is 1.3806505 × 10-23 Joules/degree Kelvin. One way to understand this constant is to realise that the only reason it isn't equal to 1 is because we chose the wrong unit for temperature. If we deem \(k\) to be equal to 1, then the unit of temperature becomes the unit of energy per bit (give or take a factor of loge2, because "bits" represent a logarithm base 2, and the standard definition of entropy uses a logarithm base of e).

The Second Law Before Boltzmann

The Second Law was known before Boltzmann, when it was stated entirely in terms of the thermodynamic definition of entropy. The probabilistic interpretation of entropy was unknown, and the size of entropy involved was always so great that the corresponding probability was so small as to be indistinguishable from zero. Thus the prohibition.

But now we do know the probabilistic interpretation, and we can state that a decrease in entropy (in a closed system) is not prohibited, but that the maximum probability of it occurring is the probability corresponding to the required entropy decrease.

So How Large is "Large"?

What counts as a "large" entropy corresponds to whatever you wish to consider as an impossibly "small" probability. If we measure entropy in "bits" (i.e. log2), then 1 bit corresponds to a 50% probability (which isn't at all "small"), 10 bits corresponds to roughly 0.1% ("smallish") and 20 bits to one in a million ("quite small", but still not "impossibly small").

On a personal level, there are at most 4,000,000,000 seconds in your life, and if you observed a particular type of event once per second, you would probably never observe an outcome of that type of event which had a probability of 1/1,000,000,000,000, which corresponds to about 40 bits.

If we consider all the people in the world, that adds a factor of less than 8,000,000,000, corresponding to another 33 bits, i.e. 73 bits in total. If we consider events occuring 1023 times a second (the fastest timescale normally observed in particle physics), over 1010 years (the age of the universe) in positions spaced an atom width apart (about 10-9 metres) over the observable universe (of diameter about 1010 light years), then this gives a maximum number of observations of about:

\[10^{23} \times 3 \times 10^7 \times 10^{10} \times (10^9 \times 3 \times 10^8 \times 3 \times 10^7 \times 10^{10})^3 = 3^7 \times 10^{142} \approx 2 \times 10^{145}\]

This corresponds to a little less than 500 bits. In other words, in the entire history of the observable universe, it is very unlikely that a spontaneous decrease of entropy of 500 bits has ever occurred within a closed system.

For comparison, consider a macroscopic thermodynamic system. A very simple example is that of 2 grams of hydrogen gas in a closed cylinder, which corresponds to about 22 litres at room temperature and pressure. Consider the entropy required to compress this cylinder to half its original volume. It is easy to calculate the required entropy decrease from first principles, because it is the probability that each molecule will be found to be in a chosen half of the original volume, i.e. 2-N where N is the number of molecules, which in this case is equal to Avogadro's number = 6.02 × -23. In other words, 602,000,000,000,000,000,000,000 bits. Which is, as you may notice, way more than 500 bits. Even if we consider this problem with a volume which is one million times smaller in each direction (i.e. a cube about 1/3000 of a millimetre on each side, which is much smaller than you can see with the naked eye), we are still talking about 602,300 bits. And the corresponding probability of 2-602,000 is so unimaginably small as to be indistinguishable from "it ain't going to happen", and then some.

It follows that it is quite reasonable to be dogmatic about the impossibility of a macroscopic decrease in entropy in a closed system, even though it is incorrect to state that any decrease in entropy is impossible.

Granville Sewell

Granville Sewell is a Professor of Mathematics at the University of Texas El Paso, and last year (2005) John Wiley and Sons, Inc. published the second edition of his textbook "The Numerical Solution of Ordinary and Partial Differential Equations". Towards the end of this textbook there are some appendices, and in particular there is an Appendix D, which is titled "Can ANYTHING Happen in an Open System?". Appendix D is a restatement of the infamous creationist Second-Law-of-Thermodynamics argument against evolution.

Of course an appendix in a John Wiley and Sons, Inc. textbook written by a Professor of Mathematics sounds pretty impressive, and we might wonder what motivated John Wiley & Sons to allow such material into a textbook (it's too advanced for a school textbook, so there's nothing to be gained from trying to please American textbook censors etc.)

Anyway, the appendix contains lots of equations, including the equation of heat diffusion. On page 2 of the appendix, after the equation numbered D.5, Sewell states:

Hence, in a closed system, entropy can never decrease.

And he reiterates this statement and variations upon it on pages 3 and 4, as he makes his argument that evolution cannot occur, not even in an "open" system.

But if you have got this far through my article, you will already realise what the fallacy is: the assumption that the second law prevents any decrease in entropy in a closed system. Sewell's equations, with differentials and integrals, look impressive, but they all assume the macroscopic approximation: that any possible decrease in entropy being considered is "large" in the sense already described. The differential equation for heat diffusion assumes that random variations are so small relative to the volume being considered that they can be ignored. Whereas evolution by natural selection happens a few molecules at a time, often just one molecule at a time – when a DNA base gets replaced by another DNA base in a "lucky" (i.e. fitness-increasing) mutation. It's not surprising, if you make an argument based on differential equations which are derived in a manner that ignore random variations, that you will disprove the possibility of evolution by natural selection, whose basic mechanism is that of random variation. Unsurprising, but, unfortunately for the advancement of creationist "science", not very interesting.

Whether Appendix D will survive into Edition 3 of the textbook is anybody's guess. Granville Sewell might not be embarrassed by it, but I think John Wiley & Sons should be.

And for a fuller analysis of the thermodynamics of evolution by natural selection, you can read my article Evolution and the Second Law of Thermodynamics.

Vote for or comment on this article on Reddit or Hacker News ...