**Entropy**

**HISTORY**

The last four centuries have
witnessed unprecedented progress. In the 18^{th} century Western
Civilization finalized the mapping of the global oceans via sail, the 19^{th}
was the age of steam power and the taming of continents, the 20^{th}
saw the widespread production of fossil fuels and electricity, and so far the
21^{st} is experiencing a biological revolution.

Central to these endeavors was the development of machinery that surpassed the limits of human muscle power. In the mid to late 19th century, theories for such devices based on easily measured macroscopic quantities emerged, initially for modeling the operation of a steam engine. In particular, Rudolf Clausius and William Thomson (Lord Kelvin) formulated the three modern Laws of Thermodynamics which required the novel and enigmatic concept of “entropy”.

At about the same time, Ludwig Boltzmann was able to derive a formula for entropy that was a consequence of the existence of atoms. His elegantly simple equation was

S = k_{B} log
W

in which S is the entropy, W
is the number of possible arrangements of all the molecules, and k_{B }is
a constant named after Boltzmann by Max Plank who was also the inventor of
quantum mechanics.

Boltzmann was brilliant but unfortunately suffered from what we would now call a bipolar disorder. While on a summer vacation with his family, he was tragically driven to commit suicide. In his will he asked that his equation be inscribed on his tombstone. But in time, the predictive success of his ideas thoroughly discredited earlier beliefs as heat being some type of fluid.

But beyond the engineering utility, entropy has profound philosophical implications. It allows us to distinguish between past and present and places implacable constraints on the origin and evolution of the universe. It dictates no less than where we came from, where are going, and how we will get there.

**FOUNDATIONAL CONCEPT**

** **

What Boltzmann realized is that the number of possible arrangements of molecules was not only a measure of disorder but more importantly their capacity for useful work. Counter intuitively, the extraction of useful work or the evolution of complexity can only happen when we go from an ordered state to a disordered one. In this sense, useful work might include driving a piston in a steam engine, the formation of gemstones, or even the growth of living things.

A typical thought experiment involves the rooms of your house. If your clothes are restricted to hangers in your closet, the number of possible arrangements is much smaller than if some were also strewn across the room. In this latter case, the many new locations might be anyplace on the bed or on a dresser or perhaps carelessly dropped on the floor. And in this case we would say you clothes have a greater number of available arrangements or that the room was messier or more disordered.

Likewise if water vapor molecules in steam were constrained at high temperature and pressure to a small space and then allowed to expand into a bigger volume, the number of possible configurations has increased. That is to say, there are more places each molecule could be. This process might be used to drive a piston in a steam engine propelling a railroad train. In general the process is

Increased number of possible configurations -> greater disorder -> potential for useful work

In any event, the mathematical description of these phenomena is the modern field of “statistical mechanics”.

**EMERGENCE OF COMPLEXITY**

Crystal growth and the growth of living things.

**INFORMATION **

One surprising advance was the abstraction of entropy to quantities not previously thought to be related to thermodynamics..

In the late 1940’s, Claude Shannon was an electrical engineer working at Bell Labs trying to describe how much information any given message contained. He wanted to use this to determine the shortest possible codes or the greatest data compression that could be used to transmit it without any loss of meaning.

In “pig Latin” for instance, the first letter of each word is moved to the end and appended with the syllable “ay”. A standardized coding in which this last syllable was omitted would shorten the transmission packet without any loss of understanding at the receiver which could add it back in later.

Not surprisingly, messages with more complexity are harder to shorten or compress. Note that complexity means more apparent randomness or fewer noticeable patterns. In a random string, each bit or letter is not correlated to neighboring ones. Any repeating patterns, could be removed at the transmitter and re-inserted at the receiver to increase the message compression. This would be published as a standardized coding format.

From this humble beginning, we now have the science of “information theory”. But the terms represented novel quantities. In particular Shannon didn’t know what to label his new “information content” as he described in a somewhat whimsical manner.

“My greatest concern was what to call it. I thought of calling it ‘information’, but the word was overly used, so I decided to call it ‘uncertainty’. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, ‘You should call it ‘entropy’, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name. In the second place, and more importantly, no one knows what entropy really is, so in a debate you will always have the advantage [2].”

**INFINITE QUANTITIES**

There are a lot of configurations. And it is true that to the extent various states are possible, they will eventually be realized by random chance.

The number of configurations is astronomical.

As for configurations or arrangements, if one has two balls one of which is red and the other green, they can be placed next to each other along a line in one of two ways. Either the red or green ball can come first.

If they can be separated and placed on a longer line with perhaps 10 distinct locations for each ball, the number of configurations increases. Then one has 10 choices for the first ball and 9 for the other for a total of 90 distinct arrangements.

In a similar manner, if the allowed space is a square with each side of length 10, then the number of configurations increases to 100x99 = 9,900. And if the space is a cube with an edge length 10, then the number of possible configurations is 1000x999 = 990,000.

Clothes on hangers in a closet or strewn about a room. Since each location is possible, in time we will cycle eventually realize each arrangement. But not all arrangements are created equal. The number of arrangements is a measure of the disorder of the system.

Entropy always increases (even though it is theoretically possible for it to decrease)

Because each configuration is different, there will be local microscopic variations. In practice however microscopic variations are quickly smoothed out. Macroscopic properties come to equilibrium at rates almost too short to measure. The variation of macroscopic properties is vanishingly small.

Entropy is related to the gross statistical properties of a system and is thus really only defined at equilibrium. It is thus only defined before and after and not during transitions. In part this is because there are an astronomical number of possible configurations.

Need to eliminate infinities in calculations. Configurations need to be quantized.

**ERGODIC HYPOTHESIS**

** **

The basic idea is that for a given number of gas molecules in a chamber, they can arrange themselves in any number of ways. Any particular molecule can be in middle or near the walls or anywhere in between. Each instantaneous arrangement of all the molecules is called a configuration. Unfortunately the microscopic details, that is the position and velocity or each of the molecules that constitute a configuration, are impossible to determine. But what we can measure is some macroscopic property like perhaps the pressure or the temperature. And we can repeatedly take measurements, each measurement taking a short length of time.

What we would like to know is the macroscopic property, e.g. pressure or temperature, averaged over all the configurations. Unfortunately while we assume all the configurations are equally likely and that the system will eventually experience each and every one, this would take a very long time. In fact the time is so astronomically huge, it boggles the imagination. But this would be the theoretically correct average value.

So what the Ergodic Hypothesis says, is that the measurements we can make, i.e. the repeated macroscopic ones, are good enough. That is to say that measuring the temperature with a thermometer which has had time to equilibrate with the gas in the chamber would be exactly the same as if we calculated the average for all possible microscopic configurations.

And this works pretty well. The pressure, for instance, is caused by molecules bouncing off the walls and varies a lot micro-second to micro-second. If we measure the position of a piston with a weight it seems to be an unchanging constant from a macroscopic point of view for as long as anyone has had the patience to study it. But from a nitpicking theoretical viewpoint this equivalence has not absolutely been proven and remains a thorn in the side of physicists.

**FLOW OF TIME**

While the equations of physic are thought to be reversible in time, entropy is not.

Things that are more mathematically probable are more likely to happen. And they are more probable by definition if there are more ways for them to happen. It is for instance easier to lose the lottery than to win it because there are many more ways to lose by a factor of literally hundreds of millions. And the probability of the total entropy of any closed system reversing is much less than that. For even small simple systems, for instance a cubic millimeter of air, probabilities are so small they are astronomically insignificant even over the lifetime of the universe raised to an exponential power. For anything larger, the probabilities decrease and again exponentially.

**INITIAL ORDER**

** **

The universe is currently expanding so it must have been much smaller in the past. Also we see many physical processes developing that are increasing the available configurations and entropy is decreasing.

Since the greater the volume available to molecules in a gas the greater the entropy, Stephen Hawking thought that perhaps if the universe were to eventually began to contract, entropy might reverse.

He struggled with this but finally had to admit the current formulation of entropy when he published that his prior reservations were unfounded and disproven.

But of course ordered states are not always harnessed for human use. For example, the Sun’s uneven heating of the Earth’s surface gives rise to temperature and pressure differences creating the winds. While the wind has the potential to drive a windmill or an electric turbine, the energy is not always captured. A more common occurrence is that the energy is dissipated bringing a uniformity to the environment. Heat death of the universe.

The global climate is driven by temperature differences that permit work to be done. If everything gets hotter but remains uniform, the weather is stagnant. Energy which is evenly distributed permits no real evolution of the system. The amount of disordered energy is called entropy and is proportional to the number of accessible states. When applied to the universe as a whole, this is a “heat death” in which nothing ever changes.

That is not to say that much fundamental work in theoretical physics has not been done in attempting to describe the evolution of the very early universe, once the initial seeds came into existence by other erstwhile means, and from a variety of perspectives.

The universe must have had a beginning in a highly ordered state. Likewise the universe must have an ending.

**PHILOSOPHICAL
CONSEQUENCES**

Working out the mathematical details, however, verifies that from an entropic, as well as a quantum mechanical, perspective that such a progression could not have continued indefinitely; and thus still does not remove the necessity of a beginning of everything at some point in the past.

Questions related to the origin of natural law bear on considerations of a clockwork universe by raising the possibility of a hidden and higher order of things than can be described by physics alone.

Since the universe was more ordered in the past and has not yet entirely degraded, it could not have existed into the infinite past. Rather the observation is that we are still able to make use of the flow of order to disorder to power our machines and ourselves. Thus the entire universe must have begun at some finite time in the past in a well-ordered state that has attempted ever since to maximize its entropy. This means the universe must have had a moment of creation in what science can only label a “supernatural” event [1].

And this original event is not
describable by science. Rather the absence of nature and natural law logically
requires a “super-natural” agency (i.e. above nature). The idea is that if
there is nothing physical, no space, energy, matter, time, or any of the
fundamental forces, no physical cause is available to create anything. The
universe must have been created “ex nihilo” in a way we can never physically
describe. This is a logical requirement of cause and effect, was
recognized by the Greeks in learned treatises nearly a millennia before Christ,
is a founding principle of modern science, and continues to confound would be
atheists even into the 21^{st} century.

Atheists as we may recall emphatically deny the supernatural and especially the existence of God the Creator. The difficulty of course is that if everything is purely physical governed entirely in principle by natural law, then every observable effect must have a predictable cause. And there must be an infinite regression of this chain of happenings into the unfathomable past. Unfortunately, the manifest observation of entropy says this is impossible.

So as our scientific knowledge advances, atheism is ever more forced into the gaps of what logic and mathematics and science has yet to discover. And so atheists are forced to believe, with a blind faith bordering on mysticism and superstition, that our well tested principles of physics are somehow mistaken. The best escape mechanism or rationalization any nihilist has yet offered is that instead the cosmos existed forever. This does remove the logical necessity of a supernatural creation event but also blindly denies most of the understandings and observations of modern science.

And since the ultimate consequence of the anti-scientific atheist delusion is the denial of an absolute moral code that permits one’s libertine and profligate instincts free reign, this is perhaps its greatest attraction. And in a manner analogous to entropy, this mental disorder is also increasing and likewise shows no signs of abating.

**REFERENCES**

1. “Why Physicists Can’t Avoid a Creation Event”, by Lisa Grossman, New Scientist, January 11, 2012; http://www.newscientist.com/article/mg21328474.400-why-physicists-cant-avoid-a-creation-event.html?

2.
“Information Theory and
Thermodynamics”, by Tribus *Helvetica
Physica Acta* (1963).

3. “The Beginning of Time” lecture by S. Hawking, online at http://www.hawking.org.uk/the-beginning-of-time.html

4. “Relativity, Thermodynamics, and Cosmology” by Tolman, Oxford University Press, Oxford, UK, (1934), [Early work on cyclic universes with Einstein somewhat out of date].

5. “A Larger Estimate of the Entropy of the Universe,” Egan and Lineweaver, Astrophysical Journal Volume 710 (2010), page 1825.

6. “Inflationary Spacetimes Are Incomplete in Past Directions”, Borde, Guth, and Vilenkin, Phys. Rev. Lett. 90, 151301 (2003).

[Basically, a stricter accounting of entropy already present in an accelerating universe seems to eliminate the possibility of some of the more fanciful cyclic models.]