The Second Law of Thermodynamics What'sNEW
Sometimes people say that life violates the second law of thermodynamics. This is not the case; we know of nothing in the universe that violates that law. So why do people say that life violates the second law of thermodynamics? What is the second law of thermodynamics? The second law is a straightforward law of physics with the consequence that, in a closed system, you can't finish any real physical process with as much useful energy as you had to start with — some is always wasted. This means that a perpetual motion machine is impossible. The second law was formulated after nineteenth century engineers noticed that heat cannot pass from a colder body to a warmer body by itself. According to philosopher of science Thomas Kuhn, the second law was first put into words by two scientists, Rudolph Clausius and William Thomson (Lord Kelvin), using different examples, in 1850-51 (2). American quantum physicist Richard P. Feynman, however, says the French physicist Sadi Carnot discovered the second law 25 years earlier (3). That would have been before the first law, conservation of energy, was discovered! In any case, modern scientists completely agree about the above principles. Thermodynamic EntropyThe first opportunity for confusion arises when we introduce the term entropy into the mix. Clausius invented the term in 1865. He had noticed that a certain ratio was constant in reversible, or ideal, heat cycles. The ratio was heat exchanged to absolute temperature. Clausius decided that the conserved ratio must correspond to a real, physical quantity, and he named it "entropy". Surely not every conserved ratio corresponds to a real, physical quantity. Historical accident has introduced this term to science. On another planet there could be physics without the concept of entropy. It completely lacks intuitive clarity. Even the great physicist James Clerk Maxwell had it backward for a while (4). Nevertheless, the term has stuck. The American Heritage Dictionary gives as the first definition of entropy, "For a closed system, the quantitative measure of the amount of thermal energy not available to do work." So it's a negative kind of quantity, the opposite of available energy. Today, it is customary to use the term entropy to state the second law: Entropy in a closed system can never decrease. As long as entropy is defined as unavailable energy, this paraphrase of the second law is equivalent to the earlier ones above. In a closed system, available energy can never increase. Likewise, unavailable energy – entropy – can never decrease. A familiar demonstration of the second law is the flow of heat from hot things to cold, and never vice-versa. When a hot stone is dropped into a bucket of cool water, the stone cools and the water warms until each is the same temperature as the other. During this process, the entropy of the system increases. If you know the heat capacities and initial temperatures of the stone and the water, and the final temperature of the water, you can quantify the entropy increase in calories or joules per degree. You may have noticed the words "closed system" a couple of times above. Consider simply a black bucket of water initially at the same temperature as the air around it. If the bucket is placed in bright sunlight, it will absorb heat from the sun, as black things do. Now the water becomes warmer than the air around it, and the available energy has increased. Has entropy decreased? Has energy that was previously unavailable become available, in a closed system? No, this example is only an apparent violation of the second law. Because sunlight was admitted, the local system was not closed; the energy of sunlight was supplied from outside the local system. If we consider the larger system, including the sun, available energy has decreased and entropy has increased as required. Let's call this kind of entropy thermodynamic entropy. The qualifier "thermodynamic" is necessary because the word entropy is also used in another, nonthermodynamic sense.
Entropy is also used to mean disorganization or disorder. J. Willard Gibbs, the nineteenth century American theoretical physicist, called it "mixedupness." The American Heritage Dictionary gives as the second definition of entropy, "a measure of disorder or randomness in a closed system." Again, it's a negative concept, this time the opposite of organization or order. The term came to have this second meaning thanks to the great Austrian physicist Ludwig Boltzmann.
The model could also be used to show that two different kinds of gasses would become thoroughly mixed. The reasoning he used for mixing is very similar to that for the diffusion of heat, but there is an important difference. In the diffusion of heat, the entropy increase can be measured with the ratio of physical units, joules per degree. In the mixing of two kinds of gasses already at the same temperature, if no energy is dissipated, the ratio of joules per degree — thermodynamic entropy — is irrelevant. The non-dissipative mixing process is related to the diffusion of heat only by analogy (5). Nevertheless, Boltzmann used a factor, k, now called Boltzmann's constant, to attach physical units to the latter situation. Now the word entropy has come to be applied to the simple mixing process, too. (Of course, Boltzmann's constant has a legitimate use — it relates the average kinetic energy of a molecule to its temperature.) Entropy in this latter sense has come to be used in the growing fields of information science, computer science, communications theory, etc. The story is often told that in the late 1940s, John von Neumann, a pioneer of the computer age, advised communication-theorist Claude E. Shannon to start using the term "entropy" when discussing information, because "no one knows what entropy really is, so in a debate you will always have the advantage" (6). Richard Feynman knew there is a difference between the two meanings of entropy. He discussed thermodynamic entropy in the section called "Entropy" of his Lectures on Physics published in 1963 (7), using physical units, joules per degree, and over a dozen equations (vol I section 44-6). He discussed the second meaning of entropy in a different section titled "Order and entropy" (vol I section 46-5) as follows:
This is Boltzmann's model again. Notice that Feynman does not use Boltzmann's constant. He assigns no physical units to this kind of entropy, just a number (a logarithm.) And he uses not a single equation in this section of his Lectures. Notice another thing. The "number of ways" can only be established by first artificially dividing up the space into little volume elements. This is not a small point. In every real physical situation, counting the number of possible arrangements requires an arbitrary parceling. As Peter Coveney and Roger Highfield say (7.5): There is, however, nothing to tell us how fine the [parceling] should be. Entropies calculated in this way depend on the size-scale decided upon, in direct contradiction with thermodynamics in which entropy changes are fully objective.
In the same paper Shannon attaches no physical units to his entropy and never mentions Boltzmann's constant, k. At one point he briefly introduces K, saying tersely, "The constant K merely amounts to a choice of a unit of measure" (p 11). Although the the 55-page paper contains more than 300 equations, K appears only once again, in Appendix 2, which concludes, "The choice of coefficient K is a matter of convenience and amounts to the choice of a unit of measure" (p 29). Shannon never specifies the unit of measure. This sort of entropy is clearly different. Physical units do not pertain to it, and (except in the case of digital information) an arbitrary convention must be imposed before it can be quantified. To distinguish this kind of entropy from thermodynamic entropy, let's call it logical entropy.
It is true that crystals and other regular configurations can be formed by unguided processes. And we are accustomed to saying that these configurations are "organized." But crystals have not been spontaneously "furnished with organs." The correct term for such regular configurations is "ordered." The recipe for a crystal is already present in the solution it grows from — the crystal lattice is prescribed by the structure of the molecules that compose it. The formation of crystals is the straightforward result of chemical and physical laws that do not evolve and that are, compared to genetic programs, very simple. The rule that things never organize themselves is also upheld in our everyday experience. Without someone to fix it, a broken glass never mends. Without maintenance, a house deteriorates. Without management, a business fails. Without new software, a computer never acquires new capabilities. Never. Charles Darwin understood this universal principle. It's common sense. That's why he once made a note to himself pertaining to evolution, "Never use the words higher or lower" (9). However, the word "higher" in this forbidden sense appears half a dozen times in the first edition of Darwin's Origin of Species (10). Even today, if you assert that a human is more highly evolved than a flatworm or an amoeba, there are darwinists who'll want to fight about it. They take the position, apparently, that evolution has not necessarily shown a trend toward more highly organized forms of life, just different forms:
Life Is OrganizationSeen in retrospect, evolution as a whole doubtless had a general direction, from simple to complex, from dependence on to relative independence of the environment, to greater and greater autonomy of individuals, greater and greater development of sense organs and nervous systems conveying and processing information about the state of the organism's surroundings, and finally greater and greater consciousness. You can call this direction progress or by some other name. — Theodosius Dobzhansky (15)Progress, then, is a property of the evolution of life as a whole by almost any conceivable intuitive standard.... Let us not pretend to deny in our philosophy what we know in our hearts to be true. — Edward O. Wilson (16) Life is organization. From prokaryotic cells, eukaryotic cells, tissues and organs, to plants and animals, families, communities, ecosystems, and living planets, life is organization, at every scale. The evolution of life is the increase of biological organization, if it is anything. Clearly, if life originates and makes evolutionary progress without organizing input somehow supplied, then something has organized itself. Logical entropy in a closed system has decreased. This is the violation that people are getting at, when they say that life violates the second law of thermodynamics. This violation, the decrease of logical entropy in a closed system, must happen continually in the darwinian account of evolutionary progress. Most darwinists just ignore this staggering problem. When confronted with it, they seek refuge in the confusion between the two kinds of entropy. [Logical] entropy has not decreased, they say, because the system is not closed. Energy such as sunlight is constantly supplied to the system. If you consider the larger system that includes the sun, [thermodynamic] entropy has increased, as required.
Another typical example of confusion between the two kinds of entropy comes from a similar book by Tim M. Berra, Evolution and the Myth of Creationism. The following paragraph from that book would seem to indicate that any large animal can assemble a bicycle (18). For example, an unassembled bicycle that arrives at your house in a shipping carton is in a state of disorder. You supply the energy of your muscles (which you get from food that came ultimately from sunlight) to assemble the bike. You have got order from disorder by supplying energy. The Sun is the source of energy input to the earth's living systems and allows them to evolve. A rare example of the use of mathematics to combine the two kinds of entropy is given in The Mystery of Life's Origin, published in 1984. Its authors acknowledge two kinds of entropy, which they call "thermal" and "configurational." To count the "number of ways" for the latter kind of entropy they use restrictions which they later admit to be unrealistic. They count only the number of ways a string of amino acids of fixed length can be sequenced. They admit in the end, however, that the string might never form. To impose the units joules per degree onto "configurational" entropy, they simply multiply by Boltzmann's constant (19). Nevertheless, they ultimately reach the following profound conclusion (p 157-158): In summary, undirected thermal energy is only able to do the chemical and thermal entropy work in polypetide synthesis, but not the coding (or sequencing) portion of the configurational entropy work.... It is difficult to imagine how one could ever couple random thermal energy flow through the system to do the required configurational entropy work of selecting and sequencing. In Evolution, Thermodynamics and Information, Jeffrey S. Wicken also adopts the terms "thermal" and "configurational." But here they both pertain only to the non-energetic "information content" of a thermodynamic state, and "energetic" information is also necessary for the complete description of a system. Shannon entropy is different from all of these, and not a useful concept to Wicken. Nevertheless, he says that evolution and the origin of life are not separate problems and, "The most parsimonious explanation is to assume that life always existed" (19.5)!
Roger Penrose's treatment of entropy is worth mentioning. In The Emperor's New Mind (20), he nimbly dodges the problem of assigning physical units to logical entropy (p 314, Penrose's italics):
In order to give the actual entropy values for these compartments we should have to worry a little about the question of the units that are chosen (metres, Joules, kilograms, degrees Kelvin, etc.). That would be out of place here, and in fact, for the utterly stupendous entropy values that I shall be giving shortly, it makes essentially no difference at all what units are in fact chosen. However, for definiteness (for the experts), let me say that I shall be taking natural units, as are provided by the rules of quantum mechanics, and for which Boltzmann's constant turns out to be unity: k = 1. Someday in the future, an extension of quantum theory might provide a natural way to parcel any real physical situation. If that happens, one of the problems with quantifying logical entropy in a real physical situation will be mitigated. But nobody, not even Penrose, is suggesting that this is the case today. And even if that day comes, still we will have no reason to attach thermodynamic units to logical entropy. (Although the word "stupendous" appears again, no "actual entropy values" follow the quoted passage.) (See later comment by Penrose, 2012.) In The Refrigerator and the Universe (21), Martin Goldstein and Inge F. Goldstein wonder if there is "an irreconcilable difference" between the two kinds of entropy. They begin their consideration of logical entropy by discussing the possible arrangements of playing cards, where the parceling is not arbitrary — the number of possibilities can be counted. When they move to the world of physics, they are not concerned over the fact that parceling must now be done arbitrarily. They are concerned, initially, about attaching physical units to logical entropy. "...Entropy is measured in units of energy divided by temperature.... W [counting microstates] is a pure number" (p 173). But ultimately they apply Boltzmann's constant. No calculations using logical entropy with physical units ensue. The next time they mention logical entropy is in the section "Information and Entropy," where they divide the previous product by Boltzmann's constant to remove the physical units! An ambitious treatment of entropy as it pertains to biology is the book Evolution as Entropy, by Daniel R. Brooks and E. O. Wiley. They acknowledge that the distinction between the different kinds of entropy is important (22): It is important to realize that the phase space, microstates, and macrostates described in our theory are not classical thermodynamic constructs.... The entropies are array entropies, more like the entropies of sorting encountered in considering an ideal gas than like the thermal entropies associated with steam engines.... In fact the authors acknowledge many kinds of entropy; they describe physical entropy, Shannon-Weaver entropy, cohesion entropy, and statistical entropy, for example. They rarely use or mention Boltzmann's constant. One of their main arguments is that although the progress of evolution seems to represent a reduction in entropy, this reduction is only apparent. In reality, evolution increases entropy as the second law requires. But evolution does not increase entropy as fast as the maximum possible rate. So, by comparison to the maximum possible rate, entropy appears to be decreasing. Our eyes have deceived us! In another book entitled Life Itself, mathematical biologist Robert Rosen of Columbia University seems to have grasped the problem when he writes, "The Second Law thus asserts that... a system autonomously tending to an organized state cannot be closed " (23). But immediately he veers away, complaining that the term "organization" is vague. Intent on introducing terms he prefers, like "entailment," he does not consider the possibility that, in an open system, life's organization could be imported into one region from another. Hans Christian von Baeyer's 1998 book, Maxwell's Demon, is engaging and informative about the scientists who pioneered the second law. The story concludes with an interview of Wojciech Zurek of the Theoretical Division of the Los Alamos National Laboratory. Zurek introduces another second kind of entropy, because, "Like all scientific ideas, the concept of entropy, useful as it is, needs to be refurbished and updated and adjusted to new insights. Someday... the two types of entropy will begin to approach each other in value, and the new theory will become amenable to experimental verification" (23.5).
It seems that most biological mechanisms of action show that life involves far-from-equilibrium conditions beyond the stability of the threshold of the thermodynamic branch. It is therefore very tempting to suggest that the origin of life may be related to successive instabilities somewhat analogous to the successive bifurcations that have lead to a state of matter of increasing coherence. Some find such passages obscure and tentative. One critic complains that work along the lines advocated by Prigogine fifteen years earlier has borne little fruit subsequently. "I don't know of a single phenomenon he has explained," said Pierre C. Hohenberg of Yale University (25). Dr. Hubert P. Yockey gives the subject of entropy and biology a probing and insightful treatment in his monograph, Information theory and molecular biology (26). He emphatically agrees that there are different kinds of entropy that do not correlate. "The Shannon entropy and the Maxwell-Boltzmann-Gibbs entropy... have nothing to do with each other" (p 313). But Shannon entropy (which pertains to information theory) makes no distinction between meaningful DNA sequences that encode life, and random DNA sequences of equal length. (Shannon wrote, "These semantic aspects of communication are irrelevant to the engineering problem.") With no distinction between meaningful and meaningless sequences, Yockey is able to conclude that evolution does not create any paradox for Shannon entropy. Nevertheless, Yockey proves with impressive command of biology and statistics that it would be impossible to find the new genes necessary for evolutionary progress by the random search method currently in favor. He is deeply sceptical of the prevailing theories of evolution and the origin of life on Earth. (Cynthia Yockey, 2005)
In 1999's The Fifth Miracle (28), theoretical physicist and science writer Paul Davies devotes a chapter, "Against the Tide," to the relationship between entropy and biology. In an endnote to that chapter he writes, "'higher' organisms have higher (not lower) algorithmic entropy..." (p 277, Davies' italics) — another reversal of the usual understanding. He concludes, "The source of biological information, then, is the organism's environment" (p 57). Later, "Gravitationally induced instability is a source of information" (p 63). But this "still leaves us with the problem.... How has meaningful information emerged in the universe?" (p 65). He gives no answer to this question. The Touchstone of Life (1999) follows Prigogine's course, relying on Boltzmann's constant to link thermodynamic and logical entropy (29). Author Werner Loewenstein often strikes the chords that accompany deep understanding. "As for the origin of information, the fountainhead, this must lie somewhere in the territory close to the big bang" (p 25). "Evidently a little bubbling, whirling and seething goes a long way in organizing matter.... That understanding has led to the birth of a new alchemy..." (p 48-49). Exactly.
Computer scientist Rolf Landauer wrote an article published in June, 1996, which contains insight that should discourage attempts to physically link the two kinds of entropy. He demonstrates that "there is no unavoidable minimal energy requirement per transmitted bit" (31). Using Boltzmann's constant to tie together thermodynamic entropy and logical entropy is thus shown to be without basis. One may rightly object that the minimal energy requirement per bit of information is unrelated to logical entropy. But this supposed requirement was the keystone of modern arguments connecting the two concepts.
It is surprising that mixing entropy and biology still fosters confusion. The relevant concepts from physics pertaining to the second law of thermodynamics are at least 100 years old. The confusion can be eradicated if we distinguish thermodynamic entropy from logical entropy, and admit that Earth's biosphere is open to organizing input from elsewhere.
23 Oct - 01 Nov 2023: Several new laws of evolution are proposed....
References1. Harold J. Morowitz, Beginnings of Cellular Life: Metabolism Recapitulates Biogenesis, Yale University Press, 1992. p 69.2. Thomas Kuhn, Black-Body Theory and the Quantum Discontinuity, 1894-1912, The University of Chicago Press, 1978. p 13. 3. Richard P. Feynman, Robert B. Leighton and Matthew Sands, The Feynman Lectures on Physics, v I; Reading, Massachusetts: Addison-Wesley Publishing Company, 1963. section 44-3. 3.5. Julian Barbour, "A History of Thermodynamics" [100-page pdf], 2020. Detailed history and discussion. 4. Harvey S. Leff and Andrew F. Rex, Maxwell's Demon: Entropy, Information, Computing, Princeton University Press, 1990. p 6. 5. For technical discussions of this difference see The Maximum Entropy Formalism, Raphael D. Levine and Myron Tribus, eds., The MIT Press, 1979. Also see correspondence with Sergio Rossell beginning 27 Aug 2005 and 26 Sep 2005. Following these exchanges we have changed our text from "...if no heat is exchanged..." to "...if no energy is dissipated...". 6. Myron Tribus and Edward C. McIrvine. "Energy and Information," p 179-188 v 225, Scientific American, September, 1971. 7. Richard P. Feynman, Robert B. Leighton and Matthew Sands, The Feynman Lectures on Physics, v I; Reading, Massachusetts: Addison-Wesley Publishing Company, 1963. 7.5. Peter Coveney and Roger Highfield, The Arrow of Time, Ballentine Books, 1990. p 176-177. 8. C. E. Shannon, "A Mathematical Theory of Communication" p 379-423 and 623-656, v 27, The Bell System Technical Journal, July, October, 1948. pdf reprint from Harvard. 8.5. [Einstein quoted in] F. Alexander Bais and J. Doyne Farmer, "Physics of Information," SFI Working Paper 07-08-029, 2007. p 35. [Also quoted in] Constantino Tsallis, Murray Gell-Mann and Yuzuru Sato, "Asymptotically scale-invariant occupancy of phase space makes the entropy Sq extensive" [abstract], doi:10.1073/pnas.0503807102, p 15377-15382 v 102, Proc. Natl. Acad. Sci., USA, 25 Oct 2005. 9. Ernst Mayr, Toward a New Philosophy of Biology, Harvard University Press, 1988. p 251. 10. Charles Darwin, On the Origin of Species by Means of Natural Selection, or the Preservation of Favoured Races in the Struggle for Life. London: John Murray, Albemarle Street, 1859. 11. Lynn Margulis and Dorion Sagan, What Is Life? Simon and Schuster, 1995. p 44. 12. Stephen Jay Gould, [interviewed in] The Third Culture, by John Brockman, Simon and Schuster, 1995. p 52. 13. Richard Dawkins, [interviewed in] The Third Culture by John Brockman, Simon and Schuster, 1995. p 84. 14. John Maynard Smith and Eörs Szathmáry, The Major Transitions in Evolution, W.H. Freeman and Company Limited, 1995. p 4. 15. Theodosius Dobzhansky, Studies in the Philosophy of Biology: Reduction and Related Problems, Francisco J. Ayala and Theodosius Dobzhansky, eds. University of California Press, 1974. p 311. 16. Edward O. Wilson, The Diversity of Life, Harvard University Press, 1992. p 187. Wilson acknowledges Charles S. Pierce, who wrote, "Let us not pretend to doubt in philosophy what we do not doubt in our hearts," Collected Papers of Charles Sanders Pierce, v 5, Charles Hartshone and Paul Weiss, eds., Harvard University Press, 1934. 17. Philip Kitcher, Abusing Science, The MIT Press, 1982. p 90. 18. Tim M. Berra, Evolution and the Myth of Creationism: A Basic Guide to the Facts in the Evolution Debate, Stanford University Press, 1990. p 126. 19. Charles B. Thaxton, Walter L. Bradley and Roger L. Olsen, The Mystery of Life's Origin: Reassessing Current Theories, New York: Philosophical Library, 1984. p 136-142. The website has three online chapters. 19.5. Jeffrey S. Wicken, Evolution, Thermodynamics and Information: Extending the Darwinian Program, Oxford University Press, 1987. p 59. 20. Roger Penrose, The Emperor's New Mind, Oxford University Press, 1989. 21. Martin Goldstein and Inge F. Goldstein, The Refrigerator and the Universe: Understanding the Laws of Energy, Harvard University Press, 1993. 22. Daniel R. Brooks and E. O. Wiley, Evolution as Entropy, second edition; The University of Chicago Press, 1988. p 37-38. 23. Robert Rosen, Life Itself: A Comprehensive Inquiry Into the Nature, Origin and Fabrication of Life, Columbia University Press, 1991. p 114. 23.5. Hans Christian von Baeyer, Maxwell's Demon: Why Warmth Disperses and Time Passes, Random House, 1998. p 165. [review in physicsworld.com by Rolf Landauer, 8 Jan 1999]. 24. Ilya Prigogine, From Being To Becoming, New York: W. H. Freeman and Company, 1980. p 123. 25. [quoted in] John Horgan, "From Complexity to Perplexity," p 104-109, Scientific American June 1995. 26. Hubert P. Yockey, Information theory and molecular biology, Cambridge University Press, 1992. 27. Christoph Adami, Introduction to Artificial Life, Telos (Springer-Verlag), 1998. 28. Paul Davies, The Fifth Miracle, Simon and Schuster, 1999. 29. Werner R. Loewenstein, The Touchstone of Life: Molecular Information, Cell Communication, and the Foundations of Life, Oxford University Press, 1999. 30. Peter Medawar, Pluto's Republic, Oxford University Press, 1984. p 226. 31. Rolf Landauer, "Minimal Energy Requirements in Communication" p 1914-1918 v 272 Science, 28 June 1996. 32. E.P. Wigner, [cited by] E. T. Jaynes, "Gibbs vs Boltzmann Entropies" p 391-398 v 33 n 5 American Journal of Physics, May 1965. | ||||||||||||