Chaos and Entropy
An Investigation on the Seeming Paradox Between Classical Mechanics and Thermodynamics
Written for PHYS-3141: Thermodynamics final project as COVID-19 forces campus shut down.
1 Introduction
Ever since the 1600s, the Newtonian laws of motion, as well as universal gravitation, has allowed for a deterministic approach in motion calculations. From a minute wooden block sliding on a frictionless surface to a planet drifting around our sun, everything seems predictable. And if given a description of the state that the object is in, one can reasonably trace along its trajectory to see its past and future.
Such total determinism, as Laplace stated, would allow an all-knowing intellect to see the future before it happens. Using empirical evidence, one holds Newton’s laws to be true. In terms of entropy, such deterministic systems would experience a zero change in entropy when undergoing time evolution. This statement seems paradoxical to the second law of thermodynamics, which illustrates that entropy never decreases in an isolated system, and the entropy of an isolated system would remain constant only when all the processes involved are strictly reversible.
Evidently, one would argue that randomness potentially involved in Newtonian systems could contribute to an increase in entropy since the added randomness effectively makes the future or past states of a system hard to be determined based on its initial, unaffected state. However, since mainstream fundamental physics models rule out the existence of randomness, the entropy involved here should only be discussed within the domain of chaos and imprecision during measurements.
On paper, one observes that a Newtonian, or classical, deterministic system does satisfy the second law of thermodynamics in terms of maintaining a constant entropy if the said system is strictly reversible. Recall that the rule only applies to a system that is absolutely sheltered from all randomness, as random motions are not strictly reversible. It is for this reason that one can claim that a classical system, which is microscopically deterministic, should have an increase in entropy when evolving through time. Another aspect that one may have oversighted about the second law is that an isolated system can also maintain a constant entropy when it is in thermal equilibrium.
Since the second law of thermodynamics is always true, the question is: why would an evolving classical system remain in its constant entropy state when it should be gaining entropy?
To answer the question, this project would first examine a few basic concepts within thermodynamics and classical mechanics and deduce how entropy behaves when classical systems undergo time evolutions. Then proceed to take a good look at what entropy and thermal equilibrium really is in classical mechanics. And at last, this project would seek to find systems that shares similar properties.
2 Background
Before one jump into action, it is critical for one to have a rough understanding about the scope of this investigation. As previously stated in the introduction, essentially this project aims to resolve the seeming paradox between classical mechanics and thermodynamics. In doing so, one cannot hide from dealing with chaos.
Intuitively, thinking that a simple system such as a block sliding on a frictionless surface at constant speed would be unpredictable is quite bizarre. Nonetheless, one still needs to recognize that a classical system can be subjected to subtle changes in its initial conditions while undergoing deterministic time evolutions, which would result in a randomized behavior that is “chaotic”. One should differentiate stochastic processes from chaotic time evolutions, as the former one is not necessarily in the scope of this investigation.
A double pendulum is a typical example for a chaotic classical system. Each individual pendulum, under zero resistance, would exhibit simple harmonic motions. When two pendulums are combined, their collective motions start to be unpredictable. Subtle variations from the initial conditions, including their length, mass, initial velocities, etc., would result in dramatically different trajectories.
One can easily find a general set of motion equations to quantify the differences outlined by the chaotic time evolution when the initial state is altered. Computer modelling can also help with visualizing the “chaotic” nature of this classical system. In the meantime, non-deterministic time-evolutions like the original Langevin equation that describes Brownian motions would serve as an excellent example for stochastic processes – looks random but not.
3 Entropy of Classical Systems
The second law of thermodynamics is essentially an empirical law, which means that it is established upon observations and experiments of specific instances. Therefore, what are the conditions needed for the second law to work on a classical system? Could a deterministic classic system gain entropy when evolving through time?
3.1 Reversible Deterministic Motions
In the most fundamental sense, the answer is no. Since all the parameters are predetermined, one can easily construct a set of equations to describe the motion of the system, which generates a zero uncertainty about the state of the system wherever along the axis of time.
Therefore, it does not matter which direction one decides to trace the motions towards, as the entire process is reversible, where motions trajectories are independent on the direction of time. The amount of information needed to describe such a state remains constant. In other words, the entropy of this classic system does not increase throughout deterministic time evolutions.
Some would take the effect of randomness into account to convince one that the added randomness would render the state in a description with probabilities. Admittedly, this theory does help with explaining the increased entropy when a “randomized” system evolves. But since such randomness is not quite compatible with mainstream fundamental physics theories, one could safely store it for more trivial considerations.
3.2 The Argument on Precision
Now one may ask, how would a deterministic classical system gain entropy, then, if all randomness is negated and all parameters of the initial state are known? Well, again, it still should not. In this case, the “state variables” that describe the system effectively form a contour over time. However, this is the most ideal of situations where the measurements of the initial states are exact. (Drossel, 2014)
To obtain the exact knowledge of the initial state would require an infinite precision on measurements, which is quite impossible. Back to Laplace’s theory, only precisely measured states can be totally deterministic. Recall the sensitivity discussed prior, minute variations in initial conditions would result in drastically different results. Without calculation, one can deduce that, generally, a real-world classic system must increase in entropy as it evolves for one cannot acknowledge all state parameters with infinite precision.
3.3 Boltzmann-Planck Equation
Another way of looking at the problem is to study the scope that these theories would work on. For the second law or thermodynamic laws in general, they work on closed macroscopic systems. At the same time, classical mechanics functions on microscopic interacting particles. In another word, entropy is a summed-up property of a collective of microstates within the macroscopic closed system.
A typical representation of the scenario mentioned above would be a well-insulated box of ideal gas. One can take the box itself as the boundary where heat, work, and substance cannot pass through. Observe Boltzmann’s equation on entropy:
S = k log W.
Entropy S of this box of ideal gas is defined by the logarithm of the total amount of microstates W within the box. That is, entropy shows how many possible ways that the particles contained within the system can be arranged. In detail, this equation shows that the second law is stochastic, where probability plays an important role, while classical systems at microscopic level behaves in time-reversible deterministic ways.
When calculating motions equations in classical systems, it is often assumed that a macroscopic classical system would share the same properties as a microscopic one. Again, since a macroscopic system usually contains astronomical number of particles, the complexity of particle arrangement would skyrocket, which makes it intuitively wrong to assume equivalence between microscopic and macroscopic systems.
One way of reconciling with this difference is to assume that the behavior of such macroscopic system exhibits the most likely behavior of all the particles involved, and within a limited time frame, the result from such calculation deviate very little from total-deterministic calculations, increasing the uncertainty of its trajectory overtime. This approximation seems to have worked quite well. That is, entropy increases within multi-particle systems if these systems have not yet reached their equilibriums, but appropriate approximations could offer a good-enough result when these systems are treated as deterministic. Consequently, take precisions into consideration, macroscopic systems that are “quasi-static” like ideal gases should also not be able to get totally determined by a collective of their microstates, yielding an increasing entropy along the passage of time.
3.4 General Comment about Entropy
History and the evolution of science has passed on the teaching repeatedly that theories only work within their respective domains. A classical system is often idealized for the sake of simplicity and efficiency, and when considering only small masses moving around a small region of space with slow motions, general relativity does not necessarily need to be involved to produce a better prediction. Such deterministic view certainly clashes with a more generalized, averaged approach taken by thermodynamics. Consider a fundamental equation for thermodynamics:
dE = TdS - PdV
During the calculations, one often holds one of the variables as constant. This process itself reflects a degree of deterministic approach as it is assumed that the initial state of the measured system is precisely known. Also, when considering a reversible process, as the evolutionary trajectory of the system forms a closed loop, entropy is not changed. Which has already become quite self-contradictory to the claim made by thermodynamics, where the microstate reflects some degree of probability distribution of the collective microstates, and therefore seemingly unpredictable in the long run. How about take a step back to consider a most miniature system containing a single gas particle:
dS = dS(System) + dS(Environment)
As a subsystem of some larger system, its entropy is effectively influenced by its surrounding environment. Take the surrounding environment into this subsystem and form a slightly larger parcel, where the entropy of the parcel is again dependent on its environment. Repeating the process and eventually, the whole observable universe would be involved. From a god-like perspective, the case where entropy is held at constant seems ridiculously impractical as spontaneous processes decorates the universe all the time.
4 In Terms of Thermal Equilibrium
Linking up classical mechanics with thermodynamics, one can consult the fundamental equation that is mentioned in the previous chapter:
dE = dQ + dW = TdS - PdV = TdS - FdL
To force entropy to change, the system must undergo an irreversible process. (Somsikov, 2007) It is often said that entropy appears to be reaching a maximum as an isolated system evolves towards an equilibrium, and it appears to be so because the observer of such systems is ignorant about the complete understanding of all degrees of freedom involved. (Henriksson, 2019)
4.1 An Isolated System
So, for an isolated system, no interactions with the environment is permitted through an adiabatic, fixated boundary. Energy changes within the system is therefore non-existent as dQ is held at zero. How would such isolated system behave entropy-wise under constant energy? Consider this system to be a box of ideal gas, in which many particles are interacting with each other constantly. At any time, there is a possibility for some form of interactions to occur, and such distribution of probability, or uncertainty, can be view as an approach to weigh entropy. When the system reaches a thermal equilibrium, all forms of interaction between the particles must occur with equal possibility, where, if put it as an end state of the system, tracing backwards in finding an initial state of the system is impossible to accomplish.
Admittedly, for an isolated system, if all the initial conditions are known perfectly, accompanying the laws of particle motions, the information that describes these particles are never lost. It is in the real-world scenarios that such prediction can only last for a limited time. Breaking these motion trajectories down to a finite region, entropy is conserved, for the precision of predicting motions on an infinitesimally small segment of time should be at infinite precision.
Relating such idea of entropy to current pandemic, as the information about the outbreak is constantly updated, the initial state for such closed system is constantly breaking its own equilibria. This effectively shows that the evolution of some large scale, constantly interacting system would behave ever more in a chaotic fashion as perfect information about the system can never be achieved.
4.2 Chaos in State Space
Carrying onwards from the last section, the scale of an isolated system clearly has an impact of the complexity of it. Allow one to formulate a graph where different state of an isolated system is labelled. At any given time and from some initial state, there are some probability that hints how the system would evolve. As time passes, the number of possible evolution states grows exponentially. By Boltzmann’s equation, entropy of such isolated system would also grow in an exponential way.
An interesting case here is that in some combinations of probability distributions, the system might evolve to a state where the possible evolved states are the same as each other. In this instance, all probable outcomes are equivalent, resulting in the observer obtaining more information about the system, which reflects a net loss on entropy. Some would ask if this violates the second law in some way. Well, the possibility of such interactions to occur within all the possible combinations of evolutions is extremely trivial. Consequently, considering all other possible paths of evolution, the end overall result, at the perspective of an observer, is a constant loss of information until nothing can be recognized anymore. (Henriksson, 2019)

The illustration above from Henriksson provides an insight on how chaos is perceived, in the latter part of Henriksson’s paper, it is summarized that: “Any given observer, whose knowledge about the initial conditions of any given system is limited, tend to lose information about the system at an exponential rate until there is none left.“ (Henriksson, 2019)
In terms of entropy, if considered in such fashion, does not increase in its essence. As for some over-the-top beings, the conditions of such systems are perfectly known, which leads to no surprises in determining the path such system would take. Effectively, it is saying that entropy of deterministic systems is actually absent, it is because that observers cannot obtain perfect information about the system that entropy seemingly increases. And putting this subject in terms of thermal equilibriums, deterministic systems are always in a state of equilibrium, one just can never perceive it.
4.3 Segway on the Arrow of Time
Seemingly, these discussion on entropy has yield some interesting questions relating to philosophy about the essence of time. In the universe where all classical or other similar systems are deterministic to some observer, future is independent on the time evolution of these systems as these tiny fluctuations generated by the interactions between individual systems may be perceived as some stochastic processes. And for such observer, as any initial condition can yield an accurate prediction on any past or future state, time is irrelevant in directionality. Anywhere in the state space of that universe, systems are perfectly in thermal equilibria.
Deducing from the claims in previous sections, thermal equilibriums of some classical system essentially shows that the state that the classical system in is made up by a flat distribution of possible states. For such system, since there is no potential for any of the substates to fall down to an even lower state, the system comes to a halt in time evolution. Additionally, if more subsystems are introduced into this system, the evolution would kick start again in search for an averaged potential.
5 Similar Systems
As this investigation has inquired, a system that is microscopically deterministic would certainly fall within the category. Intuitively, quantum mechanics may serve as a strong candidate. In quantum mechanics, wave functions are fundamental in describing how a system would behave.
5.1 About Quantum Mechanics
When treating time evolutions, regardless of the type of Hamiltonian one may choose, the output is usually more pleasant to look at if fewer particles are involved. However, it is yet again an ideal scenario. When thousands if not millions of particles are involved, the wavefunctions could be impossible to render out. Even if the calculations are somehow performed, the time it takes might have already skewed the prediction since the particles can also interact with each other in the meantime. These particles within the system evolve to reduce their collective potentials, in terms of thermodynamics, they evolve towards a thermal equilibrium.
During the process of reaching thermal equilibrium, it is analogous to a drop of coloring dispersing into a cup of water. Knowing when and where a drop of coloring was before it went in the cup, and afterwards such information is lost – how can one find that drop of coloring anyway. This increment of entropy can therefore only exist when the initial state of the wavefunctions are not infinitely accurate or when the evaluations of the wavefunctions are not performed promptly.
In the end, entropy falls onto spontaneous heat flows, and it seems that quantum mechanics cannot escape such restraint either. As one calculates time evolution with imperfect information, it is evidently impractical to trace back to its initial condition with a latter obtained observation. Such chaos calls back memories on Stern-Gerlach experiments, knowing the end results can help reason the process a particle took but there could be many ways for the particles to do so.
5.2 Other Investigations on Classical and Quantum Systems
Some paper combines classical mechanics and quantum mechanics together in various systems, including systems that are isolated, closed, open, big or small, and in equilibrium or not. This investigation unfortunately cannot dive in with the same depth with limited mathematical capabilities. However, by looking through these papers, Hamiltonians are usually discussed with great detail accompanying calculations based on more generalized thermodynamic functions in differential forms.
Such iterations come back to a statement that initial state of a classical system is self-reliant on its Hamiltonian. Amplifying this implication, such systems, if not all real-world systems, are simply chaotic in its nature. This is why some would say that classical mechanics is just a special case of thermodynamics where the state of the classical system is in a deterministic state of thermal equilibrium. Luckily, when dealing with non-deterministic systems, statistical mechanics can be of help while classical mechanics works out reasonable approximations with assumed deterministic approach.
6 Summary
In essence, neither classical mechanics nor quantum mechanics can truly be compatible with thermodynamics without the help of statistical mechanics or breaking some fundamental assumptions in physics. While classical mechanics and quantum mechanics deals with microscopic deterministic systems, thermodynamics deals with macroscopic systems. Since it is often assumed that the initial state of the systems considered are known with infinite precision, performing a direct calculation would yield an expected result that, if without enough precision, could be ridiculously different from actual experimental results.
In the course of this investigation, regardless of the approaches taken, one would discover that chaos itself stands on the decisive moment where thermodynamics parts ways with classical mechanics and quantum mechanics. As most systems are sensitive to small changes in their initial states, it is rather pointless to treat a classical system with added complexity of macroscopic non-deterministic approach. On the other hand, thermodynamics does not clash with quantum mechanics as harshly, as these theories both deal with probability distributions to some degree.
Generally speaking, it is only when abandoning the idea of infinite precision within total determinism that one can achieve reconciliation between classical mechanics and thermodynamics. To put it in another way, thermodynamics and classical mechanics are only seemingly paradoxical when entropy is involved. The differences in their relative scope of study and their approaches to the problem contributed to the. At the end, classical systems do obey the second law, and it just does not appear to be in such fashion.
7 Reference
Drossel. “On the Relation Between the Second Law of Thermodynamics and Classical and Quantum Mechanics.” ArXiv.org, 27 Aug. 2014, arxiv.org/abs/1408.6358.
Talkner, and Hänggi. “Statistical Mechanics and Thermodynamics at Strong Coupling: Quantum and Classical.” ArXiv.org, 27 Nov. 2019, arxiv.org/abs/1911.11660.
Henriksson. “On the Second Law of Thermodynamics.” ArXiv.org, 8 Nov. 2019, arxiv.org/abs/1905.06187.
Somsikov. “Thermodynamics Within the Framework of Classical Mechanics.” ArXiv.org, 15 Jan. 2005, arxiv.org/abs/cond-mat/0501357.
Somsikov. “Irreversibility in Classical Mechanics.” ArXiv.org, 9 Jan. 2006, arxiv.org/abs/physics/0601038.
Somsikov. “The Method of the Description of Dynamics Nonequilibrium Systems within the Frames of the Classical Mechanics.” ArXiv.org, 29 Sept. 2007, arxiv.org/abs/0710.0078.
Itoi, and Amano. “The Second Law of Thermodynamics from Concave Energy in Classical Mechanics.” ArXiv.org, 5 Dec. 2019, arxiv.org/abs/1912.03267.
Kurihara, et al. “Thermodynamics for Trajectories of a Mass Point.” ArXiv.org, 29 July 2014, arxiv.org/abs/1305.3724.
Kupervasser. “The Basic Paradoxes of Statistical Classical Physics and the Quantum Mechanics.” ArXiv.org, 10 Nov. 2013, arxiv.org/abs/ 0911.2076.