fbpx
Wikipedia

Entropy (statistical thermodynamics)

The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy is formulated as a statistical property using probability theory. The statistical entropy perspective was introduced in 1870 by Austrian physicist Ludwig Boltzmann, who established a new field of physics that provided the descriptive linkage between the macroscopic observation of nature and the microscopic view based on the rigorous treatment of large ensembles of microstates that constitute thermodynamic systems.

Boltzmann's principle

Ludwig Boltzmann defined entropy as a measure of the number of possible microscopic states (microstates) of a system in thermodynamic equilibrium, consistent with its macroscopic thermodynamic properties, which constitute the macrostate of the system. A useful illustration is the example of a sample of gas contained in a container. The easily measurable parameters volume, pressure, and temperature of the gas describe its macroscopic condition (state). At a microscopic level, the gas consists of a vast number of freely moving atoms or molecules, which randomly collide with one another and with the walls of the container. The collisions with the walls produce the macroscopic pressure of the gas, which illustrates the connection between microscopic and macroscopic phenomena.

A microstate of the system is a description of the positions and momenta of all its particles. The large number of particles of the gas provides an infinite number of possible microstates for the sample, but collectively they exhibit a well-defined average of configuration, which is exhibited as the macrostate of the system, to which each individual microstate contribution is negligibly small. The ensemble of microstates comprises a statistical distribution of probability for each microstate, and the group of most probable configurations accounts for the macroscopic state. Therefore, the system can be described as a whole by only a few macroscopic parameters, called the thermodynamic variables: the total energy E, volume V, pressure P, temperature T, and so forth. However, this description is relatively simple only when the system is in a state of equilibrium.

Equilibrium may be illustrated with a simple example of a drop of food coloring falling into a glass of water. The dye diffuses in a complicated manner, which is difficult to precisely predict. However, after sufficient time has passed, the system reaches a uniform color, a state much easier to describe and explain.

Boltzmann formulated a simple relationship between entropy and the number of possible microstates of a system, which is denoted by the symbol Ω. The entropy S is proportional to the natural logarithm of this number:

 

The proportionality constant kB is one of the fundamental constants of physics, and is named the Boltzmann constant in honor of its discoverer.

Since Ω is a natural number (1,2,3,...), entropy is either zero or positive (ln 1 = 0, ln Ω ≥ 0).

Boltzmann's entropy describes the system when all the accessible microstates are equally likely. It is the configuration corresponding to the maximum of entropy at equilibrium. The randomness or disorder is maximal, and so is the lack of distinction (or information) of each microstate.

Entropy is a thermodynamic property just like pressure, volume, or temperature. Therefore, it connects the microscopic and the macroscopic world view.

Boltzmann's principle is regarded as the foundation of statistical mechanics.

Gibbs entropy formula

The macroscopic state of a system is characterized by a distribution on the microstates. The entropy of this distribution is given by the Gibbs entropy formula, named after J. Willard Gibbs. For a classical system (i.e., a collection of classical particles) with a discrete set of microstates, if   is the energy of microstate i, and   is the probability that it occurs during the system's fluctuations, then the entropy of the system is

 

Entropy changes for systems in a canonical state

A system with a well-defined temperature, i.e., one in thermal equilibrium with a thermal reservoir, has a probability of being in a microstate i given by Boltzmann's distribution.

Changes in the entropy caused by changes in the external constraints are then given by:

 

where we have twice used the conservation of probability, Σ dpi = 0.

Now, Σi d(Ei pi) is the expectation value of the change in the total energy of the system.

If the changes are sufficiently slow, so that the system remains in the same microscopic state, but the state slowly (and reversibly) changes, then Σi (dEi) pi is the expectation value of the work done on the system through this reversible process, dwrev.

But from the first law of thermodynamics, dE = δw + δq. Therefore,

 

In the thermodynamic limit, the fluctuation of the macroscopic quantities from their average values becomes negligible; so this reproduces the definition of entropy from classical thermodynamics, given above.

The quantity   is a physical constant known as Boltzmann's constant. The remaining factor of the equation, the entire summation is dimensionless, since the value   is a probability and therefore dimensionless, and the logarithm is to the basis of the dimensionless mathematical constant e. Hence the SI derived units on both sides of the equation are same as heat capacity:

 

This definition remains meaningful even when the system is far away from equilibrium. Other definitions assume that the system is in thermal equilibrium, either as an isolated system, or as a system in exchange with its surroundings. The set of microstates (with probability distribution) on which the sum is done is called a statistical ensemble. Each type of statistical ensemble (micro-canonical, canonical, grand-canonical, etc.) describes a different configuration of the system's exchanges with the outside, varying from a completely isolated system to a system that can exchange one or more quantities with a reservoir, like energy, volume or molecules. In every ensemble, the equilibrium configuration of the system is dictated by the maximization of the entropy of the union of the system and its reservoir, according to the second law of thermodynamics (see the statistical mechanics article).

Neglecting correlations (or, more generally, statistical dependencies) between the states of individual particles will lead to an incorrect probability distribution on the microstates and hence to an overestimate of the entropy.[1] Such correlations occur in any system with nontrivially interacting particles, that is, in all systems more complex than an ideal gas.

This S is almost universally called simply the entropy. It can also be called the statistical entropy or the thermodynamic entropy without changing the meaning. Note the above expression of the statistical entropy is a discretized version of Shannon entropy. The von Neumann entropy formula is an extension of the Gibbs entropy formula to the quantum mechanical case.

It has been shown[1] that the Gibbs Entropy is equal to the classical "heat engine" entropy characterized by  , and the generalized Boltzmann distribution is a sufficient and necessary condition for this equivalence.[2] Furthermore, the Gibbs Entropy is the only entropy that is equivalent to the classical "heat engine" entropy under the following postulates:[3]

  1. The probability density function is proportional to some function of the ensemble parameters and random variables.
  2. Thermodynamic state functions are described by ensemble averages of random variables.
  3. At infinite temperature, all the microstates have the same probability.

Ensembles

The various ensembles used in statistical thermodynamics are linked to the entropy by the following relations:[clarification needed]

 

Order through chaos and the second law of thermodynamics

We can view Ω as a measure of our lack of knowledge about a system. As an illustration of this idea, consider a set of 100 coins, each of which is either heads up or tails up. The macrostates are specified by the total number of heads and tails, whereas the microstates are specified by the facings of each individual coin. For the macrostates of 100 heads or 100 tails, there is exactly one possible configuration, so our knowledge of the system is complete. At the opposite extreme, the macrostate which gives us the least knowledge about the system consists of 50 heads and 50 tails in any order, for which there are 100,891,344,545,564,193,334,812,497,256 (100 choose 50) ≈ 1029 possible microstates.

Even when a system is entirely isolated from external influences, its microstate is constantly changing. For instance, the particles in a gas are constantly moving, and thus occupy a different position at each moment of time; their momenta are also constantly changing as they collide with each other or with the container walls. Suppose we prepare the system in an artificially highly ordered equilibrium state. For instance, imagine dividing a container with a partition and placing a gas on one side of the partition, with a vacuum on the other side. If we remove the partition and watch the subsequent behavior of the gas, we will find that its microstate evolves according to some chaotic and unpredictable pattern, and that on average these microstates will correspond to a more disordered macrostate than before. It is possible, but extremely unlikely, for the gas molecules to bounce off one another in such a way that they remain in one half of the container. It is overwhelmingly probable for the gas to spread out to fill the container evenly, which is the new equilibrium macrostate of the system.

This is an example illustrating the second law of thermodynamics:

the total entropy of any isolated thermodynamic system tends to increase over time, approaching a maximum value.

Since its discovery, this idea has been the focus of a great deal of thought, some of it confused. A chief point of confusion is the fact that the Second Law applies only to isolated systems. For example, the Earth is not an isolated system because it is constantly receiving energy in the form of sunlight. In contrast, the universe may be considered an isolated system, so that its total entropy is constantly increasing. (Needs clarification. See: Second law of thermodynamics#cite note-Grandy 151-21)

Counting of microstates

In classical statistical mechanics, the number of microstates is actually uncountably infinite, since the properties of classical systems are continuous. For example, a microstate of a classical ideal gas is specified by the positions and momenta of all the atoms, which range continuously over the real numbers. If we want to define Ω, we have to come up with a method of grouping the microstates together to obtain a countable set. This procedure is known as coarse graining. In the case of the ideal gas, we count two states of an atom as the "same" state if their positions and momenta are within δx and δp of each other. Since the values of δx and δp can be chosen arbitrarily, the entropy is not uniquely defined. It is defined only up to an additive constant. (As we will see, the thermodynamic definition of entropy is also defined only up to a constant.)

To avoid coarse graining one can take the entropy as defined by the H-theorem.[4]

 

However, this ambiguity can be resolved with quantum mechanics. The quantum state of a system can be expressed as a superposition of "basis" states, which can be chosen to be energy eigenstates (i.e. eigenstates of the quantum Hamiltonian). Usually, the quantum states are discrete, even though there may be an infinite number of them. For a system with some specified energy E, one takes Ω to be the number of energy eigenstates within a macroscopically small energy range between E and E + δE. In the thermodynamical limit, the specific entropy becomes independent on the choice of δE.

An important result, known as Nernst's theorem or the third law of thermodynamics, states that the entropy of a system at zero absolute temperature is a well-defined constant. This is because a system at zero temperature exists in its lowest-energy state, or ground state, so that its entropy is determined by the degeneracy of the ground state. Many systems, such as crystal lattices, have a unique ground state, and (since ln(1) = 0) this means that they have zero entropy at absolute zero. Other systems have more than one state with the same, lowest energy, and have a non-vanishing "zero-point entropy". For instance, ordinary ice has a zero-point entropy of 3.41 J/(mol⋅K), because its underlying crystal structure possesses multiple configurations with the same energy (a phenomenon known as geometrical frustration).

The third law of thermodynamics states that the entropy of a perfect crystal at absolute zero (0 kelvin) is zero. This means that nearly all molecular motion should cease. The oscillator equation for predicting quantized vibrational levels shows that even when the vibrational quantum number is 0, the molecule still has vibrational energy[citation needed]:

 

where   is Planck's constant,   is the characteristic frequency of the vibration, and   is the vibrational quantum number. Even when   (the zero-point energy),   does not equal 0, in adherence to the Heisenberg uncertainty principle.

See also

References

  1. ^ a b E.T. Jaynes; Gibbs vs Boltzmann Entropies; American Journal of Physics, 391 (1965); https://doi.org/10.1119/1.1971557
  2. ^ Gao, Xiang; Gallicchio, Emilio; Roitberg, Adrian (2019). "The generalized Boltzmann distribution is the only distribution in which the Gibbs-Shannon entropy equals the thermodynamic entropy". The Journal of Chemical Physics. 151 (3): 034113. arXiv:1903.02121. Bibcode:2019JChPh.151c4113G. doi:10.1063/1.5111333. PMID 31325924. S2CID 118981017.
  3. ^ Gao, Xiang (March 2022). "The Mathematics of the Ensemble Theory". Results in Physics. 34: 105230. Bibcode:2022ResPh..3405230G. doi:10.1016/j.rinp.2022.105230. S2CID 221978379.
  4. ^ Boltzmann, Ludwig (January 1995). Lectures on Gas Theory. ISBN 0-486-68455-5.

entropy, statistical, thermodynamics, this, article, relies, excessively, references, primary, sources, please, improve, this, article, adding, secondary, tertiary, sources, find, sources, entropy, statistical, thermodynamics, news, newspapers, books, scholar,. This article relies excessively on references to primary sources Please improve this article by adding secondary or tertiary sources Find sources Entropy statistical thermodynamics news newspapers books scholar JSTOR July 2018 Learn how and when to remove this template message The concept entropy was first developed by German physicist Rudolf Clausius in the mid nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible In statistical mechanics entropy is formulated as a statistical property using probability theory The statistical entropy perspective was introduced in 1870 by Austrian physicist Ludwig Boltzmann who established a new field of physics that provided the descriptive linkage between the macroscopic observation of nature and the microscopic view based on the rigorous treatment of large ensembles of microstates that constitute thermodynamic systems Contents 1 Boltzmann s principle 2 Gibbs entropy formula 2 1 Ensembles 3 Order through chaos and the second law of thermodynamics 4 Counting of microstates 5 See also 6 ReferencesBoltzmann s principle EditMain article Boltzmann s entropy formula Ludwig Boltzmann defined entropy as a measure of the number of possible microscopic states microstates of a system in thermodynamic equilibrium consistent with its macroscopic thermodynamic properties which constitute the macrostate of the system A useful illustration is the example of a sample of gas contained in a container The easily measurable parameters volume pressure and temperature of the gas describe its macroscopic condition state At a microscopic level the gas consists of a vast number of freely moving atoms or molecules which randomly collide with one another and with the walls of the container The collisions with the walls produce the macroscopic pressure of the gas which illustrates the connection between microscopic and macroscopic phenomena A microstate of the system is a description of the positions and momenta of all its particles The large number of particles of the gas provides an infinite number of possible microstates for the sample but collectively they exhibit a well defined average of configuration which is exhibited as the macrostate of the system to which each individual microstate contribution is negligibly small The ensemble of microstates comprises a statistical distribution of probability for each microstate and the group of most probable configurations accounts for the macroscopic state Therefore the system can be described as a whole by only a few macroscopic parameters called the thermodynamic variables the total energy E volume V pressure P temperature T and so forth However this description is relatively simple only when the system is in a state of equilibrium Equilibrium may be illustrated with a simple example of a drop of food coloring falling into a glass of water The dye diffuses in a complicated manner which is difficult to precisely predict However after sufficient time has passed the system reaches a uniform color a state much easier to describe and explain Boltzmann formulated a simple relationship between entropy and the number of possible microstates of a system which is denoted by the symbol W The entropy S is proportional to the natural logarithm of this number S k B ln W displaystyle S k text B ln Omega The proportionality constant kB is one of the fundamental constants of physics and is named the Boltzmann constant in honor of its discoverer Since W is a natural number 1 2 3 entropy is either zero or positive ln 1 0 ln W 0 Boltzmann s entropy describes the system when all the accessible microstates are equally likely It is the configuration corresponding to the maximum of entropy at equilibrium The randomness or disorder is maximal and so is the lack of distinction or information of each microstate Entropy is a thermodynamic property just like pressure volume or temperature Therefore it connects the microscopic and the macroscopic world view Boltzmann s principle is regarded as the foundation of statistical mechanics Gibbs entropy formula EditThe macroscopic state of a system is characterized by a distribution on the microstates The entropy of this distribution is given by the Gibbs entropy formula named after J Willard Gibbs For a classical system i e a collection of classical particles with a discrete set of microstates if E i displaystyle E i is the energy of microstate i and p i displaystyle p i is the probability that it occurs during the system s fluctuations then the entropy of the system isS k B i p i ln p i displaystyle S k text B sum i p i ln p i Entropy changes for systems in a canonical stateA system with a well defined temperature i e one in thermal equilibrium with a thermal reservoir has a probability of being in a microstate i given by Boltzmann s distribution Changes in the entropy caused by changes in the external constraints are then given by d S k B i d p i ln p i k B i d p i E i k B T ln Z i E i d p i T i d E i p i d E i p i T displaystyle begin aligned dS amp k text B sum i dp i ln p i amp k text B sum i dp i E i k text B T ln Z amp sum i E i dp i T amp sum i d E i p i dE i p i T end aligned where we have twice used the conservation of probability S dpi 0 Now Si d Ei pi is the expectation value of the change in the total energy of the system If the changes are sufficiently slow so that the system remains in the same microscopic state but the state slowly and reversibly changes then Si dEi pi is the expectation value of the work done on the system through this reversible process dwrev But from the first law of thermodynamics dE dw dq Therefore d S d q rev T displaystyle dS frac delta langle q text rev rangle T In the thermodynamic limit the fluctuation of the macroscopic quantities from their average values becomes negligible so this reproduces the definition of entropy from classical thermodynamics given above The quantity k B displaystyle k text B is a physical constant known as Boltzmann s constant The remaining factor of the equation the entire summation is dimensionless since the value p i displaystyle p i is a probability and therefore dimensionless and the logarithm is to the basis of the dimensionless mathematical constant e Hence the SI derived units on both sides of the equation are same as heat capacity S k B J K displaystyle S k text B mathrm frac J K This definition remains meaningful even when the system is far away from equilibrium Other definitions assume that the system is in thermal equilibrium either as an isolated system or as a system in exchange with its surroundings The set of microstates with probability distribution on which the sum is done is called a statistical ensemble Each type of statistical ensemble micro canonical canonical grand canonical etc describes a different configuration of the system s exchanges with the outside varying from a completely isolated system to a system that can exchange one or more quantities with a reservoir like energy volume or molecules In every ensemble the equilibrium configuration of the system is dictated by the maximization of the entropy of the union of the system and its reservoir according to the second law of thermodynamics see the statistical mechanics article Neglecting correlations or more generally statistical dependencies between the states of individual particles will lead to an incorrect probability distribution on the microstates and hence to an overestimate of the entropy 1 Such correlations occur in any system with nontrivially interacting particles that is in all systems more complex than an ideal gas This S is almost universally called simply the entropy It can also be called the statistical entropy or the thermodynamic entropy without changing the meaning Note the above expression of the statistical entropy is a discretized version of Shannon entropy The von Neumann entropy formula is an extension of the Gibbs entropy formula to the quantum mechanical case It has been shown 1 that the Gibbs Entropy is equal to the classical heat engine entropy characterized by d S d Q T displaystyle dS frac delta Q T and the generalized Boltzmann distribution is a sufficient and necessary condition for this equivalence 2 Furthermore the Gibbs Entropy is the only entropy that is equivalent to the classical heat engine entropy under the following postulates 3 The probability density function is proportional to some function of the ensemble parameters and random variables Thermodynamic state functions are described by ensemble averages of random variables At infinite temperature all the microstates have the same probability Ensembles Edit The various ensembles used in statistical thermodynamics are linked to the entropy by the following relations clarification needed S k B ln W m i c k B ln Z c a n b E k B ln Z g r b E m N displaystyle S k text B ln Omega rm mic k text B ln Z rm can beta bar E k text B ln mathcal Z rm gr beta bar E mu bar N W m i c displaystyle Omega rm mic is the microcanonical partition function Z c a n displaystyle Z rm can is the canonical partition function Z g r displaystyle mathcal Z rm gr is the grand canonical partition functionOrder through chaos and the second law of thermodynamics EditWe can view W as a measure of our lack of knowledge about a system As an illustration of this idea consider a set of 100 coins each of which is either heads up or tails up The macrostates are specified by the total number of heads and tails whereas the microstates are specified by the facings of each individual coin For the macrostates of 100 heads or 100 tails there is exactly one possible configuration so our knowledge of the system is complete At the opposite extreme the macrostate which gives us the least knowledge about the system consists of 50 heads and 50 tails in any order for which there are 100 891 344 545 564 193 334 812 497 256 100 choose 50 1029 possible microstates Even when a system is entirely isolated from external influences its microstate is constantly changing For instance the particles in a gas are constantly moving and thus occupy a different position at each moment of time their momenta are also constantly changing as they collide with each other or with the container walls Suppose we prepare the system in an artificially highly ordered equilibrium state For instance imagine dividing a container with a partition and placing a gas on one side of the partition with a vacuum on the other side If we remove the partition and watch the subsequent behavior of the gas we will find that its microstate evolves according to some chaotic and unpredictable pattern and that on average these microstates will correspond to a more disordered macrostate than before It is possible but extremely unlikely for the gas molecules to bounce off one another in such a way that they remain in one half of the container It is overwhelmingly probable for the gas to spread out to fill the container evenly which is the new equilibrium macrostate of the system This is an example illustrating the second law of thermodynamics the total entropy of any isolated thermodynamic system tends to increase over time approaching a maximum value Since its discovery this idea has been the focus of a great deal of thought some of it confused A chief point of confusion is the fact that the Second Law applies only to isolated systems For example the Earth is not an isolated system because it is constantly receiving energy in the form of sunlight In contrast the universe may be considered an isolated system so that its total entropy is constantly increasing Needs clarification See Second law of thermodynamics cite note Grandy 151 21 Counting of microstates EditIn classical statistical mechanics the number of microstates is actually uncountably infinite since the properties of classical systems are continuous For example a microstate of a classical ideal gas is specified by the positions and momenta of all the atoms which range continuously over the real numbers If we want to define W we have to come up with a method of grouping the microstates together to obtain a countable set This procedure is known as coarse graining In the case of the ideal gas we count two states of an atom as the same state if their positions and momenta are within dx and dp of each other Since the values of dx and dp can be chosen arbitrarily the entropy is not uniquely defined It is defined only up to an additive constant As we will see the thermodynamic definition of entropy is also defined only up to a constant To avoid coarse graining one can take the entropy as defined by the H theorem 4 S k B H B k B f q i p i ln f q i p i d q 1 d p 1 d q N d p N displaystyle S k rm B H rm B k rm B int f q i p i ln f q i p i dq 1 dp 1 cdots dq N dp N However this ambiguity can be resolved with quantum mechanics The quantum state of a system can be expressed as a superposition of basis states which can be chosen to be energy eigenstates i e eigenstates of the quantum Hamiltonian Usually the quantum states are discrete even though there may be an infinite number of them For a system with some specified energy E one takes W to be the number of energy eigenstates within a macroscopically small energy range between E and E dE In the thermodynamical limit the specific entropy becomes independent on the choice of dE An important result known as Nernst s theorem or the third law of thermodynamics states that the entropy of a system at zero absolute temperature is a well defined constant This is because a system at zero temperature exists in its lowest energy state or ground state so that its entropy is determined by the degeneracy of the ground state Many systems such as crystal lattices have a unique ground state and since ln 1 0 this means that they have zero entropy at absolute zero Other systems have more than one state with the same lowest energy and have a non vanishing zero point entropy For instance ordinary ice has a zero point entropy of 3 41 J mol K because its underlying crystal structure possesses multiple configurations with the same energy a phenomenon known as geometrical frustration The third law of thermodynamics states that the entropy of a perfect crystal at absolute zero 0 kelvin is zero This means that nearly all molecular motion should cease The oscillator equation for predicting quantized vibrational levels shows that even when the vibrational quantum number is 0 the molecule still has vibrational energy citation needed E n h n 0 n 1 2 displaystyle E nu h nu 0 n begin matrix frac 1 2 end matrix where h displaystyle h is Planck s constant n 0 displaystyle nu 0 is the characteristic frequency of the vibration and n displaystyle n is the vibrational quantum number Even when n 0 displaystyle n 0 the zero point energy E n displaystyle E n does not equal 0 in adherence to the Heisenberg uncertainty principle See also EditBoltzmann constant Configuration entropy Conformational entropy Enthalpy Entropy Entropy classical thermodynamics Entropy energy dispersal Entropy of mixing Entropy order and disorder Entropy information theory History of entropy Information theory Thermodynamic free energy Tsallis entropyReferences Edit a b E T Jaynes Gibbs vs Boltzmann Entropies American Journal of Physics 391 1965 https doi org 10 1119 1 1971557 Gao Xiang Gallicchio Emilio Roitberg Adrian 2019 The generalized Boltzmann distribution is the only distribution in which the Gibbs Shannon entropy equals the thermodynamic entropy The Journal of Chemical Physics 151 3 034113 arXiv 1903 02121 Bibcode 2019JChPh 151c4113G doi 10 1063 1 5111333 PMID 31325924 S2CID 118981017 Gao Xiang March 2022 The Mathematics of the Ensemble Theory Results in Physics 34 105230 Bibcode 2022ResPh 3405230G doi 10 1016 j rinp 2022 105230 S2CID 221978379 Boltzmann Ludwig January 1995 Lectures on Gas Theory ISBN 0 486 68455 5 Retrieved from https en wikipedia org w index php title Entropy statistical thermodynamics amp oldid 1143513230 Gibbs entropy formula, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.