fbpx
Wikipedia

Gibbs paradox

In statistical mechanics, a semi-classical derivation of entropy that does not take into account the indistinguishability of particles yields an expression for entropy which is not extensive (is not proportional to the amount of substance in question). This leads to a paradox known as the Gibbs paradox, after Josiah Willard Gibbs, who proposed this thought experiment in 1874‒1875.[1][2] The paradox allows for the entropy of closed systems to decrease, violating the second law of thermodynamics. A related paradox is the "mixing paradox". If one takes the perspective that the definition of entropy must be changed so as to ignore particle permutation, in the thermodynamic limit, the paradox is averted.

Illustration of the problem edit

Gibbs himself considered the following problem that arises if the ideal gas entropy is not extensive.[1] Two identical containers of an ideal gas sit side-by-side. The gas in container #1 is identical in every respect to the gas in container #2 (i.e. in volume, mass, temperature, pressure, etc). There is a certain entropy S associated with each container which depends on the volume of each container. Now a door in the container wall is opened to allow the gas particles to mix between the containers. No macroscopic changes occur, as the system is in equilibrium. The entropy of the gas in the two-container system can be easily calculated, but if the equation is not extensive, the entropy would not be 2S. In fact, the non-extensive entropy quantity defined and studied by Gibbs would predict additional entropy. Closing the door then reduces the entropy again to S per box, in supposed violation of the Second Law of Thermodynamics.

As understood by Gibbs,[2] and reemphasized more recently,[3][4] this is a misapplication of Gibbs' non-extensive entropy quantity. If the gas particles are distinguishable, closing the doors will not return the system to its original state – many of the particles will have switched containers. There is a freedom in what is defined as ordered, and it would be a mistake to conclude the entropy had not increased. In particular, Gibbs' non-extensive entropy quantity for an ideal gas was not intended for varying numbers of particles.

The paradox is averted by concluding the indistinguishability (at least effective indistinguishability) of the particles in the volume. This results in the extensive Sackur–Tetrode equation for entropy, as derived next.

Calculating the entropy of ideal gas, and making it extensive edit

In classical mechanics, the state of an ideal gas of energy U, volume V and with N particles, each particle having mass m, is represented by specifying the momentum vector p and the position vector x for each particle. This can be thought of as specifying a point in a 6N-dimensional phase space, where each of the axes corresponds to one of the momentum or position coordinates of one of the particles. The set of points in phase space that the gas could occupy is specified by the constraint that the gas will have a particular energy:

 

and be contained inside of the volume V (let's say V is a cube of side X so that V = X3):

 

for   and  

The first constraint defines the surface of a 3N-dimensional hypersphere of radius (2mU)1/2 and the second is a 3N-dimensional hypercube of volume VN. These combine to form a 6N-dimensional hypercylinder. Just as the area of the wall of a cylinder is the circumference of the base times the height, so the area φ of the wall of this hypercylinder is:

 

The entropy is proportional to the logarithm of the number of states that the gas could have while satisfying these constraints. In classical physics, the number of states is infinitely large, but according to quantum mechanics it is finite. Before the advent of quantum mechanics, this infinity was regularized by making phase space discrete. Phase space was divided up in blocks of volume h3N. The constant h thus appeared as a result of a mathematical trick and thought to have no physical significance. However, using quantum mechanics one recovers the same formalism in the semi-classical limit, but now with h being the Planck constant. One can qualitatively see this from Heisenberg's uncertainty principle; a volume in N phase space smaller than h3N (h is the Planck constant) cannot be specified.

To compute the number of states we must compute the volume in phase space in which the system can be found and divide that by h3N. This leads us to another problem: The volume seems to approach zero, as the region in phase space in which the system can be is an area of zero thickness. This problem is an artifact of having specified the energy U with infinite accuracy. In a generic system without symmetries, a full quantum treatment would yield a discrete non-degenerate set of energy eigenstates. An exact specification of the energy would then fix the precise state the system is in, so the number of states available to the system would be one, the entropy would thus be zero.

When we specify the internal energy to be U, what we really mean is that the total energy of the gas lies somewhere in an interval of length   around U. Here   is taken to be very small, it turns out that the entropy doesn't depend strongly on the choice of   for large N. This means that the above "area" φ must be extended to a shell of a thickness equal to an uncertainty in momentum  , so the entropy is given by:

 

where the constant of proportionality is k, the Boltzmann constant. Using Stirling's approximation for the Gamma function which omits terms of less than order N, the entropy for large N becomes:

 

This quantity is not extensive as can be seen by considering two identical volumes with the same particle number and the same energy. Suppose the two volumes are separated by a barrier in the beginning. Removing or reinserting the wall is reversible, but the entropy increases when the barrier is removed by the amount

 

which is in contradiction to thermodynamics if you re-insert the barrier. This is the Gibbs paradox.

The paradox is resolved by postulating that the gas particles are in fact indistinguishable. This means that all states that differ only by a permutation of particles should be considered as the same state. For example, if we have a 2-particle gas and we specify AB as a state of the gas where the first particle (A) has momentum p1 and the second particle (B) has momentum p2, then this state as well as the BA state where the B particle has momentum p1 and the A particle has momentum p2 should be counted as the same state.

For an N-particle gas, there are N! states which are identical in this sense, if one assumes that each particle is in a different single particle state. One can safely make this assumption provided the gas isn't at an extremely high density. Under normal conditions, one can thus calculate the volume of phase space occupied by the gas, by dividing Equation 1 by N!. Using the Stirling approximation again for large N, ln(N!) ≈ N ln(N) − N, the entropy for large N is:

 

which can be easily shown to be extensive. This is the Sackur–Tetrode equation.

The mixing paradox edit

A closely related paradox to the Gibbs paradox is the mixing paradox. The Gibbs paradox is a special case of the "mixing paradox" which contains all the salient features. The difference is that the mixing paradox deals with arbitrary distinctions in the two gases, not just distinctions in particle ordering as Gibbs had considered. In this sense, it is a straightforward generalization to the argument laid out by Gibbs. Again take a box with a partition in it, with gas A on one side, gas B on the other side, and both gases are at the same temperature and pressure. If gas A and B are different gases, there is an entropy that arises once the gases are mixed. If the gases are the same, no additional entropy is calculated. The additional entropy from mixing does not depend on the character of the gases; it only depends on the fact that the gases are different. The two gases may be arbitrarily similar, but the entropy from mixing does not disappear unless they are the same gas – a paradoxical discontinuity.

This "paradox" can be explained by carefully considering the definition of entropy. In particular, as concisely explained by Edwin Thompson Jaynes,[2] definitions of entropy are arbitrary.

As a central example in Jaynes' paper points out, one can develop a theory that treats two gases as similar even if those gases may in reality be distinguished through sufficiently detailed measurement. As long as we do not perform these detailed measurements, the theory will have no internal inconsistencies. (In other words, it does not matter that we call gases A and B by the same name if we have not yet discovered that they are distinct.) If our theory calls gases A and B the same, then entropy does not change when we mix them. If our theory calls gases A and B different, then entropy does increase when they are mixed. This insight suggests that the ideas of "thermodynamic state" and of "entropy" are somewhat subjective.

The differential increase in entropy (dS) as a result of mixing dissimilar gases, multiplied by the temperature (T), equals the minimum amount of work we must do to restore the gases to their original separated state. Suppose that two gases are different, but that we are unable to detect their differences. If these gases are in a box, segregated from one another by a partition, how much work does it take to restore the system's original state after we remove the partition and let the gases mix?

None – simply reinsert the partition. Even though the gases have mixed, there was never a detectable change of state in the system, because by hypothesis the gases are experimentally indistinguishable.

As soon as we can distinguish the difference between gases, the work necessary to recover the pre-mixing macroscopic configuration from the post-mixing state becomes nonzero. This amount of work does not depend on how different the gases are, but only on whether they are distinguishable.

This line of reasoning is particularly informative when considering the concepts of indistinguishable particles and correct Boltzmann counting. Boltzmann's original expression for the number of states available to a gas assumed that a state could be expressed in terms of a number of energy "sublevels" each of which contain a particular number of particles. While the particles in a given sublevel were considered indistinguishable from each other, particles in different sublevels were considered distinguishable from particles in any other sublevel. This amounts to saying that the exchange of two particles in two different sublevels will result in a detectably different "exchange macrostate" of the gas. For example, if we consider a simple gas with N particles, at sufficiently low density that it is practically certain that each sublevel contains either one particle or none (i.e. a Maxwell–Boltzmann gas), this means that a simple container of gas will be in one of N! detectably different "exchange macrostates", one for each possible particle exchange.

Just as the mixing paradox begins with two detectably different containers, and the extra entropy that results upon mixing is proportional to the average amount of work needed to restore that initial state after mixing, so the extra entropy in Boltzmann's original derivation is proportional to the average amount of work required to restore the simple gas from some "exchange macrostate" to its original "exchange macrostate". If we assume that there is in fact no experimentally detectable difference in these "exchange macrostates" available, then using the entropy which results from assuming the particles are indistinguishable will yield a consistent theory. This is "correct Boltzmann counting".

It is often said that the resolution to the Gibbs paradox derives from the fact that, according to the quantum theory, like particles are indistinguishable in principle. By Jaynes' reasoning, if the particles are experimentally indistinguishable for whatever reason, Gibbs paradox is resolved, and quantum mechanics only provides an assurance that in the quantum realm, this indistinguishability will be true as a matter of principle, rather than being due to an insufficiently refined experimental capability.

Non-extensive entropy of two ideal gases and how to fix it edit

In this section, we present in rough outline a purely classical derivation of the non-extensive entropy for an ideal gas considered by Gibbs before "correct counting" (indistinguishability of particles) is accounted for. This is followed by a brief discussion of two standard methods for making the entropy extensive. Finally, we present a third method, due to R. Swendsen, for an extensive (additive) result for the entropy of two systems if they are allowed to exchange particles with each other.[5][6]

Setup edit

We will present a simplified version of the calculation. It differs from the full calculation in three ways:

  1. The ideal gas consists of particles confined to one spatial dimension.
  2. We keep only the terms of order  , dropping all terms of size n or less, where n is the number of particles. For our purposes, this is enough, because this is where the Gibbs paradox shows up and where it must be resolved. The neglected terms play a role when the number of particles is not very large, such as in computer simulation and nanotechnology. Also, they are needed in deriving the Sackur–Tetrode equation.
  3. The subdivision of phase space into units of the Planck constant (h) is omitted. Instead, the entropy is defined using an integral over the "accessible" portion of phase space. This serves to highlight the purely classical nature of the calculation.

We begin with a version of Boltzmann's entropy in which the integrand is all of accessible phase space:

 

The integral is restricted to a contour of available regions of phase space, subject to conservation of energy. In contrast to the one-dimensional line integrals encountered in elementary physics, the contour of constant energy possesses a vast number of dimensions. The justification for integrating over phase space using the canonical measure involves the assumption of equal probability. The assumption can be made by invoking the ergodic hypothesis as well as the Liouville's theorem of Hamiltonian systems.

(The ergodic hypothesis underlies the ability of a physical system to reach thermal equilibrium, but this may not always hold for computer simulations (see the Fermi–Pasta–Ulam–Tsingou problem) or in certain real-world systems such as non-thermal plasmas.)

Liouville's theorem assumes a fixed number of dimensions that the system 'explores'. In calculations of entropy, the number dimensions is proportional to the number of particles in the system, which forces phase space to abruptly change dimensionality when particles are added or subtracted. This may explain the difficulties in constructing a clear and simple derivation for the dependence of entropy on the number of particles.

For the ideal gas, the accessible phase space is an (n − 1)-sphere (also called a hypersphere) in the n-dimensional   space:

 

To recover the paradoxical result that entropy is not extensive, we integrate over phase space for a gas of   monatomic particles confined to a single spatial dimension by  . Since our only purpose is to illuminate a paradox, we simplify notation by taking the particle's mass and the Boltzmann constant equal to unity:  . We represent points in phase-space and its x and v parts by n and 2n dimensional vectors:

    where       and    

To calculate entropy, we use the fact that the (n-1)-sphere,   has an (n − 1)-dimensional "hypersurface volume" of

 

For example, if n = 2, the 1-sphere is the circle  , a "hypersurface" in the plane. When the sphere is even-dimensional (n odd), it will be necessary to use the gamma function to give meaning to the factorial; see below.

Gibbs paradox in a one-dimensional gas edit

Gibbs paradox arises when entropy is calculated using an   dimensional phase space, where   is also the number of particles in the gas. These particles are spatially confined to the one-dimensional interval  . The volume of the surface of fixed energy is

 

The subscripts on   are used to define the 'state variables' and will be discussed later, when it is argued that the number of particles,   lacks full status as a state variable in this calculation. The integral over configuration space is  . As indicated by the underbrace, the integral over velocity space is restricted to the "surface area" of the n − 1 dimensional hypersphere of radius  , and is therefore equal to the "area" of that hypersurface. Thus

 

After approximating the factorial and dropping the small terms, we obtain

 

In the second expression, the term   was subtracted and added, using the fact that  . This was done to highlight exactly how the "entropy" defined here fails to be an extensive property of matter. The first two terms are extensive: if the volume of the system doubles, but gets filled with the same density of particles with the same energy, then each of these terms doubles. But the third term is neither extensive nor intensive and is therefore wrong.

The arbitrary constant has been added because entropy can usually be viewed as being defined up to an arbitrary additive constant. This is especially necessary when entropy is defined as the logarithm of a phase space volume measured in units of momentum-position. Any change in how these units are defined will add or subtract a constant from the value of the entropy.

Two standard ways to make the classical entropy extensive edit

As discussed above, an extensive form of entropy is recovered if we divide the volume of phase space,  , by n!. An alternative approach is to argue that the dependence on particle number cannot be trusted on the grounds that changing   also changes the dimensionality of phase space. Such changes in dimensionality lie outside the scope of Hamiltonian mechanics and Liouville's theorem. For that reason it is plausible to allow the arbitrary constant to be a function of  .[7] Defining the function to be,  , we have:

 

which has extensive scaling:

 

Swendsen's particle-exchange approach edit

Following Swendsen,[5][6] we allow two systems to exchange particles. This essentially 'makes room' in phase space for particles to enter or leave without requiring a change in the number of dimensions of phase space. The total number of particles is  :

  •   particles have coordinates  .
    The total energy of these particles is  
  •   particles have coordinates  .
    The total energy of these particles is  
  • The system is subject to the constraints,   and  

Taking the integral over phase space, we have:

 

The question marks (?) serve as a reminder that we may not assume that the first nA particles (i.e. 1 through nA) are in system A while the other particles (nB through N) are in system B. (This is further discussed in the next section.)

Taking the logarithm and keeping only the largest terms, we have:

 

This can be interpreted as the sum of the entropy of system A and system B, both extensive. And there is a term,  , that is not extensive.

Visualizing the particle-exchange approach in three dimensions edit

 
A three particle ideal gas partitioned into two parts

The correct (extensive) formulas for systems A and B were obtained because we included all the possible ways that the two systems could exchange particles. The use of combinations (i.e. N particles choose NA) was used to ascertain the number of ways N particles can be divided into system A containing nA particles and system B containing nB particles. This counting is not justified on physical grounds, but on the need to integrate over phase space. As will be illustrated below, phase space contains not a single nA-sphere and a single nB-sphere, but instead

 

pairs of n-spheres, all situated in the same (N + 1)-dimensional velocity space. The integral over accessible phase space must include all of these n-spheres, as can be seen in the figure, which shows the actual velocity phase space associated with a gas that consists of three particles. Moreover, this gas has been divided into two systems, A and B.

If we ignore the spatial variables, the phase space of a gas with three particles is three dimensional, which permits one to sketch the n-spheres over which the integral over phase space must be taken. If all three particles are together, the split between the two gases is 3|0. Accessible phase space is delimited by an ordinary sphere (2-sphere) with a radius that is either   or   (depending which system has the particles).

If the split is 2|1, then phase space consists of circles and points. Each circle occupies two dimensions, and for each circle, two points lie on the third axis, equidistant from the center of the circle. In other words, if system A has 2 particles, accessible phase space consists of 3 pairs of n-spheres, each pair being a 1-sphere and a 0-sphere:

 
 
 

Note that

 

References edit

  1. ^ a b Gibbs, J. Willard (1875–1878). On the Equilibrium of Heterogeneous Substances. Connecticut Acad. Sci. ISBN 0-8493-9685-9. Reprinted in Gibbs, J. Willard (October 1993). The Scientific Papers of J. Willard Gibbs (Vol. 1). Ox Bow Press. ISBN 0-918024-77-3. and in Gibbs, J. Willard (February 1994). The Scientific Papers of J. Willard Gibbs (Vol. 2). Ox Bow Press. ISBN 1-881987-06-X.
  2. ^ a b c Jaynes, Edwin T (1996). "The Gibbs Paradox" (PDF). Retrieved November 8, 2005.
  3. ^ Grad, Harold (1961). "The many faces of entropy". Communications on Pure and Applied Mathematics. 14 (3): 323–354. doi:10.1002/cpa.3160140312.
  4. ^ van Kampen, N. G. (1984). "The Gibbs Paradox". In W. E. Parry (ed.). Essays in Theoretical Physics in Honor of Dirk ter Haar. Oxford: Pergamon. ISBN 978-0080265230.
  5. ^ a b Swendsen, Robert (March 2006). "Statistical mechanics of colloids and Boltzmann's definition of entropy". American Journal of Physics. 74 (3): 187–190. Bibcode:2006AmJPh..74..187S. doi:10.1119/1.2174962.
  6. ^ a b Swendsen, Robert H. (June 2002). "Statistical Mechanics of Classical Systems with Distinguishable Particles". Journal of Statistical Physics. 107 (5/6): 1143–1166. Bibcode:2002JSP...107.1143S. doi:10.1023/A:1015161825292. S2CID 122463989.
  7. ^ Jaynes, E.T. (1992). The Gibbs Paradox in Maximum Entropy and Bayesian Methods (edited by C.R. Smith, G.J. Erickson, & P.O. Neudorfere) (PDF). Dordrecht, Holland: Kluwer Academic Publishers. pp. 1–22. In particular, Gibbs failed to point out that an 'integration constant' was not an arbitrary constant, but an arbitrary function. But this has, as we shall see, nontrivial physical consequences. What is remarkable is not that Gibbs should have failed to stress a fine mathematical point in almost the last words he wrote; but that for 80 years thereafter all textbook writers (except possibly Pauli) failed to see it.

Further reading edit

  • Chih-Yuan Tseng & Ariel Caticha (2001). "Yet another resolution of the Gibbs paradox: an information theory approach". In R. L. Fry (ed.). Bayesian Inference and Maximum Entropy Methods in Science and Engineering. AIP Conference Proceedings. Vol. 617. p. 331. arXiv:cond-mat/0109324. doi:10.1063/1.1477057.
  • Dieks, Dennis (2011). "The Gibbs Paradox Revisited". In Dennis Dieks; Wenceslao J. Gonzalez; Stephan Hartmann; Thomas Uebel; Marcel Weber (eds.). Explanation, Prediction, and Confirmation. The Philosophy of Science in a European Perspective. pp. 367–377. arXiv:1003.0179. doi:10.1007/978-94-007-1180-8_25. ISBN 978-94-007-1179-2. S2CID 118395415.

External links edit

  • Gibbs paradox and its resolutions – varied collected papers

gibbs, paradox, confused, with, gibbs, phenomenon, statistical, mechanics, semi, classical, derivation, entropy, that, does, take, into, account, indistinguishability, particles, yields, expression, entropy, which, extensive, proportional, amount, substance, q. Not to be confused with Gibbs phenomenon In statistical mechanics a semi classical derivation of entropy that does not take into account the indistinguishability of particles yields an expression for entropy which is not extensive is not proportional to the amount of substance in question This leads to a paradox known as the Gibbs paradox after Josiah Willard Gibbs who proposed this thought experiment in 1874 1875 1 2 The paradox allows for the entropy of closed systems to decrease violating the second law of thermodynamics A related paradox is the mixing paradox If one takes the perspective that the definition of entropy must be changed so as to ignore particle permutation in the thermodynamic limit the paradox is averted Contents 1 Illustration of the problem 2 Calculating the entropy of ideal gas and making it extensive 3 The mixing paradox 4 Non extensive entropy of two ideal gases and how to fix it 4 1 Setup 4 2 Gibbs paradox in a one dimensional gas 4 3 Two standard ways to make the classical entropy extensive 4 4 Swendsen s particle exchange approach 4 5 Visualizing the particle exchange approach in three dimensions 5 References 6 Further reading 7 External linksIllustration of the problem editGibbs himself considered the following problem that arises if the ideal gas entropy is not extensive 1 Two identical containers of an ideal gas sit side by side The gas in container 1 is identical in every respect to the gas in container 2 i e in volume mass temperature pressure etc There is a certain entropy S associated with each container which depends on the volume of each container Now a door in the container wall is opened to allow the gas particles to mix between the containers No macroscopic changes occur as the system is in equilibrium The entropy of the gas in the two container system can be easily calculated but if the equation is not extensive the entropy would not be 2S In fact the non extensive entropy quantity defined and studied by Gibbs would predict additional entropy Closing the door then reduces the entropy again to S per box in supposed violation of the Second Law of Thermodynamics As understood by Gibbs 2 and reemphasized more recently 3 4 this is a misapplication of Gibbs non extensive entropy quantity If the gas particles are distinguishable closing the doors will not return the system to its original state many of the particles will have switched containers There is a freedom in what is defined as ordered and it would be a mistake to conclude the entropy had not increased In particular Gibbs non extensive entropy quantity for an ideal gas was not intended for varying numbers of particles The paradox is averted by concluding the indistinguishability at least effective indistinguishability of the particles in the volume This results in the extensive Sackur Tetrode equation for entropy as derived next Calculating the entropy of ideal gas and making it extensive editIn classical mechanics the state of an ideal gas of energy U volume V and with N particles each particle having mass m is represented by specifying the momentum vector p and the position vector x for each particle This can be thought of as specifying a point in a 6N dimensional phase space where each of the axes corresponds to one of the momentum or position coordinates of one of the particles The set of points in phase space that the gas could occupy is specified by the constraint that the gas will have a particular energy U 1 2 m i 1 N p i x 2 p i y 2 p i z 2 displaystyle U frac 1 2m sum i 1 N p ix 2 p iy 2 p iz 2 nbsp and be contained inside of the volume V let s say V is a cube of side X so that V X3 0 x i j X displaystyle 0 leq x ij leq X nbsp for i 1 N displaystyle i 1 N nbsp and j 1 2 3 displaystyle j 1 2 3 nbsp The first constraint defines the surface of a 3N dimensional hypersphere of radius 2mU 1 2 and the second is a 3N dimensional hypercube of volume VN These combine to form a 6N dimensional hypercylinder Just as the area of the wall of a cylinder is the circumference of the base times the height so the area f of the wall of this hypercylinder is ϕ U V N V N 2 p 3 N 2 2 m U 3 N 1 2 G 3 N 2 1 displaystyle phi U V N V N left frac 2 pi frac 3N 2 2mU frac 3N 1 2 Gamma 3N 2 right 1 nbsp The entropy is proportional to the logarithm of the number of states that the gas could have while satisfying these constraints In classical physics the number of states is infinitely large but according to quantum mechanics it is finite Before the advent of quantum mechanics this infinity was regularized by making phase space discrete Phase space was divided up in blocks of volume h3N The constant h thus appeared as a result of a mathematical trick and thought to have no physical significance However using quantum mechanics one recovers the same formalism in the semi classical limit but now with h being the Planck constant One can qualitatively see this from Heisenberg s uncertainty principle a volume in N phase space smaller than h3N h is the Planck constant cannot be specified To compute the number of states we must compute the volume in phase space in which the system can be found and divide that by h3N This leads us to another problem The volume seems to approach zero as the region in phase space in which the system can be is an area of zero thickness This problem is an artifact of having specified the energy U with infinite accuracy In a generic system without symmetries a full quantum treatment would yield a discrete non degenerate set of energy eigenstates An exact specification of the energy would then fix the precise state the system is in so the number of states available to the system would be one the entropy would thus be zero When we specify the internal energy to be U what we really mean is that the total energy of the gas lies somewhere in an interval of length d U displaystyle delta U nbsp around U Here d U displaystyle delta U nbsp is taken to be very small it turns out that the entropy doesn t depend strongly on the choice of d U displaystyle delta U nbsp for large N This means that the above area f must be extended to a shell of a thickness equal to an uncertainty in momentum d p d 2 m U m 2 U d U displaystyle delta p delta left sqrt 2mU right sqrt frac m 2U delta U nbsp so the entropy is given by S k ln ϕ d p h 3 N displaystyle left right S k ln phi delta p h 3N nbsp where the constant of proportionality is k the Boltzmann constant Using Stirling s approximation for the Gamma function which omits terms of less than order N the entropy for large N becomes S k N ln V U N 3 2 3 2 k N 1 ln 4 p m 3 h 2 displaystyle S kN ln left V left frac U N right frac 3 2 right frac 3 2 kN left 1 ln frac 4 pi m 3h 2 right nbsp This quantity is not extensive as can be seen by considering two identical volumes with the same particle number and the same energy Suppose the two volumes are separated by a barrier in the beginning Removing or reinserting the wall is reversible but the entropy increases when the barrier is removed by the amount d S k 2 N ln 2 V N ln V N ln V 2 k N ln 2 gt 0 displaystyle delta S k left 2N ln 2V N ln V N ln V right 2kN ln 2 gt 0 nbsp which is in contradiction to thermodynamics if you re insert the barrier This is the Gibbs paradox The paradox is resolved by postulating that the gas particles are in fact indistinguishable This means that all states that differ only by a permutation of particles should be considered as the same state For example if we have a 2 particle gas and we specify AB as a state of the gas where the first particle A has momentum p1 and the second particle B has momentum p2 then this state as well as the BA state where the B particle has momentum p1 and the A particle has momentum p2 should be counted as the same state For an N particle gas there are N states which are identical in this sense if one assumes that each particle is in a different single particle state One can safely make this assumption provided the gas isn t at an extremely high density Under normal conditions one can thus calculate the volume of phase space occupied by the gas by dividing Equation 1 by N Using the Stirling approximation again for large N ln N N ln N N the entropy for large N is S k N ln V N U N 3 2 k N 5 2 3 2 ln 4 p m 3 h 2 displaystyle S kN ln left left frac V N right left frac U N right frac 3 2 right kN left frac 5 2 frac 3 2 ln frac 4 pi m 3h 2 right nbsp which can be easily shown to be extensive This is the Sackur Tetrode equation The mixing paradox editA closely related paradox to the Gibbs paradox is the mixing paradox The Gibbs paradox is a special case of the mixing paradox which contains all the salient features The difference is that the mixing paradox deals with arbitrary distinctions in the two gases not just distinctions in particle ordering as Gibbs had considered In this sense it is a straightforward generalization to the argument laid out by Gibbs Again take a box with a partition in it with gas A on one side gas B on the other side and both gases are at the same temperature and pressure If gas A and B are different gases there is an entropy that arises once the gases are mixed If the gases are the same no additional entropy is calculated The additional entropy from mixing does not depend on the character of the gases it only depends on the fact that the gases are different The two gases may be arbitrarily similar but the entropy from mixing does not disappear unless they are the same gas a paradoxical discontinuity This paradox can be explained by carefully considering the definition of entropy In particular as concisely explained by Edwin Thompson Jaynes 2 definitions of entropy are arbitrary As a central example in Jaynes paper points out one can develop a theory that treats two gases as similar even if those gases may in reality be distinguished through sufficiently detailed measurement As long as we do not perform these detailed measurements the theory will have no internal inconsistencies In other words it does not matter that we call gases A and B by the same name if we have not yet discovered that they are distinct If our theory calls gases A and B the same then entropy does not change when we mix them If our theory calls gases A and B different then entropy does increase when they are mixed This insight suggests that the ideas of thermodynamic state and of entropy are somewhat subjective The differential increase in entropy dS as a result of mixing dissimilar gases multiplied by the temperature T equals the minimum amount of work we must do to restore the gases to their original separated state Suppose that two gases are different but that we are unable to detect their differences If these gases are in a box segregated from one another by a partition how much work does it take to restore the system s original state after we remove the partition and let the gases mix None simply reinsert the partition Even though the gases have mixed there was never a detectable change of state in the system because by hypothesis the gases are experimentally indistinguishable As soon as we can distinguish the difference between gases the work necessary to recover the pre mixing macroscopic configuration from the post mixing state becomes nonzero This amount of work does not depend on how different the gases are but only on whether they are distinguishable This line of reasoning is particularly informative when considering the concepts of indistinguishable particles and correct Boltzmann counting Boltzmann s original expression for the number of states available to a gas assumed that a state could be expressed in terms of a number of energy sublevels each of which contain a particular number of particles While the particles in a given sublevel were considered indistinguishable from each other particles in different sublevels were considered distinguishable from particles in any other sublevel This amounts to saying that the exchange of two particles in two different sublevels will result in a detectably different exchange macrostate of the gas For example if we consider a simple gas with N particles at sufficiently low density that it is practically certain that each sublevel contains either one particle or none i e a Maxwell Boltzmann gas this means that a simple container of gas will be in one of N detectably different exchange macrostates one for each possible particle exchange Just as the mixing paradox begins with two detectably different containers and the extra entropy that results upon mixing is proportional to the average amount of work needed to restore that initial state after mixing so the extra entropy in Boltzmann s original derivation is proportional to the average amount of work required to restore the simple gas from some exchange macrostate to its original exchange macrostate If we assume that there is in fact no experimentally detectable difference in these exchange macrostates available then using the entropy which results from assuming the particles are indistinguishable will yield a consistent theory This is correct Boltzmann counting It is often said that the resolution to the Gibbs paradox derives from the fact that according to the quantum theory like particles are indistinguishable in principle By Jaynes reasoning if the particles are experimentally indistinguishable for whatever reason Gibbs paradox is resolved and quantum mechanics only provides an assurance that in the quantum realm this indistinguishability will be true as a matter of principle rather than being due to an insufficiently refined experimental capability Non extensive entropy of two ideal gases and how to fix it editIn this section we present in rough outline a purely classical derivation of the non extensive entropy for an ideal gas considered by Gibbs before correct counting indistinguishability of particles is accounted for This is followed by a brief discussion of two standard methods for making the entropy extensive Finally we present a third method due to R Swendsen for an extensive additive result for the entropy of two systems if they are allowed to exchange particles with each other 5 6 Setup edit We will present a simplified version of the calculation It differs from the full calculation in three ways The ideal gas consists of particles confined to one spatial dimension We keep only the terms of order n log n displaystyle n log n nbsp dropping all terms of size n or less where n is the number of particles For our purposes this is enough because this is where the Gibbs paradox shows up and where it must be resolved The neglected terms play a role when the number of particles is not very large such as in computer simulation and nanotechnology Also they are needed in deriving the Sackur Tetrode equation The subdivision of phase space into units of the Planck constant h is omitted Instead the entropy is defined using an integral over the accessible portion of phase space This serves to highlight the purely classical nature of the calculation We begin with a version of Boltzmann s entropy in which the integrand is all of accessible phase space S k ln W k ln H p q E d p n d q n displaystyle S k ln Omega k ln oint limits H vec p vec q E d vec p n d vec q n nbsp The integral is restricted to a contour of available regions of phase space subject to conservation of energy In contrast to the one dimensional line integrals encountered in elementary physics the contour of constant energy possesses a vast number of dimensions The justification for integrating over phase space using the canonical measure involves the assumption of equal probability The assumption can be made by invoking the ergodic hypothesis as well as the Liouville s theorem of Hamiltonian systems The ergodic hypothesis underlies the ability of a physical system to reach thermal equilibrium but this may not always hold for computer simulations see the Fermi Pasta Ulam Tsingou problem or in certain real world systems such as non thermal plasmas Liouville s theorem assumes a fixed number of dimensions that the system explores In calculations of entropy the number dimensions is proportional to the number of particles in the system which forces phase space to abruptly change dimensionality when particles are added or subtracted This may explain the difficulties in constructing a clear and simple derivation for the dependence of entropy on the number of particles For the ideal gas the accessible phase space is an n 1 sphere also called a hypersphere in the n dimensional v displaystyle vec v nbsp space E j 1 n 1 2 m v j 2 displaystyle E sum j 1 n frac 1 2 mv j 2 nbsp To recover the paradoxical result that entropy is not extensive we integrate over phase space for a gas of n displaystyle n nbsp monatomic particles confined to a single spatial dimension by 0 lt x lt ℓ displaystyle 0 lt x lt ell nbsp Since our only purpose is to illuminate a paradox we simplify notation by taking the particle s mass and the Boltzmann constant equal to unity m k 1 displaystyle m k 1 nbsp We represent points in phase space and its x and v parts by n and 2n dimensional vectors 3 x 1 x n v 1 v n x v displaystyle vec xi x 1 x n v 1 v n vec x vec v nbsp where x x 1 x n displaystyle vec x x 1 x n nbsp and v v 1 v n displaystyle vec v v 1 v n nbsp To calculate entropy we use the fact that the n 1 sphere S v j 2 R 2 displaystyle Sigma v j 2 R 2 nbsp has an n 1 dimensional hypersurface volume of A n R n p n 2 n 2 R n 1 displaystyle tilde A n R frac n pi n 2 n 2 R n 1 nbsp For example if n 2 the 1 sphere is the circle A 2 R 2 p R displaystyle tilde A 2 R 2 pi R nbsp a hypersurface in the plane When the sphere is even dimensional n odd it will be necessary to use the gamma function to give meaning to the factorial see below Gibbs paradox in a one dimensional gas edit Gibbs paradox arises when entropy is calculated using an n displaystyle n nbsp dimensional phase space where n displaystyle n nbsp is also the number of particles in the gas These particles are spatially confined to the one dimensional interval ℓ n displaystyle ell n nbsp The volume of the surface of fixed energy is W E ℓ d x 1 d x 2 d x n d v 1 d v 2 d v n S v i 2 2 E displaystyle Omega E ell left int dx 1 int dx 2 int dx n right left underbrace int dv 1 int dv 2 int dv n Sigma v i 2 2E right nbsp The subscripts on W displaystyle Omega nbsp are used to define the state variables and will be discussed later when it is argued that the number of particles n displaystyle n nbsp lacks full status as a state variable in this calculation The integral over configuration space is ℓ n displaystyle ell n nbsp As indicated by the underbrace the integral over velocity space is restricted to the surface area of the n 1 dimensional hypersphere of radius 2 E displaystyle sqrt 2E nbsp and is therefore equal to the area of that hypersurface Thus W E ℓ ℓ n n p n 2 n 2 2 E n 1 2 displaystyle Omega E ell ell n frac n pi n 2 n 2 2E frac n 1 2 nbsp Click to view the algebraic stepsWe begin with W E ℓ ℓ n n p n 2 n 2 2 E n 1 2 displaystyle Omega E ell ell n frac n pi n 2 n 2 2E frac n 1 2 nbsp ln W E ℓ ln ℓ n n p n 2 2 E n 1 2 ln n 2 displaystyle ln Omega E ell ln left ell n n pi n 2 2E frac n 1 2 right ln left n 2 right nbsp Both terms on the right hand side have dominant terms Using the Stirling approximation for large M ln M M ln M displaystyle ln M approx M ln M nbsp M ln 2 p M displaystyle M ln sqrt 2 pi M nbsp we have ln ℓ n n p n 2 2 E n 1 2 ln ℓ n E n 2 important ln n 2 p n 2 2 E drop displaystyle ln left ell n n pi n 2 2E frac n 1 2 right underbrace ln left ell n E frac n 2 right text important underbrace ln left frac n 2 pi frac n 2 sqrt 2E right text drop nbsp ln n 2 n 2 ln n 2 n 2 ln n p n 2 ln n keep n 2 ln 2 n 2 ln n p drop displaystyle begin aligned ln n 2 amp approx frac n 2 ln frac n 2 frac n 2 ln sqrt n pi amp underbrace frac n 2 ln n text keep underbrace frac n 2 ln 2 frac n 2 ln sqrt n pi text drop end aligned nbsp Terms are neglected if they exhibit less variation with a parameter and we compare terms that vary with the same parameter Entropy is defined with an additive arbitrary constant because the area in phase space depends on what units are used For that reason it does not matter if entropy is large or small for a given value of E We instead to seek how entropy varies with E i e we seek S E displaystyle partial S partial E nbsp An expression such as E 1 2 displaystyle E frac 1 2 nbsp is much less important than an expression likeE n 2 displaystyle E frac n 2 nbsp An expression like p n displaystyle pi n nbsp is much less important than an expression like n n displaystyle n n nbsp Note that the logarithm is not a strongly increasing function The neglect of terms proportional to n compared with terms proportional to n ln n is only justified if n is extremely large Combining the important terms ln W E ℓ ln ℓ n E n 2 n 2 ln n n ln ℓ n 2 ln E n displaystyle begin aligned ln Omega E ell amp ln left ell n E frac n 2 right frac n 2 ln n amp n ln ell frac n 2 ln left frac E n right end aligned nbsp After approximating the factorial and dropping the small terms we obtain ln W E ℓ n ln ℓ n ln E n const n ln ℓ n n ln E n extensive n ln n const displaystyle begin aligned ln Omega E ell amp approx n ln ell n ln sqrt frac E n text const amp underbrace n ln frac ell n n ln sqrt frac E n text extensive n ln n text const end aligned nbsp In the second expression the term n ln n displaystyle n ln n nbsp was subtracted and added using the fact that ln ℓ ln n ln ℓ n displaystyle ln ell ln n ln ell n nbsp This was done to highlight exactly how the entropy defined here fails to be an extensive property of matter The first two terms are extensive if the volume of the system doubles but gets filled with the same density of particles with the same energy then each of these terms doubles But the third term is neither extensive nor intensive and is therefore wrong The arbitrary constant has been added because entropy can usually be viewed as being defined up to an arbitrary additive constant This is especially necessary when entropy is defined as the logarithm of a phase space volume measured in units of momentum position Any change in how these units are defined will add or subtract a constant from the value of the entropy Two standard ways to make the classical entropy extensive edit As discussed above an extensive form of entropy is recovered if we divide the volume of phase space W E ℓ displaystyle Omega E ell nbsp by n An alternative approach is to argue that the dependence on particle number cannot be trusted on the grounds that changing n displaystyle n nbsp also changes the dimensionality of phase space Such changes in dimensionality lie outside the scope of Hamiltonian mechanics and Liouville s theorem For that reason it is plausible to allow the arbitrary constant to be a function of n displaystyle n nbsp 7 Defining the function to be f n 3 2 n ln n displaystyle f n frac 3 2 n ln n nbsp we have S ln W E ℓ n ln ℓ n ln E const n ln ℓ n ln E f n ln W E ℓ n n ln ℓ n n ln E n const displaystyle begin aligned S ln Omega E ell amp approx n ln ell n ln sqrt E text const amp n ln ell n ln sqrt E f n ln Omega E ell n amp approx n ln frac ell n n ln sqrt frac E n text const end aligned nbsp which has extensive scaling S a E a ℓ a n a S E ℓ n displaystyle S alpha E alpha ell alpha n alpha S E ell n nbsp Swendsen s particle exchange approach edit Following Swendsen 5 6 we allow two systems to exchange particles This essentially makes room in phase space for particles to enter or leave without requiring a change in the number of dimensions of phase space The total number of particles is N displaystyle N nbsp n A displaystyle n A nbsp particles have coordinates 0 lt x lt ℓ A displaystyle 0 lt x lt ell A nbsp The total energy of these particles is E A displaystyle E A nbsp n B displaystyle n B nbsp particles have coordinates 0 lt x lt ℓ B displaystyle 0 lt x lt ell B nbsp The total energy of these particles is E B displaystyle E B nbsp The system is subject to the constraints E A E B E displaystyle E A E B E nbsp and n A n B N displaystyle n A n B N nbsp Taking the integral over phase space we have W E ℓ N d x d x n A terms d x d x N n B terms d v 1 d v 2 d v N S v 2 2 E A or S v 2 2 E B ℓ A n A ℓ B n B N n A n B combination n A p n A 2 n A 2 2 E A n A 1 2 n A sphere n B p n B 2 n B 2 2 E B n B 1 2 n B sphere displaystyle begin aligned Omega E ell N amp left underbrace int dx int dx n A text terms underbrace int dx int dx N n B text terms right left underbrace int dv 1 int dv 2 int dv N Sigma v 2 2E A text or Sigma v 2 2E B right amp left ell A right n A left ell B right n B left underbrace frac N n A n B text combination right left underbrace frac n A pi n A 2 n A 2 2E A frac n A 1 2 n A text sphere right left underbrace frac n B pi n B 2 n B 2 2E B frac n B 1 2 n B text sphere right end aligned nbsp The question marks serve as a reminder that we may not assume that the first nA particles i e 1 through nA are in system A while the other particles nB through N are in system B This is further discussed in the next section Taking the logarithm and keeping only the largest terms we have S ln W E ℓ N n A ln n A ℓ A E A ℓ A n B ln n B ℓ B E B ℓ B N ln N const displaystyle S ln Omega E ell N approx n A ln left frac n A ell A sqrt frac E A ell A right n B ln left frac n B ell B sqrt frac E B ell B right N ln N text const nbsp This can be interpreted as the sum of the entropy of system A and system B both extensive And there is a term N ln N displaystyle N ln N nbsp that is not extensive Visualizing the particle exchange approach in three dimensions edit nbsp A three particle ideal gas partitioned into two partsThe correct extensive formulas for systems A and B were obtained because we included all the possible ways that the two systems could exchange particles The use of combinations i e N particles choose NA was used to ascertain the number of ways N particles can be divided into system A containing nA particles and system B containing nB particles This counting is not justified on physical grounds but on the need to integrate over phase space As will be illustrated below phase space contains not a single nA sphere and a single nB sphere but instead N n A N n A n B displaystyle N choose n A frac N n A n B nbsp pairs of n spheres all situated in the same N 1 dimensional velocity space The integral over accessible phase space must include all of these n spheres as can be seen in the figure which shows the actual velocity phase space associated with a gas that consists of three particles Moreover this gas has been divided into two systems A and B If we ignore the spatial variables the phase space of a gas with three particles is three dimensional which permits one to sketch the n spheres over which the integral over phase space must be taken If all three particles are together the split between the two gases is 3 0 Accessible phase space is delimited by an ordinary sphere 2 sphere with a radius that is either 2 E 1 displaystyle sqrt 2E 1 nbsp or 2 E 2 displaystyle sqrt 2E 2 nbsp depending which system has the particles If the split is 2 1 then phase space consists of circles and points Each circle occupies two dimensions and for each circle two points lie on the third axis equidistant from the center of the circle In other words if system A has 2 particles accessible phase space consists of 3 pairs of n spheres each pair being a 1 sphere and a 0 sphere v 1 2 v 2 2 2 E A v 3 2 2 E B displaystyle v 1 2 v 2 2 2E A qquad v 3 2 2E B nbsp v 2 2 v 3 2 2 E A v 1 2 2 E B displaystyle v 2 2 v 3 2 2E A qquad v 1 2 2E B nbsp v 3 2 v 1 2 2 E A v 2 2 2 E B displaystyle v 3 2 v 1 2 2E A qquad v 2 2 2E B nbsp Note that 3 2 3 displaystyle 3 choose 2 3 nbsp References edit a b Gibbs J Willard 1875 1878 On the Equilibrium of Heterogeneous Substances Connecticut Acad Sci ISBN 0 8493 9685 9 Reprinted in Gibbs J Willard October 1993 The Scientific Papers of J Willard Gibbs Vol 1 Ox Bow Press ISBN 0 918024 77 3 and in Gibbs J Willard February 1994 The Scientific Papers of J Willard Gibbs Vol 2 Ox Bow Press ISBN 1 881987 06 X a b c Jaynes Edwin T 1996 The Gibbs Paradox PDF Retrieved November 8 2005 Grad Harold 1961 The many faces of entropy Communications on Pure and Applied Mathematics 14 3 323 354 doi 10 1002 cpa 3160140312 van Kampen N G 1984 The Gibbs Paradox In W E Parry ed Essays in Theoretical Physics in Honor of Dirk ter Haar Oxford Pergamon ISBN 978 0080265230 a b Swendsen Robert March 2006 Statistical mechanics of colloids and Boltzmann s definition of entropy American Journal of Physics 74 3 187 190 Bibcode 2006AmJPh 74 187S doi 10 1119 1 2174962 a b Swendsen Robert H June 2002 Statistical Mechanics of Classical Systems with Distinguishable Particles Journal of Statistical Physics 107 5 6 1143 1166 Bibcode 2002JSP 107 1143S doi 10 1023 A 1015161825292 S2CID 122463989 Jaynes E T 1992 The Gibbs Paradox in Maximum Entropy and Bayesian Methods edited by C R Smith G J Erickson amp P O Neudorfere PDF Dordrecht Holland Kluwer Academic Publishers pp 1 22 In particular Gibbs failed to point out that an integration constant was not an arbitrary constant but an arbitrary function But this has as we shall see nontrivial physical consequences What is remarkable is not that Gibbs should have failed to stress a fine mathematical point in almost the last words he wrote but that for 80 years thereafter all textbook writers except possibly Pauli failed to see it Further reading editChih Yuan Tseng amp Ariel Caticha 2001 Yet another resolution of the Gibbs paradox an information theory approach In R L Fry ed Bayesian Inference and Maximum Entropy Methods in Science and Engineering AIP Conference Proceedings Vol 617 p 331 arXiv cond mat 0109324 doi 10 1063 1 1477057 Dieks Dennis 2011 The Gibbs Paradox Revisited In Dennis Dieks Wenceslao J Gonzalez Stephan Hartmann Thomas Uebel Marcel Weber eds Explanation Prediction and Confirmation The Philosophy of Science in a European Perspective pp 367 377 arXiv 1003 0179 doi 10 1007 978 94 007 1180 8 25 ISBN 978 94 007 1179 2 S2CID 118395415 External links editGibbs paradox and its resolutions varied collected papers Retrieved from https en wikipedia org w index php title Gibbs paradox amp oldid 1181790001, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.