fbpx
Wikipedia

Maximum entropy thermodynamics

In physics, maximum entropy thermodynamics (colloquially, MaxEnt thermodynamics) views equilibrium thermodynamics and statistical mechanics as inference processes. More specifically, MaxEnt applies inference techniques rooted in Shannon information theory, Bayesian probability, and the principle of maximum entropy. These techniques are relevant to any situation requiring prediction from incomplete or insufficient data (e.g., image reconstruction, signal processing, spectral analysis, and inverse problems). MaxEnt thermodynamics began with two papers by Edwin T. Jaynes published in the 1957 Physical Review.[1][2]

Maximum Shannon entropy edit

Central to the MaxEnt thesis is the principle of maximum entropy. It demands as given some partly specified model and some specified data related to the model. It selects a preferred probability distribution to represent the model. The given data state "testable information"[3][4] about the probability distribution, for example particular expectation values, but are not in themselves sufficient to uniquely determine it. The principle states that one should prefer the distribution which maximizes the Shannon information entropy,

 

This is known as the Gibbs algorithm, having been introduced by J. Willard Gibbs in 1878, to set up statistical ensembles to predict the properties of thermodynamic systems at equilibrium. It is the cornerstone of the statistical mechanical analysis of the thermodynamic properties of equilibrium systems (see partition function).

A direct connection is thus made between the equilibrium thermodynamic entropy STh, a state function of pressure, volume, temperature, etc., and the information entropy for the predicted distribution with maximum uncertainty conditioned only on the expectation values of those variables:

 

kB, the Boltzmann constant, has no fundamental physical significance here, but is necessary to retain consistency with the previous historical definition of entropy by Clausius (1865) (see Boltzmann constant).

However, the MaxEnt school argue that the MaxEnt approach is a general technique of statistical inference, with applications far beyond this. It can therefore also be used to predict a distribution for "trajectories" Γ "over a period of time" by maximising:

 

This "information entropy" does not necessarily have a simple correspondence with thermodynamic entropy. But it can be used to predict features of nonequilibrium thermodynamic systems as they evolve over time.

For non-equilibrium scenarios, in an approximation that assumes local thermodynamic equilibrium, with the maximum entropy approach, the Onsager reciprocal relations and the Green–Kubo relations fall out directly. The approach also creates a theoretical framework for the study of some very special cases of far-from-equilibrium scenarios, making the derivation of the entropy production fluctuation theorem straightforward. For non-equilibrium processes, as is so for macroscopic descriptions, a general definition of entropy for microscopic statistical mechanical accounts is also lacking.

Technical note: For the reasons discussed in the article differential entropy, the simple definition of Shannon entropy ceases to be directly applicable for random variables with continuous probability distribution functions. Instead the appropriate quantity to maximize is the "relative information entropy",

 

Hc is the negative of the Kullback–Leibler divergence, or discrimination information, of m(x) from p(x), where m(x) is a prior invariant measure for the variable(s). The relative entropy Hc is always less than zero, and can be thought of as (the negative of) the number of bits of uncertainty lost by fixing on p(x) rather than m(x). Unlike the Shannon entropy, the relative entropy Hc has the advantage of remaining finite and well-defined for continuous x, and invariant under 1-to-1 coordinate transformations. The two expressions coincide for discrete probability distributions, if one can make the assumption that m(xi) is uniform – i.e. the principle of equal a-priori probability, which underlies statistical thermodynamics.

Philosophical implications edit

Adherents to the MaxEnt viewpoint take a clear position on some of the conceptual/philosophical questions in thermodynamics. This position is sketched below.

The nature of the probabilities in statistical mechanics edit

Jaynes (1985,[5] 2003,[6] et passim) discussed the concept of probability. According to the MaxEnt viewpoint, the probabilities in statistical mechanics are determined jointly by two factors: by respectively specified particular models for the underlying state space (e.g. Liouvillian phase space); and by respectively specified particular partial descriptions of the system (the macroscopic description of the system used to constrain the MaxEnt probability assignment). The probabilities are objective in the sense that, given these inputs, a uniquely defined probability distribution will result, the same for every rational investigator, independent of the subjectivity or arbitrary opinion of particular persons. The probabilities are epistemic in the sense that they are defined in terms of specified data and derived from those data by definite and objective rules of inference, the same for every rational investigator.[7] Here the word epistemic, which refers to objective and impersonal scientific knowledge, the same for every rational investigator, is used in the sense that contrasts it with opiniative, which refers to the subjective or arbitrary beliefs of particular persons; this contrast was used by Plato and Aristotle, and stands reliable today.

Jaynes also used the word 'subjective' in this context because others have used it in this context. He accepted that in a sense, a state of knowledge has a subjective aspect, simply because it refers to thought, which is a mental process. But he emphasized that the principle of maximum entropy refers only to thought which is rational and objective, independent of the personality of the thinker. In general, from a philosophical viewpoint, the words 'subjective' and 'objective' are not contradictory; often an entity has both subjective and objective aspects. Jaynes explicitly rejected the criticism of some writers that, just because one can say that thought has a subjective aspect, thought is automatically non-objective. He explicitly rejected subjectivity as a basis for scientific reasoning, the epistemology of science; he required that scientific reasoning have a fully and strictly objective basis.[8] Nevertheless, critics continue to attack Jaynes, alleging that his ideas are "subjective". One writer even goes so far as to label Jaynes' approach as "ultrasubjectivist",[9] and to mention "the panic that the term subjectivism created amongst physicists".[10]

The probabilities represent both the degree of knowledge and lack of information in the data and the model used in the analyst's macroscopic description of the system, and also what those data say about the nature of the underlying reality.

The fitness of the probabilities depends on whether the constraints of the specified macroscopic model are a sufficiently accurate and/or complete description of the system to capture all of the experimentally reproducible behavior. This cannot be guaranteed, a priori. For this reason MaxEnt proponents also call the method predictive statistical mechanics. The predictions can fail. But if they do, this is informative, because it signals the presence of new constraints needed to capture reproducible behavior in the system, which had not been taken into account.

Is entropy "real"? edit

The thermodynamic entropy (at equilibrium) is a function of the state variables of the model description. It is therefore as "real" as the other variables in the model description. If the model constraints in the probability assignment are a "good" description, containing all the information needed to predict reproducible experimental results, then that includes all of the results one could predict using the formulae involving entropy from classical thermodynamics. To that extent, the MaxEnt STh is as "real" as the entropy in classical thermodynamics.

Of course, in reality there is only one real state of the system. The entropy is not a direct function of that state. It is a function of the real state only through the (subjectively chosen) macroscopic model description.

Is ergodic theory relevant? edit

The Gibbsian ensemble idealizes the notion of repeating an experiment again and again on different systems, not again and again on the same system. So long-term time averages and the ergodic hypothesis, despite the intense interest in them in the first part of the twentieth century, strictly speaking are not relevant to the probability assignment for the state one might find the system in.

However, this changes if there is additional knowledge that the system is being prepared in a particular way some time before the measurement. One must then consider whether this gives further information which is still relevant at the time of measurement. The question of how 'rapidly mixing' different properties of the system are then becomes very much of interest. Information about some degrees of freedom of the combined system may become unusable very quickly; information about other properties of the system may go on being relevant for a considerable time.

If nothing else, the medium and long-run time correlation properties of the system are interesting subjects for experimentation in themselves. Failure to accurately predict them is a good indicator that relevant macroscopically determinable physics may be missing from the model.

The second law edit

According to Liouville's theorem for Hamiltonian dynamics, the hyper-volume of a cloud of points in phase space remains constant as the system evolves. Therefore, the information entropy must also remain constant, if we condition on the original information, and then follow each of those microstates forward in time:

 

However, as time evolves, that initial information we had becomes less directly accessible. Instead of being easily summarizable in the macroscopic description of the system, it increasingly relates to very subtle correlations between the positions and momenta of individual molecules. (Compare to Boltzmann's H-theorem.) Equivalently, it means that the probability distribution for the whole system, in 6N-dimensional phase space, becomes increasingly irregular, spreading out into long thin fingers rather than the initial tightly defined volume of possibilities.

Classical thermodynamics is built on the assumption that entropy is a state function of the macroscopic variables—i.e., that none of the history of the system matters, so that it can all be ignored.

The extended, wispy, evolved probability distribution, which still has the initial Shannon entropy STh(1), should reproduce the expectation values of the observed macroscopic variables at time t2. However it will no longer necessarily be a maximum entropy distribution for that new macroscopic description. On the other hand, the new thermodynamic entropy STh(2) assuredly will measure the maximum entropy distribution, by construction. Therefore, we expect:

 

At an abstract level, this result implies that some of the information we originally had about the system has become "no longer useful" at a macroscopic level. At the level of the 6N-dimensional probability distribution, this result represents coarse graining—i.e., information loss by smoothing out very fine-scale detail.

Caveats with the argument edit

Some caveats should be considered with the above.

1. Like all statistical mechanical results according to the MaxEnt school, this increase in thermodynamic entropy is only a prediction. It assumes in particular that the initial macroscopic description contains all of the information relevant to predicting the later macroscopic state. This may not be the case, for example if the initial description fails to reflect some aspect of the preparation of the system which later becomes relevant. In that case the "failure" of a MaxEnt prediction tells us that there is something more which is relevant that we may have overlooked in the physics of the system.

It is also sometimes suggested that quantum measurement, especially in the decoherence interpretation, may give an apparently unexpected reduction in entropy per this argument, as it appears to involve macroscopic information becoming available which was previously inaccessible. (However, the entropy accounting of quantum measurement is tricky, because to get full decoherence one may be assuming an infinite environment, with an infinite entropy).

2. The argument so far has glossed over the question of fluctuations. It has also implicitly assumed that the uncertainty predicted at time t1 for the variables at time t2 will be much smaller than the measurement error. But if the measurements do meaningfully update our knowledge of the system, our uncertainty as to its state is reduced, giving a new SI(2) which is less than SI(1). (Note that if we allow ourselves the abilities of Laplace's demon, the consequences of this new information can also be mapped backwards, so our uncertainty about the dynamical state at time t1 is now also reduced from SI(1) to SI(2)).

We know that STh(2) > SI(2); but we can now no longer be certain that it is greater than STh(1) = SI(1). This then leaves open the possibility for fluctuations in STh. The thermodynamic entropy may go "down" as well as up. A more sophisticated analysis is given by the entropy Fluctuation Theorem, which can be established as a consequence of the time-dependent MaxEnt picture.

3. As just indicated, the MaxEnt inference runs equally well in reverse. So given a particular final state, we can ask, what can we "retrodict" to improve our knowledge about earlier states? However the Second Law argument above also runs in reverse: given macroscopic information at time t2, we should expect it too to become less useful. The two procedures are time-symmetric. But now the information will become less and less useful at earlier and earlier times. (Compare with Loschmidt's paradox.) The MaxEnt inference would predict that the most probable origin of a currently low-entropy state would be as a spontaneous fluctuation from an earlier high entropy state. But this conflicts with what we know to have happened, namely that entropy has been increasing steadily, even back in the past.

The MaxEnt proponents' response to this would be that such a systematic failing in the prediction of a MaxEnt inference is a "good" thing.[11] It means that there is thus clear evidence that some important physical information has been missed in the specification the problem. If it is correct that the dynamics "are" time-symmetric, it appears that we need to put in by hand a prior probability that initial configurations with a low thermodynamic entropy are more likely than initial configurations with a high thermodynamic entropy. This cannot be explained by the immediate dynamics. Quite possibly, it arises as a reflection of the evident time-asymmetric evolution of the universe on a cosmological scale (see arrow of time).

Criticisms edit

The Maximum Entropy thermodynamics has some important opposition, in part because of the relative paucity of published results from the MaxEnt school, especially with regard to new testable predictions far-from-equilibrium.[12]

The theory has also been criticized in the grounds of internal consistency. For instance, Radu Balescu provides a strong criticism of the MaxEnt School and of Jaynes' work. Balescu states that Jaynes' and coworkers theory is based on a non-transitive evolution law that produces ambiguous results. Although some difficulties of the theory can be cured, the theory "lacks a solid foundation" and "has not led to any new concrete result".[13]

Though the maximum entropy approach is based directly on informational entropy, it is applicable to physics only when there is a clear physical definition of entropy. There is no clear unique general physical definition of entropy for non-equilibrium systems, which are general physical systems considered during a process rather than thermodynamic systems in their own internal states of thermodynamic equilibrium.[14] It follows that the maximum entropy approach will not be applicable to non-equilibrium systems until there is found a clear physical definition of entropy. This problem is related to the fact that heat may be transferred from a hotter to a colder physical system even when local thermodynamic equilibrium does not hold so that neither system has a well defined temperature. Classical entropy is defined for a system in its own internal state of thermodynamic equilibrium, which is defined by state variables, with no non-zero fluxes, so that flux variables do not appear as state variables. But for a strongly non-equilibrium system, during a process, the state variables must include non-zero flux variables. Classical physical definitions of entropy do not cover this case, especially when the fluxes are large enough to destroy local thermodynamic equilibrium. In other words, for entropy for non-equilibrium systems in general, the definition will need at least to involve specification of the process including non-zero fluxes, beyond the classical static thermodynamic state variables. The 'entropy' that is maximized needs to be defined suitably for the problem at hand. If an inappropriate 'entropy' is maximized, a wrong result is likely. In principle, maximum entropy thermodynamics does not refer narrowly and only to classical thermodynamic entropy. It is about informational entropy applied to physics, explicitly depending on the data used to formulate the problem at hand. According to Attard, for physical problems analyzed by strongly non-equilibrium thermodynamics, several physically distinct kinds of entropy need to be considered, including what he calls second entropy. Attard writes: "Maximizing the second entropy over the microstates in the given initial macrostate gives the most likely target macrostate.".[15] The physically defined second entropy can also be considered from an informational viewpoint.

See also edit

References edit

  1. ^ Jaynes, E.T. (1957). "Information theory and statistical mechanics" (PDF). Physical Review. 106 (4): 620–630. Bibcode:1957PhRv..106..620J. doi:10.1103/PhysRev.106.620. S2CID 17870175.
  2. ^ — (1957). "Information theory and statistical mechanics II" (PDF). Physical Review. 108 (2): 171–190. Bibcode:1957PhRv..108..171J. doi:10.1103/PhysRev.108.171.
  3. ^ Jaynes, E.T. (1968), p. 229.
  4. ^ Jaynes, E.T. (1979), pp. 30, 31, 40.
  5. ^ Jaynes, E.T. (1985).
  6. ^ Jaynes, E.T. (2003).
  7. ^ Jaynes, E.T. (1979), p. 28.
  8. ^ Jaynes, E.T. (1968), p. 228.
  9. ^ Guttmann, Y.M. (1999), pp. 28, 36, 38, 57, 61.
  10. ^ Guttmann, Y.M. (1999), p. 29.
  11. ^ Jaynes, E.T. (1979).
  12. ^ Kleidon, A., Lorenz, R.D. (2005).
  13. ^ Balescu, R. (1997).
  14. ^ Lieb, E.H., Yngvason, J. (2003). The entropy of classical thermodynamics, Chapter 8 of Greven, A., Keller, G., Warnecke (editors) (2003). Entropy, Princeton University Press, Princeton NJ, ISBN 0-691-11338-6, page 190.
  15. ^ Attard, P. (2012). Non-Equilibrium Thermodynamics and Statistical Mechanics: Foundations and Applications, Oxford University Press, Oxford UK, ISBN 978-0-19-966276-0, p. 161.

Bibliography of cited references edit

  • Balescu, Radu (1997). Statistical Dynamics: Matter out of equilibrium. London: Imperial College Press. Bibcode:1997sdmo.book.....B.
  • Jaynes, E.T. (September 1968). "Prior Probabilities" (PDF). IEEE Transactions on Systems Science and Cybernetics. SSC–4 (3): 227–241. doi:10.1109/TSSC.1968.300117.
  • Guttmann, Y.M. (1999). The Concept of Probability in Statistical Physics, Cambridge University Press, Cambridge UK, ISBN 978-0-521-62128-1.
  • Jaynes, E.T. (1979). "Where do we stand on maximum entropy?" (PDF). In Levine, R.; Tribus M. (eds.). The Maximum Entropy Formalism. MIT Press. ISBN 978-0-262-12080-7.
  • Jaynes, E.T. (1985). "Some random observations". Synthese. 63: 115–138. doi:10.1007/BF00485957. S2CID 46975520.
  • Jaynes, E.T. (2003). Bretthorst, G.L. (ed.). Probability Theory: The Logic of Science. Cambridge: Cambridge University Press. ISBN 978-0-521-59271-0.
  • Kleidon, Axel; Lorenz, Ralph D. (2005). Non-equilibrium thermodynamics and the production of entropy: life, earth, and beyond. Springer. pp. 42–. ISBN 978-3-540-22495-2.

Further reading edit

  • Bajkova, A.T. (1992). "The generalization of maximum entropy method for reconstruction of complex functions". Astronomical and Astrophysical Transactions. 1 (4): 313–320. Bibcode:1992A&AT....1..313B. doi:10.1080/10556799208230532.
  • Caticha, Ariel (2012). Entropic Inference and the Foundations of Physics (PDF).
  • Dewar, R.C. (2003). "Information theory explanation of the fluctuation theorem, maximum entropy production and self-organized criticality in non-equilibrium stationary states". J. Phys. A: Math. Gen. 36 (3): 631–41. arXiv:cond-mat/0005382. Bibcode:2003JPhA...36..631D. doi:10.1088/0305-4470/36/3/303. S2CID 44217479.
  • — (2005). "Maximum entropy production and the fluctuation theorem". J. Phys. A: Math. Gen. 38 (21): L371–81. Bibcode:2005JPhA...38L.371D. doi:10.1088/0305-4470/38/21/L01.
  • Grinstein, G.; Linsker, R. (2007). "Comments on a derivation and application of the 'maximum entropy production' principle". J. Phys. A: Math. Theor. 40 (31): 9717–20. Bibcode:2007JPhA...40.9717G. doi:10.1088/1751-8113/40/31/N01. S2CID 123641741. Shows invalidity of Dewar's derivations (a) of maximum entropy production (MaxEP) from fluctuation theorem for far-from-equilibrium systems, and (b) of a claimed link between MaxEP and self-organized criticality.
  • Grandy, W. T., 1987. Foundations of Statistical Mechanics. Vol 1: Equilibrium Theory; Vol. 2: Nonequilibrium Phenomena. Dordrecht: D. Reidel. Vol. 1: ISBN 90-277-2489-X. Vol. 2: ISBN 90-277-2649-3.
  • — (2004). "Three papers in nonequilibrium statistical mechanics". Found. Phys. 34 (1): 21–57. arXiv:cond-mat/0303291. Bibcode:2004FoPh...34...21G. doi:10.1023/B:FOOP.0000012008.36856.c1. S2CID 18573684.
  • Gull, S.F. (1991). "Some misconceptions about entropy". In Buck, B.; Macaulay, V.A. (eds.). Maximum Entropy in Action. Oxford University Press. ISBN 978-0-19-853963-6.
  • Jaynes 1979
  • Extensive archive of further papers by E.T. Jaynes on probability and physics. Many are collected in Rosenkrantz, R.D., ed. (1983). E.T. Jaynes — Papers on probability, statistics and statistical physics. Dordrecht, Netherlands: D. Reidel. ISBN 978-90-277-1448-0.
  • Lorenz, R. (2003). "Full steam ahead — probably" (PDF). Science. 299 (5608): 837–8. doi:10.1126/science.1081280. PMID 12574610. S2CID 118583810.
  • Rau, Jochen (1998). "Statistical Mechanics in a Nutshell". arXiv:physics/9805024.

maximum, entropy, thermodynamics, physics, maximum, entropy, thermodynamics, colloquially, maxent, thermodynamics, views, equilibrium, thermodynamics, statistical, mechanics, inference, processes, more, specifically, maxent, applies, inference, techniques, roo. In physics maximum entropy thermodynamics colloquially MaxEnt thermodynamics views equilibrium thermodynamics and statistical mechanics as inference processes More specifically MaxEnt applies inference techniques rooted in Shannon information theory Bayesian probability and the principle of maximum entropy These techniques are relevant to any situation requiring prediction from incomplete or insufficient data e g image reconstruction signal processing spectral analysis and inverse problems MaxEnt thermodynamics began with two papers by Edwin T Jaynes published in the 1957 Physical Review 1 2 Contents 1 Maximum Shannon entropy 2 Philosophical implications 2 1 The nature of the probabilities in statistical mechanics 2 2 Is entropy real 2 3 Is ergodic theory relevant 2 4 The second law 2 5 Caveats with the argument 3 Criticisms 4 See also 5 References 5 1 Bibliography of cited references 6 Further readingMaximum Shannon entropy editCentral to the MaxEnt thesis is the principle of maximum entropy It demands as given some partly specified model and some specified data related to the model It selects a preferred probability distribution to represent the model The given data state testable information 3 4 about the probability distribution for example particular expectation values but are not in themselves sufficient to uniquely determine it The principle states that one should prefer the distribution which maximizes the Shannon information entropy S I i p i ln p i displaystyle S text I sum i p i ln p i nbsp This is known as the Gibbs algorithm having been introduced by J Willard Gibbs in 1878 to set up statistical ensembles to predict the properties of thermodynamic systems at equilibrium It is the cornerstone of the statistical mechanical analysis of the thermodynamic properties of equilibrium systems see partition function A direct connection is thus made between the equilibrium thermodynamic entropy STh a state function of pressure volume temperature etc and the information entropy for the predicted distribution with maximum uncertainty conditioned only on the expectation values of those variables S Th P V T eqm k B S I P V T displaystyle S text Th P V T ldots text eqm k text B S text I P V T ldots nbsp kB the Boltzmann constant has no fundamental physical significance here but is necessary to retain consistency with the previous historical definition of entropy by Clausius 1865 see Boltzmann constant However the MaxEnt school argue that the MaxEnt approach is a general technique of statistical inference with applications far beyond this It can therefore also be used to predict a distribution for trajectories G over a period of time by maximising S I p G ln p G displaystyle S text I sum p Gamma ln p Gamma nbsp This information entropy does not necessarily have a simple correspondence with thermodynamic entropy But it can be used to predict features of nonequilibrium thermodynamic systems as they evolve over time For non equilibrium scenarios in an approximation that assumes local thermodynamic equilibrium with the maximum entropy approach the Onsager reciprocal relations and the Green Kubo relations fall out directly The approach also creates a theoretical framework for the study of some very special cases of far from equilibrium scenarios making the derivation of the entropy production fluctuation theorem straightforward For non equilibrium processes as is so for macroscopic descriptions a general definition of entropy for microscopic statistical mechanical accounts is also lacking Technical note For the reasons discussed in the article differential entropy the simple definition of Shannon entropy ceases to be directly applicable for random variables with continuous probability distribution functions Instead the appropriate quantity to maximize is the relative information entropy H c p x log p x m x d x displaystyle H text c int p x log frac p x m x dx nbsp Hc is the negative of the Kullback Leibler divergence or discrimination information of m x from p x where m x is a prior invariant measure for the variable s The relative entropy Hc is always less than zero and can be thought of as the negative of the number of bits of uncertainty lost by fixing on p x rather than m x Unlike the Shannon entropy the relative entropy Hc has the advantage of remaining finite and well defined for continuous x and invariant under 1 to 1 coordinate transformations The two expressions coincide for discrete probability distributions if one can make the assumption that m xi is uniform i e the principle of equal a priori probability which underlies statistical thermodynamics Philosophical implications editAdherents to the MaxEnt viewpoint take a clear position on some of the conceptual philosophical questions in thermodynamics This position is sketched below The nature of the probabilities in statistical mechanics edit Jaynes 1985 5 2003 6 et passim discussed the concept of probability According to the MaxEnt viewpoint the probabilities in statistical mechanics are determined jointly by two factors by respectively specified particular models for the underlying state space e g Liouvillian phase space and by respectively specified particular partial descriptions of the system the macroscopic description of the system used to constrain the MaxEnt probability assignment The probabilities are objective in the sense that given these inputs a uniquely defined probability distribution will result the same for every rational investigator independent of the subjectivity or arbitrary opinion of particular persons The probabilities are epistemic in the sense that they are defined in terms of specified data and derived from those data by definite and objective rules of inference the same for every rational investigator 7 Here the word epistemic which refers to objective and impersonal scientific knowledge the same for every rational investigator is used in the sense that contrasts it with opiniative which refers to the subjective or arbitrary beliefs of particular persons this contrast was used by Plato and Aristotle and stands reliable today Jaynes also used the word subjective in this context because others have used it in this context He accepted that in a sense a state of knowledge has a subjective aspect simply because it refers to thought which is a mental process But he emphasized that the principle of maximum entropy refers only to thought which is rational and objective independent of the personality of the thinker In general from a philosophical viewpoint the words subjective and objective are not contradictory often an entity has both subjective and objective aspects Jaynes explicitly rejected the criticism of some writers that just because one can say that thought has a subjective aspect thought is automatically non objective He explicitly rejected subjectivity as a basis for scientific reasoning the epistemology of science he required that scientific reasoning have a fully and strictly objective basis 8 Nevertheless critics continue to attack Jaynes alleging that his ideas are subjective One writer even goes so far as to label Jaynes approach as ultrasubjectivist 9 and to mention the panic that the term subjectivism created amongst physicists 10 The probabilities represent both the degree of knowledge and lack of information in the data and the model used in the analyst s macroscopic description of the system and also what those data say about the nature of the underlying reality The fitness of the probabilities depends on whether the constraints of the specified macroscopic model are a sufficiently accurate and or complete description of the system to capture all of the experimentally reproducible behavior This cannot be guaranteed a priori For this reason MaxEnt proponents also call the method predictive statistical mechanics The predictions can fail But if they do this is informative because it signals the presence of new constraints needed to capture reproducible behavior in the system which had not been taken into account Is entropy real edit The thermodynamic entropy at equilibrium is a function of the state variables of the model description It is therefore as real as the other variables in the model description If the model constraints in the probability assignment are a good description containing all the information needed to predict reproducible experimental results then that includes all of the results one could predict using the formulae involving entropy from classical thermodynamics To that extent the MaxEnt STh is as real as the entropy in classical thermodynamics Of course in reality there is only one real state of the system The entropy is not a direct function of that state It is a function of the real state only through the subjectively chosen macroscopic model description Is ergodic theory relevant edit The Gibbsian ensemble idealizes the notion of repeating an experiment again and again on different systems not again and again on the same system So long term time averages and the ergodic hypothesis despite the intense interest in them in the first part of the twentieth century strictly speaking are not relevant to the probability assignment for the state one might find the system in However this changes if there is additional knowledge that the system is being prepared in a particular way some time before the measurement One must then consider whether this gives further information which is still relevant at the time of measurement The question of how rapidly mixing different properties of the system are then becomes very much of interest Information about some degrees of freedom of the combined system may become unusable very quickly information about other properties of the system may go on being relevant for a considerable time If nothing else the medium and long run time correlation properties of the system are interesting subjects for experimentation in themselves Failure to accurately predict them is a good indicator that relevant macroscopically determinable physics may be missing from the model The second law edit According to Liouville s theorem for Hamiltonian dynamics the hyper volume of a cloud of points in phase space remains constant as the system evolves Therefore the information entropy must also remain constant if we condition on the original information and then follow each of those microstates forward in time D S I 0 displaystyle Delta S text I 0 nbsp However as time evolves that initial information we had becomes less directly accessible Instead of being easily summarizable in the macroscopic description of the system it increasingly relates to very subtle correlations between the positions and momenta of individual molecules Compare to Boltzmann s H theorem Equivalently it means that the probability distribution for the whole system in 6N dimensional phase space becomes increasingly irregular spreading out into long thin fingers rather than the initial tightly defined volume of possibilities Classical thermodynamics is built on the assumption that entropy is a state function of the macroscopic variables i e that none of the history of the system matters so that it can all be ignored The extended wispy evolved probability distribution which still has the initial Shannon entropy STh 1 should reproduce the expectation values of the observed macroscopic variables at time t2 However it will no longer necessarily be a maximum entropy distribution for that new macroscopic description On the other hand the new thermodynamic entropy STh 2 assuredly will measure the maximum entropy distribution by construction Therefore we expect S Th 2 S Th 1 displaystyle S text Th 2 geq S text Th 1 nbsp At an abstract level this result implies that some of the information we originally had about the system has become no longer useful at a macroscopic level At the level of the 6N dimensional probability distribution this result represents coarse graining i e information loss by smoothing out very fine scale detail Caveats with the argument edit Some caveats should be considered with the above 1 Like all statistical mechanical results according to the MaxEnt school this increase in thermodynamic entropy is only a prediction It assumes in particular that the initial macroscopic description contains all of the information relevant to predicting the later macroscopic state This may not be the case for example if the initial description fails to reflect some aspect of the preparation of the system which later becomes relevant In that case the failure of a MaxEnt prediction tells us that there is something more which is relevant that we may have overlooked in the physics of the system It is also sometimes suggested that quantum measurement especially in the decoherence interpretation may give an apparently unexpected reduction in entropy per this argument as it appears to involve macroscopic information becoming available which was previously inaccessible However the entropy accounting of quantum measurement is tricky because to get full decoherence one may be assuming an infinite environment with an infinite entropy 2 The argument so far has glossed over the question of fluctuations It has also implicitly assumed that the uncertainty predicted at time t1 for the variables at time t2 will be much smaller than the measurement error But if the measurements do meaningfully update our knowledge of the system our uncertainty as to its state is reduced giving a new SI 2 which is less than SI 1 Note that if we allow ourselves the abilities of Laplace s demon the consequences of this new information can also be mapped backwards so our uncertainty about the dynamical state at time t1 is now also reduced from SI 1 to SI 2 We know that STh 2 gt SI 2 but we can now no longer be certain that it is greater than STh 1 SI 1 This then leaves open the possibility for fluctuations in STh The thermodynamic entropy may go down as well as up A more sophisticated analysis is given by the entropy Fluctuation Theorem which can be established as a consequence of the time dependent MaxEnt picture 3 As just indicated the MaxEnt inference runs equally well in reverse So given a particular final state we can ask what can we retrodict to improve our knowledge about earlier states However the Second Law argument above also runs in reverse given macroscopic information at time t2 we should expect it too to become less useful The two procedures are time symmetric But now the information will become less and less useful at earlier and earlier times Compare with Loschmidt s paradox The MaxEnt inference would predict that the most probable origin of a currently low entropy state would be as a spontaneous fluctuation from an earlier high entropy state But this conflicts with what we know to have happened namely that entropy has been increasing steadily even back in the past The MaxEnt proponents response to this would be that such a systematic failing in the prediction of a MaxEnt inference is a good thing 11 It means that there is thus clear evidence that some important physical information has been missed in the specification the problem If it is correct that the dynamics are time symmetric it appears that we need to put in by hand a prior probability that initial configurations with a low thermodynamic entropy are more likely than initial configurations with a high thermodynamic entropy This cannot be explained by the immediate dynamics Quite possibly it arises as a reflection of the evident time asymmetric evolution of the universe on a cosmological scale see arrow of time Criticisms editThe Maximum Entropy thermodynamics has some important opposition in part because of the relative paucity of published results from the MaxEnt school especially with regard to new testable predictions far from equilibrium 12 The theory has also been criticized in the grounds of internal consistency For instance Radu Balescu provides a strong criticism of the MaxEnt School and of Jaynes work Balescu states that Jaynes and coworkers theory is based on a non transitive evolution law that produces ambiguous results Although some difficulties of the theory can be cured the theory lacks a solid foundation and has not led to any new concrete result 13 Though the maximum entropy approach is based directly on informational entropy it is applicable to physics only when there is a clear physical definition of entropy There is no clear unique general physical definition of entropy for non equilibrium systems which are general physical systems considered during a process rather than thermodynamic systems in their own internal states of thermodynamic equilibrium 14 It follows that the maximum entropy approach will not be applicable to non equilibrium systems until there is found a clear physical definition of entropy This problem is related to the fact that heat may be transferred from a hotter to a colder physical system even when local thermodynamic equilibrium does not hold so that neither system has a well defined temperature Classical entropy is defined for a system in its own internal state of thermodynamic equilibrium which is defined by state variables with no non zero fluxes so that flux variables do not appear as state variables But for a strongly non equilibrium system during a process the state variables must include non zero flux variables Classical physical definitions of entropy do not cover this case especially when the fluxes are large enough to destroy local thermodynamic equilibrium In other words for entropy for non equilibrium systems in general the definition will need at least to involve specification of the process including non zero fluxes beyond the classical static thermodynamic state variables The entropy that is maximized needs to be defined suitably for the problem at hand If an inappropriate entropy is maximized a wrong result is likely In principle maximum entropy thermodynamics does not refer narrowly and only to classical thermodynamic entropy It is about informational entropy applied to physics explicitly depending on the data used to formulate the problem at hand According to Attard for physical problems analyzed by strongly non equilibrium thermodynamics several physically distinct kinds of entropy need to be considered including what he calls second entropy Attard writes Maximizing the second entropy over the microstates in the given initial macrostate gives the most likely target macrostate 15 The physically defined second entropy can also be considered from an informational viewpoint See also editEdwin Thompson Jaynes First law of thermodynamics Second law of thermodynamics Principle of maximum entropy Principle of Minimum Discrimination Information Kullback Leibler divergence Quantum relative entropy Information theory and measure theory Entropy power inequalityReferences edit Jaynes E T 1957 Information theory and statistical mechanics PDF Physical Review 106 4 620 630 Bibcode 1957PhRv 106 620J doi 10 1103 PhysRev 106 620 S2CID 17870175 1957 Information theory and statistical mechanics II PDF Physical Review 108 2 171 190 Bibcode 1957PhRv 108 171J doi 10 1103 PhysRev 108 171 Jaynes E T 1968 p 229 Jaynes E T 1979 pp 30 31 40 Jaynes E T 1985 Jaynes E T 2003 Jaynes E T 1979 p 28 Jaynes E T 1968 p 228 Guttmann Y M 1999 pp 28 36 38 57 61 Guttmann Y M 1999 p 29 Jaynes E T 1979 Kleidon A Lorenz R D 2005 Balescu R 1997 Lieb E H Yngvason J 2003 The entropy of classical thermodynamics Chapter 8 of Greven A Keller G Warnecke editors 2003 Entropy Princeton University Press Princeton NJ ISBN 0 691 11338 6 page 190 Attard P 2012 Non Equilibrium Thermodynamics and Statistical Mechanics Foundations and Applications Oxford University Press Oxford UK ISBN 978 0 19 966276 0 p 161 Bibliography of cited references edit Balescu Radu 1997 Statistical Dynamics Matter out of equilibrium London Imperial College Press Bibcode 1997sdmo book B Jaynes E T September 1968 Prior Probabilities PDF IEEE Transactions on Systems Science and Cybernetics SSC 4 3 227 241 doi 10 1109 TSSC 1968 300117 Guttmann Y M 1999 The Concept of Probability in Statistical Physics Cambridge University Press Cambridge UK ISBN 978 0 521 62128 1 Jaynes E T 1979 Where do we stand on maximum entropy PDF In Levine R Tribus M eds The Maximum Entropy Formalism MIT Press ISBN 978 0 262 12080 7 Jaynes E T 1985 Some random observations Synthese 63 115 138 doi 10 1007 BF00485957 S2CID 46975520 Jaynes E T 2003 Bretthorst G L ed Probability Theory The Logic of Science Cambridge Cambridge University Press ISBN 978 0 521 59271 0 Kleidon Axel Lorenz Ralph D 2005 Non equilibrium thermodynamics and the production of entropy life earth and beyond Springer pp 42 ISBN 978 3 540 22495 2 Further reading editBajkova A T 1992 The generalization of maximum entropy method for reconstruction of complex functions Astronomical and Astrophysical Transactions 1 4 313 320 Bibcode 1992A amp AT 1 313B doi 10 1080 10556799208230532 Caticha Ariel 2012 Entropic Inference and the Foundations of Physics PDF Dewar R C 2003 Information theory explanation of the fluctuation theorem maximum entropy production and self organized criticality in non equilibrium stationary states J Phys A Math Gen 36 3 631 41 arXiv cond mat 0005382 Bibcode 2003JPhA 36 631D doi 10 1088 0305 4470 36 3 303 S2CID 44217479 2005 Maximum entropy production and the fluctuation theorem J Phys A Math Gen 38 21 L371 81 Bibcode 2005JPhA 38L 371D doi 10 1088 0305 4470 38 21 L01 Grinstein G Linsker R 2007 Comments on a derivation and application of the maximum entropy production principle J Phys A Math Theor 40 31 9717 20 Bibcode 2007JPhA 40 9717G doi 10 1088 1751 8113 40 31 N01 S2CID 123641741 Shows invalidity of Dewar s derivations a of maximum entropy production MaxEP from fluctuation theorem for far from equilibrium systems and b of a claimed link between MaxEP and self organized criticality Grandy W T 1987 Foundations of Statistical Mechanics Vol 1 Equilibrium Theory Vol 2 Nonequilibrium Phenomena Dordrecht D Reidel Vol 1 ISBN 90 277 2489 X Vol 2 ISBN 90 277 2649 3 2004 Three papers in nonequilibrium statistical mechanics Found Phys 34 1 21 57 arXiv cond mat 0303291 Bibcode 2004FoPh 34 21G doi 10 1023 B FOOP 0000012008 36856 c1 S2CID 18573684 Gull S F 1991 Some misconceptions about entropy In Buck B Macaulay V A eds Maximum Entropy in Action Oxford University Press ISBN 978 0 19 853963 6 Jaynes 1979 Extensive archive of further papers by E T Jaynes on probability and physics Many are collected in Rosenkrantz R D ed 1983 E T Jaynes Papers on probability statistics and statistical physics Dordrecht Netherlands D Reidel ISBN 978 90 277 1448 0 Lorenz R 2003 Full steam ahead probably PDF Science 299 5608 837 8 doi 10 1126 science 1081280 PMID 12574610 S2CID 118583810 Rau Jochen 1998 Statistical Mechanics in a Nutshell arXiv physics 9805024 Retrieved from https en wikipedia org w index php title Maximum entropy thermodynamics amp oldid 1173183172, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.