fbpx
Wikipedia

Maxwell–Boltzmann statistics

In statistical mechanics, Maxwell–Boltzmann statistics describes the distribution of classical material particles over various energy states in thermal equilibrium. It is applicable when the temperature is high enough or the particle density is low enough to render quantum effects negligible.

Maxwell–Boltzmann statistics can be used to derive the Maxwell–Boltzmann distribution of particle speeds in an ideal gas. Shown: distribution of speeds for 106 oxygen molecules at -100, 20, and 600 °C.

The expected number of particles with energy for Maxwell–Boltzmann statistics is

where:

  • is the energy of the i-th energy level,
  • is the average number of particles in the set of states with energy ,
  • is the degeneracy of energy level i, that is, the number of states with energy which may nevertheless be distinguished from each other by some other means,[nb 1]
  • μ is the chemical potential,
  • k is the Boltzmann constant,
  • T is absolute temperature,
  • N is the total number of particles:
  • Z is the partition function:
  • e is Euler's number

Equivalently, the number of particles is sometimes expressed as

where the index i now specifies a particular state rather than the set of all states with energy , and .

History edit

Maxwell–Boltzmann statistics grew out of the Maxwell–Boltzmann distribution, most likely as a distillation of the underlying technique.[dubious ] The distribution was first derived by Maxwell in 1860 on heuristic grounds. Boltzmann later, in the 1870s, carried out significant investigations into the physical origins of this distribution. The distribution can be derived on the ground that it maximizes the entropy of the system.

Applicability edit

 
Comparison of average occupancy of the ground state for three statistics

Maxwell–Boltzmann statistics is used to derive the Maxwell–Boltzmann distribution of an ideal gas. However, it can also be used to extend that distribution to particles with a different energy–momentum relation, such as relativistic particles (resulting in Maxwell–Jüttner distribution), and to other than three-dimensional spaces.

Maxwell–Boltzmann statistics is often described as the statistics of "distinguishable" classical particles. In other words, the configuration of particle A in state 1 and particle B in state 2 is different from the case in which particle B is in state 1 and particle A is in state 2. This assumption leads to the proper (Boltzmann) statistics of particles in the energy states, but yields non-physical results for the entropy, as embodied in the Gibbs paradox.

At the same time, there are no real particles that have the characteristics required by Maxwell–Boltzmann statistics. Indeed, the Gibbs paradox is resolved if we treat all particles of a certain type (e.g., electrons, protons,photon etc.) as principally indistinguishable. Once this assumption is made, the particle statistics change. The change in entropy in the entropy of mixing example may be viewed as an example of a non-extensive entropy resulting from the distinguishability of the two types of particles being mixed.

Quantum particles are either bosons (following instead Bose–Einstein statistics) or fermions (subject to the Pauli exclusion principle, following instead Fermi–Dirac statistics). Both of these quantum statistics approach the Maxwell–Boltzmann statistics in the limit of high temperature and low particle density.

Derivations edit

Maxwell–Boltzmann statistics can be derived in various statistical mechanical thermodynamic ensembles:[1]

In each case it is necessary to assume that the particles are non-interacting, and that multiple particles can occupy the same state and do so independently.

Derivation from microcanonical ensemble edit

Suppose we have a container with a huge number of very small particles all with identical physical characteristics (such as mass, charge, etc.). Let's refer to this as the system. Assume that though the particles have identical properties, they are distinguishable. For example, we might identify each particle by continually observing their trajectories, or by placing a marking on each one, e.g., drawing a different number on each one as is done with lottery balls.

The particles are moving inside that container in all directions with great speed. Because the particles are speeding around, they possess some energy. The Maxwell–Boltzmann distribution is a mathematical function that describes about how many particles in the container have a certain energy. More precisely, the Maxwell–Boltzmann distribution gives the non-normalized probability (this means that the probabilities do not add up to 1) that the state corresponding to a particular energy is occupied.

In general, there may be many particles with the same amount of energy  . Let the number of particles with the same energy   be  , the number of particles possessing another energy   be  , and so forth for all the possible energies   To describe this situation, we say that   is the occupation number of the energy level   If we know all the occupation numbers   then we know the total energy of the system. However, because we can distinguish between which particles are occupying each energy level, the set of occupation numbers   does not completely describe the state of the system. To completely describe the state of the system, or the microstate, we must specify exactly which particles are in each energy level. Thus when we count the number of possible states of the system, we must count each and every microstate, and not just the possible sets of occupation numbers.

To begin with, assume that there is only one state at each energy level   (there is no degeneracy). What follows next is a bit of combinatorial thinking which has little to do in accurately describing the reservoir of particles. For instance, let's say there is a total of   boxes labelled  . With the concept of combination, we could calculate how many ways there are to arrange   into the set of boxes, where the order of balls within each box isn’t tracked. First, we select   balls from a total of   balls to place into box  , and continue to select for each box from the remaining balls, ensuring that every ball is placed in one of the boxes. The total number of ways that the balls can be arranged is

 

As every ball has been placed into a box,  , and we simplify the expression as

 

This is just the multinomial coefficient, the number of ways of arranging N items into k boxes, the l-th box holding Nl items, ignoring the permutation of items in each box.

Now, consider the case where there is more than one way to put   particles in the box   (i.e. taking the degeneracy problem into consideration). If the  -th box has a "degeneracy" of  , that is, it has   "sub-boxes" (  boxes with the same energy  . These states/boxes with the same energy are called degenerate states.), such that any way of filling the  -th box where the number in the sub-boxes is changed is a distinct way of filling the box, then the number of ways of filling the i-th box must be increased by the number of ways of distributing the   objects in the   "sub-boxes". The number of ways of placing   distinguishable objects in   "sub-boxes" is   (the first object can go into any of the   boxes, the second object can also go into any of the   boxes, and so on). Thus the number of ways   that a total of   particles can be classified into energy levels according to their energies, while each level   having   distinct states such that the i-th level accommodates   particles is:

 

This is the form for W first derived by Boltzmann. Boltzmann's fundamental equation   relates the thermodynamic entropy S to the number of microstates W, where k is the Boltzmann constant. It was pointed out by Gibbs however, that the above expression for W does not yield an extensive entropy, and is therefore faulty. This problem is known as the Gibbs paradox. The problem is that the particles considered by the above equation are not indistinguishable. In other words, for two particles (A and B) in two energy sublevels the population represented by [A,B] is considered distinct from the population [B,A] while for indistinguishable particles, they are not. If we carry out the argument for indistinguishable particles, we are led to the Bose–Einstein expression for W:

 

The Maxwell–Boltzmann distribution follows from this Bose–Einstein distribution for temperatures well above absolute zero, implying that  . The Maxwell–Boltzmann distribution also requires low density, implying that  . Under these conditions, we may use Stirling's approximation for the factorial:

 

to write:

 

Using the fact that   for   we can again use Stirling's approximation to write:

 

This is essentially a division by N! of Boltzmann's original expression for W, and this correction is referred to as correct Boltzmann counting.

We wish to find the   for which the function   is maximized, while considering the constraint that there is a fixed number of particles   and a fixed energy   in the container. The maxima of   and   are achieved by the same values of   and, since it is easier to accomplish mathematically, we will maximize the latter function instead. We constrain our solution using Lagrange multipliers forming the function:

 
 

Finally

 

In order to maximize the expression above we apply Fermat's theorem (stationary points), according to which local extrema, if exist, must be at critical points (partial derivatives vanish):

 

By solving the equations above ( ) we arrive to an expression for  :

 

Substituting this expression for   into the equation for   and assuming that   yields:

 

or, rearranging:

 

Boltzmann realized that this is just an expression of the Euler-integrated fundamental equation of thermodynamics. Identifying E as the internal energy, the Euler-integrated fundamental equation states that :

 

where T is the temperature, P is pressure, V is volume, and μ is the chemical potential. Boltzmann's famous equation   is the realization that the entropy is proportional to   with the constant of proportionality being the Boltzmann constant. Using the ideal gas equation of state (PV = NkT), It follows immediately that   and   so that the populations may now be written:

 

Note that the above formula is sometimes written:

 

where   is the absolute activity.

Alternatively, we may use the fact that

 

to obtain the population numbers as

 

where Z is the partition function defined by:

 

In an approximation where εi is considered to be a continuous variable, the Thomas–Fermi approximation yields a continuous degeneracy g proportional to   so that:

 

which is just the Maxwell–Boltzmann distribution for the energy.

Derivation from canonical ensemble edit

In the above discussion, the Boltzmann distribution function was obtained via directly analysing the multiplicities of a system. Alternatively, one can make use of the canonical ensemble. In a canonical ensemble, a system is in thermal contact with a reservoir. While energy is free to flow between the system and the reservoir, the reservoir is thought to have infinitely large heat capacity as to maintain constant temperature, T, for the combined system.

In the present context, our system is assumed to have the energy levels   with degeneracies  . As before, we would like to calculate the probability that our system has energy  .

If our system is in state  , then there would be a corresponding number of microstates available to the reservoir. Call this number  . By assumption, the combined system (of the system we are interested in and the reservoir) is isolated, so all microstates are equally probable. Therefore, for instance, if  , we can conclude that our system is twice as likely to be in state   than  . In general, if   is the probability that our system is in state  ,

 

Since the entropy of the reservoir  , the above becomes

 

Next we recall the thermodynamic identity (from the first law of thermodynamics):

 

In a canonical ensemble, there is no exchange of particles, so the   term is zero. Similarly,   This gives

 

where   and   denote the energies of the reservoir and the system at  , respectively. For the second equality we have used the conservation of energy. Substituting into the first equation relating  :

 

which implies, for any state s of the system

 

where Z is an appropriately chosen "constant" to make total probability 1. (Z is constant provided that the temperature T is invariant.)

 

where the index s runs through all microstates of the system. Z is sometimes called the Boltzmann sum over states (or "Zustandssumme" in the original German). If we index the summation via the energy eigenvalues instead of all possible states, degeneracy must be taken into account. The probability of our system having energy   is simply the sum of the probabilities of all corresponding microstates:

 

where, with obvious modification,

 

this is the same result as before.

Comments on this derivation:

  • Notice that in this formulation, the initial assumption "... suppose the system has total N particles..." is dispensed with. Indeed, the number of particles possessed by the system plays no role in arriving at the distribution. Rather, how many particles would occupy states with energy   follows as an easy consequence.
  • What has been presented above is essentially a derivation of the canonical partition function. As one can see by comparing the definitions, the Boltzmann sum over states is equal to the canonical partition function.
  • Exactly the same approach can be used to derive Fermi–Dirac and Bose–Einstein statistics. However, there one would replace the canonical ensemble with the grand canonical ensemble, since there is exchange of particles between the system and the reservoir. Also, the system one considers in those cases is a single particle state, not a particle. (In the above discussion, we could have assumed our system to be a single atom.)

See also edit

Notes edit

  1. ^ For example, two simple point particles may have the same energy, but different momentum vectors. They may be distinguished from each other on this basis, and the degeneracy will be the number of possible ways that they can be so distinguished.

References edit

  1. ^ Tolman, R. C. (1938). The Principles of Statistical Mechanics. Dover Publications. ISBN 9780486638966.

Bibliography edit

  • Carter, Ashley H., "Classical and Statistical Thermodynamics", Prentice–Hall, Inc., 2001, New Jersey.
  • Raj Pathria, "Statistical Mechanics", Butterworth–Heinemann, 1996.

maxwell, boltzmann, statistics, confused, with, maxwell, boltzmann, distribution, statistical, mechanics, describes, distribution, classical, material, particles, over, various, energy, states, thermal, equilibrium, applicable, when, temperature, high, enough,. Not to be confused with Maxwell Boltzmann distribution In statistical mechanics Maxwell Boltzmann statistics describes the distribution of classical material particles over various energy states in thermal equilibrium It is applicable when the temperature is high enough or the particle density is low enough to render quantum effects negligible Maxwell Boltzmann statistics can be used to derive the Maxwell Boltzmann distribution of particle speeds in an ideal gas Shown distribution of speeds for 106 oxygen molecules at 100 20 and 600 C The expected number of particles with energy e i displaystyle varepsilon i for Maxwell Boltzmann statistics is N i g i e e i m k T N Z g i e e i k T displaystyle langle N i rangle frac g i e varepsilon i mu kT frac N Z g i e varepsilon i kT where e i displaystyle varepsilon i is the energy of the i th energy level N i displaystyle langle N i rangle is the average number of particles in the set of states with energy e i displaystyle varepsilon i g i displaystyle g i is the degeneracy of energy level i that is the number of states with energy e i displaystyle varepsilon i which may nevertheless be distinguished from each other by some other means nb 1 m is the chemical potential k is the Boltzmann constant T is absolute temperature N is the total number of particles N i N i displaystyle N sum i N i Z is the partition function Z i g i e e i k T displaystyle Z sum i g i e varepsilon i kT e is Euler s number Equivalently the number of particles is sometimes expressed as N i 1 e e i m k T N Z e e i k T displaystyle langle N i rangle frac 1 e varepsilon i mu kT frac N Z e varepsilon i kT where the index i now specifies a particular state rather than the set of all states with energy e i displaystyle varepsilon i and Z i e e i k T textstyle Z sum i e varepsilon i kT Contents 1 History 2 Applicability 3 Derivations 3 1 Derivation from microcanonical ensemble 3 2 Derivation from canonical ensemble 4 See also 5 Notes 6 References 7 BibliographyHistory editFurther information Maxwell Boltzmann distribution Maxwell Boltzmann statistics grew out of the Maxwell Boltzmann distribution most likely as a distillation of the underlying technique dubious discuss The distribution was first derived by Maxwell in 1860 on heuristic grounds Boltzmann later in the 1870s carried out significant investigations into the physical origins of this distribution The distribution can be derived on the ground that it maximizes the entropy of the system Applicability edit nbsp Comparison of average occupancy of the ground state for three statistics Maxwell Boltzmann statistics is used to derive the Maxwell Boltzmann distribution of an ideal gas However it can also be used to extend that distribution to particles with a different energy momentum relation such as relativistic particles resulting in Maxwell Juttner distribution and to other than three dimensional spaces Maxwell Boltzmann statistics is often described as the statistics of distinguishable classical particles In other words the configuration of particle A in state 1 and particle B in state 2 is different from the case in which particle B is in state 1 and particle A is in state 2 This assumption leads to the proper Boltzmann statistics of particles in the energy states but yields non physical results for the entropy as embodied in the Gibbs paradox At the same time there are no real particles that have the characteristics required by Maxwell Boltzmann statistics Indeed the Gibbs paradox is resolved if we treat all particles of a certain type e g electrons protons photon etc as principally indistinguishable Once this assumption is made the particle statistics change The change in entropy in the entropy of mixing example may be viewed as an example of a non extensive entropy resulting from the distinguishability of the two types of particles being mixed Quantum particles are either bosons following instead Bose Einstein statistics or fermions subject to the Pauli exclusion principle following instead Fermi Dirac statistics Both of these quantum statistics approach the Maxwell Boltzmann statistics in the limit of high temperature and low particle density Derivations editMaxwell Boltzmann statistics can be derived in various statistical mechanical thermodynamic ensembles 1 The grand canonical ensemble exactly The canonical ensemble exactly The microcanonical ensemble but only in the thermodynamic limit In each case it is necessary to assume that the particles are non interacting and that multiple particles can occupy the same state and do so independently Derivation from microcanonical ensemble edit This section may be too technical for most readers to understand Please help improve it to make it understandable to non experts without removing the technical details December 2013 Learn how and when to remove this message Suppose we have a container with a huge number of very small particles all with identical physical characteristics such as mass charge etc Let s refer to this as the system Assume that though the particles have identical properties they are distinguishable For example we might identify each particle by continually observing their trajectories or by placing a marking on each one e g drawing a different number on each one as is done with lottery balls The particles are moving inside that container in all directions with great speed Because the particles are speeding around they possess some energy The Maxwell Boltzmann distribution is a mathematical function that describes about how many particles in the container have a certain energy More precisely the Maxwell Boltzmann distribution gives the non normalized probability this means that the probabilities do not add up to 1 that the state corresponding to a particular energy is occupied In general there may be many particles with the same amount of energy e displaystyle varepsilon nbsp Let the number of particles with the same energy e 1 displaystyle varepsilon 1 nbsp be N 1 displaystyle N 1 nbsp the number of particles possessing another energy e 2 displaystyle varepsilon 2 nbsp be N 2 displaystyle N 2 nbsp and so forth for all the possible energies e i i 1 2 3 displaystyle varepsilon i mid i 1 2 3 ldots nbsp To describe this situation we say that N i displaystyle N i nbsp is the occupation number of the energy level i displaystyle i nbsp If we know all the occupation numbers N i i 1 2 3 displaystyle N i mid i 1 2 3 ldots nbsp then we know the total energy of the system However because we can distinguish between which particles are occupying each energy level the set of occupation numbers N i i 1 2 3 displaystyle N i mid i 1 2 3 ldots nbsp does not completely describe the state of the system To completely describe the state of the system or the microstate we must specify exactly which particles are in each energy level Thus when we count the number of possible states of the system we must count each and every microstate and not just the possible sets of occupation numbers To begin with assume that there is only one state at each energy level i displaystyle i nbsp there is no degeneracy What follows next is a bit of combinatorial thinking which has little to do in accurately describing the reservoir of particles For instance let s say there is a total of k displaystyle k nbsp boxes labelled a b k displaystyle a b ldots k nbsp With the concept of combination we could calculate how many ways there are to arrange N displaystyle N nbsp into the set of boxes where the order of balls within each box isn t tracked First we select N a displaystyle N a nbsp balls from a total of N displaystyle N nbsp balls to place into box a displaystyle a nbsp and continue to select for each box from the remaining balls ensuring that every ball is placed in one of the boxes The total number of ways that the balls can be arranged is W N N a N N a N N a N b N N a N b N N a N b N c N N a N b N c N N ℓ N k N N ℓ N k N N a N b N c N k N N a N ℓ N k displaystyle begin aligned W amp frac N N a cancel N N a times frac cancel N N a N b cancel N N a N b times frac cancel N N a N b N c cancel N N a N b N c times cdots times frac cancel N cdots N ell N k N cdots N ell N k 8pt amp frac N N a N b N c cdots N k N N a cdots N ell N k end aligned nbsp As every ball has been placed into a box N N a N b N k 0 1 displaystyle N N a N b cdots N k 0 1 nbsp and we simplify the expression as W N ℓ a b k 1 N ℓ displaystyle W N prod ell a b ldots k frac 1 N ell nbsp This is just the multinomial coefficient the number of ways of arranging N items into k boxes the l th box holding Nl items ignoring the permutation of items in each box Now consider the case where there is more than one way to put N i displaystyle N i nbsp particles in the box i displaystyle i nbsp i e taking the degeneracy problem into consideration If the i displaystyle i nbsp th box has a degeneracy of g i displaystyle g i nbsp that is it has g i displaystyle g i nbsp sub boxes g i displaystyle g i nbsp boxes with the same energy e i displaystyle varepsilon i nbsp These states boxes with the same energy are called degenerate states such that any way of filling the i displaystyle i nbsp th box where the number in the sub boxes is changed is a distinct way of filling the box then the number of ways of filling the i th box must be increased by the number of ways of distributing the N i displaystyle N i nbsp objects in the g i displaystyle g i nbsp sub boxes The number of ways of placing N i displaystyle N i nbsp distinguishable objects in g i displaystyle g i nbsp sub boxes is g i N i displaystyle g i N i nbsp the first object can go into any of the g i displaystyle g i nbsp boxes the second object can also go into any of the g i displaystyle g i nbsp boxes and so on Thus the number of ways W displaystyle W nbsp that a total of N displaystyle N nbsp particles can be classified into energy levels according to their energies while each level i displaystyle i nbsp having g i displaystyle g i nbsp distinct states such that the i th level accommodates N i displaystyle N i nbsp particles is W N i g i N i N i displaystyle W N prod i frac g i N i N i nbsp This is the form for W first derived by Boltzmann Boltzmann s fundamental equation S k ln W displaystyle S k ln W nbsp relates the thermodynamic entropy S to the number of microstates W where k is the Boltzmann constant It was pointed out by Gibbs however that the above expression for W does not yield an extensive entropy and is therefore faulty This problem is known as the Gibbs paradox The problem is that the particles considered by the above equation are not indistinguishable In other words for two particles A and B in two energy sublevels the population represented by A B is considered distinct from the population B A while for indistinguishable particles they are not If we carry out the argument for indistinguishable particles we are led to the Bose Einstein expression for W W i N i g i 1 N i g i 1 displaystyle W prod i frac N i g i 1 N i g i 1 nbsp The Maxwell Boltzmann distribution follows from this Bose Einstein distribution for temperatures well above absolute zero implying that g i 1 displaystyle g i gg 1 nbsp The Maxwell Boltzmann distribution also requires low density implying that g i N i displaystyle g i gg N i nbsp Under these conditions we may use Stirling s approximation for the factorial N N N e N displaystyle N approx N N e N nbsp to write W i N i g i N i g i N i N i g i g i i g i N i 1 N i g i g i N i N i displaystyle W approx prod i frac N i g i N i g i N i N i g i g i approx prod i frac g i N i 1 N i g i g i N i N i nbsp Using the fact that 1 N i g i g i e N i displaystyle 1 N i g i g i approx e N i nbsp for g i N i displaystyle g i gg N i nbsp we can again use Stirling s approximation to write W i g i N i N i displaystyle W approx prod i frac g i N i N i nbsp This is essentially a division by N of Boltzmann s original expression for W and this correction is referred to as correct Boltzmann counting We wish to find the N i displaystyle N i nbsp for which the function W displaystyle W nbsp is maximized while considering the constraint that there is a fixed number of particles N N i textstyle left N sum N i right nbsp and a fixed energy E N i e i textstyle left E sum N i varepsilon i right nbsp in the container The maxima of W displaystyle W nbsp and ln W displaystyle ln W nbsp are achieved by the same values of N i displaystyle N i nbsp and since it is easier to accomplish mathematically we will maximize the latter function instead We constrain our solution using Lagrange multipliers forming the function f N 1 N 2 N n ln W a N N i b E N i e i displaystyle f N 1 N 2 ldots N n textstyle ln W alpha N sum N i beta E sum N i varepsilon i nbsp ln W ln i 1 n g i N i N i i 1 n N i ln g i N i ln N i N i displaystyle ln W ln left prod i 1 n frac g i N i N i right approx sum i 1 n left N i ln g i N i ln N i N i right nbsp Finally f N 1 N 2 N n a N b E i 1 n N i ln g i N i ln N i N i a b e i N i displaystyle f N 1 N 2 ldots N n alpha N beta E sum i 1 n left N i ln g i N i ln N i N i alpha beta varepsilon i N i right nbsp In order to maximize the expression above we apply Fermat s theorem stationary points according to which local extrema if exist must be at critical points partial derivatives vanish f N i ln g i ln N i a b e i 0 displaystyle frac partial f partial N i ln g i ln N i alpha beta varepsilon i 0 nbsp By solving the equations above i 1 n displaystyle i 1 ldots n nbsp we arrive to an expression for N i displaystyle N i nbsp N i g i e a b e i displaystyle N i frac g i e alpha beta varepsilon i nbsp Substituting this expression for N i displaystyle N i nbsp into the equation for ln W displaystyle ln W nbsp and assuming that N 1 displaystyle N gg 1 nbsp yields ln W a 1 N b E displaystyle ln W alpha 1 N beta E nbsp or rearranging E ln W b N b a N b displaystyle E frac ln W beta frac N beta frac alpha N beta nbsp Boltzmann realized that this is just an expression of the Euler integrated fundamental equation of thermodynamics Identifying E as the internal energy the Euler integrated fundamental equation states that E T S P V m N displaystyle E TS PV mu N nbsp where T is the temperature P is pressure V is volume and m is the chemical potential Boltzmann s famous equation S k ln W displaystyle S k ln W nbsp is the realization that the entropy is proportional to ln W displaystyle ln W nbsp with the constant of proportionality being the Boltzmann constant Using the ideal gas equation of state PV NkT It follows immediately that b 1 k T displaystyle beta 1 kT nbsp and a m k T displaystyle alpha mu kT nbsp so that the populations may now be written N i g i e e i m k T displaystyle N i frac g i e varepsilon i mu kT nbsp Note that the above formula is sometimes written N i g i e e i k T z displaystyle N i frac g i e varepsilon i kT z nbsp where z exp m k T displaystyle z exp mu kT nbsp is the absolute activity Alternatively we may use the fact that i N i N displaystyle sum i N i N nbsp to obtain the population numbers as N i N g i e e i k T Z displaystyle N i N frac g i e varepsilon i kT Z nbsp where Z is the partition function defined by Z i g i e e i k T displaystyle Z sum i g i e varepsilon i kT nbsp In an approximation where ei is considered to be a continuous variable the Thomas Fermi approximation yields a continuous degeneracy g proportional to e displaystyle sqrt varepsilon nbsp so that e e e k T 0 e e e k T e e e k T p 2 k T 3 2 2 e e e k T p k T 3 displaystyle frac sqrt varepsilon e varepsilon kT int 0 infty sqrt varepsilon e varepsilon kT frac sqrt varepsilon e varepsilon kT frac sqrt pi 2 kT 3 2 frac 2 sqrt varepsilon e varepsilon kT sqrt pi kT 3 nbsp which is just the Maxwell Boltzmann distribution for the energy Derivation from canonical ensemble edit This section may be too technical for most readers to understand Please help improve it to make it understandable to non experts without removing the technical details December 2013 Learn how and when to remove this message In the above discussion the Boltzmann distribution function was obtained via directly analysing the multiplicities of a system Alternatively one can make use of the canonical ensemble In a canonical ensemble a system is in thermal contact with a reservoir While energy is free to flow between the system and the reservoir the reservoir is thought to have infinitely large heat capacity as to maintain constant temperature T for the combined system In the present context our system is assumed to have the energy levels e i displaystyle varepsilon i nbsp with degeneracies g i displaystyle g i nbsp As before we would like to calculate the probability that our system has energy e i displaystyle varepsilon i nbsp If our system is in state s 1 displaystyle s 1 nbsp then there would be a corresponding number of microstates available to the reservoir Call this number W R s 1 displaystyle Omega R s 1 nbsp By assumption the combined system of the system we are interested in and the reservoir is isolated so all microstates are equally probable Therefore for instance if W R s 1 2 W R s 2 displaystyle Omega R s 1 2 Omega R s 2 nbsp we can conclude that our system is twice as likely to be in state s 1 displaystyle s 1 nbsp than s 2 displaystyle s 2 nbsp In general if P s i displaystyle P s i nbsp is the probability that our system is in state s i displaystyle s i nbsp P s 1 P s 2 W R s 1 W R s 2 displaystyle frac P s 1 P s 2 frac Omega R s 1 Omega R s 2 nbsp Since the entropy of the reservoir S R k ln W R displaystyle S R k ln Omega R nbsp the above becomes P s 1 P s 2 e S R s 1 k e S R s 2 k e S R s 1 S R s 2 k displaystyle frac P s 1 P s 2 frac e S R s 1 k e S R s 2 k e S R s 1 S R s 2 k nbsp Next we recall the thermodynamic identity from the first law of thermodynamics d S R 1 T d U R P d V R m d N R displaystyle dS R frac 1 T dU R P dV R mu dN R nbsp In a canonical ensemble there is no exchange of particles so the d N R displaystyle dN R nbsp term is zero Similarly d V R 0 displaystyle dV R 0 nbsp This gives S R s 1 S R s 2 1 T U R s 1 U R s 2 1 T E s 1 E s 2 displaystyle S R s 1 S R s 2 frac 1 T U R s 1 U R s 2 frac 1 T E s 1 E s 2 nbsp where U R s i displaystyle U R s i nbsp and E s i displaystyle E s i nbsp denote the energies of the reservoir and the system at s i displaystyle s i nbsp respectively For the second equality we have used the conservation of energy Substituting into the first equation relating P s 1 P s 2 displaystyle P s 1 P s 2 nbsp P s 1 P s 2 e E s 1 k T e E s 2 k T displaystyle frac P s 1 P s 2 frac e E s 1 kT e E s 2 kT nbsp which implies for any state s of the system P s 1 Z e E s k T displaystyle P s frac 1 Z e E s kT nbsp where Z is an appropriately chosen constant to make total probability 1 Z is constant provided that the temperature T is invariant Z s e E s k T displaystyle Z sum s e E s kT nbsp where the index s runs through all microstates of the system Z is sometimes called the Boltzmann sum over states or Zustandssumme in the original German If we index the summation via the energy eigenvalues instead of all possible states degeneracy must be taken into account The probability of our system having energy e i displaystyle varepsilon i nbsp is simply the sum of the probabilities of all corresponding microstates P e i 1 Z g i e e i k T displaystyle P varepsilon i frac 1 Z g i e varepsilon i kT nbsp where with obvious modification Z j g j e e j k T displaystyle Z sum j g j e varepsilon j kT nbsp this is the same result as before Comments on this derivation Notice that in this formulation the initial assumption suppose the system has totalNparticles is dispensed with Indeed the number of particles possessed by the system plays no role in arriving at the distribution Rather how many particles would occupy states with energy e i displaystyle varepsilon i nbsp follows as an easy consequence What has been presented above is essentially a derivation of the canonical partition function As one can see by comparing the definitions the Boltzmann sum over states is equal to the canonical partition function Exactly the same approach can be used to derive Fermi Dirac and Bose Einstein statistics However there one would replace the canonical ensemble with the grand canonical ensemble since there is exchange of particles between the system and the reservoir Also the system one considers in those cases is a single particle state not a particle In the above discussion we could have assumed our system to be a single atom See also editBose Einstein statistics Fermi Dirac statistics Boltzmann factorNotes edit For example two simple point particles may have the same energy but different momentum vectors They may be distinguished from each other on this basis and the degeneracy will be the number of possible ways that they can be so distinguished References edit Tolman R C 1938 The Principles of Statistical Mechanics Dover Publications ISBN 9780486638966 Bibliography editCarter Ashley H Classical and Statistical Thermodynamics Prentice Hall Inc 2001 New Jersey Raj Pathria Statistical Mechanics Butterworth Heinemann 1996 Retrieved from https en wikipedia org w index php title Maxwell Boltzmann statistics amp oldid 1215367958 correct Boltzmann counting, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.