fbpx
Wikipedia

Entropy rate

In the mathematical theory of probability, the entropy rate or source information rate is a function assigning an entropy to a stochastic process.

For a strongly stationary process, the conditional entropy for latest random variable eventually tend towards this rate value.

Definition edit

A processes   with a countable index gives rise to the sequence of its joint entropies  . If the limit exists, the entropy rate is defined as

 

Note that given any sequence   with   and letting  , by telescoping one has  . The entropy rate thus computes the mean of the first   such entropy changes, with   going to infinity. The behaviour of joint entropies from one index to the next is also explicitly subject in some characterizations of entropy.

Discussion edit

While   may be understood as a sequence of random variables, the entropy rate   represents the average entropy change per one random variable, in the long term.

It can be thought of as a general property of stochastic sources - this is the subject of the asymptotic equipartition property.

For strongly stationary processes edit

A stochastic process also gives rise to a sequence of conditional entropies, comprising more and more random variables. For strongly stationary stochastic processes, the entropy rate equals the limit of that sequence

 

The quantity given by the limit on the right is also denoted  , which motivated to the extend that here this is then again a rate associated with the process, in the above sense.

For Markov chains edit

Since a stochastic process defined by a Markov chain that is irreducible, aperiodic and positive recurrent has a stationary distribution, the entropy rate is independent of the initial distribution.

For example, consider a Markov chain defined on a countable number of states. Given its right stochastic transition matrix   and an entropy

 

associated with each state, one finds

 

where   is the asymptotic distribution of the chain.

In particular, it follows that the entropy rate of an i.i.d. stochastic process is the same as the entropy of any individual member in the process.

For hidden Markov models edit

The entropy rate of hidden Markov models (HMM) has no known closed-form solution. However, it has known upper and lower bounds. Let the underlying Markov chain   be stationary, and let   be the observable states, then we have

 
and at the limit of  , both sides converge to the middle.[1]

Applications edit

The entropy rate may be used to estimate the complexity of stochastic processes. It is used in diverse applications ranging from characterizing the complexity of languages, blind source separation, through to optimizing quantizers and data compression algorithms. For example, a maximum entropy rate criterion may be used for feature selection in machine learning.[2]

See also edit

References edit

  1. ^ Cover, Thomas M.; Thomas, Joy A. (2006). "4.5. Functions of Markov chains". Elements of information theory (2nd ed.). Hoboken, N.J: Wiley-Interscience. ISBN 978-0-471-24195-9.
  2. ^ Einicke, G. A. (2018). "Maximum-Entropy Rate Selection of Features for Classifying Changes in Knee and Ankle Dynamics During Running". IEEE Journal of Biomedical and Health Informatics. 28 (4): 1097–1103. doi:10.1109/JBHI.2017.2711487. PMID 29969403. S2CID 49555941.
  • Cover, T. and Thomas, J. (1991) Elements of Information Theory, John Wiley and Sons, Inc., ISBN 0-471-06259-6 [1]

entropy, rate, mathematical, theory, probability, entropy, rate, source, information, rate, function, assigning, entropy, stochastic, process, strongly, stationary, process, conditional, entropy, latest, random, variable, eventually, tend, towards, this, rate,. In the mathematical theory of probability the entropy rate or source information rate is a function assigning an entropy to a stochastic process For a strongly stationary process the conditional entropy for latest random variable eventually tend towards this rate value Contents 1 Definition 2 Discussion 2 1 For strongly stationary processes 2 2 For Markov chains 2 3 For hidden Markov models 3 Applications 4 See also 5 ReferencesDefinition editA processes X displaystyle X nbsp with a countable index gives rise to the sequence of its joint entropies H n X 1 X 2 X n displaystyle H n X 1 X 2 dots X n nbsp If the limit exists the entropy rate is defined as H X lim n 1 n H n displaystyle H X lim n to infty tfrac 1 n H n nbsp Note that given any sequence a n n displaystyle a n n nbsp with a 0 0 displaystyle a 0 0 nbsp and letting D a k a k a k 1 displaystyle Delta a k a k a k 1 nbsp by telescoping one has a n k 1 n D a k displaystyle a n textstyle sum k 1 n Delta a k nbsp The entropy rate thus computes the mean of the first n displaystyle n nbsp such entropy changes with n displaystyle n nbsp going to infinity The behaviour of joint entropies from one index to the next is also explicitly subject in some characterizations of entropy Discussion editWhile X displaystyle X nbsp may be understood as a sequence of random variables the entropy rate H X displaystyle H X nbsp represents the average entropy change per one random variable in the long term It can be thought of as a general property of stochastic sources this is the subject of the asymptotic equipartition property For strongly stationary processes edit A stochastic process also gives rise to a sequence of conditional entropies comprising more and more random variables For strongly stationary stochastic processes the entropy rate equals the limit of that sequence H X lim n H X n X n 1 X n 2 X 1 displaystyle H X lim n to infty H X n X n 1 X n 2 dots X 1 nbsp The quantity given by the limit on the right is also denoted H X displaystyle H X nbsp which motivated to the extend that here this is then again a rate associated with the process in the above sense For Markov chains edit Since a stochastic process defined by a Markov chain that is irreducible aperiodic and positive recurrent has a stationary distribution the entropy rate is independent of the initial distribution For example consider a Markov chain defined on a countable number of states Given its right stochastic transition matrix P i j displaystyle P ij nbsp and an entropy h i j P i j log P i j displaystyle h i sum j P ij log P ij nbsp associated with each state one finds H X i m i h i displaystyle displaystyle H X sum i mu i h i nbsp where m i displaystyle mu i nbsp is the asymptotic distribution of the chain In particular it follows that the entropy rate of an i i d stochastic process is the same as the entropy of any individual member in the process For hidden Markov models edit The entropy rate of hidden Markov models HMM has no known closed form solution However it has known upper and lower bounds Let the underlying Markov chain X 1 displaystyle X 1 infty nbsp be stationary and let Y 1 displaystyle Y 1 infty nbsp be the observable states then we haveH Y n X 1 Y 1 n 1 H Y H Y n Y 1 n 1 displaystyle H Y n X 1 Y 1 n 1 leq H Y leq H Y n Y 1 n 1 nbsp and at the limit of n displaystyle n to infty nbsp both sides converge to the middle 1 Applications editThe entropy rate may be used to estimate the complexity of stochastic processes It is used in diverse applications ranging from characterizing the complexity of languages blind source separation through to optimizing quantizers and data compression algorithms For example a maximum entropy rate criterion may be used for feature selection in machine learning 2 See also editInformation source mathematics Markov information source Asymptotic equipartition property Maximal entropy random walk chosen to maximize entropy rateReferences edit Cover Thomas M Thomas Joy A 2006 4 5 Functions of Markov chains Elements of information theory 2nd ed Hoboken N J Wiley Interscience ISBN 978 0 471 24195 9 Einicke G A 2018 Maximum Entropy Rate Selection of Features for Classifying Changes in Knee and Ankle Dynamics During Running IEEE Journal of Biomedical and Health Informatics 28 4 1097 1103 doi 10 1109 JBHI 2017 2711487 PMID 29969403 S2CID 49555941 Cover T and Thomas J 1991 Elements of Information Theory John Wiley and Sons Inc ISBN 0 471 06259 6 1 Retrieved from https en wikipedia org w index php title Entropy rate amp oldid 1199491737, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.