fbpx
Wikipedia

Markov chains on a measurable state space

A Markov chain on a measurable state space is a discrete-time-homogeneous Markov chain with a measurable space as state space.

History edit

The definition of Markov chains has evolved during the 20th century. In 1953 the term Markov chain was used for stochastic processes with discrete or continuous index set, living on a countable or finite state space, see Doob.[1] or Chung.[2] Since the late 20th century it became more popular to consider a Markov chain as a stochastic process with discrete index set, living on a measurable state space.[3][4][5]

Definition edit

Denote with   a measurable space and with   a Markov kernel with source and target  . A stochastic process   on   is called a time homogeneous Markov chain with Markov kernel   and start distribution   if

 

is satisfied for any  . One can construct for any Markov kernel and any probability measure an associated Markov chain.[4]

Remark about Markov kernel integration edit

For any measure   we denote for  -integrable function   the Lebesgue integral as  . For the measure   defined by   we used the following notation:

 

Basic properties edit

Starting in a single point edit

If   is a Dirac measure in  , we denote for a Markov kernel   with starting distribution   the associated Markov chain as   on   and the expectation value

 

for a  -integrable function  . By definition, we have then  .

We have for any measurable function   the following relation:[4]

 

Family of Markov kernels edit

For a Markov kernel   with starting distribution   one can introduce a family of Markov kernels   by

 

for   and  . For the associated Markov chain   according to   and   one obtains

 .

Stationary measure edit

A probability measure   is called stationary measure of a Markov kernel   if

 

holds for any  . If   on   denotes the Markov chain according to a Markov kernel   with stationary measure  , and the distribution of   is  , then all   have the same probability distribution, namely:

 

for any  .

Reversibility edit

A Markov kernel   is called reversible according to a probability measure   if

 

holds for any  . Replacing   shows that if   is reversible according to  , then   must be a stationary measure of  .

See also edit

References edit

  1. ^ Joseph L. Doob: Stochastic Processes. New York: John Wiley & Sons, 1953.
  2. ^ Kai L. Chung: Markov Chains with Stationary Transition Probabilities. Second edition. Berlin: Springer-Verlag, 1974.
  3. ^ Sean Meyn and Richard L. Tweedie: Markov Chains and Stochastic Stability. 2nd edition, 2009.
  4. ^ a b c Daniel Revuz: Markov Chains. 2nd edition, 1984.
  5. ^ Rick Durrett: Probability: Theory and Examples. Fourth edition, 2005.

markov, chains, measurable, state, space, markov, chain, measurable, state, space, discrete, time, homogeneous, markov, chain, with, measurable, space, state, space, contents, history, definition, remark, about, markov, kernel, integration, basic, properties, . A Markov chain on a measurable state space is a discrete time homogeneous Markov chain with a measurable space as state space Contents 1 History 2 Definition 2 1 Remark about Markov kernel integration 3 Basic properties 3 1 Starting in a single point 3 2 Family of Markov kernels 3 3 Stationary measure 3 4 Reversibility 4 See also 5 ReferencesHistory editThe definition of Markov chains has evolved during the 20th century In 1953 the term Markov chain was used for stochastic processes with discrete or continuous index set living on a countable or finite state space see Doob 1 or Chung 2 Since the late 20th century it became more popular to consider a Markov chain as a stochastic process with discrete index set living on a measurable state space 3 4 5 Definition editDenote with E S displaystyle E Sigma nbsp a measurable space and with p displaystyle p nbsp a Markov kernel with source and target E S displaystyle E Sigma nbsp A stochastic process X n n N displaystyle X n n in mathbb N nbsp on W F P displaystyle Omega mathcal F mathbb P nbsp is called a time homogeneous Markov chain with Markov kernel p displaystyle p nbsp and start distribution m displaystyle mu nbsp if P X 0 A 0 X 1 A 1 X n A n A 0 A n 1 p y n 1 A n p y n 2 d y n 1 p y 0 d y 1 m d y 0 displaystyle mathbb P X 0 in A 0 X 1 in A 1 dots X n in A n int A 0 dots int A n 1 p y n 1 A n p y n 2 dy n 1 dots p y 0 dy 1 mu dy 0 nbsp is satisfied for any n N A 0 A n S displaystyle n in mathbb N A 0 dots A n in Sigma nbsp One can construct for any Markov kernel and any probability measure an associated Markov chain 4 Remark about Markov kernel integration edit For any measure m S 0 displaystyle mu colon Sigma to 0 infty nbsp we denote for m displaystyle mu nbsp integrable function f E R displaystyle f colon E to mathbb R cup infty infty nbsp the Lebesgue integral as E f x m d x displaystyle int E f x mu dx nbsp For the measure n x S 0 displaystyle nu x colon Sigma to 0 infty nbsp defined by n x A p x A displaystyle nu x A p x A nbsp we used the following notation E f y p x d y E f y n x d y displaystyle int E f y p x dy int E f y nu x dy nbsp Basic properties editStarting in a single point edit If m displaystyle mu nbsp is a Dirac measure in x displaystyle x nbsp we denote for a Markov kernel p displaystyle p nbsp with starting distribution m displaystyle mu nbsp the associated Markov chain as X n n N displaystyle X n n in mathbb N nbsp on W F P x displaystyle Omega mathcal F mathbb P x nbsp and the expectation value E x X W X w P x d w displaystyle mathbb E x X int Omega X omega mathbb P x d omega nbsp for a P x displaystyle mathbb P x nbsp integrable function X displaystyle X nbsp By definition we have then P x X 0 x 1 displaystyle mathbb P x X 0 x 1 nbsp We have for any measurable function f E 0 displaystyle f colon E to 0 infty nbsp the following relation 4 E f y p x d y E x f X 1 displaystyle int E f y p x dy mathbb E x f X 1 nbsp Family of Markov kernels edit For a Markov kernel p displaystyle p nbsp with starting distribution m displaystyle mu nbsp one can introduce a family of Markov kernels p n n N displaystyle p n n in mathbb N nbsp by p n 1 x A E p n y A p x d y displaystyle p n 1 x A int E p n y A p x dy nbsp for n N n 1 displaystyle n in mathbb N n geq 1 nbsp and p 1 p displaystyle p 1 p nbsp For the associated Markov chain X n n N displaystyle X n n in mathbb N nbsp according to p displaystyle p nbsp and m displaystyle mu nbsp one obtains P X 0 A X n B A p n x B m d x displaystyle mathbb P X 0 in A X n in B int A p n x B mu dx nbsp Stationary measure edit A probability measure m displaystyle mu nbsp is called stationary measure of a Markov kernel p displaystyle p nbsp if A m d x E p x A m d x displaystyle int A mu dx int E p x A mu dx nbsp holds for any A S displaystyle A in Sigma nbsp If X n n N displaystyle X n n in mathbb N nbsp on W F P displaystyle Omega mathcal F mathbb P nbsp denotes the Markov chain according to a Markov kernel p displaystyle p nbsp with stationary measure m displaystyle mu nbsp and the distribution of X 0 displaystyle X 0 nbsp is m displaystyle mu nbsp then all X n displaystyle X n nbsp have the same probability distribution namely P X n A m A displaystyle mathbb P X n in A mu A nbsp for any A S displaystyle A in Sigma nbsp Reversibility edit A Markov kernel p displaystyle p nbsp is called reversible according to a probability measure m displaystyle mu nbsp if A p x B m d x B p x A m d x displaystyle int A p x B mu dx int B p x A mu dx nbsp holds for any A B S displaystyle A B in Sigma nbsp Replacing A E displaystyle A E nbsp shows that if p displaystyle p nbsp is reversible according to m displaystyle mu nbsp then m displaystyle mu nbsp must be a stationary measure of p displaystyle p nbsp See also editHarris chain Subshift of finite typeReferences edit Joseph L Doob Stochastic Processes New York John Wiley amp Sons 1953 Kai L Chung Markov Chains with Stationary Transition Probabilities Second edition Berlin Springer Verlag 1974 Sean Meyn and Richard L Tweedie Markov Chains and Stochastic Stability 2nd edition 2009 a b c Daniel Revuz Markov Chains 2nd edition 1984 Rick Durrett Probability Theory and Examples Fourth edition 2005 Retrieved from https en wikipedia org w index php title Markov chains on a measurable state space amp oldid 1180381898, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.