fbpx
Wikipedia

Continuous mapping theorem

In probability theory, the continuous mapping theorem states that continuous functions preserve limits even if their arguments are sequences of random variables. A continuous function, in Heine’s definition, is such a function that maps convergent sequences into convergent sequences: if xnx then g(xn) → g(x). The continuous mapping theorem states that this will also be true if we replace the deterministic sequence {xn} with a sequence of random variables {Xn}, and replace the standard notion of convergence of real numbers “→” with one of the types of convergence of random variables.

This theorem was first proved by Henry Mann and Abraham Wald in 1943,[1] and it is therefore sometimes called the Mann–Wald theorem.[2] Meanwhile, Denis Sargan refers to it as the general transformation theorem.[3]

Statement

Let {Xn}, X be random elements defined on a metric space S. Suppose a function g: SS′ (where S′ is another metric space) has the set of discontinuity points Dg such that Pr[X ∈ Dg] = 0. Then[4][5]

 

where the superscripts, "d", "p", and "a.s." denote convergence in distribution, convergence in probability, and almost sure convergence respectively.

Proof

This proof has been adopted from (van der Vaart 1998, Theorem 2.3)

Spaces S and S′ are equipped with certain metrics. For simplicity we will denote both of these metrics using the |x − y| notation, even though the metrics may be arbitrary and not necessarily Euclidean.

Convergence in distribution

We will need a particular statement from the portmanteau theorem: that convergence in distribution   is equivalent to

  for every bounded continuous functional f.

So it suffices to prove that   for every bounded continuous functional f. Note that   is itself a bounded continuous functional. And so the claim follows from the statement above.

Convergence in probability

Fix an arbitrary ε > 0. Then for any δ > 0 consider the set Bδ defined as

 

This is the set of continuity points x of the function g(·) for which it is possible to find, within the δ-neighborhood of x, a point which maps outside the ε-neighborhood of g(x). By definition of continuity, this set shrinks as δ goes to zero, so that limδ → 0Bδ = ∅.

Now suppose that |g(X) − g(Xn)| > ε. This implies that at least one of the following is true: either |XXn| ≥ δ, or X ∈ Dg, or XBδ. In terms of probabilities this can be written as

 

On the right-hand side, the first term converges to zero as n → ∞ for any fixed δ, by the definition of convergence in probability of the sequence {Xn}. The second term converges to zero as δ → 0, since the set Bδ shrinks to an empty set. And the last term is identically equal to zero by assumption of the theorem. Therefore, the conclusion is that

 

which means that g(Xn) converges to g(X) in probability.

Almost sure convergence

By definition of the continuity of the function g(·),

 

at each point X(ω) where g(·) is continuous. Therefore,

 

because the intersection of two almost sure events is almost sure.

By definition, we conclude that g(Xn) converges to g(X) almost surely.

See also

References

  1. ^ Mann, H. B.; Wald, A. (1943). "On Stochastic Limit and Order Relationships". Annals of Mathematical Statistics. 14 (3): 217–226. doi:10.1214/aoms/1177731415. JSTOR 2235800.
  2. ^ Amemiya, Takeshi (1985). Advanced Econometrics. Cambridge, MA: Harvard University Press. p. 88. ISBN 0-674-00560-0.
  3. ^ Sargan, Denis (1988). Lectures on Advanced Econometric Theory. Oxford: Basil Blackwell. pp. 4–8. ISBN 0-631-14956-2.
  4. ^ Billingsley, Patrick (1969). Convergence of Probability Measures. John Wiley & Sons. p. 31 (Corollary 1). ISBN 0-471-07242-7.
  5. ^ van der Vaart, A. W. (1998). Asymptotic Statistics. New York: Cambridge University Press. p. 7 (Theorem 2.3). ISBN 0-521-49603-9.

continuous, mapping, theorem, confused, with, contraction, mapping, theorem, probability, theory, continuous, mapping, theorem, states, that, continuous, functions, preserve, limits, even, their, arguments, sequences, random, variables, continuous, function, h. Not to be confused with the contraction mapping theorem In probability theory the continuous mapping theorem states that continuous functions preserve limits even if their arguments are sequences of random variables A continuous function in Heine s definition is such a function that maps convergent sequences into convergent sequences if xn x then g xn g x The continuous mapping theorem states that this will also be true if we replace the deterministic sequence xn with a sequence of random variables Xn and replace the standard notion of convergence of real numbers with one of the types of convergence of random variables This theorem was first proved by Henry Mann and Abraham Wald in 1943 1 and it is therefore sometimes called the Mann Wald theorem 2 Meanwhile Denis Sargan refers to it as the general transformation theorem 3 Contents 1 Statement 2 Proof 2 1 Convergence in distribution 2 2 Convergence in probability 2 3 Almost sure convergence 3 See also 4 ReferencesStatement EditLet Xn X be random elements defined on a metric space S Suppose a function g S S where S is another metric space has the set of discontinuity points Dg such that Pr X Dg 0 Then 4 5 X n d X g X n d g X X n p X g X n p g X X n a s X g X n a s g X displaystyle begin aligned X n xrightarrow text d X quad amp Rightarrow quad g X n xrightarrow text d g X 6pt X n xrightarrow text p X quad amp Rightarrow quad g X n xrightarrow text p g X 6pt X n xrightarrow text a s X quad amp Rightarrow quad g X n xrightarrow text a s g X end aligned where the superscripts d p and a s denote convergence in distribution convergence in probability and almost sure convergence respectively Proof EditThis proof has been adopted from van der Vaart 1998 Theorem 2 3 Spaces S and S are equipped with certain metrics For simplicity we will denote both of these metrics using the x y notation even though the metrics may be arbitrary and not necessarily Euclidean Convergence in distribution Edit We will need a particular statement from the portmanteau theorem that convergence in distribution X n d X displaystyle X n xrightarrow d X is equivalent to E f X n E f X displaystyle mathbb E f X n to mathbb E f X for every bounded continuous functional f So it suffices to prove that E f g X n E f g X displaystyle mathbb E f g X n to mathbb E f g X for every bounded continuous functional f Note that F f g displaystyle F f circ g is itself a bounded continuous functional And so the claim follows from the statement above Convergence in probability Edit Fix an arbitrary e gt 0 Then for any d gt 0 consider the set Bd defined as B d x S x D g y S x y lt d g x g y gt e displaystyle B delta big x in S mid x notin D g exists y in S x y lt delta g x g y gt varepsilon big This is the set of continuity points x of the function g for which it is possible to find within the d neighborhood of x a point which maps outside the e neighborhood of g x By definition of continuity this set shrinks as d goes to zero so that limd 0Bd Now suppose that g X g Xn gt e This implies that at least one of the following is true either X Xn d or X Dg or X Bd In terms of probabilities this can be written as Pr g X n g X gt e Pr X n X d Pr X B d Pr X D g displaystyle Pr big big g X n g X big gt varepsilon big leq Pr big X n X geq delta big Pr X in B delta Pr X in D g On the right hand side the first term converges to zero as n for any fixed d by the definition of convergence in probability of the sequence Xn The second term converges to zero as d 0 since the set Bd shrinks to an empty set And the last term is identically equal to zero by assumption of the theorem Therefore the conclusion is that lim n Pr g X n g X gt e 0 displaystyle lim n to infty Pr big big g X n g X big gt varepsilon big 0 which means that g Xn converges to g X in probability Almost sure convergence Edit By definition of the continuity of the function g lim n X n w X w lim n g X n w g X w displaystyle lim n to infty X n omega X omega quad Rightarrow quad lim n to infty g X n omega g X omega at each point X w where g is continuous Therefore Pr lim n g X n g X Pr lim n g X n g X X D g Pr lim n X n X X D g 1 displaystyle begin aligned Pr left lim n to infty g X n g X right amp geq Pr left lim n to infty g X n g X X notin D g right amp geq Pr left lim n to infty X n X X notin D g right 1 end aligned because the intersection of two almost sure events is almost sure By definition we conclude that g Xn converges to g X almost surely See also EditSlutsky s theorem Portmanteau theorem Pushforward measureReferences Edit Mann H B Wald A 1943 On Stochastic Limit and Order Relationships Annals of Mathematical Statistics 14 3 217 226 doi 10 1214 aoms 1177731415 JSTOR 2235800 Amemiya Takeshi 1985 Advanced Econometrics Cambridge MA Harvard University Press p 88 ISBN 0 674 00560 0 Sargan Denis 1988 Lectures on Advanced Econometric Theory Oxford Basil Blackwell pp 4 8 ISBN 0 631 14956 2 Billingsley Patrick 1969 Convergence of Probability Measures John Wiley amp Sons p 31 Corollary 1 ISBN 0 471 07242 7 van der Vaart A W 1998 Asymptotic Statistics New York Cambridge University Press p 7 Theorem 2 3 ISBN 0 521 49603 9 Retrieved from https en wikipedia org w index php title Continuous mapping theorem amp oldid 1126091052, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.