fbpx
Wikipedia

Minimum-variance unbiased estimator

In statistics a minimum-variance unbiased estimator (MVUE) or uniformly minimum-variance unbiased estimator (UMVUE) is an unbiased estimator that has lower variance than any other unbiased estimator for all possible values of the parameter.

For practical statistics problems, it is important to determine the MVUE if one exists, since less-than-optimal procedures would naturally be avoided, other things being equal. This has led to substantial development of statistical theory related to the problem of optimal estimation.

While combining the constraint of unbiasedness with the desirability metric of least variance leads to good results in most practical settings—making MVUE a natural starting point for a broad range of analyses—a targeted specification may perform better for a given problem; thus, MVUE is not always the best stopping point.

Definition edit

Consider estimation of   based on data   i.i.d. from some member of a family of densities  , where   is the parameter space. An unbiased estimator   of   is UMVUE if  ,

 

for any other unbiased estimator  

If an unbiased estimator of   exists, then one can prove there is an essentially unique MVUE.[1] Using the Rao–Blackwell theorem one can also prove that determining the MVUE is simply a matter of finding a complete sufficient statistic for the family   and conditioning any unbiased estimator on it.

Further, by the Lehmann–Scheffé theorem, an unbiased estimator that is a function of a complete, sufficient statistic is the UMVUE estimator.

Put formally, suppose   is unbiased for  , and that   is a complete sufficient statistic for the family of densities. Then

 

is the MVUE for  

A Bayesian analog is a Bayes estimator, particularly with minimum mean square error (MMSE).

Estimator selection edit

An efficient estimator need not exist, but if it does and if it is unbiased, it is the MVUE. Since the mean squared error (MSE) of an estimator δ is

 

the MVUE minimizes MSE among unbiased estimators. In some cases biased estimators have lower MSE because they have a smaller variance than does any unbiased estimator; see estimator bias.

Example edit

Consider the data to be a single observation from an absolutely continuous distribution on   with density

 

and we wish to find the UMVU estimator of

 

First we recognize that the density can be written as

 

Which is an exponential family with sufficient statistic  . In fact this is a full rank exponential family, and therefore   is complete sufficient. See exponential family for a derivation which shows

 

Therefore,

 

Here we use Lehmann–Scheffé theorem to get the MVUE

Clearly   is unbiased and   is complete sufficient, thus the UMVU estimator is

 

This example illustrates that an unbiased function of the complete sufficient statistic will be UMVU, as Lehmann–Scheffé theorem states.

Other examples edit

 
where m is the sample maximum. This is a scaled and shifted (so unbiased) transform of the sample maximum, which is a sufficient and complete statistic. See German tank problem for details.

See also edit

Bayesian analogs edit

References edit

  1. ^ Lee, A. J., 1946- (1990). U-statistics : theory and practice. New York: M. Dekker. ISBN 0824782534. OCLC 21523971.{{cite book}}: CS1 maint: multiple names: authors list (link) CS1 maint: numeric names: authors list (link)
  • Keener, Robert W. (2006). Statistical Theory: Notes for a Course in Theoretical Statistics. Springer. pp. 47–48, 57–58.
  • Keener, Robert W. (2010). Theoretical statistics: Topics for a core course. New York: Springer. DOI 10.1007/978-0-387-93839-4
  • Voinov V. G., Nikulin M.S. (1993). Unbiased estimators and their applications, Vol.1: Univariate case. Kluwer Academic Publishers. pp. 521p.

minimum, variance, unbiased, estimator, this, article, needs, additional, citations, verification, please, help, improve, this, article, adding, citations, reliable, sources, unsourced, material, challenged, removed, find, sources, news, newspapers, books, sch. This article needs additional citations for verification Please help improve this article by adding citations to reliable sources Unsourced material may be challenged and removed Find sources Minimum variance unbiased estimator news newspapers books scholar JSTOR November 2009 Learn how and when to remove this message In statistics a minimum variance unbiased estimator MVUE or uniformly minimum variance unbiased estimator UMVUE is an unbiased estimator that has lower variance than any other unbiased estimator for all possible values of the parameter For practical statistics problems it is important to determine the MVUE if one exists since less than optimal procedures would naturally be avoided other things being equal This has led to substantial development of statistical theory related to the problem of optimal estimation While combining the constraint of unbiasedness with the desirability metric of least variance leads to good results in most practical settings making MVUE a natural starting point for a broad range of analyses a targeted specification may perform better for a given problem thus MVUE is not always the best stopping point Contents 1 Definition 2 Estimator selection 3 Example 4 Other examples 5 See also 5 1 Bayesian analogs 6 ReferencesDefinition editConsider estimation of g 8 displaystyle g theta nbsp based on data X 1 X 2 X n displaystyle X 1 X 2 ldots X n nbsp i i d from some member of a family of densities p 8 8 W displaystyle p theta theta in Omega nbsp where W displaystyle Omega nbsp is the parameter space An unbiased estimator d X 1 X 2 X n displaystyle delta X 1 X 2 ldots X n nbsp of g 8 displaystyle g theta nbsp is UMVUE if 8 W displaystyle forall theta in Omega nbsp var d X 1 X 2 X n var d X 1 X 2 X n displaystyle operatorname var delta X 1 X 2 ldots X n leq operatorname var tilde delta X 1 X 2 ldots X n nbsp for any other unbiased estimator d displaystyle tilde delta nbsp If an unbiased estimator of g 8 displaystyle g theta nbsp exists then one can prove there is an essentially unique MVUE 1 Using the Rao Blackwell theorem one can also prove that determining the MVUE is simply a matter of finding a complete sufficient statistic for the family p 8 8 W displaystyle p theta theta in Omega nbsp and conditioning any unbiased estimator on it Further by the Lehmann Scheffe theorem an unbiased estimator that is a function of a complete sufficient statistic is the UMVUE estimator Put formally suppose d X 1 X 2 X n displaystyle delta X 1 X 2 ldots X n nbsp is unbiased for g 8 displaystyle g theta nbsp and that T displaystyle T nbsp is a complete sufficient statistic for the family of densities Then h X 1 X 2 X n E d X 1 X 2 X n T displaystyle eta X 1 X 2 ldots X n operatorname E delta X 1 X 2 ldots X n mid T nbsp is the MVUE for g 8 displaystyle g theta nbsp A Bayesian analog is a Bayes estimator particularly with minimum mean square error MMSE Estimator selection editAn efficient estimator need not exist but if it does and if it is unbiased it is the MVUE Since the mean squared error MSE of an estimator d is MSE d var d bias d 2 displaystyle operatorname MSE delta operatorname var delta operatorname bias delta 2 nbsp the MVUE minimizes MSE among unbiased estimators In some cases biased estimators have lower MSE because they have a smaller variance than does any unbiased estimator see estimator bias Example editConsider the data to be a single observation from an absolutely continuous distribution on R displaystyle mathbb R nbsp with density p 8 x 8 e x 1 e x 8 1 displaystyle p theta x frac theta e x 1 e x theta 1 nbsp and we wish to find the UMVU estimator of g 8 1 8 2 displaystyle g theta frac 1 theta 2 nbsp First we recognize that the density can be written as e x 1 e x exp 8 log 1 e x log 8 displaystyle frac e x 1 e x exp theta log 1 e x log theta nbsp Which is an exponential family with sufficient statistic T log 1 e x displaystyle T log 1 e x nbsp In fact this is a full rank exponential family and therefore T displaystyle T nbsp is complete sufficient See exponential family for a derivation which shows E T 1 8 var T 1 8 2 displaystyle operatorname E T frac 1 theta quad operatorname var T frac 1 theta 2 nbsp Therefore E T 2 2 8 2 displaystyle operatorname E T 2 frac 2 theta 2 nbsp Here we use Lehmann Scheffe theorem to get the MVUEClearly d X T 2 2 displaystyle delta X frac T 2 2 nbsp is unbiased and T log 1 e x displaystyle T log 1 e x nbsp is complete sufficient thus the UMVU estimator is h X E d X T E T 2 2 T T 2 2 log 1 e X 2 2 displaystyle eta X operatorname E delta X mid T operatorname E left left frac T 2 2 right T right frac T 2 2 frac log 1 e X 2 2 nbsp This example illustrates that an unbiased function of the complete sufficient statistic will be UMVU as Lehmann Scheffe theorem states Other examples editFor a normal distribution with unknown mean and variance the sample mean and unbiased sample variance are the MVUEs for the population mean and population variance However the sample standard deviation is not unbiased for the population standard deviation see unbiased estimation of standard deviation Further for other distributions the sample mean and sample variance are not in general MVUEs for a uniform distribution with unknown upper and lower bounds the mid range is the MVUE for the population mean If k exemplars are chosen without replacement from a discrete uniform distribution over the set 1 2 N with unknown upper bound N the MVUE for N is k 1 k m 1 displaystyle frac k 1 k m 1 nbsp dd where m is the sample maximum This is a scaled and shifted so unbiased transform of the sample maximum which is a sufficient and complete statistic See German tank problem for details See also editCramer Rao bound Best linear unbiased estimator BLUE Bias variance tradeoff Lehmann Scheffe theorem U statistic Bayesian analogs edit Bayes estimator Minimum mean square error MMSE References edit Lee A J 1946 1990 U statistics theory and practice New York M Dekker ISBN 0824782534 OCLC 21523971 a href Template Cite book html title Template Cite book cite book a CS1 maint multiple names authors list link CS1 maint numeric names authors list link Keener Robert W 2006 Statistical Theory Notes for a Course in Theoretical Statistics Springer pp 47 48 57 58 Keener Robert W 2010 Theoretical statistics Topics for a core course New York Springer DOI 10 1007 978 0 387 93839 4 Voinov V G Nikulin M S 1993 Unbiased estimators and their applications Vol 1 Univariate case Kluwer Academic Publishers pp 521p Retrieved from https en wikipedia org w index php title Minimum variance unbiased estimator amp oldid 1156141706, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.