fbpx
Wikipedia

Median

In statistics and probability theory, the median is the value separating the higher half from the lower half of a data sample, a population, or a probability distribution. For a data set, it may be thought of as "the middle" value. The basic feature of the median in describing data compared to the mean (often simply described as the "average") is that it is not skewed by a small proportion of extremely large or small values, and therefore provides a better representation of the center. Median income, for example, may be a better way to describe center of the income distribution because increases in the largest incomes alone have no effect on median. For this reason, the median is of central importance in robust statistics.

Finding the median in sets of data with an odd and even number of values.

Finite data set of numbers edit

The median of a finite list of numbers is the "middle" number, when those numbers are listed in order from smallest to greatest.

If the data set has an odd number of observations, the middle one is selected. For example, the following list of seven numbers,

1, 3, 3, 6, 7, 8, 9

has the median of 6, which is the fourth value.

If the data set has an even number of observations, there is no distinct middle value and the median is usually defined to be the arithmetic mean of the two middle values.[1][2] For example, this data set of 8 numbers

1, 2, 3, 4, 5, 6, 8, 9

has a median value of 4.5, that is  . (In more technical terms, this interprets the median as the fully trimmed mid-range).

In general, with this convention, the median can be defined as follows: For a data set   of   elements, ordered from smallest to greatest,

if   is odd,  
if   is even,  
Comparison of common averages of values [ 1, 2, 2, 3, 4, 7, 9 ]
Type Description Example Result
Midrange Midway point between the minimum and the maximum of a data set 1, 2, 2, 3, 4, 7, 9 5
Arithmetic mean Sum of values of a data set divided by number of values:   (1 + 2 + 2 + 3 + 4 + 7 + 9) / 7 4
Median Middle value separating the greater and lesser halves of a data set 1, 2, 2, 3, 4, 7, 9 3
Mode Most frequent value in a data set 1, 2, 2, 3, 4, 7, 9 2

Formal definition and notation edit

Formally, a median of a population is any value such that at least half of the population is less than or equal to the proposed median and at least half is greater than or equal to the proposed median. As seen above, medians may not be unique. If each set contains more than half the population, then some of the population is exactly equal to the unique median.

The median is well-defined for any ordered (one-dimensional) data and is independent of any distance metric. The median can thus be applied to school classes which are ranked but not numerical (e.g. working out a median grade when student test scores are graded from F to A), although the result might be halfway between classes if there is an even number of classes. (For odd number classes, one specific class is determined as the median.)

A geometric median, on the other hand, is defined in any number of dimensions. A related concept, in which the outcome is forced to correspond to a member of the sample, is the medoid.

There is no widely accepted standard notation for the median, but some authors represent the median of a variable x as med(x), ,[3] as μ1/2,[1] or as M.[3][4] In any of these cases, the use of these or other symbols for the median needs to be explicitly defined when they are introduced.

The median is a special case of other ways of summarizing the typical values associated with a statistical distribution: it is the 2nd quartile, 5th decile, and 50th percentile.

Uses edit

The median can be used as a measure of location when one attaches reduced importance to extreme values, typically because a distribution is skewed, extreme values are not known, or outliers are untrustworthy, i.e., may be measurement/transcription errors.

For example, consider the multiset

1, 2, 2, 2, 3, 14.

The median is 2 in this case, as is the mode, and it might be seen as a better indication of the center than the arithmetic mean of 4, which is larger than all but one of the values. However, the widely cited empirical relationship that the mean is shifted "further into the tail" of a distribution than the median is not generally true. At most, one can say that the two statistics cannot be "too far" apart; see § Inequality relating means and medians below.[5]

As a median is based on the middle data in a set, it is not necessary to know the value of extreme results in order to calculate it. For example, in a psychology test investigating the time needed to solve a problem, if a small number of people failed to solve the problem at all in the given time a median can still be calculated.[6]

Because the median is simple to understand and easy to calculate, while also a robust approximation to the mean, the median is a popular summary statistic in descriptive statistics. In this context, there are several choices for a measure of variability: the range, the interquartile range, the mean absolute deviation, and the median absolute deviation.

For practical purposes, different measures of location and dispersion are often compared on the basis of how well the corresponding population values can be estimated from a sample of data. The median, estimated using the sample median, has good properties in this regard. While it is not usually optimal if a given population distribution is assumed, its properties are always reasonably good. For example, a comparison of the efficiency of candidate estimators shows that the sample mean is more statistically efficient when—and only when— data is uncontaminated by data from heavy-tailed distributions or from mixtures of distributions.[citation needed] Even then, the median has a 64% efficiency compared to the minimum-variance mean (for large normal samples), which is to say the variance of the median will be ~50% greater than the variance of the mean.[7][8]

Probability distributions edit

 
Geometric visualization of the mode, median and mean of an arbitrary probability density function.[9]

For any real-valued probability distribution with cumulative distribution function F, a median is defined as any real number m that satisfies the inequalities

 

An equivalent phrasing uses a random variable X distributed according to F:

 

Note that this definition does not require X to have an absolutely continuous distribution (which has a probability density function f), nor does it require a discrete one. In the former case, the inequalities can be upgraded to equality: a median satisfies

 

Any probability distribution on the real number set   has at least one median, but in pathological cases there may be more than one median: if F is constant 1/2 on an interval (so that f = 0 there), then any value of that interval is a median.

Medians of particular distributions edit

The medians of certain types of distributions can be easily calculated from their parameters; furthermore, they exist even for some distributions lacking a well-defined mean, such as the Cauchy distribution:

Properties edit

Optimality property edit

The mean absolute error of a real variable c with respect to the random variable X is

 

Provided that the probability distribution of X is such that the above expectation exists, then m is a median of X if and only if m is a minimizer of the mean absolute error with respect to X.[11] In particular, if m is a sample median, then it minimizes the arithmetic mean of the absolute deviations.[12] Note, however, that in cases where the sample contains an even number of elements, this minimizer is not unique.

More generally, a median is defined as a minimum of

 

as discussed below in the section on multivariate medians (specifically, the spatial median).

This optimization-based definition of the median is useful in statistical data-analysis, for example, in k-medians clustering.

Inequality relating means and medians edit

 
Comparison of mean, median and mode of two log-normal distributions with different skewness

If the distribution has finite variance, then the distance between the median   and the mean   is bounded by one standard deviation.

This bound was proved by Book and Sher in 1979 for discrete samples,[13] and more generally by Page and Murty in 1982.[14] In a comment on a subsequent proof by O'Cinneide,[15] Mallows in 1991 presented a compact proof that uses Jensen's inequality twice,[16] as follows. Using |·| for the absolute value, we have

 

The first and third inequalities come from Jensen's inequality applied to the absolute-value function and the square function, which are each convex. The second inequality comes from the fact that a median minimizes the absolute deviation function  .

Mallows's proof can be generalized to obtain a multivariate version of the inequality[17] simply by replacing the absolute value with a norm:

 

where m is a spatial median, that is, a minimizer of the function   The spatial median is unique when the data-set's dimension is two or more.[18][19]

An alternative proof uses the one-sided Chebyshev inequality; it appears in an inequality on location and scale parameters. This formula also follows directly from Cantelli's inequality.[20]

Unimodal distributions edit

For the case of unimodal distributions, one can achieve a sharper bound on the distance between the median and the mean:

 .[21]

A similar relation holds between the median and the mode:

 

Jensen's inequality for medians edit

Jensen's inequality states that for any random variable X with a finite expectation E[X] and for any convex function f

 

This inequality generalizes to the median as well. We say a function f: RR is a C function if, for any t,

 

is a closed interval (allowing the degenerate cases of a single point or an empty set). Every convex function is a C function, but the reverse does not hold. If f is a C function, then

 

If the medians are not unique, the statement holds for the corresponding suprema.[22]

Medians for samples edit

Efficient computation of the sample median edit

Even though comparison-sorting n items requires Ω(n log n) operations, selection algorithms can compute the kth-smallest of n items with only Θ(n) operations. This includes the median, which is the n/2th order statistic (or for an even number of samples, the arithmetic mean of the two middle order statistics).[23]

Selection algorithms still have the downside of requiring Ω(n) memory, that is, they need to have the full sample (or a linear-sized portion of it) in memory. Because this, as well as the linear time requirement, can be prohibitive, several estimation procedures for the median have been developed. A simple one is the median of three rule, which estimates the median as the median of a three-element subsample; this is commonly used as a subroutine in the quicksort sorting algorithm, which uses an estimate of its input's median. A more robust estimator is Tukey's ninther, which is the median of three rule applied with limited recursion:[24] if A is the sample laid out as an array, and

med3(A) = med(A[1], A[n/2], A[n]),

then

ninther(A) = med3(med3(A[1 ... 1/3n]), med3(A[1/3n ... 2/3n]), med3(A[2/3n ... n]))

The remedian is an estimator for the median that requires linear time but sub-linear memory, operating in a single pass over the sample.[25]

Sampling distribution edit

The distributions of both the sample mean and the sample median were determined by Laplace.[26] The distribution of the sample median from a population with a density function   is asymptotically normal with mean   and variance[27]

 

where   is the median of   and   is the sample size:

 


A modern proof follows below. Laplace's result is now understood as a special case of the asymptotic distribution of arbitrary quantiles.

For normal samples, the density is  , thus for large samples the variance of the median equals  [7] (See also section #Efficiency below.)

Derivation of the asymptotic distribution edit

We take the sample size to be an odd number   and assume our variable continuous; the formula for the case of discrete variables is given below in § Empirical local density. The sample can be summarized as "below median", "at median", and "above median", which corresponds to a trinomial distribution with probabilities  ,   and  . For a continuous variable, the probability of multiple sample values being exactly equal to the median is 0, so one can calculate the density of at the point   directly from the trinomial distribution:

 .

Now we introduce the beta function. For integer arguments   and  , this can be expressed as  . Also, recall that  . Using these relationships and setting both   and   equal to   allows the last expression to be written as

 

Hence the density function of the median is a symmetric beta distribution pushed forward by  . Its mean, as we would expect, is 0.5 and its variance is  . By the chain rule, the corresponding variance of the sample median is

 .

The additional 2 is negligible in the limit.

Empirical local density edit

In practice, the functions   and   above are often not known or assumed. However, they can be estimated from an observed frequency distribution. In this section, we give an example. Consider the following table, representing a sample of 3,800 (discrete-valued) observations:

v 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5
f(v) 0.000 0.008 0.010 0.013 0.083 0.108 0.328 0.220 0.202 0.023 0.005
F(v) 0.000 0.008 0.018 0.031 0.114 0.222 0.550 0.770 0.972 0.995 1.000

Because the observations are discrete-valued, constructing the exact distribution of the median is not an immediate translation of the above expression for  ; one may (and typically does) have multiple instances of the median in one's sample. So we must sum over all these possibilities:

 

Here, i is the number of points strictly less than the median and k the number strictly greater.

Using these preliminaries, it is possible to investigate the effect of sample size on the standard errors of the mean and median. The observed mean is 3.16, the observed raw median is 3 and the observed interpolated median is 3.174. The following table gives some comparison statistics.

Sample size
Statistic
3 9 15 21
Expected value of median 3.198 3.191 3.174 3.161
Standard error of median (above formula) 0.482 0.305 0.257 0.239
Standard error of median (asymptotic approximation) 0.879 0.508 0.393 0.332
Standard error of mean 0.421 0.243 0.188 0.159

The expected value of the median falls slightly as sample size increases while, as would be expected, the standard errors of both the median and the mean are proportionate to the inverse square root of the sample size. The asymptotic approximation errs on the side of caution by overestimating the standard error.

Estimation of variance from sample data edit

The value of  —the asymptotic value of   where   is the population median—has been studied by several authors. The standard "delete one" jackknife method produces inconsistent results.[28] An alternative—the "delete k" method—where   grows with the sample size has been shown to be asymptotically consistent.[29] This method may be computationally expensive for large data sets. A bootstrap estimate is known to be consistent,[30] but converges very slowly (order of  ).[31] Other methods have been proposed but their behavior may differ between large and small samples.[32]

Efficiency edit

The efficiency of the sample median, measured as the ratio of the variance of the mean to the variance of the median, depends on the sample size and on the underlying population distribution. For a sample of size   from the normal distribution, the efficiency for large N is

 

The efficiency tends to   as   tends to infinity.

In other words, the relative variance of the median will be  , or 57% greater than the variance of the mean – the relative standard error of the median will be  , or 25% greater than the standard error of the mean,   (see also section #Sampling distribution above.).[33]

Other estimators edit

For univariate distributions that are symmetric about one median, the Hodges–Lehmann estimator is a robust and highly efficient estimator of the population median.[34]

If data is represented by a statistical model specifying a particular family of probability distributions, then estimates of the median can be obtained by fitting that family of probability distributions to the data and calculating the theoretical median of the fitted distribution.[citation needed] Pareto interpolation is an application of this when the population is assumed to have a Pareto distribution.

Multivariate median edit

Previously, this article discussed the univariate median, when the sample or population had one-dimension. When the dimension is two or higher, there are multiple concepts that extend the definition of the univariate median; each such multivariate median agrees with the univariate median when the dimension is exactly one.[34][35][36][37]

Marginal median edit

The marginal median is defined for vectors defined with respect to a fixed set of coordinates. A marginal median is defined to be the vector whose components are univariate medians. The marginal median is easy to compute, and its properties were studied by Puri and Sen.[34][38]

Geometric median edit

The geometric median of a discrete set of sample points   in a Euclidean space is the[a] point minimizing the sum of distances to the sample points.

 

In contrast to the marginal median, the geometric median is equivariant with respect to Euclidean similarity transformations such as translations and rotations.

Median in all directions edit

If the marginal medians for all coordinate systems coincide, then their common location may be termed the "median in all directions".[40] This concept is relevant to voting theory on account of the median voter theorem. When it exists, the median in all directions coincides with the geometric median (at least for discrete distributions).

Centerpoint edit

In statistics and computational geometry, the notion of centerpoint is a generalization of the median to data in higher-dimensional Euclidean space. Given a set of points in d-dimensional space, a centerpoint of the set is a point such that any hyperplane that goes through that point divides the set of points in two roughly equal subsets: the smaller part should have at least a 1/(d + 1) fraction of the points. Like the median, a centerpoint need not be one of the data points. Every non-empty set of points (with no duplicates) has at least one centerpoint.

Other median-related concepts edit

Interpolated median edit

When dealing with a discrete variable, it is sometimes useful to regard the observed values as being midpoints of underlying continuous intervals. An example of this is a Likert scale, on which opinions or preferences are expressed on a scale with a set number of possible responses. If the scale consists of the positive integers, an observation of 3 might be regarded as representing the interval from 2.50 to 3.50. It is possible to estimate the median of the underlying variable. If, say, 22% of the observations are of value 2 or below and 55.0% are of 3 or below (so 33% have the value 3), then the median   is 3 since the median is the smallest value of   for which   is greater than a half. But the interpolated median is somewhere between 2.50 and 3.50. First we add half of the interval width   to the median to get the upper bound of the median interval. Then we subtract that proportion of the interval width which equals the proportion of the 33% which lies above the 50% mark. In other words, we split up the interval width pro rata to the numbers of observations. In this case, the 33% is split into 28% below the median and 5% above it so we subtract 5/33 of the interval width from the upper bound of 3.50 to give an interpolated median of 3.35. More formally, if the values   are known, the interpolated median can be calculated from

 

Alternatively, if in an observed sample there are   scores above the median category,   scores in it and   scores below it then the interpolated median is given by

 

Pseudo-median edit

For univariate distributions that are symmetric about one median, the Hodges–Lehmann estimator is a robust and highly efficient estimator of the population median; for non-symmetric distributions, the Hodges–Lehmann estimator is a robust and highly efficient estimator of the population pseudo-median, which is the median of a symmetrized distribution and which is close to the population median.[41] The Hodges–Lehmann estimator has been generalized to multivariate distributions.[42]

Variants of regression edit

The Theil–Sen estimator is a method for robust linear regression based on finding medians of slopes.[43]

Median filter edit

The median filter is an important tool of image processing, that can effectively remove any salt and pepper noise from grayscale images.

Cluster analysis edit

In cluster analysis, the k-medians clustering algorithm provides a way of defining clusters, in which the criterion of maximising the distance between cluster-means that is used in k-means clustering, is replaced by maximising the distance between cluster-medians.

Median–median line edit

This is a method of robust regression. The idea dates back to Wald in 1940 who suggested dividing a set of bivariate data into two halves depending on the value of the independent parameter  : a left half with values less than the median and a right half with values greater than the median.[44] He suggested taking the means of the dependent   and independent   variables of the left and the right halves and estimating the slope of the line joining these two points. The line could then be adjusted to fit the majority of the points in the data set.

Nair and Shrivastava in 1942 suggested a similar idea but instead advocated dividing the sample into three equal parts before calculating the means of the subsamples.[45] Brown and Mood in 1951 proposed the idea of using the medians of two subsamples rather the means.[46] Tukey combined these ideas and recommended dividing the sample into three equal size subsamples and estimating the line based on the medians of the subsamples.[47]

Median-unbiased estimators edit

Any mean-unbiased estimator minimizes the risk (expected loss) with respect to the squared-error loss function, as observed by Gauss. A median-unbiased estimator minimizes the risk with respect to the absolute-deviation loss function, as observed by Laplace. Other loss functions are used in statistical theory, particularly in robust statistics.

The theory of median-unbiased estimators was revived by in 1947:[48]

An estimate of a one-dimensional parameter θ will be said to be median-unbiased if, for fixed θ, the median of the distribution of the estimate is at the value θ; i.e., the estimate underestimates just as often as it overestimates. This requirement seems for most purposes to accomplish as much as the mean-unbiased requirement and has the additional property that it is invariant under one-to-one transformation.

— page 584

Further properties of median-unbiased estimators have been reported.[49][50][51][52] Median-unbiased estimators are invariant under one-to-one transformations.

There are methods of constructing median-unbiased estimators that are optimal (in a sense analogous to the minimum-variance property for mean-unbiased estimators). Such constructions exist for probability distributions having monotone likelihood-functions.[53][54] One such procedure is an analogue of the Rao–Blackwell procedure for mean-unbiased estimators: The procedure holds for a smaller class of probability distributions than does the Rao—Blackwell procedure but for a larger class of loss functions.[55]

History edit

Scientific researchers in the ancient near east appear not to have used summary statistics altogether, instead choosing values that offered maximal consistency with a broader theory that integrated a wide variety of phenomena.[56] Within the Mediterranean (and, later, European) scholarly community, statistics like the mean are fundamentally a medieval and early modern development. (The history of the median outside Europe and its predecessors remains relatively unstudied.)

The idea of the median appeared in the 6th century in the Talmud, in order to fairly analyze divergent appraisals.[57][58] However, the concept did not spread to the broader scientific community.

Instead, the closest ancestor of the modern median is the mid-range, invented by Al-Biruni[59]: 31 [60] Transmission of his work to later scholars is unclear. He applied his technique to assaying currency metals, but, after he published his work, most assayers still adopted the most unfavorable value from their results, lest they appear to cheat.[59]: 35–8  [61] However, increased navigation at sea during the Age of Discovery meant that ship's navigators increasingly had to attempt to determine latitude in unfavorable weather against hostile shores, leading to renewed interest in summary statistics. Whether rediscovered or independently invented, the mid-range is recommended to nautical navigators in Harriot's "Instructions for Raleigh's Voyage to Guiana, 1595".[59]: 45–8 

The idea of the median may have first appeared in Edward Wright's 1599 book Certaine Errors in Navigation on a section about compass navigation.[62] Wright was reluctant to discard measured values, and may have felt that the median — incorporating a greater proportion of the dataset than the mid-range — was more likely to be correct. However, Wright did not give examples of his technique's use, making it hard to verify that he described the modern notion of median.[56][60][b] The median (in the context of probability) certainly appeared in the correspondence of Christiaan Huygens, but as an example of a statistic that was inappropriate for actuarial practice.[56]

The earliest recommendation of the median dates to 1757, when Roger Joseph Boscovich developed a regression method based on the L1 norm and therefore implicitly on the median.[56][63] In 1774, Laplace made this desire explicit: he suggested the median be used as the standard estimator of the value of a posterior PDF. The specific criterion was to minimize the expected magnitude of the error;   where   is the estimate and   is the true value. To this end, Laplace determined the distributions of both the sample mean and the sample median in the early 1800s.[26][64] However, a decade later, Gauss and Legendre developed the least squares method, which minimizes   to obtain the mean. Within the context of regression, Gauss and Legendre's innovation offers vastly easier computation. Consequently, Laplaces' proposal was generally rejected until the rise of computing devices 150 years later (and is still a relatively uncommon algorithm).[65]

Antoine Augustin Cournot in 1843 was the first[66] to use the term median (valeur médiane) for the value that divides a probability distribution into two equal halves. Gustav Theodor Fechner used the median (Centralwerth) in sociological and psychological phenomena.[67] It had earlier been used only in astronomy and related fields. Gustav Fechner popularized the median into the formal analysis of data, although it had been used previously by Laplace,[67] and the median appeared in a textbook by F. Y. Edgeworth.[68] Francis Galton used the English term median in 1881,[69][70] having earlier used the terms middle-most value in 1869, and the medium in 1880.[71][72]

Statisticians encouraged the use of medians intensely throughout the 19th century for its intuitive clarity and ease of manual computation. However, the notion of median does not lend itself to the theory of higher moments as well as the arithmetic mean does, and is much harder to compute by computer. As a result, the median was steadily supplanted as a notion of generic average by the arithmetic mean during the 20th century.[56][60]

See also edit

  • Absolute deviation – Difference between a variable's observed value and a reference value
  • Bias of an estimator – Difference between an estimator's expected value from a parameter's true value
  • Central tendency – Statistical value representing the center or average of a distribution
  • Concentration of measure – Statistical parameter for Lipschitz functions – Strong form of uniform continuity
  • Median graph – Graph with a median for each three vertices
  • Median of medians – Fast approximate median algorithm – Algorithm to calculate the approximate median in linear time
  • Median search – Method for finding kth smallest value
  • Median slope – Statistical method for fitting a line
  • Median voter theory – Theory in political science
  • Medoid – representative objects of a data set or a cluster within a data set whose sum of dissimilarities to all the objects in the cluster is minimals – Generalization of the median in higher dimensions

Notes edit

  1. ^ The geometric median is unique unless the sample is collinear.[39]
  2. ^ Subsequent scholars appear to concur with Eisenhart that Boroughs' 1580 figures, while suggestive of the median, in fact describe an arithmetic mean.;[59]: 62–3  Boroughs is mentioned in no other work.

References edit

  1. ^ a b Weisstein, Eric W. "Statistical Median". MathWorld.
  2. ^ Simon, Laura J.; "Descriptive statistics" 2010-07-30 at the Wayback Machine, Statistical Education Resource Kit, Pennsylvania State Department of Statistics
  3. ^ a b Derek Bissell (1994). Statistical Methods for Spc and Tqm. CRC Press. pp. 26–. ISBN 978-0-412-39440-9. Retrieved 25 February 2013.
  4. ^ David J. Sheskin (27 August 2003). Handbook of Parametric and Nonparametric Statistical Procedures (Third ed.). CRC Press. p. 7. ISBN 978-1-4200-3626-8. Retrieved 25 February 2013.
  5. ^ Paul T. von Hippel (2005). . Journal of Statistics Education. 13 (2). Archived from the original on 2008-10-14. Retrieved 2015-06-18.
  6. ^ Robson, Colin (1994). Experiment, Design and Statistics in Psychology. Penguin. pp. 42–45. ISBN 0-14-017648-9.
  7. ^ a b Williams, D. (2001). Weighing the Odds. Cambridge University Press. p. 165. ISBN 052100618X.
  8. ^ Maindonald, John; Braun, W. John (2010-05-06). Data Analysis and Graphics Using R: An Example-Based Approach. Cambridge University Press. p. 104. ISBN 978-1-139-48667-5.
  9. ^ . Archived from the original on 8 April 2015. Retrieved 16 March 2015.
  10. ^ Newman, M. E. J. (2005). "Power laws, Pareto distributions and Zipf's law". Contemporary Physics. 46 (5): 323–351. arXiv:cond-mat/0412004. Bibcode:2005ConPh..46..323N. doi:10.1080/00107510500052444. S2CID 2871747.
  11. ^ Stroock, Daniel (2011). Probability Theory. Cambridge University Press. pp. 43. ISBN 978-0-521-13250-3.
  12. ^ DeGroot, Morris H. (1970). Optimal Statistical Decisions. McGraw-Hill Book Co., New York-London-Sydney. p. 232. ISBN 9780471680291. MR 0356303.
  13. ^ Stephen A. Book; Lawrence Sher (1979). "How close are the mean and the median?". The Two-Year College Mathematics Journal. 10 (3): 202–204. doi:10.2307/3026748. JSTOR 3026748. Retrieved 12 March 2022.
  14. ^ Warren Page; Vedula N. Murty (1982). "Nearness Relations Among Measures of Central Tendency and Dispersion: Part 1". The Two-Year College Mathematics Journal. 13 (5): 315–327. doi:10.1080/00494925.1982.11972639 (inactive 1 August 2023). Retrieved 12 March 2022.{{cite journal}}: CS1 maint: DOI inactive as of August 2023 (link)
  15. ^ O'Cinneide, Colm Art (1990). "The mean is within one standard deviation of any median". The American Statistician. 44 (4): 292–293. doi:10.1080/00031305.1990.10475743. Retrieved 12 March 2022.
  16. ^ Mallows, Colin (August 1991). "Another comment on O'Cinneide". The American Statistician. 45 (3): 257. doi:10.1080/00031305.1991.10475815.
  17. ^ Piché, Robert (2012). Random Vectors and Random Sequences. Lambert Academic Publishing. ISBN 978-3659211966.
  18. ^ Kemperman, Johannes H. B. (1987). Dodge, Yadolah (ed.). "The median of a finite measure on a Banach space: Statistical data analysis based on the L1-norm and related methods". Papers from the First International Conference Held at Neuchâtel, August 31–September 4, 1987. Amsterdam: North-Holland Publishing Co.: 217–230. MR 0949228.
  19. ^ Milasevic, Philip; Ducharme, Gilles R. (1987). "Uniqueness of the spatial median". Annals of Statistics. 15 (3): 1332–1333. doi:10.1214/aos/1176350511. MR 0902264.
  20. ^ K.Van Steen Notes on probability and statistics
  21. ^ Basu, S.; Dasgupta, A. (1997). "The Mean, Median, and Mode of Unimodal Distributions:A Characterization". Theory of Probability and Its Applications. 41 (2): 210–223. doi:10.1137/S0040585X97975447. S2CID 54593178.
  22. ^ Merkle, M. (2005). "Jensen's inequality for medians". Statistics & Probability Letters. 71 (3): 277–281. doi:10.1016/j.spl.2004.11.010.
  23. ^ Alfred V. Aho and John E. Hopcroft and Jeffrey D. Ullman (1974). The Design and Analysis of Computer Algorithms. Reading/MA: Addison-Wesley. ISBN 0-201-00029-6. Here: Section 3.6 "Order Statistics", p.97-99, in particular Algorithm 3.6 and Theorem 3.9.
  24. ^ Bentley, Jon L.; McIlroy, M. Douglas (1993). "Engineering a sort function". Software: Practice and Experience. 23 (11): 1249–1265. doi:10.1002/spe.4380231105. S2CID 8822797.
  25. ^ Rousseeuw, Peter J.; Bassett, Gilbert W. Jr. (1990). "The remedian: a robust averaging method for large data sets" (PDF). J. Amer. Statist. Assoc. 85 (409): 97–104. doi:10.1080/01621459.1990.10475311.
  26. ^ a b Stigler, Stephen (December 1973). "Studies in the History of Probability and Statistics. XXXII: Laplace, Fisher and the Discovery of the Concept of Sufficiency". Biometrika. 60 (3): 439–445. doi:10.1093/biomet/60.3.439. JSTOR 2334992. MR 0326872.
  27. ^ Rider, Paul R. (1960). "Variance of the median of small samples from several special populations". J. Amer. Statist. Assoc. 55 (289): 148–150. doi:10.1080/01621459.1960.10482056.
  28. ^ Efron, B. (1982). The Jackknife, the Bootstrap and other Resampling Plans. Philadelphia: SIAM. ISBN 0898711797.
  29. ^ Shao, J.; Wu, C. F. (1989). "A General Theory for Jackknife Variance Estimation". Ann. Stat. 17 (3): 1176–1197. doi:10.1214/aos/1176347263. JSTOR 2241717.
  30. ^ Efron, B. (1979). "Bootstrap Methods: Another Look at the Jackknife". Ann. Stat. 7 (1): 1–26. doi:10.1214/aos/1176344552. JSTOR 2958830.
  31. ^ Hall, P.; Martin, M. A. (1988). "Exact Convergence Rate of Bootstrap Quantile Variance Estimator". Probab Theory Related Fields. 80 (2): 261–268. doi:10.1007/BF00356105. S2CID 119701556.
  32. ^ Jiménez-Gamero, M. D.; Munoz-García, J.; Pino-Mejías, R. (2004). "Reduced bootstrap for the median". Statistica Sinica. 14 (4): 1179–1198.
  33. ^ Maindonald, John; John Braun, W. (2010-05-06). Data Analysis and Graphics Using R: An Example-Based Approach. Cambridge University Press. ISBN 9781139486675.
  34. ^ a b c Hettmansperger, Thomas P.; McKean, Joseph W. (1998). Robust nonparametric statistical methods. Kendall's Library of Statistics. Vol. 5. London: Edward Arnold. ISBN 0-340-54937-8. MR 1604954.
  35. ^ Small, Christopher G. "A survey of multidimensional medians." International Statistical Review/Revue Internationale de Statistique (1990): 263–277. doi:10.2307/1403809 JSTOR 1403809
  36. ^ Niinimaa, A., and H. Oja. "Multivariate median." Encyclopedia of statistical sciences (1999).
  37. ^ Mosler, Karl. Multivariate Dispersion, Central Regions, and Depth: The Lift Zonoid Approach. Vol. 165. Springer Science & Business Media, 2012.
  38. ^ Puri, Madan L.; Sen, Pranab K.; Nonparametric Methods in Multivariate Analysis, John Wiley & Sons, New York, NY, 1971. (Reprinted by Krieger Publishing)
  39. ^ Vardi, Yehuda; Zhang, Cun-Hui (2000). "The multivariate L1-median and associated data depth". Proceedings of the National Academy of Sciences of the United States of America. 97 (4): 1423–1426 (electronic). Bibcode:2000PNAS...97.1423V. doi:10.1073/pnas.97.4.1423. MR 1740461. PMC 26449. PMID 10677477.
  40. ^ Davis, Otto A.; DeGroot, Morris H.; Hinich, Melvin J. (January 1972). "Social Preference Orderings and Majority Rule" (PDF). Econometrica. 40 (1): 147–157. doi:10.2307/1909727. JSTOR 1909727. The authors, working in a topic in which uniqueness is assumed, actually use the expression "unique median in all directions".
  41. ^ Pratt, William K.; Cooper, Ted J.; Kabir, Ihtisham (1985-07-11). Corbett, Francis J (ed.). "Pseudomedian Filter". Architectures and Algorithms for Digital Image Processing II. 0534: 34. Bibcode:1985SPIE..534...34P. doi:10.1117/12.946562. S2CID 173183609.
  42. ^ Oja, Hannu (2010). Multivariate nonparametric methods with R: An approach based on spatial signs and ranks. Lecture Notes in Statistics. Vol. 199. New York, NY: Springer. pp. xiv+232. doi:10.1007/978-1-4419-0468-3. ISBN 978-1-4419-0467-6. MR 2598854.
  43. ^ Wilcox, Rand R. (2001), "Theil–Sen estimator", Fundamentals of Modern Statistical Methods: Substantially Improving Power and Accuracy, Springer-Verlag, pp. 207–210, ISBN 978-0-387-95157-7.
  44. ^ Wald, A. (1940). "The Fitting of Straight Lines if Both Variables are Subject to Error" (PDF). Annals of Mathematical Statistics. 11 (3): 282–300. doi:10.1214/aoms/1177731868. JSTOR 2235677.
  45. ^ Nair, K. R.; Shrivastava, M. P. (1942). "On a Simple Method of Curve Fitting". Sankhyā: The Indian Journal of Statistics. 6 (2): 121–132. JSTOR 25047749.
  46. ^ Brown, G. W.; Mood, A. M. (1951). "On Median Tests for Linear Hypotheses". Proc Second Berkeley Symposium on Mathematical Statistics and Probability. Berkeley, CA: University of California Press. pp. 159–166. Zbl 0045.08606.
  47. ^ Tukey, J. W. (1977). Exploratory Data Analysis. Reading, MA: Addison-Wesley. ISBN 0201076160.
  48. ^ Brown, George W. (1947). "On Small-Sample Estimation". Annals of Mathematical Statistics. 18 (4): 582–585. doi:10.1214/aoms/1177730349. JSTOR 2236236.
  49. ^ Lehmann, Erich L. (1951). "A General Concept of Unbiasedness". Annals of Mathematical Statistics. 22 (4): 587–592. doi:10.1214/aoms/1177729549. JSTOR 2236928.
  50. ^ Birnbaum, Allan (1961). "A Unified Theory of Estimation, I". Annals of Mathematical Statistics. 32 (1): 112–135. doi:10.1214/aoms/1177705145. JSTOR 2237612.
  51. ^ van der Vaart, H. Robert (1961). "Some Extensions of the Idea of Bias". Annals of Mathematical Statistics. 32 (2): 436–447. doi:10.1214/aoms/1177705051. JSTOR 2237754. MR 0125674.
  52. ^ Pfanzagl, Johann; with the assistance of R. Hamböker (1994). Parametric Statistical Theory. Walter de Gruyter. ISBN 3-11-013863-8. MR 1291393.
  53. ^ Pfanzagl, Johann. "On optimal median unbiased estimators in the presence of nuisance parameters." The Annals of Statistics (1979): 187–193.
  54. ^ Brown, L. D.; Cohen, Arthur; Strawderman, W. E. (1976). "A Complete Class Theorem for Strict Monotone Likelihood Ratio With Applications". Ann. Statist. 4 (4): 712–722. doi:10.1214/aos/1176343543.
  55. ^ Page; Brown, L. D.; Cohen, Arthur; Strawderman, W. E. (1976). "A Complete Class Theorem for Strict Monotone Likelihood Ratio With Applications". Ann. Statist. 4 (4): 712–722. doi:10.1214/aos/1176343543.
  56. ^ a b c d e Bakker, Arthur; Gravemeijer, Koeno P. E. (2006-06-01). "An Historical Phenomenology of Mean and Median". Educational Studies in Mathematics. 62 (2): 149–168. doi:10.1007/s10649-006-7099-8. ISSN 1573-0816. S2CID 143708116.
  57. ^ Adler, Dan (31 December 2014). . Jewish American and Israeli Issues. Archived from the original on 6 December 2015. Retrieved 22 February 2020.
  58. ^ Modern Economic Theory in the Talmud by Yisrael Aumann
  59. ^ a b c d Eisenhart, Churchill (24 August 1971). The Development of the Concept of the Best Mean of a Set of Measurements from Antiquity to the Present Day (PDF) (Speech). 131st Annual Meeting of the American Statistical Association. Colorado State University.
  60. ^ a b c "How the Average Triumphed Over the Median". Priceonomics. 5 April 2016. Retrieved 2020-02-23.
  61. ^ Sangster, Alan (March 2021). "The Life and Works of Luca Pacioli (1446/7–1517), Humanist Educator". Abacus. 57 (1): 126–152. doi:10.1111/abac.12218. hdl:2164/16100. ISSN 0001-3072. S2CID 233917744.
  62. ^ Wright, Edward; Parsons, E. J. S.; Morris, W. F. (1939). "Edward Wright and His Work". Imago Mundi. 3: 61–71. doi:10.1080/03085693908591862. ISSN 0308-5694. JSTOR 1149920.
  63. ^ Stigler, S. M. (1986). The History of Statistics: The Measurement of Uncertainty Before 1900. Harvard University Press. ISBN 0674403401.
  64. ^ Laplace PS de (1818) Deuxième supplément à la Théorie Analytique des Probabilités, Paris, Courcier
  65. ^ Jaynes, E.T. (2007). Probability theory : the logic of science (5. print. ed.). Cambridge [u.a.]: Cambridge Univ. Press. p. 172. ISBN 978-0-521-59271-0.
  66. ^ Howarth, Richard (2017). Dictionary of Mathematical Geosciences: With Historical Notes. Springer. p. 374.
  67. ^ a b Keynes, J.M. (1921) A Treatise on Probability. Pt II Ch XVII §5 (p 201) (2006 reprint, Cosimo Classics, ISBN 9781596055308 : multiple other reprints)
  68. ^ Stigler, Stephen M. (2002). Statistics on the Table: The History of Statistical Concepts and Methods. Harvard University Press. pp. 105–7. ISBN 978-0-674-00979-0.
  69. ^ Galton F (1881) "Report of the Anthropometric Committee" pp 245–260. Report of the 51st Meeting of the British Association for the Advancement of Science
  70. ^ David, H. A. (1995). "First (?) Occurrence of Common Terms in Mathematical Statistics". The American Statistician. 49 (2): 121–133. doi:10.2307/2684625. ISSN 0003-1305. JSTOR 2684625.
  71. ^ encyclopediaofmath.org
  72. ^ personal.psu.edu

External links edit

  • "Median (in statistics)", Encyclopedia of Mathematics, EMS Press, 2001 [1994]
  • Median as a weighted arithmetic mean of all Sample Observations
  • On-line calculator
  • Calculating the median
  • A problem involving the mean, the median, and the mode.
  • Weisstein, Eric W. "Statistical Median". MathWorld.
  • Python script for Median computations and income inequality metrics
  • Fast Computation of the Median by Successive Binning
  • 'Mean, median, mode and skewness', A tutorial devised for first-year psychology students at Oxford University, based on a worked example.
  • The Complex SAT Math Problem Even the College Board Got Wrong: Andrew Daniels in Popular Mechanics

This article incorporates material from Median of a distribution on PlanetMath, which is licensed under the Creative Commons Attribution/Share-Alike License.

median, this, article, about, statistical, concept, other, uses, disambiguation, statistics, probability, theory, median, value, separating, higher, half, from, lower, half, data, sample, population, probability, distribution, data, thought, middle, value, bas. This article is about the statistical concept For other uses see Median disambiguation In statistics and probability theory the median is the value separating the higher half from the lower half of a data sample a population or a probability distribution For a data set it may be thought of as the middle value The basic feature of the median in describing data compared to the mean often simply described as the average is that it is not skewed by a small proportion of extremely large or small values and therefore provides a better representation of the center Median income for example may be a better way to describe center of the income distribution because increases in the largest incomes alone have no effect on median For this reason the median is of central importance in robust statistics Finding the median in sets of data with an odd and even number of values Contents 1 Finite data set of numbers 2 Formal definition and notation 3 Uses 4 Probability distributions 4 1 Medians of particular distributions 5 Properties 5 1 Optimality property 5 2 Inequality relating means and medians 5 2 1 Unimodal distributions 6 Jensen s inequality for medians 7 Medians for samples 7 1 Efficient computation of the sample median 7 2 Sampling distribution 7 2 1 Derivation of the asymptotic distribution 7 2 1 1 Empirical local density 7 3 Estimation of variance from sample data 7 4 Efficiency 7 5 Other estimators 8 Multivariate median 8 1 Marginal median 8 2 Geometric median 8 3 Median in all directions 8 4 Centerpoint 9 Other median related concepts 9 1 Interpolated median 9 2 Pseudo median 9 3 Variants of regression 9 4 Median filter 9 5 Cluster analysis 9 6 Median median line 10 Median unbiased estimators 11 History 12 See also 13 Notes 14 References 15 External linksFinite data set of numbers editThe median of a finite list of numbers is the middle number when those numbers are listed in order from smallest to greatest If the data set has an odd number of observations the middle one is selected For example the following list of seven numbers 1 3 3 6 7 8 9has the median of 6 which is the fourth value If the data set has an even number of observations there is no distinct middle value and the median is usually defined to be the arithmetic mean of the two middle values 1 2 For example this data set of 8 numbers 1 2 3 4 5 6 8 9has a median value of 4 5 that is 4 5 2 displaystyle 4 5 2 nbsp In more technical terms this interprets the median as the fully trimmed mid range In general with this convention the median can be defined as follows For a data set x displaystyle x nbsp of n displaystyle n nbsp elements ordered from smallest to greatest if n displaystyle n nbsp is odd med x x n 1 2 displaystyle operatorname med x x n 1 2 nbsp if n displaystyle n nbsp is even med x x n 2 x n 2 1 2 displaystyle operatorname med x frac x n 2 x n 2 1 2 nbsp Comparison of common averages of values 1 2 2 3 4 7 9 Type Description Example ResultMidrange Midway point between the minimum and the maximum of a data set 1 2 2 3 4 7 9 5Arithmetic mean Sum of values of a data set divided by number of values x 1 n i 1 n x i textstyle bar x frac 1 n sum i 1 n x i nbsp 1 2 2 3 4 7 9 7 4Median Middle value separating the greater and lesser halves of a data set 1 2 2 3 4 7 9 3Mode Most frequent value in a data set 1 2 2 3 4 7 9 2Formal definition and notation editFormally a median of a population is any value such that at least half of the population is less than or equal to the proposed median and at least half is greater than or equal to the proposed median As seen above medians may not be unique If each set contains more than half the population then some of the population is exactly equal to the unique median The median is well defined for any ordered one dimensional data and is independent of any distance metric The median can thus be applied to school classes which are ranked but not numerical e g working out a median grade when student test scores are graded from F to A although the result might be halfway between classes if there is an even number of classes For odd number classes one specific class is determined as the median A geometric median on the other hand is defined in any number of dimensions A related concept in which the outcome is forced to correspond to a member of the sample is the medoid There is no widely accepted standard notation for the median but some authors represent the median of a variable x as med x x 3 as m1 2 1 or as M 3 4 In any of these cases the use of these or other symbols for the median needs to be explicitly defined when they are introduced The median is a special case of other ways of summarizing the typical values associated with a statistical distribution it is the 2nd quartile 5th decile and 50th percentile Uses editThe median can be used as a measure of location when one attaches reduced importance to extreme values typically because a distribution is skewed extreme values are not known or outliers are untrustworthy i e may be measurement transcription errors For example consider the multiset 1 2 2 2 3 14 The median is 2 in this case as is the mode and it might be seen as a better indication of the center than the arithmetic mean of 4 which is larger than all but one of the values However the widely cited empirical relationship that the mean is shifted further into the tail of a distribution than the median is not generally true At most one can say that the two statistics cannot be too far apart see Inequality relating means and medians below 5 As a median is based on the middle data in a set it is not necessary to know the value of extreme results in order to calculate it For example in a psychology test investigating the time needed to solve a problem if a small number of people failed to solve the problem at all in the given time a median can still be calculated 6 Because the median is simple to understand and easy to calculate while also a robust approximation to the mean the median is a popular summary statistic in descriptive statistics In this context there are several choices for a measure of variability the range the interquartile range the mean absolute deviation and the median absolute deviation For practical purposes different measures of location and dispersion are often compared on the basis of how well the corresponding population values can be estimated from a sample of data The median estimated using the sample median has good properties in this regard While it is not usually optimal if a given population distribution is assumed its properties are always reasonably good For example a comparison of the efficiency of candidate estimators shows that the sample mean is more statistically efficient when and only when data is uncontaminated by data from heavy tailed distributions or from mixtures of distributions citation needed Even then the median has a 64 efficiency compared to the minimum variance mean for large normal samples which is to say the variance of the median will be 50 greater than the variance of the mean 7 8 Probability distributions edit nbsp Geometric visualization of the mode median and mean of an arbitrary probability density function 9 For any real valued probability distribution with cumulative distribution function F a median is defined as any real number m that satisfies the inequalities m d F x 1 2 and m d F x 1 2 displaystyle int infty m dF x geq frac 1 2 text and int m infty dF x geq frac 1 2 nbsp An equivalent phrasing uses a random variable X distributed according to F P X m 1 2 and P X m 1 2 displaystyle operatorname P X leq m geq frac 1 2 text and operatorname P X geq m geq frac 1 2 nbsp Note that this definition does not require X to have an absolutely continuous distribution which has a probability density function f nor does it require a discrete one In the former case the inequalities can be upgraded to equality a median satisfiesP X m m f x d x 1 2 m f x d x P X m displaystyle operatorname P X leq m int infty m f x dx frac 1 2 int m infty f x dx operatorname P X geq m nbsp Any probability distribution on the real number set R displaystyle mathbb R nbsp has at least one median but in pathological cases there may be more than one median if F is constant 1 2 on an interval so that f 0 there then any value of that interval is a median Medians of particular distributions edit The medians of certain types of distributions can be easily calculated from their parameters furthermore they exist even for some distributions lacking a well defined mean such as the Cauchy distribution The median of a symmetric unimodal distribution coincides with the mode The median of a symmetric distribution which possesses a mean m also takes the value m The median of a normal distribution with mean m and variance s2 is m In fact for a normal distribution mean median mode The median of a uniform distribution in the interval a b is a b 2 which is also the mean The median of a Cauchy distribution with location parameter x0 and scale parameter y is x0 the location parameter The median of a power law distribution x a with exponent a gt 1 is 21 a 1 xmin where xmin is the minimum value for which the power law holds 10 The median of an exponential distribution with rate parameter l is the natural logarithm of 2 divided by the rate parameter l 1ln 2 The median of a Weibull distribution with shape parameter k and scale parameter l is l ln 2 1 k Properties editOptimality property edit The mean absolute error of a real variable c with respect to the random variable X is E X c displaystyle E left X c right nbsp Provided that the probability distribution of X is such that the above expectation exists then m is a median of X if and only if m is a minimizer of the mean absolute error with respect to X 11 In particular if m is a sample median then it minimizes the arithmetic mean of the absolute deviations 12 Note however that in cases where the sample contains an even number of elements this minimizer is not unique More generally a median is defined as a minimum of E X c X displaystyle E X c X nbsp as discussed below in the section on multivariate medians specifically the spatial median This optimization based definition of the median is useful in statistical data analysis for example in k medians clustering Inequality relating means and medians edit nbsp Comparison of mean median and mode of two log normal distributions with different skewnessIf the distribution has finite variance then the distance between the median X displaystyle tilde X nbsp and the mean X displaystyle bar X nbsp is bounded by one standard deviation This bound was proved by Book and Sher in 1979 for discrete samples 13 and more generally by Page and Murty in 1982 14 In a comment on a subsequent proof by O Cinneide 15 Mallows in 1991 presented a compact proof that uses Jensen s inequality twice 16 as follows Using for the absolute value we have m m E X m E X m E X m E X m 2 s displaystyle begin aligned mu m operatorname E X m amp leq operatorname E X m amp leq operatorname E X mu amp leq sqrt operatorname E left X mu 2 right sigma end aligned nbsp The first and third inequalities come from Jensen s inequality applied to the absolute value function and the square function which are each convex The second inequality comes from the fact that a median minimizes the absolute deviation function a E X a displaystyle a mapsto operatorname E X a nbsp Mallows s proof can be generalized to obtain a multivariate version of the inequality 17 simply by replacing the absolute value with a norm m m E X m 2 trace var X displaystyle mu m leq sqrt operatorname E left X mu 2 right sqrt operatorname trace left operatorname var X right nbsp where m is a spatial median that is a minimizer of the function a E X a displaystyle a mapsto operatorname E X a nbsp The spatial median is unique when the data set s dimension is two or more 18 19 An alternative proof uses the one sided Chebyshev inequality it appears in an inequality on location and scale parameters This formula also follows directly from Cantelli s inequality 20 Unimodal distributions edit For the case of unimodal distributions one can achieve a sharper bound on the distance between the median and the mean X X 3 5 1 2 s 0 7746 s displaystyle left tilde X bar X right leq left frac 3 5 right frac 1 2 sigma approx 0 7746 sigma nbsp 21 A similar relation holds between the median and the mode X m o d e 3 1 2 s 1 732 s displaystyle left tilde X mathrm mode right leq 3 frac 1 2 sigma approx 1 732 sigma nbsp Jensen s inequality for medians editJensen s inequality states that for any random variable X with a finite expectation E X and for any convex function f f E x E f x displaystyle f E x leq E f x nbsp This inequality generalizes to the median as well We say a function f R R is a C function if for any t f 1 t x R f x t displaystyle f 1 left infty t right x in mathbb R mid f x leq t nbsp is a closed interval allowing the degenerate cases of a single point or an empty set Every convex function is a C function but the reverse does not hold If f is a C function then f med X med f X displaystyle f operatorname med X leq operatorname med f X nbsp If the medians are not unique the statement holds for the corresponding suprema 22 Medians for samples editThis section discusses the theory of estimating a population median from a sample To calculate the median of a sample by hand see Finite data set of numbers above Efficient computation of the sample median edit Even though comparison sorting n items requires W n log n operations selection algorithms can compute the k th smallest of n items with only 8 n operations This includes the median which is the n 2 th order statistic or for an even number of samples the arithmetic mean of the two middle order statistics 23 Selection algorithms still have the downside of requiring W n memory that is they need to have the full sample or a linear sized portion of it in memory Because this as well as the linear time requirement can be prohibitive several estimation procedures for the median have been developed A simple one is the median of three rule which estimates the median as the median of a three element subsample this is commonly used as a subroutine in the quicksort sorting algorithm which uses an estimate of its input s median A more robust estimator is Tukey s ninther which is the median of three rule applied with limited recursion 24 if A is the sample laid out as an array and med3 A med A 1 A n 2 A n then ninther A med3 med3 A 1 1 3 n med3 A 1 3 n 2 3 n med3 A 2 3 n n The remedian is an estimator for the median that requires linear time but sub linear memory operating in a single pass over the sample 25 Sampling distribution edit The distributions of both the sample mean and the sample median were determined by Laplace 26 The distribution of the sample median from a population with a density function f x displaystyle f x nbsp is asymptotically normal with mean m displaystyle mu nbsp and variance 27 1 4 n f m 2 displaystyle frac 1 4nf m 2 nbsp where m displaystyle m nbsp is the median of f x displaystyle f x nbsp and n displaystyle n nbsp is the sample size Sample median N m m s 2 1 4 n f m 2 displaystyle text Sample median sim mathcal N left mu m sigma 2 frac 1 4nf m 2 right nbsp A modern proof follows below Laplace s result is now understood as a special case of the asymptotic distribution of arbitrary quantiles For normal samples the density is f m 1 2 p s 2 displaystyle f m 1 sqrt 2 pi sigma 2 nbsp thus for large samples the variance of the median equals p 2 s 2 n displaystyle pi 2 cdot sigma 2 n nbsp 7 See also section Efficiency below Derivation of the asymptotic distribution edit This section does not cite any sources Please help improve this section by adding citations to reliable sources Unsourced material may be challenged and removed November 2023 Learn how and when to remove this template message We take the sample size to be an odd number N 2 n 1 displaystyle N 2n 1 nbsp and assume our variable continuous the formula for the case of discrete variables is given below in Empirical local density The sample can be summarized as below median at median and above median which corresponds to a trinomial distribution with probabilities F v displaystyle F v nbsp f v displaystyle f v nbsp and 1 F v displaystyle 1 F v nbsp For a continuous variable the probability of multiple sample values being exactly equal to the median is 0 so one can calculate the density of at the point v displaystyle v nbsp directly from the trinomial distribution Pr med v d v 2 n 1 n n F v n 1 F v n f v d v displaystyle Pr operatorname med v dv frac 2n 1 n n F v n 1 F v n f v dv nbsp Now we introduce the beta function For integer arguments a displaystyle alpha nbsp and b displaystyle beta nbsp this can be expressed as B a b a 1 b 1 a b 1 displaystyle mathrm B alpha beta frac alpha 1 beta 1 alpha beta 1 nbsp Also recall that f v d v d F v displaystyle f v dv dF v nbsp Using these relationships and setting both a displaystyle alpha nbsp and b displaystyle beta nbsp equal to n 1 displaystyle n 1 nbsp allows the last expression to be written as F v n 1 F v n B n 1 n 1 d F v displaystyle frac F v n 1 F v n mathrm B n 1 n 1 dF v nbsp Hence the density function of the median is a symmetric beta distribution pushed forward by F displaystyle F nbsp Its mean as we would expect is 0 5 and its variance is 1 4 N 2 displaystyle 1 4 N 2 nbsp By the chain rule the corresponding variance of the sample median is 1 4 N 2 f m 2 displaystyle frac 1 4 N 2 f m 2 nbsp The additional 2 is negligible in the limit Empirical local density edit In practice the functions f displaystyle f nbsp and F displaystyle F nbsp above are often not known or assumed However they can be estimated from an observed frequency distribution In this section we give an example Consider the following table representing a sample of 3 800 discrete valued observations v 0 0 5 1 1 5 2 2 5 3 3 5 4 4 5 5f v 0 000 0 008 0 010 0 013 0 083 0 108 0 328 0 220 0 202 0 023 0 005F v 0 000 0 008 0 018 0 031 0 114 0 222 0 550 0 770 0 972 0 995 1 000Because the observations are discrete valued constructing the exact distribution of the median is not an immediate translation of the above expression for Pr med v displaystyle Pr operatorname med v nbsp one may and typically does have multiple instances of the median in one s sample So we must sum over all these possibilities Pr med v i 0 n k 0 n N i N i k k F v 1 i 1 F v k f v N i k displaystyle Pr operatorname med v sum i 0 n sum k 0 n frac N i N i k k F v 1 i 1 F v k f v N i k nbsp Here i is the number of points strictly less than the median and k the number strictly greater Using these preliminaries it is possible to investigate the effect of sample size on the standard errors of the mean and median The observed mean is 3 16 the observed raw median is 3 and the observed interpolated median is 3 174 The following table gives some comparison statistics Sample sizeStatistic 3 9 15 21Expected value of median 3 198 3 191 3 174 3 161Standard error of median above formula 0 482 0 305 0 257 0 239Standard error of median asymptotic approximation 0 879 0 508 0 393 0 332Standard error of mean 0 421 0 243 0 188 0 159The expected value of the median falls slightly as sample size increases while as would be expected the standard errors of both the median and the mean are proportionate to the inverse square root of the sample size The asymptotic approximation errs on the side of caution by overestimating the standard error Estimation of variance from sample data edit The value of 2 f x 2 displaystyle 2f x 2 nbsp the asymptotic value of n 1 2 n m displaystyle n 1 2 nu m nbsp where n displaystyle nu nbsp is the population median has been studied by several authors The standard delete one jackknife method produces inconsistent results 28 An alternative the delete k method where k displaystyle k nbsp grows with the sample size has been shown to be asymptotically consistent 29 This method may be computationally expensive for large data sets A bootstrap estimate is known to be consistent 30 but converges very slowly order of n 1 4 displaystyle n frac 1 4 nbsp 31 Other methods have been proposed but their behavior may differ between large and small samples 32 Efficiency edit The efficiency of the sample median measured as the ratio of the variance of the mean to the variance of the median depends on the sample size and on the underlying population distribution For a sample of size N 2 n 1 displaystyle N 2n 1 nbsp from the normal distribution the efficiency for large N is 2 p N 2 N displaystyle frac 2 pi frac N 2 N nbsp The efficiency tends to 2 p displaystyle frac 2 pi nbsp as N displaystyle N nbsp tends to infinity In other words the relative variance of the median will be p 2 1 57 displaystyle pi 2 approx 1 57 nbsp or 57 greater than the variance of the mean the relative standard error of the median will be p 2 1 2 1 25 displaystyle pi 2 frac 1 2 approx 1 25 nbsp or 25 greater than the standard error of the mean s n displaystyle sigma sqrt n nbsp see also section Sampling distribution above 33 Other estimators edit For univariate distributions that are symmetric about one median the Hodges Lehmann estimator is a robust and highly efficient estimator of the population median 34 If data is represented by a statistical model specifying a particular family of probability distributions then estimates of the median can be obtained by fitting that family of probability distributions to the data and calculating the theoretical median of the fitted distribution citation needed Pareto interpolation is an application of this when the population is assumed to have a Pareto distribution Multivariate median editPreviously this article discussed the univariate median when the sample or population had one dimension When the dimension is two or higher there are multiple concepts that extend the definition of the univariate median each such multivariate median agrees with the univariate median when the dimension is exactly one 34 35 36 37 Marginal median edit The marginal median is defined for vectors defined with respect to a fixed set of coordinates A marginal median is defined to be the vector whose components are univariate medians The marginal median is easy to compute and its properties were studied by Puri and Sen 34 38 Geometric median edit The geometric median of a discrete set of sample points x 1 x N displaystyle x 1 ldots x N nbsp in a Euclidean space is the a point minimizing the sum of distances to the sample points m a r g m i n m R m n 1 N m x n 2 displaystyle hat mu underset mu in mathbb R m operatorname arg min sum n 1 N left mu x n right 2 nbsp In contrast to the marginal median the geometric median is equivariant with respect to Euclidean similarity transformations such as translations and rotations Median in all directions edit If the marginal medians for all coordinate systems coincide then their common location may be termed the median in all directions 40 This concept is relevant to voting theory on account of the median voter theorem When it exists the median in all directions coincides with the geometric median at least for discrete distributions Centerpoint edit This section is an excerpt from Centerpoint geometry edit In statistics and computational geometry the notion of centerpoint is a generalization of the median to data in higher dimensional Euclidean space Given a set of points in d dimensional space a centerpoint of the set is a point such that any hyperplane that goes through that point divides the set of points in two roughly equal subsets the smaller part should have at least a 1 d 1 fraction of the points Like the median a centerpoint need not be one of the data points Every non empty set of points with no duplicates has at least one centerpoint Other median related concepts editInterpolated median edit When dealing with a discrete variable it is sometimes useful to regard the observed values as being midpoints of underlying continuous intervals An example of this is a Likert scale on which opinions or preferences are expressed on a scale with a set number of possible responses If the scale consists of the positive integers an observation of 3 might be regarded as representing the interval from 2 50 to 3 50 It is possible to estimate the median of the underlying variable If say 22 of the observations are of value 2 or below and 55 0 are of 3 or below so 33 have the value 3 then the median m displaystyle m nbsp is 3 since the median is the smallest value of x displaystyle x nbsp for which F x displaystyle F x nbsp is greater than a half But the interpolated median is somewhere between 2 50 and 3 50 First we add half of the interval width w displaystyle w nbsp to the median to get the upper bound of the median interval Then we subtract that proportion of the interval width which equals the proportion of the 33 which lies above the 50 mark In other words we split up the interval width pro rata to the numbers of observations In this case the 33 is split into 28 below the median and 5 above it so we subtract 5 33 of the interval width from the upper bound of 3 50 to give an interpolated median of 3 35 More formally if the values f x displaystyle f x nbsp are known the interpolated median can be calculated from m int m w 1 2 F m 1 2 f m displaystyle m text int m w left frac 1 2 frac F m frac 1 2 f m right nbsp Alternatively if in an observed sample there are k displaystyle k nbsp scores above the median category j displaystyle j nbsp scores in it and i displaystyle i nbsp scores below it then the interpolated median is given by m int m w 2 k i j displaystyle m text int m frac w 2 left frac k i j right nbsp Pseudo median edit Main article Pseudomedian For univariate distributions that are symmetric about one median the Hodges Lehmann estimator is a robust and highly efficient estimator of the population median for non symmetric distributions the Hodges Lehmann estimator is a robust and highly efficient estimator of the population pseudo median which is the median of a symmetrized distribution and which is close to the population median 41 The Hodges Lehmann estimator has been generalized to multivariate distributions 42 Variants of regression edit The Theil Sen estimator is a method for robust linear regression based on finding medians of slopes 43 Median filter edit The median filter is an important tool of image processing that can effectively remove any salt and pepper noise from grayscale images Cluster analysis edit Main article k medians clustering In cluster analysis the k medians clustering algorithm provides a way of defining clusters in which the criterion of maximising the distance between cluster means that is used in k means clustering is replaced by maximising the distance between cluster medians Median median line edit This is a method of robust regression The idea dates back to Wald in 1940 who suggested dividing a set of bivariate data into two halves depending on the value of the independent parameter x displaystyle x nbsp a left half with values less than the median and a right half with values greater than the median 44 He suggested taking the means of the dependent y displaystyle y nbsp and independent x displaystyle x nbsp variables of the left and the right halves and estimating the slope of the line joining these two points The line could then be adjusted to fit the majority of the points in the data set Nair and Shrivastava in 1942 suggested a similar idea but instead advocated dividing the sample into three equal parts before calculating the means of the subsamples 45 Brown and Mood in 1951 proposed the idea of using the medians of two subsamples rather the means 46 Tukey combined these ideas and recommended dividing the sample into three equal size subsamples and estimating the line based on the medians of the subsamples 47 Median unbiased estimators editMain article Bias of an estimator Median unbiased estimators Any mean unbiased estimator minimizes the risk expected loss with respect to the squared error loss function as observed by Gauss A median unbiased estimator minimizes the risk with respect to the absolute deviation loss function as observed by Laplace Other loss functions are used in statistical theory particularly in robust statistics The theory of median unbiased estimators was revived by George W Brown in 1947 48 An estimate of a one dimensional parameter 8 will be said to be median unbiased if for fixed 8 the median of the distribution of the estimate is at the value 8 i e the estimate underestimates just as often as it overestimates This requirement seems for most purposes to accomplish as much as the mean unbiased requirement and has the additional property that it is invariant under one to one transformation page 584 Further properties of median unbiased estimators have been reported 49 50 51 52 Median unbiased estimators are invariant under one to one transformations There are methods of constructing median unbiased estimators that are optimal in a sense analogous to the minimum variance property for mean unbiased estimators Such constructions exist for probability distributions having monotone likelihood functions 53 54 One such procedure is an analogue of the Rao Blackwell procedure for mean unbiased estimators The procedure holds for a smaller class of probability distributions than does the Rao Blackwell procedure but for a larger class of loss functions 55 History editScientific researchers in the ancient near east appear not to have used summary statistics altogether instead choosing values that offered maximal consistency with a broader theory that integrated a wide variety of phenomena 56 Within the Mediterranean and later European scholarly community statistics like the mean are fundamentally a medieval and early modern development The history of the median outside Europe and its predecessors remains relatively unstudied The idea of the median appeared in the 6th century in the Talmud in order to fairly analyze divergent appraisals 57 58 However the concept did not spread to the broader scientific community Instead the closest ancestor of the modern median is the mid range invented by Al Biruni 59 31 60 Transmission of his work to later scholars is unclear He applied his technique to assaying currency metals but after he published his work most assayers still adopted the most unfavorable value from their results lest they appear to cheat 59 35 8 61 However increased navigation at sea during the Age of Discovery meant that ship s navigators increasingly had to attempt to determine latitude in unfavorable weather against hostile shores leading to renewed interest in summary statistics Whether rediscovered or independently invented the mid range is recommended to nautical navigators in Harriot s Instructions for Raleigh s Voyage to Guiana 1595 59 45 8 The idea of the median may have first appeared in Edward Wright s 1599 book Certaine Errors in Navigation on a section about compass navigation 62 Wright was reluctant to discard measured values and may have felt that the median incorporating a greater proportion of the dataset than the mid range was more likely to be correct However Wright did not give examples of his technique s use making it hard to verify that he described the modern notion of median 56 60 b The median in the context of probability certainly appeared in the correspondence of Christiaan Huygens but as an example of a statistic that was inappropriate for actuarial practice 56 The earliest recommendation of the median dates to 1757 when Roger Joseph Boscovich developed a regression method based on the L1 norm and therefore implicitly on the median 56 63 In 1774 Laplace made this desire explicit he suggested the median be used as the standard estimator of the value of a posterior PDF The specific criterion was to minimize the expected magnitude of the error a a displaystyle alpha alpha nbsp where a displaystyle alpha nbsp is the estimate and a displaystyle alpha nbsp is the true value To this end Laplace determined the distributions of both the sample mean and the sample median in the early 1800s 26 64 However a decade later Gauss and Legendre developed the least squares method which minimizes a a 2 displaystyle alpha alpha 2 nbsp to obtain the mean Within the context of regression Gauss and Legendre s innovation offers vastly easier computation Consequently Laplaces proposal was generally rejected until the rise of computing devices 150 years later and is still a relatively uncommon algorithm 65 Antoine Augustin Cournot in 1843 was the first 66 to use the term median valeur mediane for the value that divides a probability distribution into two equal halves Gustav Theodor Fechner used the median Centralwerth in sociological and psychological phenomena 67 It had earlier been used only in astronomy and related fields Gustav Fechner popularized the median into the formal analysis of data although it had been used previously by Laplace 67 and the median appeared in a textbook by F Y Edgeworth 68 Francis Galton used the English term median in 1881 69 70 having earlier used the terms middle most value in 1869 and the medium in 1880 71 72 Statisticians encouraged the use of medians intensely throughout the 19th century for its intuitive clarity and ease of manual computation However the notion of median does not lend itself to the theory of higher moments as well as the arithmetic mean does and is much harder to compute by computer As a result the median was steadily supplanted as a notion of generic average by the arithmetic mean during the 20th century 56 60 See also edit nbsp Mathematics portalAbsolute deviation Difference between a variable s observed value and a reference valuePages displaying short descriptions of redirect targets Bias of an estimator Difference between an estimator s expected value from a parameter s true value Central tendency Statistical value representing the center or average of a distribution Concentration of measure Statistical parameter for Lipschitz functions Strong form of uniform continuityPages displaying short descriptions of redirect targets Median graph Graph with a median for each three vertices Median of medians Fast approximate median algorithm Algorithm to calculate the approximate median in linear time Median search Method for finding kth smallest valuePages displaying short descriptions of redirect targets Median slope Statistical method for fitting a linePages displaying short descriptions of redirect targets Median voter theory Theory in political sciencePages displaying short descriptions of redirect targets Medoid representative objects of a data set or a cluster within a data set whose sum of dissimilarities to all the objects in the cluster is minimalPages displaying wikidata descriptions as a fallback s Generalization of the median in higher dimensionsNotes edit The geometric median is unique unless the sample is collinear 39 Subsequent scholars appear to concur with Eisenhart that Boroughs 1580 figures while suggestive of the median in fact describe an arithmetic mean 59 62 3 Boroughs is mentioned in no other work References edit a b Weisstein Eric W Statistical Median MathWorld Simon Laura J Descriptive statistics Archived 2010 07 30 at the Wayback Machine Statistical Education Resource Kit Pennsylvania State Department of Statistics a b Derek Bissell 1994 Statistical Methods for Spc and Tqm CRC Press pp 26 ISBN 978 0 412 39440 9 Retrieved 25 February 2013 David J Sheskin 27 August 2003 Handbook of Parametric and Nonparametric Statistical Procedures Third ed CRC Press p 7 ISBN 978 1 4200 3626 8 Retrieved 25 February 2013 Paul T von Hippel 2005 Mean Median and Skew Correcting a Textbook Rule Journal of Statistics Education 13 2 Archived from the original on 2008 10 14 Retrieved 2015 06 18 Robson Colin 1994 Experiment Design and Statistics in Psychology Penguin pp 42 45 ISBN 0 14 017648 9 a b Williams D 2001 Weighing the Odds Cambridge University Press p 165 ISBN 052100618X Maindonald John Braun W John 2010 05 06 Data Analysis and Graphics Using R An Example Based Approach Cambridge University Press p 104 ISBN 978 1 139 48667 5 AP Statistics Review Density Curves and the Normal Distributions Archived from the original on 8 April 2015 Retrieved 16 March 2015 Newman M E J 2005 Power laws Pareto distributions and Zipf s law Contemporary Physics 46 5 323 351 arXiv cond mat 0412004 Bibcode 2005ConPh 46 323N doi 10 1080 00107510500052444 S2CID 2871747 Stroock Daniel 2011 Probability Theory Cambridge University Press pp 43 ISBN 978 0 521 13250 3 DeGroot Morris H 1970 Optimal Statistical Decisions McGraw Hill Book Co New York London Sydney p 232 ISBN 9780471680291 MR 0356303 Stephen A Book Lawrence Sher 1979 How close are the mean and the median The Two Year College Mathematics Journal 10 3 202 204 doi 10 2307 3026748 JSTOR 3026748 Retrieved 12 March 2022 Warren Page Vedula N Murty 1982 Nearness Relations Among Measures of Central Tendency and Dispersion Part 1 The Two Year College Mathematics Journal 13 5 315 327 doi 10 1080 00494925 1982 11972639 inactive 1 August 2023 Retrieved 12 March 2022 a href Template Cite journal html title Template Cite journal cite journal a CS1 maint DOI inactive as of August 2023 link O Cinneide Colm Art 1990 The mean is within one standard deviation of any median The American Statistician 44 4 292 293 doi 10 1080 00031305 1990 10475743 Retrieved 12 March 2022 Mallows Colin August 1991 Another comment on O Cinneide The American Statistician 45 3 257 doi 10 1080 00031305 1991 10475815 Piche Robert 2012 Random Vectors and Random Sequences Lambert Academic Publishing ISBN 978 3659211966 Kemperman Johannes H B 1987 Dodge Yadolah ed The median of a finite measure on a Banach space Statistical data analysis based on the L1 norm and related methods Papers from the First International Conference Held at Neuchatel August 31 September 4 1987 Amsterdam North Holland Publishing Co 217 230 MR 0949228 Milasevic Philip Ducharme Gilles R 1987 Uniqueness of the spatial median Annals of Statistics 15 3 1332 1333 doi 10 1214 aos 1176350511 MR 0902264 K Van Steen Notes on probability and statistics Basu S Dasgupta A 1997 The Mean Median and Mode of Unimodal Distributions A Characterization Theory of Probability and Its Applications 41 2 210 223 doi 10 1137 S0040585X97975447 S2CID 54593178 Merkle M 2005 Jensen s inequality for medians Statistics amp Probability Letters 71 3 277 281 doi 10 1016 j spl 2004 11 010 Alfred V Aho and John E Hopcroft and Jeffrey D Ullman 1974 The Design and Analysis of Computer Algorithms Reading MA Addison Wesley ISBN 0 201 00029 6 Here Section 3 6 Order Statistics p 97 99 in particular Algorithm 3 6 and Theorem 3 9 Bentley Jon L McIlroy M Douglas 1993 Engineering a sort function Software Practice and Experience 23 11 1249 1265 doi 10 1002 spe 4380231105 S2CID 8822797 Rousseeuw Peter J Bassett Gilbert W Jr 1990 The remedian a robust averaging method for large data sets PDF J Amer Statist Assoc 85 409 97 104 doi 10 1080 01621459 1990 10475311 a b Stigler Stephen December 1973 Studies in the History of Probability and Statistics XXXII Laplace Fisher and the Discovery of the Concept of Sufficiency Biometrika 60 3 439 445 doi 10 1093 biomet 60 3 439 JSTOR 2334992 MR 0326872 Rider Paul R 1960 Variance of the median of small samples from several special populations J Amer Statist Assoc 55 289 148 150 doi 10 1080 01621459 1960 10482056 Efron B 1982 The Jackknife the Bootstrap and other Resampling Plans Philadelphia SIAM ISBN 0898711797 Shao J Wu C F 1989 A General Theory for Jackknife Variance Estimation Ann Stat 17 3 1176 1197 doi 10 1214 aos 1176347263 JSTOR 2241717 Efron B 1979 Bootstrap Methods Another Look at the Jackknife Ann Stat 7 1 1 26 doi 10 1214 aos 1176344552 JSTOR 2958830 Hall P Martin M A 1988 Exact Convergence Rate of Bootstrap Quantile Variance Estimator Probab Theory Related Fields 80 2 261 268 doi 10 1007 BF00356105 S2CID 119701556 Jimenez Gamero M D Munoz Garcia J Pino Mejias R 2004 Reduced bootstrap for the median Statistica Sinica 14 4 1179 1198 Maindonald John John Braun W 2010 05 06 Data Analysis and Graphics Using R An Example Based Approach Cambridge University Press ISBN 9781139486675 a b c Hettmansperger Thomas P McKean Joseph W 1998 Robust nonparametric statistical methods Kendall s Library of Statistics Vol 5 London Edward Arnold ISBN 0 340 54937 8 MR 1604954 Small Christopher G A survey of multidimensional medians International Statistical Review Revue Internationale de Statistique 1990 263 277 doi 10 2307 1403809 JSTOR 1403809 Niinimaa A and H Oja Multivariate median Encyclopedia of statistical sciences 1999 Mosler Karl Multivariate Dispersion Central Regions and Depth The Lift Zonoid Approach Vol 165 Springer Science amp Business Media 2012 Puri Madan L Sen Pranab K Nonparametric Methods in Multivariate Analysis John Wiley amp Sons New York NY 1971 Reprinted by Krieger Publishing Vardi Yehuda Zhang Cun Hui 2000 The multivariate L1 median and associated data depth Proceedings of the National Academy of Sciences of the United States of America 97 4 1423 1426 electronic Bibcode 2000PNAS 97 1423V doi 10 1073 pnas 97 4 1423 MR 1740461 PMC 26449 PMID 10677477 Davis Otto A DeGroot Morris H Hinich Melvin J January 1972 Social Preference Orderings and Majority Rule PDF Econometrica 40 1 147 157 doi 10 2307 1909727 JSTOR 1909727 The authors working in a topic in which uniqueness is assumed actually use the expression unique median in all directions Pratt William K Cooper Ted J Kabir Ihtisham 1985 07 11 Corbett Francis J ed Pseudomedian Filter Architectures and Algorithms for Digital Image Processing II 0534 34 Bibcode 1985SPIE 534 34P doi 10 1117 12 946562 S2CID 173183609 Oja Hannu 2010 Multivariate nonparametric methods with R An approach based on spatial signs and ranks Lecture Notes in Statistics Vol 199 New York NY Springer pp xiv 232 doi 10 1007 978 1 4419 0468 3 ISBN 978 1 4419 0467 6 MR 2598854 Wilcox Rand R 2001 Theil Sen estimator Fundamentals of Modern Statistical Methods Substantially Improving Power and Accuracy Springer Verlag pp 207 210 ISBN 978 0 387 95157 7 Wald A 1940 The Fitting of Straight Lines if Both Variables are Subject to Error PDF Annals of Mathematical Statistics 11 3 282 300 doi 10 1214 aoms 1177731868 JSTOR 2235677 Nair K R Shrivastava M P 1942 On a Simple Method of Curve Fitting Sankhya The Indian Journal of Statistics 6 2 121 132 JSTOR 25047749 Brown G W Mood A M 1951 On Median Tests for Linear Hypotheses Proc Second Berkeley Symposium on Mathematical Statistics and Probability Berkeley CA University of California Press pp 159 166 Zbl 0045 08606 Tukey J W 1977 Exploratory Data Analysis Reading MA Addison Wesley ISBN 0201076160 Brown George W 1947 On Small Sample Estimation Annals of Mathematical Statistics 18 4 582 585 doi 10 1214 aoms 1177730349 JSTOR 2236236 Lehmann Erich L 1951 A General Concept of Unbiasedness Annals of Mathematical Statistics 22 4 587 592 doi 10 1214 aoms 1177729549 JSTOR 2236928 Birnbaum Allan 1961 A Unified Theory of Estimation I Annals of Mathematical Statistics 32 1 112 135 doi 10 1214 aoms 1177705145 JSTOR 2237612 van der Vaart H Robert 1961 Some Extensions of the Idea of Bias Annals of Mathematical Statistics 32 2 436 447 doi 10 1214 aoms 1177705051 JSTOR 2237754 MR 0125674 Pfanzagl Johann with the assistance of R Hamboker 1994 Parametric Statistical Theory Walter de Gruyter ISBN 3 11 013863 8 MR 1291393 Pfanzagl Johann On optimal median unbiased estimators in the presence of nuisance parameters The Annals of Statistics 1979 187 193 Brown L D Cohen Arthur Strawderman W E 1976 A Complete Class Theorem for Strict Monotone Likelihood Ratio With Applications Ann Statist 4 4 712 722 doi 10 1214 aos 1176343543 Page Brown L D Cohen Arthur Strawderman W E 1976 A Complete Class Theorem for Strict Monotone Likelihood Ratio With Applications Ann Statist 4 4 712 722 doi 10 1214 aos 1176343543 a b c d e Bakker Arthur Gravemeijer Koeno P E 2006 06 01 An Historical Phenomenology of Mean and Median Educational Studies in Mathematics 62 2 149 168 doi 10 1007 s10649 006 7099 8 ISSN 1573 0816 S2CID 143708116 Adler Dan 31 December 2014 Talmud and Modern Economics Jewish American and Israeli Issues Archived from the original on 6 December 2015 Retrieved 22 February 2020 Modern Economic Theory in the Talmud by Yisrael Aumann a b c d Eisenhart Churchill 24 August 1971 The Development of the Concept of the Best Mean of a Set of Measurements from Antiquity to the Present Day PDF Speech 131st Annual Meeting of the American Statistical Association Colorado State University a b c How the Average Triumphed Over the Median Priceonomics 5 April 2016 Retrieved 2020 02 23 Sangster Alan March 2021 The Life and Works of Luca Pacioli 1446 7 1517 Humanist Educator Abacus 57 1 126 152 doi 10 1111 abac 12218 hdl 2164 16100 ISSN 0001 3072 S2CID 233917744 Wright Edward Parsons E J S Morris W F 1939 Edward Wright and His Work Imago Mundi 3 61 71 doi 10 1080 03085693908591862 ISSN 0308 5694 JSTOR 1149920 Stigler S M 1986 The History of Statistics The Measurement of Uncertainty Before 1900 Harvard University Press ISBN 0674403401 Laplace PS de 1818 Deuxieme supplement a la Theorie Analytique des Probabilites Paris Courcier Jaynes E T 2007 Probability theory the logic of science 5 print ed Cambridge u a Cambridge Univ Press p 172 ISBN 978 0 521 59271 0 Howarth Richard 2017 Dictionary of Mathematical Geosciences With Historical Notes Springer p 374 a b Keynes J M 1921 A Treatise on Probability Pt II Ch XVII 5 p 201 2006 reprint Cosimo Classics ISBN 9781596055308 multiple other reprints Stigler Stephen M 2002 Statistics on the Table The History of Statistical Concepts and Methods Harvard University Press pp 105 7 ISBN 978 0 674 00979 0 Galton F 1881 Report of the Anthropometric Committee pp 245 260 Report of the 51st Meeting of the British Association for the Advancement of Science David H A 1995 First Occurrence of Common Terms in Mathematical Statistics The American Statistician 49 2 121 133 doi 10 2307 2684625 ISSN 0003 1305 JSTOR 2684625 encyclopediaofmath org personal psu eduExternal links edit Median in statistics Encyclopedia of Mathematics EMS Press 2001 1994 Median as a weighted arithmetic mean of all Sample Observations On line calculator Calculating the median A problem involving the mean the median and the mode Weisstein Eric W Statistical Median MathWorld Python script for Median computations and income inequality metrics Fast Computation of the Median by Successive Binning Mean median mode and skewness A tutorial devised for first year psychology students at Oxford University based on a worked example The Complex SAT Math Problem Even the College Board Got Wrong Andrew Daniels in Popular MechanicsThis article incorporates material from Median of a distribution on PlanetMath which is licensed under the Creative Commons Attribution Share Alike License Retrieved from https en wikipedia org w index php title Median amp oldid 1194659010, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.