fbpx
Wikipedia

Distribution of the product of two random variables

A product distribution is a probability distribution constructed as the distribution of the product of random variables having two other known distributions. Given two statistically independent random variables X and Y, the distribution of the random variable Z that is formed as the product is a product distribution.

Algebra of random variables

The product is one type of algebra for random variables: Related to the product distribution are the ratio distribution, sum distribution (see List of convolutions of probability distributions) and difference distribution. More generally, one may talk of combinations of sums, differences, products and ratios.

Many of these distributions are described in Melvin D. Springer's book from 1979 The Algebra of Random Variables.[1]

Derivation for independent random variables

If   and   are two independent, continuous random variables, described by probability density functions   and   then the probability density function of   is[2]

 

Proof

We first write the cumulative distribution function of   starting with its definition

 

We find the desired probability density function by taking the derivative of both sides with respect to  . Since on the right hand side,   appears only in the integration limits, the derivative is easily performed using the fundamental theorem of calculus and the chain rule. (Note the negative sign that is needed when the variable occurs in the lower limit of the integration.)

 

where the absolute value is used to conveniently combine the two terms.[3]

Alternate proof

A faster more compact proof begins with the same step of writing the cumulative distribution of   starting with its definition:

 

where   is the Heaviside step function and serves to limit the region of integration to values of   and   satisfying  .

We find the desired probability density function by taking the derivative of both sides with respect to  .

 

where we utilize the translation and scaling properties of the Dirac delta function  .

A more intuitive description of the procedure is illustrated in the figure below. The joint pdf   exists in the  -  plane and an arc of constant   value is shown as the shaded line. To find the marginal probability   on this arc, integrate over increments of area   on this contour.

 
Diagram to illustrate the product distribution of two variables.

Starting with  , we have  . So the probability increment is  . Since   implies  , we can relate the probability increment to the  -increment, namely  . Then integration over  , yields  .

A Bayesian interpretation

Let   be a random sample drawn from probability distribution  . Scaling   by   generates a sample from scaled distribution   which can be written as a conditional distribution  .

Letting   be a random variable with pdf  , the distribution of the scaled sample becomes   and integrating out   we get   so   is drawn from this distribution  . However, substituting the definition of   we also have   which has the same form as the product distribution above. Thus the Bayesian posterior distribution   is the distribution of the product of the two independent random samples   and  .

For the case of one variable being discrete, let   have probability   at levels   with  . The conditional density is  . Therefore  .

Expectation of product of random variables

When two random variables are statistically independent, the expectation of their product is the product of their expectations. This can be proved from the law of total expectation:

 

In the inner expression, Y is a constant. Hence:

 
 

This is true even if X and Y are statistically dependent in which case   is a function of Y. In the special case in which X and Y are statistically independent, it is a constant independent of Y. Hence:

 
 

Variance of the product of independent random variables

Let   be uncorrelated random variables with means   and variances  . If, additionally, the random variables   and   are uncorrelated, then the variance of the product XY is

 

In the case of the product of more than two variables, if   are statistically independent then[4] the variance of their product is

 

Characteristic function of product of random variables

Assume X, Y are independent random variables. The characteristic function of X is  , and the distribution of Y is known. Then from the law of total expectation, we have[5]

 

If the characteristic functions and distributions of both X and Y are known, then alternatively,   also holds.

Mellin transform

The Mellin transform of a distribution   with support only on   and having a random sample   is

 

The inverse transform is

 

if   are two independent random samples from different distributions, then the Mellin transform of their product is equal to the product of their Mellin transforms:

 

If s is restricted to integer values, a simpler result is

 

Thus the moments of the random product   are the product of the corresponding moments of   and this extends to non-integer moments, for example

 

The pdf of a function can be reconstructed from its moments using the saddlepoint approximation method.

A further result is that for independent X, Y

 

Gamma distribution example To illustrate how the product of moments yields a much simpler result than finding the moments of the distribution of the product, let   be sampled from two Gamma distributions,   with parameters   whose moments are

 

Multiplying the corresponding moments gives the Mellin transform result

 

Independently, it is known that the product of two independent Gamma-distributed samples (~Gamma(α,1) and Gamma(β,1)) has a K-distribution:

 

To find the moments of this, make the change of variable  , simplifying similar integrals to:

 

thus

 

The definite integral

  is well documented and we have finally
 

which, after some difficulty, has agreed with the moment product result above.

If X, Y are drawn independently from Gamma distributions with shape parameters   then

 

This type of result is universally true, since for bivariate independent variables   thus

 

or equivalently it is clear that   are independent variables.

Special cases

Lognormal distributions

The distribution of the product of two random variables which have lognormal distributions is again lognormal. This is itself a special case of a more general set of results where the logarithm of the product can be written as the sum of the logarithms. Thus, in cases where a simple result can be found in the list of convolutions of probability distributions, where the distributions to be convolved are those of the logarithms of the components of the product, the result might be transformed to provide the distribution of the product. However this approach is only useful where the logarithms of the components of the product are in some standard families of distributions.

Uniformly distributed independent random variables

Let   be the product of two independent variables   each uniformly distributed on the interval [0,1], possibly the outcome of a copula transformation. As noted in "Lognormal Distributions" above, PDF convolution operations in the Log domain correspond to the product of sample values in the original domain. Thus, making the transformation  , such that  , each variate is distributed independently on u as

 .

and the convolution of the two distributions is the autoconvolution

 

Next retransform the variable to   yielding the distribution

  on the interval [0,1]

For the product of multiple (> 2) independent samples the characteristic function route is favorable. If we define   then   above is a Gamma distribution of shape 1 and scale factor 1,   , and its known CF is  . Note that   so the Jacobian of the transformation is unity.

The convolution of   independent samples from   therefore has CF   which is known to be the CF of a Gamma distribution of shape  :

 .

Making the inverse transformation   we get the PDF of the product of the n samples:

 

The following, more conventional, derivation from Stackexchange[6] is consistent with this result. First of all, letting   its CDF is

 

The density of  

Multiplying by a third independent sample gives distribution function

 

Taking the derivative yields  

The author of the note conjectures that, in general,  

 
The geometry of the product distribution of two random variables in the unit square.

The figure illustrates the nature of the integrals above. The shaded area within the unit square and below the line z = xy, represents the CDF of z. This divides into two parts. The first is for 0 < x < z where the increment of area in the vertical slot is just equal to dx. The second part lies below the xy line, has y-height z/x, and incremental area dx z/x.

Independent central-normal distributions

The product of two independent Normal samples follows a modified Bessel function. Let   be samples from a Normal(0,1) distribution and  . Then

 


The variance of this distribution could be determined, in principle, by a definite integral from Gradsheyn and Ryzhik,[7]

 

thus  

A much simpler result, stated in a section above, is that the variance of the product of zero-mean independent samples is equal to the product of their variances. Since the variance of each Normal sample is one, the variance of the product is also one.

Correlated central-normal distributions

The product of correlated Normal samples case was recently addressed by Nadarajaha and Pogány.[8] Let   be zero mean, unit variance, normally distributed variates with correlation coefficient  

Then

 

Mean and variance: For the mean we have   from the definition of correlation coefficient. The variance can be found by transforming from two unit variance zero mean uncorrelated variables U, V. Let

 

Then X, Y are unit variance variables with correlation coefficient   and

 

Removing odd-power terms, whose expectations are obviously zero, we get

 

Since   we have

 

High correlation asymptote In the highly correlated case,   the product converges on the square of one sample. In this case the   asymptote is   and

 

which is a Chi-squared distribution with one degree of freedom.

Multiple correlated samples. Nadarajaha et al. further show that if   iid random variables sampled from   and   is their mean then

 

where W is the Whittaker function while  .

Using the identity  , see for example the DLMF compilation. eqn(13.13.9),[9] this expression can be somewhat simplified to

 

The pdf gives the distribution of a sample covariance. The approximate distribution of a correlation coefficient can be found via the Fisher transformation.

Multiple non-central correlated samples. The distribution of the product of correlated non-central normal samples was derived by Cui et al.[10] and takes the form of an infinite series of modified Bessel functions of the first kind.

Moments of product of correlated central normal samples

For a central normal distribution N(0,1) the moments are

 

where   denotes the double factorial.

If   are central correlated variables, the simplest bivariate case of the multivariate normal moment problem described by Kan,[11] then

 

where

  is the correlation coefficient and  

[needs checking]

Correlated non-central normal distributions

The distribution of the product of non-central correlated normal samples was derived by Cui et al.[10] and takes the form of an infinite series.

These product distributions are somewhat comparable to the Wishart distribution. The latter is the joint distribution of the four elements (actually only three independent elements) of a sample covariance matrix. If   are samples from a bivariate time series then the   is a Wishart matrix with K degrees of freedom. The product distributions above are the unconditional distribution of the aggregate of K > 1 samples of  .

Independent complex-valued central-normal distributions

Let   be independent samples from a normal(0,1) distribution.
Setting   are independent zero-mean complex normal samples with circular symmetry. Their complex variances are  

The density functions of

  are Rayleigh distributions defined as:
 

The variable   is clearly Chi-squared with two degrees of freedom and has PDF

 

Wells et al.[12] show that the density function of   is

 

and the cumulative distribution function of   is

 

Thus the polar representation of the product of two uncorrelated complex Gaussian samples is

 .

The first and second moments of this distribution can be found from the integral in Normal Distributions above

 
 

Thus its variance is  .

Further, the density of   corresponds to the product of two independent Chi-square samples   each with two DoF. Writing these as scaled Gamma distributions   then, from the Gamma products below, the density of the product is

 

Independent complex-valued noncentral normal distributions

The product of non-central independent complex Gaussians is described by O’Donoughue and Moura[13] and forms a double infinite series of modified Bessel functions of the first and second types.

Gamma distributions

The product of two independent Gamma samples,  , defining  , follows[14]

 

Beta distributions

Nagar et al.[15] define a correlated bivariate beta distribution

 

where

 

Then the pdf of Z = XY is given by

 

where   is the Gauss hypergeometric function defined by the Euler integral

 

Note that multivariate distributions are not generally unique, apart from the Gaussian case, and there may be alternatives.

Uniform and gamma distributions

The distribution of the product of a random variable having a uniform distribution on (0,1) with a random variable having a gamma distribution with shape parameter equal to 2, is an exponential distribution.[16] A more general case of this concerns the distribution of the product of a random variable having a beta distribution with a random variable having a gamma distribution: for some cases where the parameters of the two component distributions are related in a certain way, the result is again a gamma distribution but with a changed shape parameter.[16]

The K-distribution is an example of a non-standard distribution that can be defined as a product distribution (where both components have a gamma distribution).

Gamma and Pareto distributions

The product of n Gamma and m Pareto independent samples was derived by Nadarajah.[17]

See also

Notes

  1. ^ Springer, Melvin Dale (1979). The Algebra of Random Variables. Wiley. ISBN 978-0-471-01406-5. Retrieved 24 September 2012.
  2. ^ Rohatgi, V. K. (1976). An Introduction to Probability Theory and Mathematical Statistics. Wiley Series in Probability and Statistics. New York: Wiley. doi:10.1002/9781118165676. ISBN 978-0-19-853185-2.
  3. ^ Grimmett, G. R.; Stirzaker, D.R. (2001). Probability and Random Processes. Oxford: Oxford University Press. ISBN 978-0-19-857222-0. Retrieved 4 October 2015.
  4. ^ Sarwate, Dilip (March 9, 2013). "Variance of product of multiple random variables". Stack Exchange.
  5. ^ "How to find characteristic function of product of random variables". Stack Exchange. January 3, 2013.
  6. ^ heropup (1 February 2014). "product distribution of two uniform distribution, what about 3 or more". Stack Exchange.
  7. ^ Gradsheyn, I S; Ryzhik, I M (1980). Tables of Integrals, Series and Products. Academic Press. pp. section 6.561.
  8. ^ Nadarajah, Saralees; Pogány, Tibor (2015). "On the distribution of the product of correlated normal random variables". Comptes Rendus de l'Académie des Sciences, Série I. 354 (2): 201–204. doi:10.1016/j.crma.2015.10.019.
  9. ^ Equ(13.18.9). "Digital Library of Mathematical Functions". NIST: National Institute of Standards and Technology.
  10. ^ a b Cui, Guolong (2016). "Exact Distribution for the Product of Two Correlated Gaussian Random Variables". IEEE Signal Processing Letters. 23 (11): 1662–1666. Bibcode:2016ISPL...23.1662C. doi:10.1109/LSP.2016.2614539.
  11. ^ Kan, Raymond (2008). "From moments of sum to moments of product". Journal of Multivariate Analysis. 99 (3): 542–554. doi:10.1016/j.jmva.2007.01.013.
  12. ^ Wells, R T; Anderson, R L; Cell, J W (1962). "The Distribution of the Product of Two Central or Non-Central Chi-Square Variates". The Annals of Mathematical Statistics. 33 (3): 1016–1020. doi:10.1214/aoms/1177704469.
  13. ^ O’Donoughue, N; Moura, J M F (March 2012). "On the Product of Independent Complex Gaussians". IEEE Transactions on Signal Processing. 60 (3): 1050–1063. Bibcode:2012ITSP...60.1050O. doi:10.1109/TSP.2011.2177264.
  14. ^ Wolfies (August 2017). "PDF of the product of two independent Gamma random variables". stackexchange.
  15. ^ Nagar, D K; Orozco-Castañeda, J M; Gupta, A K (2009). "Product and quotient of correlated beta variables". Applied Mathematics Letters. 22: 105–109. doi:10.1016/j.aml.2008.02.014.
  16. ^ a b Johnson, Norman L.; Kotz, Samuel; Balakrishnan, N. (1995). Continuous Univariate Distributions Volume 2, Second edition. Wiley. p. 306. ISBN 978-0-471-58494-0. Retrieved 24 September 2012.
  17. ^ Nadarajah, Saralees (June 2011). "Exact distribution of the product of n gamma and m Pareto random variables". Journal of Computational and Applied Mathematics. 235 (15): 4496–4512. doi:10.1016/j.cam.2011.04.018.

References

  • Springer, Melvin Dale; Thompson, W. E. (1970). "The distribution of products of beta, gamma and Gaussian random variables". SIAM Journal on Applied Mathematics. 18 (4): 721–737. doi:10.1137/0118065. JSTOR 2099424.
  • Springer, Melvin Dale; Thompson, W. E. (1966). "The distribution of products of independent random variables". SIAM Journal on Applied Mathematics. 14 (3): 511–526. doi:10.1137/0114046. JSTOR 2946226.

distribution, product, random, variables, product, distribution, probability, distribution, constructed, distribution, product, random, variables, having, other, known, distributions, given, statistically, independent, random, variables, distribution, random, . A product distribution is a probability distribution constructed as the distribution of the product of random variables having two other known distributions Given two statistically independent random variables X and Y the distribution of the random variable Z that is formed as the product Z X Y displaystyle Z XY is a product distribution Contents 1 Algebra of random variables 2 Derivation for independent random variables 2 1 Proof 2 2 Alternate proof 2 3 A Bayesian interpretation 3 Expectation of product of random variables 4 Variance of the product of independent random variables 5 Characteristic function of product of random variables 6 Mellin transform 7 Special cases 7 1 Lognormal distributions 7 2 Uniformly distributed independent random variables 7 3 Independent central normal distributions 7 4 Correlated central normal distributions 7 5 Correlated non central normal distributions 7 6 Independent complex valued central normal distributions 7 7 Independent complex valued noncentral normal distributions 7 8 Gamma distributions 7 9 Beta distributions 7 10 Uniform and gamma distributions 7 11 Gamma and Pareto distributions 8 See also 9 Notes 10 ReferencesAlgebra of random variables EditMain article Algebra of random variables The product is one type of algebra for random variables Related to the product distribution are the ratio distribution sum distribution see List of convolutions of probability distributions and difference distribution More generally one may talk of combinations of sums differences products and ratios Many of these distributions are described in Melvin D Springer s book from 1979 The Algebra of Random Variables 1 Derivation for independent random variables EditIf X displaystyle X and Y displaystyle Y are two independent continuous random variables described by probability density functions f X displaystyle f X and f Y displaystyle f Y then the probability density function of Z X Y displaystyle Z XY is 2 f Z z f X x f Y z x 1 x d x displaystyle f Z z int infty infty f X x f Y z x frac 1 x dx Proof Edit We first write the cumulative distribution function of Z displaystyle Z starting with its definition F Z z def P Z z P X Y z P X Y z X 0 P X Y z X 0 P Y z X X 0 P Y z X X 0 0 f X x z x f Y y d y d x 0 f X x z x f Y y d y d x displaystyle begin aligned F Z z amp stackrel text def mathbb P Z leq z amp mathbb P XY leq z amp mathbb P XY leq z X geq 0 mathbb P XY leq z X leq 0 amp mathbb P Y leq z X X geq 0 mathbb P Y geq z X X leq 0 amp int 0 infty f X x int infty z x f Y y dy dx int infty 0 f X x int z x infty f Y y dy dx end aligned We find the desired probability density function by taking the derivative of both sides with respect to z displaystyle z Since on the right hand side z displaystyle z appears only in the integration limits the derivative is easily performed using the fundamental theorem of calculus and the chain rule Note the negative sign that is needed when the variable occurs in the lower limit of the integration f Z z 0 f X x f Y z x 1 x d x 0 f X x f Y z x 1 x d x 0 f X x f Y z x 1 x d x 0 f X x f Y z x 1 x d x f X x f Y z x 1 x d x displaystyle begin aligned f Z z amp int 0 infty f X x f Y z x frac 1 x dx int infty 0 f X x f Y z x frac 1 x dx amp int 0 infty f X x f Y z x frac 1 x dx int infty 0 f X x f Y z x frac 1 x dx amp int infty infty f X x f Y z x frac 1 x dx end aligned where the absolute value is used to conveniently combine the two terms 3 Alternate proof Edit A faster more compact proof begins with the same step of writing the cumulative distribution of Z displaystyle Z starting with its definition F Z z d e f P Z z P X Y z f X x f Y y u z x y d y d x displaystyle begin aligned F Z z amp overset underset mathrm def mathbb P Z leq z amp mathbb P XY leq z amp int infty infty int infty infty f X x f Y y u z xy dy dx end aligned where u displaystyle u cdot is the Heaviside step function and serves to limit the region of integration to values of x displaystyle x and y displaystyle y satisfying x y z displaystyle xy leq z We find the desired probability density function by taking the derivative of both sides with respect to z displaystyle z f Z z f X x f Y y d z x y d y d x f X x f Y z x d z x y d y d x f X x f Y z x 1 x d x displaystyle begin aligned f Z z amp int infty infty int infty infty f X x f Y y delta z xy dy dx amp int infty infty f X x f Y z x left int infty infty delta z xy dy right dx amp int infty infty f X x f Y z x frac 1 x dx end aligned where we utilize the translation and scaling properties of the Dirac delta function d displaystyle delta A more intuitive description of the procedure is illustrated in the figure below The joint pdf f X x f Y y displaystyle f X x f Y y exists in the x displaystyle x y displaystyle y plane and an arc of constant z displaystyle z value is shown as the shaded line To find the marginal probability f Z z displaystyle f Z z on this arc integrate over increments of area d x d y f x y displaystyle dx dy f x y on this contour Diagram to illustrate the product distribution of two variables Starting with y z x displaystyle y frac z x we have d y z x 2 d x y x d x displaystyle dy frac z x 2 dx frac y x dx So the probability increment is d p f x y d x d y f X x f Y z x y x d x d x displaystyle delta p f x y dx dy f X x f Y z x frac y x dx dx Since z y x displaystyle z yx implies d z y d x displaystyle dz y dx we can relate the probability increment to the z displaystyle z increment namely d p f X x f Y z x 1 x d x d z displaystyle delta p f X x f Y z x frac 1 x dx dz Then integration over x displaystyle x yields f Z z f X x f Y z x 1 x d x displaystyle f Z z int f X x f Y z x frac 1 x dx A Bayesian interpretation Edit Let X f x displaystyle X sim f x be a random sample drawn from probability distribution f x x displaystyle f x x Scaling X displaystyle X by 8 displaystyle theta generates a sample from scaled distribution 8 X 1 8 f X x 8 displaystyle theta X sim frac 1 theta f X left frac x theta right which can be written as a conditional distribution g x x 8 1 8 f x x 8 displaystyle g x x theta frac 1 theta f x left frac x theta right Letting 8 displaystyle theta be a random variable with pdf f 8 8 displaystyle f theta theta the distribution of the scaled sample becomes f X 8 x g X x 8 f 8 8 displaystyle f X theta x g X x mid theta f theta theta and integrating out 8 displaystyle theta we get h x x g X x 8 f 8 8 d 8 displaystyle h x x int infty infty g X x theta f theta theta d theta so 8 X displaystyle theta X is drawn from this distribution 8 X h X x displaystyle theta X sim h X x However substituting the definition of g displaystyle g we also have h X x 1 8 f x x 8 f 8 8 d 8 displaystyle h X x int infty infty frac 1 theta f x left frac x theta right f theta theta d theta which has the same form as the product distribution above Thus the Bayesian posterior distribution h X x displaystyle h X x is the distribution of the product of the two independent random samples 8 displaystyle theta and X displaystyle X For the case of one variable being discrete let 8 displaystyle theta have probability P i displaystyle P i at levels 8 i displaystyle theta i with i P i 1 displaystyle sum i P i 1 The conditional density is f X x 8 i 1 8 i f x x 8 i displaystyle f X x mid theta i frac 1 theta i f x left frac x theta i right Therefore f X 8 x P i 8 i f X x 8 i displaystyle f X theta x sum frac P i theta i f X left frac x theta i right Expectation of product of random variables EditWhen two random variables are statistically independent the expectation of their product is the product of their expectations This can be proved from the law of total expectation E X Y E E X Y Y displaystyle operatorname E XY operatorname E operatorname E XY mid Y In the inner expression Y is a constant Hence E X Y Y Y E X Y displaystyle operatorname E XY mid Y Y cdot operatorname E X mid Y E X Y E Y E X Y displaystyle operatorname E XY operatorname E Y cdot operatorname E X mid Y This is true even if X and Y are statistically dependent in which case E X Y displaystyle operatorname E X mid Y is a function of Y In the special case in which X and Y are statistically independent it is a constant independent of Y Hence E X Y E Y E X displaystyle operatorname E XY operatorname E Y cdot operatorname E X E X Y E X E Y displaystyle operatorname E XY operatorname E X cdot operatorname E Y Variance of the product of independent random variables EditLet X Y displaystyle X Y be uncorrelated random variables with means m X m Y displaystyle mu X mu Y and variances s X 2 s Y 2 displaystyle sigma X 2 sigma Y 2 If additionally the random variables X 2 displaystyle X 2 and Y 2 displaystyle Y 2 are uncorrelated then the variance of the product XY is Var X Y s X 2 m X 2 s Y 2 m Y 2 m X 2 m Y 2 displaystyle operatorname Var XY sigma X 2 mu X 2 sigma Y 2 mu Y 2 mu X 2 mu Y 2 In the case of the product of more than two variables if X 1 X n n gt 2 displaystyle X 1 cdots X n n gt 2 are statistically independent then 4 the variance of their product is Var X 1 X 2 X n i 1 n s i 2 m i 2 i 1 n m i 2 displaystyle operatorname Var X 1 X 2 cdots X n prod i 1 n sigma i 2 mu i 2 prod i 1 n mu i 2 Characteristic function of product of random variables EditAssume X Y are independent random variables The characteristic function of X is f X t displaystyle varphi X t and the distribution of Y is known Then from the law of total expectation we have 5 f Z t E e i t X Y E E e i t X Y Y E f X t Y displaystyle begin aligned varphi Z t amp operatorname E e itXY amp operatorname E operatorname E e itXY mid Y amp operatorname E varphi X tY end aligned If the characteristic functions and distributions of both X and Y are known then alternatively f Z t E f Y t X displaystyle varphi Z t operatorname E varphi Y tX also holds Mellin transform EditThe Mellin transform of a distribution f x displaystyle f x with support only on x 0 displaystyle x geq 0 and having a random sample X displaystyle X is M f x f s 0 x s 1 f x d x E X s 1 displaystyle mathcal M f x varphi s int 0 infty x s 1 f x dx operatorname E X s 1 The inverse transform is M 1 f s f x 1 2 p i c i c i x s f s d s displaystyle mathcal M 1 varphi s f x frac 1 2 pi i int c i infty c i infty x s varphi s ds if X and Y displaystyle X text and Y are two independent random samples from different distributions then the Mellin transform of their product is equal to the product of their Mellin transforms M X Y s M X s M Y s displaystyle mathcal M XY s mathcal M X s mathcal M Y s If s is restricted to integer values a simpler result is E X Y n E X n E Y n displaystyle operatorname E XY n operatorname E X n operatorname E Y n Thus the moments of the random product X Y displaystyle XY are the product of the corresponding moments of X and Y displaystyle X text and Y and this extends to non integer moments for example E X Y 1 p E X 1 p E Y 1 p displaystyle operatorname E XY 1 p operatorname E X 1 p operatorname E Y 1 p The pdf of a function can be reconstructed from its moments using the saddlepoint approximation method A further result is that for independent X Y E X p Y q E X p E Y q displaystyle operatorname E X p Y q operatorname E X p operatorname E Y q Gamma distribution example To illustrate how the product of moments yields a much simpler result than finding the moments of the distribution of the product let X Y displaystyle X Y be sampled from two Gamma distributions f G a m m a x 8 1 G 8 1 x 8 1 e x displaystyle f Gamma x theta 1 Gamma theta 1 x theta 1 e x with parameters 8 a b displaystyle theta alpha beta whose moments are E X p 0 x p G x 8 d x G 8 p G 8 displaystyle operatorname E X p int 0 infty x p Gamma x theta dx frac Gamma theta p Gamma theta Multiplying the corresponding moments gives the Mellin transform result E X Y p E X p E Y p G a p G a G b p G b displaystyle operatorname E XY p operatorname E X p operatorname E Y p frac Gamma alpha p Gamma alpha frac Gamma beta p Gamma beta Independently it is known that the product of two independent Gamma distributed samples Gamma a 1 and Gamma b 1 has a K distribution f z a b 2 G a 1 G b 1 z a b 2 1 K a b 2 z 1 a b f K z a b 1 a b z 0 displaystyle f z alpha beta 2 Gamma alpha 1 Gamma beta 1 z frac alpha beta 2 1 K alpha beta 2 sqrt z frac 1 alpha beta f K left frac z alpha beta 1 alpha beta right z geq 0 To find the moments of this make the change of variable y 2 z displaystyle y 2 sqrt z simplifying similar integrals to 0 z p K n 2 z d z 2 2 p 1 0 y 2 p 1 K n y d y displaystyle int 0 infty z p K nu 2 sqrt z dz 2 2p 1 int 0 infty y 2p 1 K nu y dy thus 2 0 z a b 2 1 K a b 2 z d z 2 a b 2 p 1 0 y a b 2 p 1 K a b y d y displaystyle 2 int 0 infty z frac alpha beta 2 1 K alpha beta 2 sqrt z dz 2 alpha beta 2p 1 int 0 infty y alpha beta 2p 1 K alpha beta y dy The definite integral 0 y m K n y d y 2 m 1 G 1 m n 2 G 1 m n 2 displaystyle int 0 infty y mu K nu y dy 2 mu 1 Gamma left frac 1 mu nu 2 right Gamma left frac 1 mu nu 2 right is well documented and we have finallyE Z p 2 a b 2 p 1 2 a b 2 p 1 G a G b G a b 2 p a b 2 G a b 2 p a b 2 G a p G b p G a G b displaystyle begin aligned E Z p amp frac 2 alpha beta 2p 1 2 alpha beta 2p 1 Gamma alpha Gamma beta Gamma left frac alpha beta 2p alpha beta 2 right Gamma left frac alpha beta 2p alpha beta 2 right amp frac Gamma alpha p Gamma beta p Gamma alpha Gamma beta end aligned which after some difficulty has agreed with the moment product result above If X Y are drawn independently from Gamma distributions with shape parameters a b displaystyle alpha beta then E X p Y q E X p E Y q G a p G a G b q G b displaystyle operatorname E X p Y q operatorname E X p operatorname E Y q frac Gamma alpha p Gamma alpha frac Gamma beta q Gamma beta This type of result is universally true since for bivariate independent variables f X Y x y f X x f Y y displaystyle f X Y x y f X x f Y y thus E X p Y q x y x p y q f X Y x y d y d x x x p y y q f Y y d y f X x d x x x p f X x d x y y q f Y y d y E X p E Y q displaystyle begin aligned operatorname E X p Y q amp int x infty infty int y infty infty x p y q f X Y x y dy dx amp int x infty infty x p Big int y infty infty y q f Y y dy Big f X x dx amp int x infty infty x p f X x dx int y infty infty y q f Y y dy amp operatorname E X p operatorname E Y q end aligned or equivalently it is clear that X p and Y q displaystyle X p text and Y q are independent variables Special cases EditLognormal distributions Edit The distribution of the product of two random variables which have lognormal distributions is again lognormal This is itself a special case of a more general set of results where the logarithm of the product can be written as the sum of the logarithms Thus in cases where a simple result can be found in the list of convolutions of probability distributions where the distributions to be convolved are those of the logarithms of the components of the product the result might be transformed to provide the distribution of the product However this approach is only useful where the logarithms of the components of the product are in some standard families of distributions Uniformly distributed independent random variables Edit Let Z displaystyle Z be the product of two independent variables Z X 1 X 2 displaystyle Z X 1 X 2 each uniformly distributed on the interval 0 1 possibly the outcome of a copula transformation As noted in Lognormal Distributions above PDF convolution operations in the Log domain correspond to the product of sample values in the original domain Thus making the transformation u ln x displaystyle u ln x such that p U u d u p X x d x displaystyle p U u du p X x dx each variate is distributed independently on u as p U u p X x d u d x 1 x 1 e u lt u 0 displaystyle p U u frac p X x du dx frac 1 x 1 e u infty lt u leq 0 and the convolution of the two distributions is the autoconvolution c y u 0 y e u e y u d u u y 0 e y d u y e y lt y 0 displaystyle c y int u 0 y e u e y u du int u y 0 e y du ye y infty lt y leq 0 Next retransform the variable to z e y displaystyle z e y yielding the distribution c 2 z c Y y d z d y y e y e y y ln 1 z displaystyle c 2 z c Y y dz dy frac ye y e y y ln 1 z on the interval 0 1 For the product of multiple gt 2 independent samples the characteristic function route is favorable If we define y y displaystyle tilde y y then c y displaystyle c tilde y above is a Gamma distribution of shape 1 and scale factor 1 c y y e y displaystyle c tilde y tilde y e tilde y and its known CF is 1 i t 1 displaystyle 1 it 1 Note that d y d y displaystyle d tilde y dy so the Jacobian of the transformation is unity The convolution of n displaystyle n independent samples from Y displaystyle tilde Y therefore has CF 1 i t n displaystyle 1 it n which is known to be the CF of a Gamma distribution of shape n displaystyle n c n y G n 1 y n 1 e y G n 1 y n 1 e y displaystyle c n tilde y Gamma n 1 tilde y n 1 e tilde y Gamma n 1 y n 1 e y Making the inverse transformation z e y displaystyle z e y we get the PDF of the product of the n samples f n z c n y d z d y G n 1 log z n 1 e y e y log z n 1 n 1 0 lt z 1 displaystyle f n z frac c n y dz dy Gamma n 1 Big log z Big n 1 e y e y frac Big log z Big n 1 n 1 0 lt z leq 1 The following more conventional derivation from Stackexchange 6 is consistent with this result First of all letting Z 2 X 1 X 2 displaystyle Z 2 X 1 X 2 its CDF is F Z 2 z Pr Z 2 z x 0 1 Pr X 2 z x f X 1 x d x x 0 z 1 d x x z 1 z x d x z z log z 0 lt z 1 displaystyle begin aligned F Z 2 z Pr Big Z 2 leq z Big amp int x 0 1 Pr Big X 2 leq frac z x Big f X 1 x dx amp int x 0 z 1dx int x z 1 frac z x dx amp z z log z 0 lt z leq 1 end aligned The density of z 2 is then f z 2 log z 2 displaystyle z 2 text is then f z 2 log z 2 Multiplying by a third independent sample gives distribution function F Z 3 z Pr Z 3 z x 0 1 Pr X 3 z x f Z 2 x d x x 0 z log x d x x z 1 z x log x d x z log z 1 1 2 z log 2 z displaystyle begin aligned F Z 3 z Pr Big Z 3 leq z Big amp int x 0 1 Pr Big X 3 leq frac z x Big f Z 2 x dx amp int x 0 z log x dx int x z 1 frac z x log x dx amp z Big log z 1 Big frac 1 2 z log 2 z end aligned Taking the derivative yields f Z 3 z 1 2 log 2 z 0 lt z 1 displaystyle f Z 3 z frac 1 2 log 2 z 0 lt z leq 1 The author of the note conjectures that in general f Z n z log z n 1 n 1 0 lt z 1 displaystyle f Z n z frac log z n 1 n 1 0 lt z leq 1 The geometry of the product distribution of two random variables in the unit square The figure illustrates the nature of the integrals above The shaded area within the unit square and below the line z xy represents the CDF of z This divides into two parts The first is for 0 lt x lt z where the increment of area in the vertical slot is just equal to dx The second part lies below the xy line has y height z x and incremental area dx z x Independent central normal distributions Edit The product of two independent Normal samples follows a modified Bessel function Let x y displaystyle x y be samples from a Normal 0 1 distribution and z x y displaystyle z xy Then p Z z K 0 z p lt z lt displaystyle p Z z frac K 0 z pi infty lt z lt infty The variance of this distribution could be determined in principle by a definite integral from Gradsheyn and Ryzhik 7 0 x m K n a x d x 2 m 1 a m 1 G 1 m n 2 G 1 m n 2 a gt 0 n 1 m gt 0 displaystyle int 0 infty x mu K nu ax dx 2 mu 1 a mu 1 Gamma Big frac 1 mu nu 2 Big Gamma Big frac 1 mu nu 2 Big a gt 0 nu 1 pm mu gt 0 thus z 2 K 0 z p d z 4 p G 2 3 2 1 displaystyle int infty infty frac z 2 K 0 z pi dz frac 4 pi Gamma 2 Big frac 3 2 Big 1 A much simpler result stated in a section above is that the variance of the product of zero mean independent samples is equal to the product of their variances Since the variance of each Normal sample is one the variance of the product is also one Correlated central normal distributions Edit The product of correlated Normal samples case was recently addressed by Nadarajaha and Pogany 8 Let X Y displaystyle X text Y be zero mean unit variance normally distributed variates with correlation coefficient r and let Z X Y displaystyle rho text and let Z XY Then f Z z 1 p 1 r 2 exp r z 1 r 2 K 0 z 1 r 2 displaystyle f Z z frac 1 pi sqrt 1 rho 2 exp left frac rho z 1 rho 2 right K 0 left frac z 1 rho 2 right Mean and variance For the mean we have E Z r displaystyle operatorname E Z rho from the definition of correlation coefficient The variance can be found by transforming from two unit variance zero mean uncorrelated variables U V Let X U Y r U 1 r 2 V displaystyle X U Y rho U sqrt 1 rho 2 V Then X Y are unit variance variables with correlation coefficient r displaystyle rho and X Y 2 U 2 r U 1 r 2 V 2 U 2 r 2 U 2 2 r 1 r 2 U V 1 r 2 V 2 displaystyle XY 2 U 2 bigg rho U sqrt 1 rho 2 V bigg 2 U 2 bigg rho 2 U 2 2 rho sqrt 1 rho 2 UV 1 rho 2 V 2 bigg Removing odd power terms whose expectations are obviously zero we get E X Y 2 r 2 E U 4 1 r 2 E U 2 E V 2 3 r 2 1 r 2 1 2 r 2 displaystyle operatorname E XY 2 rho 2 operatorname E U 4 1 rho 2 operatorname E U 2 operatorname E V 2 3 rho 2 1 rho 2 1 2 rho 2 Since E Z 2 r 2 displaystyle operatorname E Z 2 rho 2 we have Var Z E Z 2 E Z 2 1 2 r 2 r 2 1 r 2 displaystyle operatorname Var Z operatorname E Z 2 operatorname E Z 2 1 2 rho 2 rho 2 1 rho 2 dd High correlation asymptote In the highly correlated case r 1 displaystyle rho rightarrow 1 the product converges on the square of one sample In this case the K 0 displaystyle K 0 asymptote is K 0 x p 2 x e x in the limit as x z 1 r 2 displaystyle K 0 x rightarrow sqrt tfrac pi 2x e x text in the limit as x frac z 1 rho 2 rightarrow infty and p z 1 p 1 r 2 exp r z 1 r 2 p 1 r 2 2 z exp z 1 r 2 1 2 p z exp z r z 1 r 1 r 1 2 p z exp z 1 r z gt 0 1 G 1 2 2 z e z 2 as r 1 displaystyle begin aligned p z amp rightarrow frac 1 pi sqrt 1 rho 2 exp left frac rho z 1 rho 2 right sqrt frac pi 1 rho 2 2z exp left frac z 1 rho 2 right amp frac 1 sqrt 2 pi z exp Bigg frac z rho z 1 rho 1 rho Bigg amp frac 1 sqrt 2 pi z exp Bigg frac z 1 rho Bigg z gt 0 amp rightarrow frac 1 Gamma tfrac 1 2 sqrt 2z e tfrac z 2 text as rho rightarrow 1 end aligned which is a Chi squared distribution with one degree of freedom Multiple correlated samples Nadarajaha et al further show that if Z 1 Z 2 Z n are n displaystyle Z 1 Z 2 Z n text are n iid random variables sampled from f Z z displaystyle f Z z and Z 1 n Z i displaystyle bar Z tfrac 1 n sum Z i is their mean then f Z z n n 2 2 n 2 G n 2 z n 2 1 exp b g 2 z W 0 1 n 2 z lt z lt displaystyle f bar Z z frac n n 2 2 n 2 Gamma frac n 2 z n 2 1 exp left frac beta gamma 2 z right W 0 frac 1 n 2 z infty lt z lt infty where W is the Whittaker function while b n 1 r g n 1 r displaystyle beta frac n 1 rho gamma frac n 1 rho Using the identity W 0 n x x p K n x 2 x 0 displaystyle W 0 nu x sqrt frac x pi K nu x 2 x geq 0 see for example the DLMF compilation eqn 13 13 9 9 this expression can be somewhat simplified to f z z n n 2 2 n 2 G n 2 z n 2 1 exp b g 2 z b g p z K 1 n 2 b g 2 z lt z lt displaystyle f bar z z frac n n 2 2 n 2 Gamma frac n 2 z n 2 1 exp left frac beta gamma 2 z right sqrt frac beta gamma pi z K frac 1 n 2 left frac beta gamma 2 z right infty lt z lt infty The pdf gives the distribution of a sample covariance The approximate distribution of a correlation coefficient can be found via the Fisher transformation Multiple non central correlated samples The distribution of the product of correlated non central normal samples was derived by Cui et al 10 and takes the form of an infinite series of modified Bessel functions of the first kind Moments of product of correlated central normal samplesFor a central normal distribution N 0 1 the moments are E X p 1 s 2 p x p exp x 2 2 s 2 d x 0 if p is odd s p p 1 if p is even displaystyle operatorname E X p frac 1 sigma sqrt 2 pi int infty infty x p exp tfrac x 2 2 sigma 2 dx begin cases 0 amp text if p text is odd sigma p p 1 amp text if p text is even end cases where n displaystyle n denotes the double factorial If X Y Norm 0 1 displaystyle X Y sim text Norm 0 1 are central correlated variables the simplest bivariate case of the multivariate normal moment problem described by Kan 11 then E X p Y q 0 if p q is odd p q 2 p q 2 k 0 t 2 r 2 k p 2 k q 2 k 2 k if p and q are even p q 2 p q 2 k 0 t 2 r 2 k 1 p 1 2 k q 1 2 k 2 k 1 if p and q are odd displaystyle operatorname E X p Y q begin cases 0 amp text if p q text is odd frac p q 2 tfrac p q 2 sum k 0 t frac 2 rho 2k Big frac p 2 k Big Big frac q 2 k Big 2k amp text if p text and q text are even frac p q 2 tfrac p q 2 sum k 0 t frac 2 rho 2k 1 Big frac p 1 2 k Big Big frac q 1 2 k Big 2k 1 amp text if p text and q text are odd end cases where r displaystyle rho is the correlation coefficient and t min p q 2 displaystyle t min p q 2 needs checking Correlated non central normal distributions Edit The distribution of the product of non central correlated normal samples was derived by Cui et al 10 and takes the form of an infinite series These product distributions are somewhat comparable to the Wishart distribution The latter is the joint distribution of the four elements actually only three independent elements of a sample covariance matrix If x t y t displaystyle x t y t are samples from a bivariate time series then the W t 1 K x t y t x t y t T displaystyle W sum t 1 K dbinom x t y t dbinom x t y t T is a Wishart matrix with K degrees of freedom The product distributions above are the unconditional distribution of the aggregate of K gt 1 samples of W 2 1 displaystyle W 2 1 Independent complex valued central normal distributions Edit Let u 1 v 1 u 2 v 2 displaystyle u 1 v 1 u 2 v 2 be independent samples from a normal 0 1 distribution Setting z 1 u 1 i v 1 and z 2 u 2 i v 2 then z 1 z 2 displaystyle z 1 u 1 iv 1 text and z 2 u 2 iv 2 text then z 1 z 2 are independent zero mean complex normal samples with circular symmetry Their complex variances are Var z i 2 displaystyle operatorname Var z i 2 The density functions of r i z i u i 2 v i 2 1 2 i 1 2 displaystyle r i equiv z i u i 2 v i 2 frac 1 2 i 1 2 are Rayleigh distributions defined as f r r i r i e r i 2 2 of mean p 2 and variance 4 p 2 displaystyle f r r i r i e r i 2 2 text of mean sqrt tfrac pi 2 text and variance frac 4 pi 2 The variable y i r i 2 displaystyle y i equiv r i 2 is clearly Chi squared with two degrees of freedom and has PDF f y i y i 1 2 e y i 2 of mean value 2 displaystyle f y i y i tfrac 1 2 e y i 2 text of mean value 2 Wells et al 12 show that the density function of s z 1 z 2 displaystyle s equiv z 1 z 2 is f s s s K 0 s s 0 displaystyle f s s sK 0 s s geq 0 and the cumulative distribution function of s displaystyle s is P a Pr s a s 0 a s K 0 s d s 1 a K 1 a displaystyle P a Pr s leq a int s 0 a sK 0 s ds 1 aK 1 a Thus the polar representation of the product of two uncorrelated complex Gaussian samples is f s 8 s 8 f s s p 8 8 where p 8 is uniform on 0 2 p displaystyle f s theta s theta f s s p theta theta text where p theta text is uniform on 0 2 pi The first and second moments of this distribution can be found from the integral in Normal Distributions above m 1 0 s 2 K 0 s d x 2 G 2 3 2 2 p 2 2 p 2 displaystyle m 1 int 0 infty s 2 K 0 s dx 2 Gamma 2 tfrac 3 2 2 tfrac sqrt pi 2 2 frac pi 2 m 2 0 s 3 K 0 s d x 2 2 G 2 4 2 4 displaystyle m 2 int 0 infty s 3 K 0 s dx 2 2 Gamma 2 tfrac 4 2 4 Thus its variance is Var s m 2 m 1 2 4 p 2 4 displaystyle operatorname Var s m 2 m 1 2 4 frac pi 2 4 Further the density of z s 2 r 1 r 2 2 r 1 2 r 2 2 y 1 y 2 displaystyle z equiv s 2 r 1 r 2 2 r 1 2 r 2 2 y 1 y 2 corresponds to the product of two independent Chi square samples y i displaystyle y i each with two DoF Writing these as scaled Gamma distributions f y y i 1 8 G 1 e y i 8 with 8 2 displaystyle f y y i tfrac 1 theta Gamma 1 e y i theta text with theta 2 then from the Gamma products below the density of the product is f Z z 1 2 K 0 z with expectation E z 4 displaystyle f Z z tfrac 1 2 K 0 sqrt z text with expectation operatorname E z 4 Independent complex valued noncentral normal distributions Edit The product of non central independent complex Gaussians is described by O Donoughue and Moura 13 and forms a double infinite series of modified Bessel functions of the first and second types Gamma distributions Edit The product of two independent Gamma samples z x 1 x 2 displaystyle z x 1 x 2 defining G x k i 8 i x k i 1 e x 8 i G k i 8 i k i displaystyle Gamma x k i theta i frac x k i 1 e x theta i Gamma k i theta i k i follows 14 p Z z 2 G k 1 G k 2 z k 1 k 2 2 1 8 1 8 2 k 1 k 2 2 K k 1 k 2 2 z 8 1 8 2 2 G k 1 G k 2 y k 1 k 2 2 1 8 1 8 2 K k 1 k 2 2 y where y z 8 1 8 2 displaystyle begin aligned p Z z amp frac 2 Gamma k 1 Gamma k 2 frac z frac k 1 k 2 2 1 theta 1 theta 2 frac k 1 k 2 2 K k 1 k 2 left 2 sqrt frac z theta 1 theta 2 right amp frac 2 Gamma k 1 Gamma k 2 frac y frac k 1 k 2 2 1 theta 1 theta 2 K k 1 k 2 left 2 sqrt y right text where y frac z theta 1 theta 2 end aligned Beta distributions Edit Nagar et al 15 define a correlated bivariate beta distribution f x y x a 1 y b 1 1 x b c 1 1 y a c 1 B a b c 1 x y a b c 0 lt x y lt 1 displaystyle f x y frac x a 1 y b 1 1 x b c 1 1 y a c 1 B a b c 1 xy a b c 0 lt x y lt 1 where B a b c G a G b G c G a b c displaystyle B a b c frac Gamma a Gamma b Gamma c Gamma a b c Then the pdf of Z XY is given by f Z z B a c b c z a 1 1 z c 1 B a b c 2 F 1 a c a c a b 2 c 1 z 0 lt z lt 1 displaystyle f Z z frac B a c b c z a 1 1 z c 1 B a b c 2 F 1 a c a c a b 2c 1 z 0 lt z lt 1 where 2 F 1 displaystyle 2 F 1 is the Gauss hypergeometric function defined by the Euler integral 2 F 1 a b c z G c G a G c a 0 1 v a 1 1 v c a 1 1 v z b d v displaystyle 2 F 1 a b c z frac Gamma c Gamma a Gamma c a int 0 1 v a 1 1 v c a 1 1 vz b dv Note that multivariate distributions are not generally unique apart from the Gaussian case and there may be alternatives Uniform and gamma distributions Edit The distribution of the product of a random variable having a uniform distribution on 0 1 with a random variable having a gamma distribution with shape parameter equal to 2 is an exponential distribution 16 A more general case of this concerns the distribution of the product of a random variable having a beta distribution with a random variable having a gamma distribution for some cases where the parameters of the two component distributions are related in a certain way the result is again a gamma distribution but with a changed shape parameter 16 The K distribution is an example of a non standard distribution that can be defined as a product distribution where both components have a gamma distribution Gamma and Pareto distributions Edit The product of n Gamma and m Pareto independent samples was derived by Nadarajah 17 See also EditAlgebra of random variables Sum of independent random variablesNotes Edit Springer Melvin Dale 1979 The Algebra of Random Variables Wiley ISBN 978 0 471 01406 5 Retrieved 24 September 2012 Rohatgi V K 1976 An Introduction to Probability Theory and Mathematical Statistics Wiley Series in Probability and Statistics New York Wiley doi 10 1002 9781118165676 ISBN 978 0 19 853185 2 Grimmett G R Stirzaker D R 2001 Probability and Random Processes Oxford Oxford University Press ISBN 978 0 19 857222 0 Retrieved 4 October 2015 Sarwate Dilip March 9 2013 Variance of product of multiple random variables Stack Exchange How to find characteristic function of product of random variables Stack Exchange January 3 2013 heropup 1 February 2014 product distribution of two uniform distribution what about 3 or more Stack Exchange Gradsheyn I S Ryzhik I M 1980 Tables of Integrals Series and Products Academic Press pp section 6 561 Nadarajah Saralees Pogany Tibor 2015 On the distribution of the product of correlated normal random variables Comptes Rendus de l Academie des Sciences Serie I 354 2 201 204 doi 10 1016 j crma 2015 10 019 Equ 13 18 9 Digital Library of Mathematical Functions NIST National Institute of Standards and Technology a b Cui Guolong 2016 Exact Distribution for the Product of Two Correlated Gaussian Random Variables IEEE Signal Processing Letters 23 11 1662 1666 Bibcode 2016ISPL 23 1662C doi 10 1109 LSP 2016 2614539 Kan Raymond 2008 From moments of sum to moments of product Journal of Multivariate Analysis 99 3 542 554 doi 10 1016 j jmva 2007 01 013 Wells R T Anderson R L Cell J W 1962 The Distribution of the Product of Two Central or Non Central Chi Square Variates The Annals of Mathematical Statistics 33 3 1016 1020 doi 10 1214 aoms 1177704469 O Donoughue N Moura J M F March 2012 On the Product of Independent Complex Gaussians IEEE Transactions on Signal Processing 60 3 1050 1063 Bibcode 2012ITSP 60 1050O doi 10 1109 TSP 2011 2177264 Wolfies August 2017 PDF of the product of two independent Gamma random variables stackexchange Nagar D K Orozco Castaneda J M Gupta A K 2009 Product and quotient of correlated beta variables Applied Mathematics Letters 22 105 109 doi 10 1016 j aml 2008 02 014 a b Johnson Norman L Kotz Samuel Balakrishnan N 1995 Continuous Univariate Distributions Volume 2 Second edition Wiley p 306 ISBN 978 0 471 58494 0 Retrieved 24 September 2012 Nadarajah Saralees June 2011 Exact distribution of the product of n gamma and m Pareto random variables Journal of Computational and Applied Mathematics 235 15 4496 4512 doi 10 1016 j cam 2011 04 018 References EditSpringer Melvin Dale Thompson W E 1970 The distribution of products of beta gamma and Gaussian random variables SIAM Journal on Applied Mathematics 18 4 721 737 doi 10 1137 0118065 JSTOR 2099424 Springer Melvin Dale Thompson W E 1966 The distribution of products of independent random variables SIAM Journal on Applied Mathematics 14 3 511 526 doi 10 1137 0114046 JSTOR 2946226 Retrieved from https en wikipedia org w index php title Distribution of the product of two random variables amp oldid 1122892077, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.