fbpx
Wikipedia

Joint probability distribution

Given two random variables that are defined on the same probability space,[1] the joint probability distribution is the corresponding probability distribution on all possible pairs of outputs. The joint distribution can just as well be considered for any given number of random variables. The joint distribution encodes the marginal distributions, i.e. the distributions of each of the individual random variables. It also encodes the conditional probability distributions, which deal with how the outputs of one random variable are distributed when given information on the outputs of the other random variable(s).

Many sample observations (black) are shown from a joint probability distribution. The marginal densities are shown as well (in blue and in red).

In the formal mathematical setup of measure theory, the joint distribution is given by the pushforward measure, by the map obtained by pairing together the given random variables, of the sample space's probability measure.

In the case of real-valued random variables, the joint distribution, as a particular multivariate distribution, may be expressed by a multivariate cumulative distribution function, or by a multivariate probability density function together with a multivariate probability mass function. In the special case of continuous random variables, it is sufficient to consider probability density functions, and in the case of discrete random variables, it is sufficient to consider probability mass functions.

Examples Edit

Draws from an urn Edit

Each of two urns contains twice as many red balls as blue balls, and no others, and one ball is randomly selected from each urn, with the two draws independent of each other. Let   and   be discrete random variables associated with the outcomes of the draw from the first urn and second urn respectively. The probability of drawing a red ball from either of the urns is 2/3, and the probability of drawing a blue ball is 1/3. The joint probability distribution is presented in the following table:

A=Red A=Blue P(B)
B=Red (2/3)(2/3)=4/9 (1/3)(2/3)=2/9 4/9+2/9=2/3
B=Blue (2/3)(1/3)=2/9 (1/3)(1/3)=1/9 2/9+1/9=1/3
P(A) 4/9+2/9=2/3 2/9+1/9=1/3

Each of the four inner cells shows the probability of a particular combination of results from the two draws; these probabilities are the joint distribution. In any one cell the probability of a particular combination occurring is (since the draws are independent) the product of the probability of the specified result for A and the probability of the specified result for B. The probabilities in these four cells sum to 1, as with all probability distributions.

Moreover, the final row and the final column give the marginal probability distribution for A and the marginal probability distribution for B respectively. For example, for A the first of these cells gives the sum of the probabilities for A being red, regardless of which possibility for B in the column above the cell occurs, as 2/3. Thus the marginal probability distribution for   gives  's probabilities unconditional on  , in a margin of the table.

Coin flips Edit

Consider the flip of two fair coins; let   and   be discrete random variables associated with the outcomes of the first and second coin flips respectively. Each coin flip is a Bernoulli trial and has a Bernoulli distribution. If a coin displays "heads" then the associated random variable takes the value 1, and it takes the value 0 otherwise. The probability of each of these outcomes is 1/2, so the marginal (unconditional) density functions are

 
 

The joint probability mass function of   and   defines probabilities for each pair of outcomes. All possible outcomes are

 

Since each outcome is equally likely the joint probability mass function becomes

 

Since the coin flips are independent, the joint probability mass function is the product of the marginals:

 

Rolling a die Edit

Consider the roll of a fair die and let   if the number is even (i.e. 2, 4, or 6) and   otherwise. Furthermore, let   if the number is prime (i.e. 2, 3, or 5) and   otherwise.

1 2 3 4 5 6
A 0 1 0 1 0 1
B 0 1 1 0 1 0

Then, the joint distribution of   and  , expressed as a probability mass function, is

 
 

These probabilities necessarily sum to 1, since the probability of some combination of   and   occurring is 1.

Marginal probability distribution Edit

If more than one random variable is defined in a random experiment, it is important to distinguish between the joint probability distribution of X and Y and the probability distribution of each variable individually. The individual probability distribution of a random variable is referred to as its marginal probability distribution. In general, the marginal probability distribution of X can be determined from the joint probability distribution of X and other random variables.

If the joint probability density function of random variable X and Y is   , the marginal probability density function of X and Y, which defines the marginal distribution, is given by:

 
 

where the first integral is over all points in the range of (X,Y) for which X=x and the second integral is over all points in the range of (X,Y) for which Y=y.[2]

Joint cumulative distribution function Edit

For a pair of random variables  , the joint cumulative distribution function (CDF)   is given by[3]: p. 89 

 

 

 

 

 

(Eq.1)

where the right-hand side represents the probability that the random variable   takes on a value less than or equal to   and that   takes on a value less than or equal to  .

For   random variables  , the joint CDF   is given by

 

 

 

 

 

(Eq.2)

Interpreting the   random variables as a random vector   yields a shorter notation:

 

Joint density function or mass function Edit

Discrete case Edit

The joint probability mass function of two discrete random variables   is:

 

 

 

 

 

(Eq.3)

or written in terms of conditional distributions

 

where   is the probability of   given that  .

The generalization of the preceding two-variable case is the joint probability distribution of   discrete random variables   which is:

 

 

 

 

 

(Eq.4)

or equivalently

 .

This identity is known as the chain rule of probability.

Since these are probabilities, in the two-variable case

 

which generalizes for   discrete random variables   to

 

Continuous case Edit

The joint probability density function   for two continuous random variables is defined as the derivative of the joint cumulative distribution function (see Eq.1):

 

 

 

 

 

(Eq.5)

This is equal to:

 

where   and   are the conditional distributions of   given   and of   given   respectively, and   and   are the marginal distributions for   and   respectively.

The definition extends naturally to more than two random variables:

 

 

 

 

 

(Eq.6)

Again, since these are probability distributions, one has

 

respectively

 

Mixed case Edit

The "mixed joint density" may be defined where one or more random variables are continuous and the other random variables are discrete. With one variable of each type

 

One example of a situation in which one may wish to find the cumulative distribution of one random variable which is continuous and another random variable which is discrete arises when one wishes to use a logistic regression in predicting the probability of a binary outcome Y conditional on the value of a continuously distributed outcome  . One must use the "mixed" joint density when finding the cumulative distribution of this binary outcome because the input variables   were initially defined in such a way that one could not collectively assign it either a probability density function or a probability mass function. Formally,   is the probability density function of   with respect to the product measure on the respective supports of   and  . Either of these two decompositions can then be used to recover the joint cumulative distribution function:

 

The definition generalizes to a mixture of arbitrary numbers of discrete and continuous random variables.

Additional properties Edit

Joint distribution for independent variables Edit

In general two random variables   and   are independent if and only if the joint cumulative distribution function satisfies

 

Two discrete random variables   and   are independent if and only if the joint probability mass function satisfies

 

for all   and  .

While the number of independent random events grows, the related joint probability value decreases rapidly to zero, according to a negative exponential law.

Similarly, two absolutely continuous random variables are independent if and only if

 

for all   and  . This means that acquiring any information about the value of one or more of the random variables leads to a conditional distribution of any other variable that is identical to its unconditional (marginal) distribution; thus no variable provides any information about any other variable.

Joint distribution for conditionally dependent variables Edit

If a subset   of the variables   is conditionally dependent given another subset   of these variables, then the probability mass function of the joint distribution is  .   is equal to  . Therefore, it can be efficiently represented by the lower-dimensional probability distributions   and  . Such conditional independence relations can be represented with a Bayesian network or copula functions.

Covariance Edit

When two or more random variables are defined on a probability space, it is useful to describe how they vary together; that is, it is useful to measure the relationship between the variables. A common measure of the relationship between two random variables is the covariance. Covariance is a measure of linear relationship between the random variables. If the relationship between the random variables is nonlinear, the covariance might not be sensitive to the relationship, which means, it does not relate the correlation between two variables.

The covariance between the random variable X and Y, denoted as cov(X,Y), is :

 [4]

Correlation Edit

There is another measure of the relationship between two random variables that is often easier to interpret than the covariance.

The correlation just scales the covariance by the product of the standard deviation of each variable. Consequently, the correlation is a dimensionless quantity that can be used to compare the linear relationships between pairs of variables in different units. If the points in the joint probability distribution of X and Y that receive positive probability tend to fall along a line of positive (or negative) slope, ρXY is near +1 (or −1). If ρXY equals +1 or −1, it can be shown that the points in the joint probability distribution that receive positive probability fall exactly along a straight line. Two random variables with nonzero correlation are said to be correlated. Similar to covariance, the correlation is a measure of the linear relationship between random variables.

The correlation between random variable X and Y, denoted as

 

Important named distributions Edit

Named joint distributions that arise frequently in statistics include the multivariate normal distribution, the multivariate stable distribution, the multinomial distribution, the negative multinomial distribution, the multivariate hypergeometric distribution, and the elliptical distribution.

See also Edit

References Edit

  1. ^ Feller, William (1957). An introduction to probability theory and its applications, vol 1, 3rd edition. pp. 217–218. ISBN 978-0471257080.
  2. ^ Montgomery, Douglas C. (19 November 2013). Applied statistics and probability for engineers. Runger, George C. (Sixth ed.). Hoboken, NJ. ISBN 978-1-118-53971-2. OCLC 861273897.{{cite book}}: CS1 maint: location missing publisher (link)
  3. ^ Park,Kun Il (2018). Fundamentals of Probability and Stochastic Processes with Applications to Communications. Springer. ISBN 978-3-319-68074-3.
  4. ^ Montgomery, Douglas C. (19 November 2013). Applied statistics and probability for engineers. Runger, George C. (Sixth ed.). Hoboken, NJ. ISBN 978-1-118-53971-2. OCLC 861273897.{{cite book}}: CS1 maint: location missing publisher (link)

External links Edit

joint, probability, distribution, given, random, variables, that, defined, same, probability, space, joint, probability, distribution, corresponding, probability, distribution, possible, pairs, outputs, joint, distribution, just, well, considered, given, numbe. Given two random variables that are defined on the same probability space 1 the joint probability distribution is the corresponding probability distribution on all possible pairs of outputs The joint distribution can just as well be considered for any given number of random variables The joint distribution encodes the marginal distributions i e the distributions of each of the individual random variables It also encodes the conditional probability distributions which deal with how the outputs of one random variable are distributed when given information on the outputs of the other random variable s X displaystyle X Y displaystyle Y p X displaystyle p X p Y displaystyle p Y Many sample observations black are shown from a joint probability distribution The marginal densities are shown as well in blue and in red In the formal mathematical setup of measure theory the joint distribution is given by the pushforward measure by the map obtained by pairing together the given random variables of the sample space s probability measure In the case of real valued random variables the joint distribution as a particular multivariate distribution may be expressed by a multivariate cumulative distribution function or by a multivariate probability density function together with a multivariate probability mass function In the special case of continuous random variables it is sufficient to consider probability density functions and in the case of discrete random variables it is sufficient to consider probability mass functions Contents 1 Examples 1 1 Draws from an urn 1 2 Coin flips 1 3 Rolling a die 2 Marginal probability distribution 3 Joint cumulative distribution function 4 Joint density function or mass function 4 1 Discrete case 4 2 Continuous case 4 3 Mixed case 5 Additional properties 5 1 Joint distribution for independent variables 5 2 Joint distribution for conditionally dependent variables 5 3 Covariance 5 4 Correlation 6 Important named distributions 7 See also 8 References 9 External linksExamples EditDraws from an urn Edit Each of two urns contains twice as many red balls as blue balls and no others and one ball is randomly selected from each urn with the two draws independent of each other Let A displaystyle A nbsp and B displaystyle B nbsp be discrete random variables associated with the outcomes of the draw from the first urn and second urn respectively The probability of drawing a red ball from either of the urns is 2 3 and the probability of drawing a blue ball is 1 3 The joint probability distribution is presented in the following table A Red A Blue P B B Red 2 3 2 3 4 9 1 3 2 3 2 9 4 9 2 9 2 3B Blue 2 3 1 3 2 9 1 3 1 3 1 9 2 9 1 9 1 3P A 4 9 2 9 2 3 2 9 1 9 1 3Each of the four inner cells shows the probability of a particular combination of results from the two draws these probabilities are the joint distribution In any one cell the probability of a particular combination occurring is since the draws are independent the product of the probability of the specified result for A and the probability of the specified result for B The probabilities in these four cells sum to 1 as with all probability distributions Moreover the final row and the final column give the marginal probability distribution for A and the marginal probability distribution for B respectively For example for A the first of these cells gives the sum of the probabilities for A being red regardless of which possibility for B in the column above the cell occurs as 2 3 Thus the marginal probability distribution for A displaystyle A nbsp gives A displaystyle A nbsp s probabilities unconditional on B displaystyle B nbsp in a margin of the table Coin flips Edit Consider the flip of two fair coins let A displaystyle A nbsp and B displaystyle B nbsp be discrete random variables associated with the outcomes of the first and second coin flips respectively Each coin flip is a Bernoulli trial and has a Bernoulli distribution If a coin displays heads then the associated random variable takes the value 1 and it takes the value 0 otherwise The probability of each of these outcomes is 1 2 so the marginal unconditional density functions are P A 1 2 for A 0 1 displaystyle P A 1 2 quad text for quad A in 0 1 nbsp P B 1 2 for B 0 1 displaystyle P B 1 2 quad text for quad B in 0 1 nbsp The joint probability mass function of A displaystyle A nbsp and B displaystyle B nbsp defines probabilities for each pair of outcomes All possible outcomes are A 0 B 0 A 0 B 1 A 1 B 0 A 1 B 1 displaystyle A 0 B 0 A 0 B 1 A 1 B 0 A 1 B 1 nbsp Since each outcome is equally likely the joint probability mass function becomes P A B 1 4 for A B 0 1 displaystyle P A B 1 4 quad text for quad A B in 0 1 nbsp Since the coin flips are independent the joint probability mass function is the product of the marginals P A B P A P B for A B 0 1 displaystyle P A B P A P B quad text for quad A B in 0 1 nbsp Rolling a die Edit Consider the roll of a fair die and let A 1 displaystyle A 1 nbsp if the number is even i e 2 4 or 6 and A 0 displaystyle A 0 nbsp otherwise Furthermore let B 1 displaystyle B 1 nbsp if the number is prime i e 2 3 or 5 and B 0 displaystyle B 0 nbsp otherwise 1 2 3 4 5 6A 0 1 0 1 0 1B 0 1 1 0 1 0Then the joint distribution of A displaystyle A nbsp and B displaystyle B nbsp expressed as a probability mass function is P A 0 B 0 P 1 1 6 P A 1 B 0 P 4 6 2 6 displaystyle mathrm P A 0 B 0 P 1 frac 1 6 quad quad mathrm P A 1 B 0 P 4 6 frac 2 6 nbsp P A 0 B 1 P 3 5 2 6 P A 1 B 1 P 2 1 6 displaystyle mathrm P A 0 B 1 P 3 5 frac 2 6 quad quad mathrm P A 1 B 1 P 2 frac 1 6 nbsp These probabilities necessarily sum to 1 since the probability of some combination of A displaystyle A nbsp and B displaystyle B nbsp occurring is 1 Marginal probability distribution EditIf more than one random variable is defined in a random experiment it is important to distinguish between the joint probability distribution of X and Y and the probability distribution of each variable individually The individual probability distribution of a random variable is referred to as its marginal probability distribution In general the marginal probability distribution of X can be determined from the joint probability distribution of X and other random variables If the joint probability density function of random variable X and Y is f X Y x y displaystyle f X Y x y nbsp the marginal probability density function of X and Y which defines the marginal distribution is given by f X x f X Y x y d y displaystyle f X x int f X Y x y dy nbsp f Y y f X Y x y d x displaystyle f Y y int f X Y x y dx nbsp where the first integral is over all points in the range of X Y for which X x and the second integral is over all points in the range of X Y for which Y y 2 Joint cumulative distribution function EditFor a pair of random variables X Y displaystyle X Y nbsp the joint cumulative distribution function CDF F X Y displaystyle F XY nbsp is given by 3 p 89 F X Y x y P X x Y y displaystyle F X Y x y operatorname P X leq x Y leq y nbsp Eq 1 where the right hand side represents the probability that the random variable X displaystyle X nbsp takes on a value less than or equal to x displaystyle x nbsp and that Y displaystyle Y nbsp takes on a value less than or equal to y displaystyle y nbsp For N displaystyle N nbsp random variables X 1 X N displaystyle X 1 ldots X N nbsp the joint CDF F X 1 X N displaystyle F X 1 ldots X N nbsp is given by F X 1 X N x 1 x N P X 1 x 1 X N x N displaystyle F X 1 ldots X N x 1 ldots x N operatorname P X 1 leq x 1 ldots X N leq x N nbsp Eq 2 Interpreting the N displaystyle N nbsp random variables as a random vector X X 1 X N T displaystyle mathbf X X 1 ldots X N T nbsp yields a shorter notation F X x P X 1 x 1 X N x N displaystyle F mathbf X mathbf x operatorname P X 1 leq x 1 ldots X N leq x N nbsp Joint density function or mass function EditDiscrete case Edit The joint probability mass function of two discrete random variables X Y displaystyle X Y nbsp is p X Y x y P X x a n d Y y displaystyle p X Y x y mathrm P X x mathrm and Y y nbsp Eq 3 or written in terms of conditional distributions p X Y x y P Y y X x P X x P X x Y y P Y y displaystyle p X Y x y mathrm P Y y mid X x cdot mathrm P X x mathrm P X x mid Y y cdot mathrm P Y y nbsp where P Y y X x displaystyle mathrm P Y y mid X x nbsp is the probability of Y y displaystyle Y y nbsp given that X x displaystyle X x nbsp The generalization of the preceding two variable case is the joint probability distribution of n displaystyle n nbsp discrete random variables X 1 X 2 X n displaystyle X 1 X 2 dots X n nbsp which is p X 1 X n x 1 x n P X 1 x 1 and and X n x n displaystyle p X 1 ldots X n x 1 ldots x n mathrm P X 1 x 1 text and dots text and X n x n nbsp Eq 4 or equivalently p X 1 X n x 1 x n P X 1 x 1 P X 2 x 2 X 1 x 1 P X 3 x 3 X 1 x 1 X 2 x 2 P X n x n X 1 x 1 X 2 x 2 X n 1 x n 1 displaystyle begin aligned p X 1 ldots X n x 1 ldots x n amp mathrm P X 1 x 1 cdot mathrm P X 2 x 2 mid X 1 x 1 amp cdot mathrm P X 3 x 3 mid X 1 x 1 X 2 x 2 amp dots amp cdot P X n x n mid X 1 x 1 X 2 x 2 dots X n 1 x n 1 end aligned nbsp This identity is known as the chain rule of probability Since these are probabilities in the two variable case i j P X x i a n d Y y j 1 displaystyle sum i sum j mathrm P X x i mathrm and Y y j 1 nbsp which generalizes for n displaystyle n nbsp discrete random variables X 1 X 2 X n displaystyle X 1 X 2 dots X n nbsp to i j k P X 1 x 1 i X 2 x 2 j X n x n k 1 displaystyle sum i sum j dots sum k mathrm P X 1 x 1i X 2 x 2j dots X n x nk 1 nbsp Continuous case Edit The joint probability density function f X Y x y displaystyle f X Y x y nbsp for two continuous random variables is defined as the derivative of the joint cumulative distribution function see Eq 1 f X Y x y 2 F X Y x y x y displaystyle f X Y x y frac partial 2 F X Y x y partial x partial y nbsp Eq 5 This is equal to f X Y x y f Y X y x f X x f X Y x y f Y y displaystyle f X Y x y f Y mid X y mid x f X x f X mid Y x mid y f Y y nbsp where f Y X y x displaystyle f Y mid X y mid x nbsp and f X Y x y displaystyle f X mid Y x mid y nbsp are the conditional distributions of Y displaystyle Y nbsp given X x displaystyle X x nbsp and of X displaystyle X nbsp given Y y displaystyle Y y nbsp respectively and f X x displaystyle f X x nbsp and f Y y displaystyle f Y y nbsp are the marginal distributions for X displaystyle X nbsp and Y displaystyle Y nbsp respectively The definition extends naturally to more than two random variables f X 1 X n x 1 x n n F X 1 X n x 1 x n x 1 x n displaystyle f X 1 ldots X n x 1 ldots x n frac partial n F X 1 ldots X n x 1 ldots x n partial x 1 ldots partial x n nbsp Eq 6 Again since these are probability distributions one has x y f X Y x y d y d x 1 displaystyle int x int y f X Y x y dy dx 1 nbsp respectively x 1 x n f X 1 X n x 1 x n d x n d x 1 1 displaystyle int x 1 ldots int x n f X 1 ldots X n x 1 ldots x n dx n ldots dx 1 1 nbsp Mixed case Edit The mixed joint density may be defined where one or more random variables are continuous and the other random variables are discrete With one variable of each type f X Y x y f X Y x y P Y y P Y y X x f X x displaystyle begin aligned f X Y x y f X mid Y x mid y mathrm P Y y mathrm P Y y mid X x f X x end aligned nbsp One example of a situation in which one may wish to find the cumulative distribution of one random variable which is continuous and another random variable which is discrete arises when one wishes to use a logistic regression in predicting the probability of a binary outcome Y conditional on the value of a continuously distributed outcome X displaystyle X nbsp One must use the mixed joint density when finding the cumulative distribution of this binary outcome because the input variables X Y displaystyle X Y nbsp were initially defined in such a way that one could not collectively assign it either a probability density function or a probability mass function Formally f X Y x y displaystyle f X Y x y nbsp is the probability density function of X Y displaystyle X Y nbsp with respect to the product measure on the respective supports of X displaystyle X nbsp and Y displaystyle Y nbsp Either of these two decompositions can then be used to recover the joint cumulative distribution function F X Y x y t y s x f X Y s t d s displaystyle begin aligned F X Y x y amp sum limits t leq y int s infty x f X Y s t ds end aligned nbsp The definition generalizes to a mixture of arbitrary numbers of discrete and continuous random variables Additional properties EditJoint distribution for independent variables Edit In general two random variables X displaystyle X nbsp and Y displaystyle Y nbsp are independent if and only if the joint cumulative distribution function satisfies F X Y x y F X x F Y y displaystyle F X Y x y F X x cdot F Y y nbsp Two discrete random variables X displaystyle X nbsp and Y displaystyle Y nbsp are independent if and only if the joint probability mass function satisfies P X x and Y y P X x P Y y displaystyle P X x mbox and Y y P X x cdot P Y y nbsp for all x displaystyle x nbsp and y displaystyle y nbsp While the number of independent random events grows the related joint probability value decreases rapidly to zero according to a negative exponential law Similarly two absolutely continuous random variables are independent if and only if f X Y x y f X x f Y y displaystyle f X Y x y f X x cdot f Y y nbsp for all x displaystyle x nbsp and y displaystyle y nbsp This means that acquiring any information about the value of one or more of the random variables leads to a conditional distribution of any other variable that is identical to its unconditional marginal distribution thus no variable provides any information about any other variable Joint distribution for conditionally dependent variables Edit If a subset A displaystyle A nbsp of the variables X 1 X n displaystyle X 1 cdots X n nbsp is conditionally dependent given another subset B displaystyle B nbsp of these variables then the probability mass function of the joint distribution is P X 1 X n displaystyle mathrm P X 1 ldots X n nbsp P X 1 X n displaystyle mathrm P X 1 ldots X n nbsp is equal to P B P A B displaystyle P B cdot P A mid B nbsp Therefore it can be efficiently represented by the lower dimensional probability distributions P B displaystyle P B nbsp and P A B displaystyle P A mid B nbsp Such conditional independence relations can be represented with a Bayesian network or copula functions Covariance Edit When two or more random variables are defined on a probability space it is useful to describe how they vary together that is it is useful to measure the relationship between the variables A common measure of the relationship between two random variables is the covariance Covariance is a measure of linear relationship between the random variables If the relationship between the random variables is nonlinear the covariance might not be sensitive to the relationship which means it does not relate the correlation between two variables The covariance between the random variable X and Y denoted as cov X Y is s X Y E X m x Y m y E X Y m x m y displaystyle sigma XY E X mu x Y mu y E XY mu x mu y nbsp 4 Correlation Edit There is another measure of the relationship between two random variables that is often easier to interpret than the covariance The correlation just scales the covariance by the product of the standard deviation of each variable Consequently the correlation is a dimensionless quantity that can be used to compare the linear relationships between pairs of variables in different units If the points in the joint probability distribution of X and Y that receive positive probability tend to fall along a line of positive or negative slope rXY is near 1 or 1 If rXY equals 1 or 1 it can be shown that the points in the joint probability distribution that receive positive probability fall exactly along a straight line Two random variables with nonzero correlation are said to be correlated Similar to covariance the correlation is a measure of the linear relationship between random variables The correlation between random variable X and Y denoted asr X Y c o v X Y V X V Y s X Y s X s Y displaystyle rho XY frac cov X Y sqrt V X V Y frac sigma XY sigma X sigma Y nbsp Important named distributions EditNamed joint distributions that arise frequently in statistics include the multivariate normal distribution the multivariate stable distribution the multinomial distribution the negative multinomial distribution the multivariate hypergeometric distribution and the elliptical distribution See also EditBayesian programming Chow Liu tree Conditional probability Copula probability theory Disintegration theorem Multivariate statistics Statistical interference Pairwise independent distributionReferences Edit Feller William 1957 An introduction to probability theory and its applications vol 1 3rd edition pp 217 218 ISBN 978 0471257080 Montgomery Douglas C 19 November 2013 Applied statistics and probability for engineers Runger George C Sixth ed Hoboken NJ ISBN 978 1 118 53971 2 OCLC 861273897 a href Template Cite book html title Template Cite book cite book a CS1 maint location missing publisher link Park Kun Il 2018 Fundamentals of Probability and Stochastic Processes with Applications to Communications Springer ISBN 978 3 319 68074 3 Montgomery Douglas C 19 November 2013 Applied statistics and probability for engineers Runger George C Sixth ed Hoboken NJ ISBN 978 1 118 53971 2 OCLC 861273897 a href Template Cite book html title Template Cite book cite book a CS1 maint location missing publisher link External links Edit Joint distribution Encyclopedia of Mathematics EMS Press 2001 1994 Multi dimensional distribution Encyclopedia of Mathematics EMS Press 2001 1994 A modern introduction to probability and statistics understanding why and how Dekking Michel 1946 London Springer 2005 ISBN 978 1 85233 896 1 OCLC 262680588 Joint continuous density function PlanetMath Mathworld Joint Distribution Function Retrieved from https en wikipedia org w index php title Joint probability distribution amp oldid 1159896069, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.