Overview

bivariate distribution


Show Summary Details

Quick Reference

When, as a result of an experiment, values for two random variables X and Y are obtained, it is said that there is a bivariate distribution. In the case of a sample, with n pairs of values (x1, y1), (x2, y2),…, (xn, yn) being obtained, the methods of correlation and regression may be appropriate. Associated with the experiment is a bivariate probability distribution. In the case when X and Y are discrete random variables, the distribution is specified by the joint distribution giving P(X=xj & Y=yk) for all values of j and k.

The marginal distribution of X is given by , and the marginal distribution of Y is given by . If X and Y are independent random variables then . The expected values and variances of X and Y are given in the usual way from these marginal distributions. For example,. The conditional distribution of X, given that Y=yk, is given by , and the conditional expectation of X, given that Y=yk , written as E(X|Y=yk), is defined in the usual way as . The conditional distribution of Y and the conditional expectation of Y, given that X=xj, are defined similarly.

In the case when X and Y are continuous random variables, the distribution is specified by the joint probability density function f(x, y) with the property that if R is any region of the (x, y) plane then . The marginal distribution of X then has probability density function (pdf) and the marginal distribution of Y has pdf . If the two random variables are independent of one another then f(x, y) is the product of the pdfs of the two marginal distributions. The expected values and variances of X and Y are given in the usual way from these marginal distributions.

The probability density function for the conditional distribution of X, given that Y=yk, is and . See also multivariate distribution.

Subjects: Probability and Statistics.


Reference entries

Users without a subscription are not able to see the full content. Please, subscribe or login to access all content.