Multivariate normal distribution

In probability theory and statistics, a multivariate normal distribution, also sometimes called a multivariate Gaussian distribution (in honor of Carl Friedrich Gauss, who was not the first to write about the normal distribution) is a specific probability distribution.

Contents

General case

A random vector <math>X = [X_1, \cdots, X_N]<math> follows a multivariate normal distribution, also sometimes called a multivariate Gaussian distribution, if it satisfies the following equivalent conditions:

  • there is a random vector <math>Z = [Z_1, \cdots, Z_M]<math>, whose components are independent standard normal random variables, a vector <math>\mu = [\mu_1, \cdots, \mu_N]<math> and an <math>N \times M<math> matrix <math>A<math> such that <math>X = A Z + \mu<math>.
  • there is a vector <math>\mu<math> and a symmetric, positive semi-definite matrix <math>\Gamma<math> such that the characteristic function of <math>X<math> is
<math>

\phi_X(u) = \exp \left(

i \mu^\top u - \frac{1}{2} u^\top \Gamma u

\right) . <math>

The following is not quite equivalent to the conditions above, since it fails to allow for a singular matrix as the variance:

  • there is a vector <math>\mu = [\mu_1, \cdots, \mu_N]<math> and a symmetric, positive definite covariance matrix <math>\Sigma<math> (<math>N \times N<math> matrix) such that <math>X<math> has density
<math>

f_X(x_1, \cdots, x_N) = \frac

{1}
{(2\pi)^{N/2} \left|\Sigma\right|^{1/2}}

\exp \left(

-\frac{1}{2}
( x - \mu)^\top \Sigma^{-1} (x - \mu)

\right) <math>

where <math>\left| \Sigma \right|<math> is the determinant of <math>\Sigma<math>. Note how the equation above reduces to that of the univariate normal distribution if <math>\Sigma<math> is a scalar (i.e., a real number).

The vector <math>\mu<math> in these conditions is the expected value of <math>X<math> and the matrix <math>\Sigma = A A^T<math> is the covariance matrix of the components <math>X_i<math>.

It is important to realize that the covariance matrix must be allowed to be singular. That case arises frequently in statistics; for example, in the distribution of the vector of residuals in ordinary linear regression problems. Note also that the <math>X_i<math> are in general not independent; they can be seen as the result of applying the linear transformation <math>A<math> to a collection of independent Gaussian variables <math>Z<math>.

The multivariate normal can be written in the following notation:

<math>X \sim N(\mu, \Sigma)<math>

or to make it explicitly known <math>X<math> is N-dimensional

<math>X \sim N_N(\mu, \Sigma)<math>

Cumulative distribution function

The cumulative distribution function (cdf) <math>F(x)<math> is defined as the probability that all values in a random vector <math>X<math> are less than or equal to the corresponding values in vector <math>x<math>. Though there is no closed form for <math>F(x)<math>, there are a number of algorithms that estimate it numerically. For one such example, see [1] (http://alex.strashny.org/2005/04/multivariate_normal_cumulative_distribution_function_cdf_in_matlab.html) (includes MATLAB code).

A counterexample

The fact that two random variables X and Y are normally distributed does not imply that the pair (XY) has a bivariate normal distribution. A simple example is one in which Y = X if |X| > 1 and Y = −X if |X| < 1.

If X and Y are normally distributed and independent, then they are "jointly normally distributed", i.e., the pair (XY) does have a bivariate normal distribution. There are of course also many bivariate normally distributions in which the components are correlated.

Bivariate case

In the 2-dimensional nonsingular case, the probability density function is

<math>

f(x,y) = \frac{1}{2 \pi \sigma_x \sigma_y \sqrt{1-\rho^2}} \exp \left(

-\frac{1}{2 (1-\rho^2)}
\left(
 \frac{x^2}{\sigma_x^2} +
 \frac{y^2}{\sigma_y^2} -
 \frac{2 \rho x y}{ (\sigma_x \sigma_y)}
\right)

\right) <math>

where <math>\rho<math> is the correlation between <math>X<math> and <math>Y<math>.

Linear transformation

If <math>Y = B X<math> is a linear transformation of <math>X<math> where <math>B<math> is an <math>m \times p<math> matrix then <math>Y<math> has a multivariate normal distribution with expected value <math>B \mu<math>and variance <math>B \Sigma B^T<math> (i.e., <math>Y \sim N \left(B \mu, B \Sigma B^T\right)<math>.

Corollary: any subset of the <math>X_i<math> has a marginal distribution that is also multivariate normal. To see this consider the following example: to extract the subset <math>(X_1, X_2, X_4)^T<math>, use

<math>

B = \begin{bmatrix}

1 & 0 & 0 & 0 & 0 & \ldots & 0 \\
0 & 1 & 0 & 0 & 0 & \ldots & 0 \\
0 & 0 & 0 & 1 & 0 & \ldots & 0

\end{bmatrix} <math>

which extracts the desired elements directly.

Correlations and independence

In general, random variables may be uncorrelated but highly dependent. But if a random vector has a multivariate normal distribution then any two or more of its components that are uncorrelated are independent. This implies that any two or more of its components that are pairwise independent are independent.

But it is not true that two random variables that are (separately, marginally) normally distributed and uncorrelated are independent. Two random variables that are normally distributed may fail to be jointly normally distributed, i.e., the vector whose components they are may fail to have a multivariate normal distribution. For an example of two normally distributed random variables that are uncorrelated but not independent, see normally distributed and uncorrelated does not imply independent.

Conditional distributions

Then if <math>\mu<math> and <math>\Sigma<math> are partitioned as follows

<math>

\mu = \begin{bmatrix}

\mu_1 \\
\mu_2

\end{bmatrix} \quad<math> with sizes <math>\begin{bmatrix} q \times 1 \\ N-q \times 1 \end{bmatrix}<math>

<math>

\Sigma = \begin{bmatrix}

\Sigma_{11} & \Sigma_{12} \\
\Sigma_{21} & \Sigma_{22}

\end{bmatrix} \quad<math> with sizes <math>\begin{bmatrix} q \times q & q \times N-q \\ N-q \times q & N-q \times N-q \end{bmatrix}<math>

then the distribution of <math>x_1<math> conditional on <math>x_2=a<math> is multivariate normal <math>X_1|X_2=a \sim N(\bar{\mu}, \overline{\Sigma})<math> where

<math>

\bar{\mu} = \mu_1 + \Sigma_{12} \Sigma_{22}^{-1} \left(

a - \mu_2

\right) <math>

and covariance matrix

<math>

\overline{\Sigma} = \Sigma_{11} - \Sigma_{12} \Sigma_{22}^{-1} \Sigma_{21}. <math>

This matrix is the Schur complement of <math>{\mathbf\Sigma_{22}}<math> in <math>{\mathbf\Sigma}<math>.

Note that knowing the value of <math>x_2<math> to be <math>a<math> alters the variance; perhaps more surprisingly, the mean is shifted by <math>\Sigma_{12} \Sigma_{22}^{-1} \left(a - \mu_2 \right)<math>; compare this with the situation of not knowing the value of <math>a<math>, in which case <math>x_1<math> would have distribution <math>N_q \left(\mu_1, \Sigma_{11} \right)<math>.

The matrix <math>\Sigma_{12} \Sigma_{22}^{-1}<math> is known as the matrix of regression coefficients.

Fisher information matrix

The Fisher information matrix (FIM) for a normal distribution takes a special formulation. The <math>(m,n)<math> element of the FIM for <math>X \sim N(\mu(\theta), \Sigma(\theta))<math> is

<math>

\mathcal{I}_{m,n} = \frac{\partial \mu}{\partial \theta_m} \Sigma^{-1} \frac{\partial \mu^\top}{\partial \theta_n} + \frac{1}{2} \mathrm{tr} \left(

\Sigma^{-1}
\frac{\partial \Sigma}{\partial \theta_m}
\Sigma^{-1}
\frac{\partial \Sigma}{\partial \theta_n}

\right) <math>

where

  • <math>

\frac{\partial \mu}{\partial \theta_m} = \begin{bmatrix}

\frac{\partial \mu_1}{\partial \theta_m} &
\frac{\partial \mu_2}{\partial \theta_m} &
\cdots &
\frac{\partial \mu_N}{\partial \theta_m} &

\end{bmatrix} <math>

  • <math>

\frac{\partial \mu^\top}{\partial \theta_m} = \left(

\frac{\partial \mu}{\partial \theta_m}

\right)^\top = \begin{bmatrix}

\frac{\partial \mu_1}{\partial \theta_m} \\  \\
\frac{\partial \mu_2}{\partial \theta_m} \\  \\
\vdots \\  \\
\frac{\partial \mu_N}{\partial \theta_m} \\  \\

\end{bmatrix} <math>

  • <math>

\frac{\partial \Sigma}{\partial \theta_m} = \begin{bmatrix}

\frac{\partial \Sigma_{1,1}}{\partial \theta_m} &
\frac{\partial \Sigma_{1,2}}{\partial \theta_m} &
\cdots &
\frac{\partial \Sigma_{1,N}}{\partial \theta_m} \\  \\
\frac{\partial \Sigma_{2,1}}{\partial \theta_m} &
\frac{\partial \Sigma_{2,2}}{\partial \theta_m} &
\cdots &
\frac{\partial \Sigma_{2,N}}{\partial \theta_m} \\  \\
\vdots & \vdots & \ddots & \vdots \\  \\
\frac{\partial \Sigma_{N,1}}{\partial \theta_m} &
\frac{\partial \Sigma_{N,2}}{\partial \theta_m} &
\cdots &
\frac{\partial \Sigma_{N,N}}{\partial \theta_m}

\end{bmatrix} <math>

  • <math>\mathrm{tr}<math> is the trace function

Estimation of parameters

The derivation of the maximum-likelihood estimator of the covariance matrix of a multivariate normal distribution is perhaps surprisingly subtle and elegant. See estimation of covariance matrices.

In short, the pdf is

<math>f(x)=(2 \pi)^{-p/2} \det(\Sigma)^{-1/2} \exp\left({-1 \over 2} (x-\mu)^T \Sigma^{-1} (x-\mu)\right)<math>

and the ML estimator of the covariance matrix is

<math>\Sigma = {1 \over n}\sum_{i=1}^n (X_i-\overline{X})(X_i-\overline{X})^T<math>

which is simply the sample covariance matrix.

Generating values drawn from the distribution

To generate values from a multivariate normal distribution given μ and A such that X = AZ + μ as detailed above, simply generate a suitable vector of independent standard normal values Z using for example the Box-Muller transform, and apply the foregoing equation.

Given only the covariance matrix Q, one can generate a suitable A using Cholesky decomposition.

Navigation

  • Art and Cultures
    • Art (https://academickids.com/encyclopedia/index.php/Art)
    • Architecture (https://academickids.com/encyclopedia/index.php/Architecture)
    • Cultures (https://www.academickids.com/encyclopedia/index.php/Cultures)
    • Music (https://www.academickids.com/encyclopedia/index.php/Music)
    • Musical Instruments (http://academickids.com/encyclopedia/index.php/List_of_musical_instruments)
  • Biographies (http://www.academickids.com/encyclopedia/index.php/Biographies)
  • Clipart (http://www.academickids.com/encyclopedia/index.php/Clipart)
  • Geography (http://www.academickids.com/encyclopedia/index.php/Geography)
    • Countries of the World (http://www.academickids.com/encyclopedia/index.php/Countries)
    • Maps (http://www.academickids.com/encyclopedia/index.php/Maps)
    • Flags (http://www.academickids.com/encyclopedia/index.php/Flags)
    • Continents (http://www.academickids.com/encyclopedia/index.php/Continents)
  • History (http://www.academickids.com/encyclopedia/index.php/History)
    • Ancient Civilizations (http://www.academickids.com/encyclopedia/index.php/Ancient_Civilizations)
    • Industrial Revolution (http://www.academickids.com/encyclopedia/index.php/Industrial_Revolution)
    • Middle Ages (http://www.academickids.com/encyclopedia/index.php/Middle_Ages)
    • Prehistory (http://www.academickids.com/encyclopedia/index.php/Prehistory)
    • Renaissance (http://www.academickids.com/encyclopedia/index.php/Renaissance)
    • Timelines (http://www.academickids.com/encyclopedia/index.php/Timelines)
    • United States (http://www.academickids.com/encyclopedia/index.php/United_States)
    • Wars (http://www.academickids.com/encyclopedia/index.php/Wars)
    • World History (http://www.academickids.com/encyclopedia/index.php/History_of_the_world)
  • Human Body (http://www.academickids.com/encyclopedia/index.php/Human_Body)
  • Mathematics (http://www.academickids.com/encyclopedia/index.php/Mathematics)
  • Reference (http://www.academickids.com/encyclopedia/index.php/Reference)
  • Science (http://www.academickids.com/encyclopedia/index.php/Science)
    • Animals (http://www.academickids.com/encyclopedia/index.php/Animals)
    • Aviation (http://www.academickids.com/encyclopedia/index.php/Aviation)
    • Dinosaurs (http://www.academickids.com/encyclopedia/index.php/Dinosaurs)
    • Earth (http://www.academickids.com/encyclopedia/index.php/Earth)
    • Inventions (http://www.academickids.com/encyclopedia/index.php/Inventions)
    • Physical Science (http://www.academickids.com/encyclopedia/index.php/Physical_Science)
    • Plants (http://www.academickids.com/encyclopedia/index.php/Plants)
    • Scientists (http://www.academickids.com/encyclopedia/index.php/Scientists)
  • Social Studies (http://www.academickids.com/encyclopedia/index.php/Social_Studies)
    • Anthropology (http://www.academickids.com/encyclopedia/index.php/Anthropology)
    • Economics (http://www.academickids.com/encyclopedia/index.php/Economics)
    • Government (http://www.academickids.com/encyclopedia/index.php/Government)
    • Religion (http://www.academickids.com/encyclopedia/index.php/Religion)
    • Holidays (http://www.academickids.com/encyclopedia/index.php/Holidays)
  • Space and Astronomy
    • Solar System (http://www.academickids.com/encyclopedia/index.php/Solar_System)
    • Planets (http://www.academickids.com/encyclopedia/index.php/Planets)
  • Sports (http://www.academickids.com/encyclopedia/index.php/Sports)
  • Timelines (http://www.academickids.com/encyclopedia/index.php/Timelines)
  • Weather (http://www.academickids.com/encyclopedia/index.php/Weather)
  • US States (http://www.academickids.com/encyclopedia/index.php/US_States)

Information

  • Home Page (http://academickids.com/encyclopedia/index.php)
  • Contact Us (http://www.academickids.com/encyclopedia/index.php/Contactus)

  • Clip Art (http://classroomclipart.com)
Toolbox
Personal tools