Cumulant

Contents

Cumulants of probability distributions

In probability theory and statistics, the cumulants κn of the probability distribution of a random variable X are given by

<math>E\left(e^{tX}\right)=\exp\left(\sum_{n=1}^\infty\kappa_n t^n/n!\right).\,<math>

In other words, κn/n! is the nth coefficient in the power series representation of the logarithm of the moment-generating function. The logarithm of the moment-generating function is therefore called the cumulant-generating function.

In case some of the moments of the probability distribution of the random variable X are infinite, one must take t to be a pure imaginary number.

The "problem of cumulants" attempts to recover a probability distribution from its sequence of cumulants. In some cases no solution exists; in some cases a unique solution exists; in some cases more than one solution exists.

Some properties of cumulants

Invariance and equivariance

The first cumulant is shift-equivariant; all of the others are shift-invariant. To state this less tersely, denote by κn(X) the nth cumulant of the probability distribution of the random variable X. The statement is that if c is constant then κ1(X + c) = κ1(X) + c and κn(X + c) = κn(X) for n ≥ 2, i.e., c is added to the first cumulant, but all higher cumulants are unchanged.

Homogeneity

The nth cumulant is homogeneous of degree n, i.e. if c is any constant, then

<math>\kappa_n(cX)=c^n\kappa_n(X).<math>

Additivity

If X and Y are independent random variables then κn(X + Y) = κn(X) + κn(Y).

Cumulants and moments

The cumulants are related to the moments by the following recursion formula:

<math>\kappa_n=\mu'_n-\sum_{k=1}^{n-1}{n-1 \choose k-1}\kappa_k \mu_{n-k}'.<math>

The nth moment μ′n is an nth-degree polynomial in the first n cumulants, thus:

<math>\mu'_1=\kappa_1\,<math>
<math>\mu'_2=\kappa_2+\kappa_1^2\,<math>
<math>\mu'_3=\kappa_3+3\kappa_2\kappa_1+\kappa_1^3\,<math>
<math>\mu'_4=\kappa_4+4\kappa_3\kappa_1+3\kappa_2^2+6\kappa_2\kappa_1^2+\kappa_1^4\,<math>
<math>\mu'_5=\kappa_5+5\kappa_4\kappa_1+10\kappa_3\kappa_2

+10\kappa_3\kappa_1^2+15\kappa_2^2\kappa_1 +10\kappa_2\kappa_1^3+\kappa_1^5\,<math>

<math>\mu'_6=\kappa_6+6\kappa_5\kappa_1+15\kappa_4\kappa_2+15\kappa_4\kappa_1^2

+10\kappa_3^2+60\kappa_3\kappa_2\kappa_1+20\kappa_3\kappa_1^3+15\kappa_2^3 +45\kappa_2^2\kappa_1^2+15\kappa_2\kappa_1^4+\kappa_1^6\,<math>

The "prime" distinguishes the moments μ′n from the central moments μn. To express the central moments as functions of the cumulants, just drop from these polynomials all terms in which κ1 appears as a factor.

The coefficients are precisely those that occur in Faà di Bruno's formula.

Cumulants and set-partitions

These polynomials have a remarkable combinatorial interpretation: the coefficients count certain partitions of sets. A general form of these polynomials is

<math>\mu'_n=\sum_{\pi}\prod_{B\in\pi}\kappa_{\left|B\right|}<math>

where

  • π runs through the list of all partitions of a set of size n;
  • "B ∈ π" means B is one of the "blocks" into which the set is partitioned; and
  • |B| is the size of the set B.

Thus each monomial is a constant times a product of cumulants in which the sum of the indices is n (e.g., in the term κ3 κ22 κ1, the sum of the indices is 3 + 2 + 2 + 1 = 8; this appears in the polynomial that expresses the 8th moment as a function of the first eight cumulants). A partition of the integer n corresponds to each term. The coefficient in each term is the number of partitions of a set of n members that collapse to that partition of the integer n when the members of the set become indistinguishable.

Cumulants of particular probability distributions

<math>\kappa_1=p,\,<math>
<math>\kappa_{n+1}=p(1-p){d\kappa_n \over dp}.\,<math>

A distribution with arbitrary given cumulants κn can be approximated through the Gram-Charlier or Edgeworth series.

Joint cumulants

The joint cumulant of several random variables X1, ..., Xn is

<math>\kappa(X_1,\dots,X_n)

=\sum_\pi\prod_{B\in\pi}(|B|-1)!(-1)^{|B|-1}E\left(\prod_{i\in B}X_i\right)<math>

where π runs through the list of all partitions of { 1, ..., n }, and B runs through the list of all blocks of the partition π. For example,

<math>\kappa(X,Y,Z)=E(XYZ)-E(XY)E(Z)-E(XZ)E(Y)-E(YZ)E(X)+2E(X)E(Y)E(Z).\,<math>

The joint cumulant of just one random variable is its expected value, and that of two random variables is their covariance. If some of the random variables are independent of all of the others, then the joint cumulant is zero. If all n random variables are the same, then the joint cumulant is the nth ordinary cumulant.

The combinatorial meaning of the expression of moments in terms of cumulants is easier to understand than that of cumulants in terms of moments:

<math>E(X_1\cdots X_n)=\sum_\pi\prod_{B\in\pi}\kappa(X_i : i \in B).<math>

For example:

<math>E(XYZ)=\kappa(X,Y,Z)+\kappa(X,Y)\kappa(Z)+\kappa(X,Z)\kappa(Y)

+\kappa(Y,Z)\kappa(X)+\kappa(X)\kappa(Y)\kappa(Z).\,<math>

Conditional cumulants and the law of total cumulance

Main article: law of total cumulance

The law of total expectation and the law of total variance generalize naturally to conditional cumulants. The case n = 3, expressed in the language of (central) moments rather than that of cumulants, says

<math>\mu_3(X)=E(\mu_3(X\mid Y))+\mu_3(E(X\mid Y))

+3\,\operatorname{cov}(E(X\mid Y),\operatorname{var}(X\mid Y)).<math>

The general result stated below first appeared in 1969 in The Calculation of Cumulants via Conditioning by David R. Brillinger in volume 21 of Annals of the Institute of Statistical Mathematics, pages 215-218.

In general, we have

<math>\kappa(X_1,\dots,X_n)=\sum_\pi \kappa(\kappa(X_{\pi_1}\mid Y),\dots,\kappa(X_{\pi_b}\mid Y))<math>

where

  • the sum is over all partitions π of the set { 1, ..., n } of indices, and
  • π1, ..., πb are all of the "blocks" of the partition π; the expression κ(Xπk) indicates that the joint cumulant of the random variables whose indices are in that block of the partition.

History

Cumulants were first introduced by the Danish astronomer, actuary, mathematician, and statistician Thorvald N. Thiele (1838 - 1910) in 1889. Thiele called them half-invariants. They were first called cumulants in a 1931 paper, The derivation of the pattern formulae of two-way partitions from those of simpler patterns, Proceedings of the London Mathematical Society, Series 2, v. 33, pp. 195-208, by the great statistical geneticist Sir Ronald Fisher and the statistician John Wishart, eponym of the Wishart distribution. The historian Stephen Stigler has said that the name cumulant was suggested to Fisher in a letter from Harold Hotelling. In another paper published in 1929, Fisher had called them cumulative moment functions.

Formal cumulants

More generally, the cumulants of a sequence { mn : n = 1, 2, 3, ... }, not necessarily the moments of any probability distribution, are given by

<math>1+\sum_{n=1}^\infty m_n t^n/n!=\exp\left(\sum_{n=1}^\infty\kappa_n t^n/n!\right)<math>

where the values of κn for n = 1, 2, 3, ... are found formally, i.e., by algebra alone, in disregard of questions of whether any series converges. All of the difficulties of the "problem of cumulants" are absent when one works formally. The simplest example is that the second cumulant of a probability distribution must always be nonnegative, and is zero only if all of the higher cumulants are zero. Formal cumulants are subject to no such constraints.

One well-known example

In combinatorics, the nth Bell number is the number of partitions of a set of size n. All of the cumulants of the sequence of Bell numbers are equal to 1. The Bell numbers are the moments of the Poisson distribution with expected value 1.

Cumulants of a polynomial sequence of binomial type

For any sequence { κn : n = 1, 2, 3, ... } of scalars in a field of characteristic zero, being considered formal cumulants, there is a corresponding sequence { μ ′ : n = 1, 2, 3, ... } of formal moments, given by the polynomials above. For those polynomials, construct a polynomial sequence in the following way. Out the polynomial

<math>\begin{matrix}\mu'_6= &

\kappa_6+6\kappa_5\kappa_1+15\kappa_4\kappa_2+15\kappa_4\kappa_1^2 +10\kappa_3^2+60\kappa_3\kappa_2\kappa_1 \\ \\ & +20\kappa_3\kappa_1^3+15\kappa_2^3 +45\kappa_2^2\kappa_1^2+15\kappa_2\kappa_1^4+\kappa_1^6\end{matrix}<math>

make a new polynomial in these plus one additional variable x:

<math>\begin{matrix}p_6(x)= &

(\kappa_6)\,x+(6\kappa_5\kappa_1+15\kappa_4\kappa_2+10\kappa_3^2)\,x^2 +(15\kappa_4\kappa_1^2+60\kappa_3\kappa_2\kappa_1+15\kappa_2^3)\,x^3 \\ \\ & +(45\kappa_2^2\kappa_1^2)\,x^4+(15\kappa_2\kappa_1^4)\,x^5 +(\kappa_1^6)\,x^6\end{matrix}<math>

... and generalize the pattern. The pattern is that the numbers of blocks in the aforementioned partitions are the exponents on x. Each coefficient is a polynomial in the cumulants; these are the Bell polynomials, named after Eric Temple Bell.

This sequence of polynomials is of binomial type. In fact, no other sequences of binomial type exist; every polynomial sequence of binomial type is completely determined by its sequence of formal cumulants.

Free cumulants

In the identity

<math>E(X_1\cdots X_n)=\sum_\pi\prod_{B\in\pi}\kappa(X_i : i\in B)<math>

one sums over all partitions of the set { 1, ..., n }. If instead, one sums only over the noncrossing partitions, then one gets "free cumulants" rather than conventional cumulants treated above. These play a central role in free probability theory. In that theory, rather than considering independence of random variables, defined in terms of Cartesian products of algebras of random variables, one considers instead "freeness" of random variables, defined in terms of free products of algebras rather than Cartesion products of algebras.

The ordinary cumulants of degree higher than 2 of the normal distribution are zero. The free cumulants of degree higher than 2 of the Wigner semicircle distribution are zero. This is one respect in which the role of the Wigner distribution in free probability theory is analogous to that of the normal distribution in conventional probability theory.

External references

Navigation

  • Art and Cultures
    • Art (https://academickids.com/encyclopedia/index.php/Art)
    • Architecture (https://academickids.com/encyclopedia/index.php/Architecture)
    • Cultures (https://www.academickids.com/encyclopedia/index.php/Cultures)
    • Music (https://www.academickids.com/encyclopedia/index.php/Music)
    • Musical Instruments (http://academickids.com/encyclopedia/index.php/List_of_musical_instruments)
  • Biographies (http://www.academickids.com/encyclopedia/index.php/Biographies)
  • Clipart (http://www.academickids.com/encyclopedia/index.php/Clipart)
  • Geography (http://www.academickids.com/encyclopedia/index.php/Geography)
    • Countries of the World (http://www.academickids.com/encyclopedia/index.php/Countries)
    • Maps (http://www.academickids.com/encyclopedia/index.php/Maps)
    • Flags (http://www.academickids.com/encyclopedia/index.php/Flags)
    • Continents (http://www.academickids.com/encyclopedia/index.php/Continents)
  • History (http://www.academickids.com/encyclopedia/index.php/History)
    • Ancient Civilizations (http://www.academickids.com/encyclopedia/index.php/Ancient_Civilizations)
    • Industrial Revolution (http://www.academickids.com/encyclopedia/index.php/Industrial_Revolution)
    • Middle Ages (http://www.academickids.com/encyclopedia/index.php/Middle_Ages)
    • Prehistory (http://www.academickids.com/encyclopedia/index.php/Prehistory)
    • Renaissance (http://www.academickids.com/encyclopedia/index.php/Renaissance)
    • Timelines (http://www.academickids.com/encyclopedia/index.php/Timelines)
    • United States (http://www.academickids.com/encyclopedia/index.php/United_States)
    • Wars (http://www.academickids.com/encyclopedia/index.php/Wars)
    • World History (http://www.academickids.com/encyclopedia/index.php/History_of_the_world)
  • Human Body (http://www.academickids.com/encyclopedia/index.php/Human_Body)
  • Mathematics (http://www.academickids.com/encyclopedia/index.php/Mathematics)
  • Reference (http://www.academickids.com/encyclopedia/index.php/Reference)
  • Science (http://www.academickids.com/encyclopedia/index.php/Science)
    • Animals (http://www.academickids.com/encyclopedia/index.php/Animals)
    • Aviation (http://www.academickids.com/encyclopedia/index.php/Aviation)
    • Dinosaurs (http://www.academickids.com/encyclopedia/index.php/Dinosaurs)
    • Earth (http://www.academickids.com/encyclopedia/index.php/Earth)
    • Inventions (http://www.academickids.com/encyclopedia/index.php/Inventions)
    • Physical Science (http://www.academickids.com/encyclopedia/index.php/Physical_Science)
    • Plants (http://www.academickids.com/encyclopedia/index.php/Plants)
    • Scientists (http://www.academickids.com/encyclopedia/index.php/Scientists)
  • Social Studies (http://www.academickids.com/encyclopedia/index.php/Social_Studies)
    • Anthropology (http://www.academickids.com/encyclopedia/index.php/Anthropology)
    • Economics (http://www.academickids.com/encyclopedia/index.php/Economics)
    • Government (http://www.academickids.com/encyclopedia/index.php/Government)
    • Religion (http://www.academickids.com/encyclopedia/index.php/Religion)
    • Holidays (http://www.academickids.com/encyclopedia/index.php/Holidays)
  • Space and Astronomy
    • Solar System (http://www.academickids.com/encyclopedia/index.php/Solar_System)
    • Planets (http://www.academickids.com/encyclopedia/index.php/Planets)
  • Sports (http://www.academickids.com/encyclopedia/index.php/Sports)
  • Timelines (http://www.academickids.com/encyclopedia/index.php/Timelines)
  • Weather (http://www.academickids.com/encyclopedia/index.php/Weather)
  • US States (http://www.academickids.com/encyclopedia/index.php/US_States)

Information

  • Home Page (http://academickids.com/encyclopedia/index.php)
  • Contact Us (http://www.academickids.com/encyclopedia/index.php/Contactus)

  • Clip Art (http://classroomclipart.com)
Toolbox
Personal tools