White noise

This article is about white noise as a scientific concept. For the 1985 novel by Don DeLillo, see White Noise (novel). For the 2005 movie by Geoffrey Sax, see White Noise (movie). For the electronic music band see White Noise (band). White Noise is also used as a term for a certain type of music promoting racism and white supremacy.


Missing image
Whitenoise.png
Four thousandths of a second of white noise

White noise Template:Audio is a random signal (or process) with a flat power spectral density. In other words, the signal's power spectral density has equal power in any band, at any centre frequency, having a given bandwidth.

An infinite-bandwidth white noise signal is purely a theoretical construct. By having power at all frequencies, the total power of such a signal is infinite. In practice, a signal can be "white" with a flat spectrum over a defined frequency band.

Contents

Statistical properties

Missing image
White-noise.png
An example realization of a white noise process.

The term white noise is also commonly applied to a noise signal in the spatial domain which has zero autocorrelation with itself over the relevant space dimensions. The signal is then "white" in the spatial frequency domain (this is equally true for signals in the angular frequency domain, e.g. the distribution of a signal across all angles in the night sky). The image to the right displays a finite length, discrete time realization of a white noise process generated from a computer.

Being uncorrelated in time does not however restrict the values a signal can take. Any distribution of values is possible (although it must have zero DC component). For example, a binary signal which can only take on the values 1 or 0 will be white if the sequence of zeros and ones is statistically uncorrelated. Noise having a continuous distribution, such as a normal distribution, can of course be white.

It is often incorrectly assumed that Gaussian noise (see normal distribution) is necessarily white noise. However, neither property implies the other. Thus, the two words "Gaussian" and "white" are often both specified in mathematical models of systems. Gaussian white noise is a good approximation of many real-world situations and generates mathematically tractable models. These models are used so frequently that the term additive white Gaussian noise has a standard abbreviation: AWGN. Gaussian white noise has the useful statistical property that its values are independent (see Statistical independence).


White noise is the generalized mean-square derivative of the Wiener process or Brownian motion.

Colors of noise

Template:Main article There are also other "colors" of noise, the most commonly used being pink and brown.

Applications

One use for white noise is in the field of architectural acoustics. Here in order to submerge distracting, undesirable noises (for example conversations, etc.,) in interior spaces, a constant low level of noise is generated and provided as a background sound.

White noise has also been used in electronic music, where is it used either directly or as an input for a filter to create other types of noise signal.

To set up the EQ for a concert or other performance in a venue, white noise is sent through the PA system and monitored from various points in the venue so that the engineer can tell if the acoustics of the building naturally boost or cut any frequencies and can compensate with the mixer.

White noise is used as the basis of some random number generators.

Mathematical definition

White random vector

A random vector <math>\mathbf{w}<math> is a white random vector if and only if its mean vector and autocorrelation matrix are the following:

<math>\mu_w = \mathbb{E}\{ \mathbf{w} \} = 0<math>
<math>R_{ww} = \mathbb{E}\{ \mathbf{w} \mathbf{w}^T\} = \sigma^2 \mathbf{I}<math>

I.e., it is a zero mean random vector, and its autocorrelation matrix is a multiple of the identity matrix. When the autocorrelation matrix is a multiple of the identity, we say that it has spherical correlation.

White random process (white noise)

A continuous time random process <math>w(t)<math> where <math>t \in \mathbb{R}<math> is a white noise process if and only if its mean function and autocorrelation function satisfy the following:

<math>\mu_w(t) = \mathbb{E}\{ w(t)\} = 0<math>
<math>R_{ww}(t_1, t_2) = \mathbb{E}\{ w(t_1) w(t_2)\} = \sigma^2 \delta(t_1 - t_2)<math>

I.e., it is a zero mean process for all time and has infinite power at zero time shift since its autocorrelation function is the Dirac delta function.

The above autocorrelation function implies the following power spectral density.

<math>S_{xx}(\omega) = \sigma^2 \,\!<math>

since the Fourier transform of the delta function is equal to 1. Since this power spectral density is the same at all frequencies, we call it white as an analogy to the frequency spectrum of white light.

Random vector transformations

Two theoretical applications using a white random vector are the simulation and whitening of another arbitrary random vector. To simulate an arbitrary random vector, we transform a white random vector with a carefully chosen matrix. We choose the transformation matrix so that the mean and covariance matrix of the transformed white random vector matches the mean and covariance matrix of the arbitrary random vector that we are simulating. To whiten an arbitrary random vector, we transform it by a different carefully chosen matrix so that the output random vector is a white random vector.

These two ideas are crucial in applications such as channel estimation and channel equalization in communications and audio. These concepts are also used in data compression.

Simulating a random vector

Suppose that a random vector <math>\mathbf{x}<math> has covariance matrix <math>K_{xx}<math>. Since this matrix is Hermitian symmetric and positive semidefinite, by the spectral theorem from linear algebra, we can diagonalize or factor the matrix in the following way.

<math>\,\! K_{xx} = E \Lambda E^T<math>

where <math>E<math> is the orthogonal matrix of eigenvectors and <math>\Lambda<math> is the diagonal matrix of eigenvalues.

We can simulate the 1st and 2nd moment properties of this random vector <math>\mathbf{x}<math> with mean <math>\mathbf{\mu}<math> and covariance matrix <math>K_{xx}<math> via the following transformation of a white vector <math>\mathbf{w}<math>:

<math> \mathbf{x} = H \, \mathbf{w} + \mu<math>

where

<math> \,\!H = E \Lambda^{1/2}<math>

Thus, the output of this transformation has expectation

<math> \mathbb{E} \{\mathbf{x}\} = H \, \mathbb{E} \{\mathbf{w}\} + \mu = \mu<math>

and covariance matrix

<math> \mathbb{E} \{(\mathbf{x} - \mu) (\mathbf{x} - \mu)^T\} = H \, \mathbb{E} \{\mathbf{w} \mathbf{w}^T\} \, H^T = H \, H^T = E \Lambda^{1/2} \Lambda^{1/2} E^T = K_{xx}<math>

Whitening a random vector

The method for whitening a vector <math>\mathbf{x}<math> with mean <math>\mathbf{\mu}<math> and covariance matrix <math>K_{xx}<math> is to perform the following calculation:

<math>\mathbf{w} = \Lambda^{-1/2}\, E^T \, ( \mathbf{x} - \mathbf{\mu} )<math>

Thus, the output of this transformation has expectation

<math> \mathbb{E} \{\mathbf{w}\} = \Lambda^{-1/2}\, E^T \, ( \mathbb{E} \{\mathbf{x} \} - \mathbf{\mu} ) = \Lambda^{-1/2}\, E^T \, (\mu - \mu) = 0<math>

and covariance matrix

<math> \mathbb{E} \{\mathbf{w} \mathbf{w}^T\} = \mathbb{E} \{ \Lambda^{-1/2}\, E^T \, ( \mathbf{x} - \mathbf{\mu} )( \mathbf{x} - \mathbf{\mu} )^T E \, \Lambda^{-1/2}\, \}<math>
<math> = \Lambda^{-1/2}\, E^T \, \mathbb{E} \{( \mathbf{x} - \mathbf{\mu} )( \mathbf{x} - \mathbf{\mu} )^T\} E \, \Lambda^{-1/2}\,<math>
<math> = \Lambda^{-1/2}\, E^T \, K_{xx} E \, \Lambda^{-1/2}<math>

By diagonalizing <math>K_{xx}<math>, we get the following:

<math> \Lambda^{-1/2}\, E^T \, E \Lambda E^T E \, \Lambda^{-1/2} = \Lambda^{-1/2}\, \Lambda \, \Lambda^{-1/2} = I<math>

Thus, with the above transformation, we can whiten the random vector to have zero mean and the identity covariance matrix.

Random signal transformations

We can extend the same two concepts of simulating and whitening to the case of continuous time random signals or processes. For simulating, we create a filter into which we feed a white noise signal. We choose the filter so that the output signal simulates the 1st and 2nd moments of any arbitrary random process. For whitening, we feed any arbitrary random signal into a specially chosen filter so that the output of the filter is a white noise signal.

Simulating a continuous-time random signal

Missing image
Simulation-filter.png
White noise fed into a linear, time-invariant filter to simulate the 1st and 2nd moments of an arbitrary random process.

We can simulate any wide-sense stationary, continuous-time random process <math>x(t) : t \in \mathbb{R}\,\!<math> with constant mean <math>\mu<math> and covariance function

<math>K_x(\tau) = \mathbb{E} \left\{ (x(t_1) - \mu) (x(t_2) - \mu)^{*} \right\} \mbox{ where } \tau = t_1 - t_2<math>

and power spectral density

<math>S_x(\omega) = \int_{-\infty}^{\infty} K_x(\tau) \, e^{-j \omega \tau} \, d\tau<math>

We can simulate this signal using frequency domain techniques.

Because <math>K_x(\tau)<math> is Hermitian symmetric and positive semi-definite, it follows that <math>S_x(\omega) <math> is real and can be factored as

<math>S_x(\omega) = | H(\omega) |^2 = H(\omega) \, H^{*} (\omega) <math>

if and only if <math>S_x(\omega)<math> satisfies the Paley-Wiener criterion.

<math> \int_{-\infty}^{\infty} \frac{\log (S_x(\omega))}{1 + \omega^2} \, d \omega < \infty <math>

If <math>S_x(\omega)<math> is a rational function, we can then factor it into pole-zero form as

<math>S_x(\omega) = \frac{\Pi_{k=1}^{N} (c_k - j \omega)(c^{*}_k + j \omega)}{\Pi_{k=1}^{D} (d_k - j \omega)(d^{*}_k + j \omega)}<math>

Choosing a minimum phase <math>H(\omega)<math> so that its poles and zeros lie inside the left half s-plane, we can then simulate <math>x(t)<math> with <math>H(\omega)<math> as the transfer function of the filter.

We can simulate <math>x(t)<math> by constructing the following linear, time-invariant filter

<math>\hat{x}(t) = \mathcal{F}^{-1} \left\{ H(\omega) \right\} * w(t) + \mu <math>

where <math>w(t)<math> is a continuous-time, white-noise signal with the following 1st and 2nd moment properties:

<math> \mathbb{E}\{w(t)\} = 0<math>
<math> \mathbb{E}\{w(t_1)w^{*}(t_2)\} = K_w(t_1, t_2) = \delta(t_1 - t_2)<math>

Thus, the resultant signal <math>\hat{x}(t)<math> has the same 2nd moment properties as the desired signal <math>x(t)<math>.

Whitening a continuous-time random signal

Missing image
Whitening-filter.png
An arbitrary random process x(t) fed into a linear, time-invariant filter that whitens x(t) to create white noise at the output.

Suppose we have a wide-sense stationary, continuous-time random process <math>x(t) : t \in \mathbb{R}\,\!<math> defined with the same mean <math>\mu<math>, covariance function <math>K_x(\tau)<math>, and power spectral density <math>S_x(\omega)<math> as above.

We can whiten this signal using frequency domain techniques. We factor the power spectral density <math>S_x(\omega)<math> as described above.

Choosing the minimum phase <math>H(\omega)<math> so that its poles and zeros lie inside the left half s-plane, we can then whiten <math>x(t)<math> with the following inverse filter

<math>H_{inv}(\omega) = \frac{1}{H(\omega)}<math>

We choose the minimum phase filter so that the resulting inverse filter is stable. Additionally, we must be sure that <math>H(\omega)<math> is strictly positive for all <math>\omega \in \mathbb{R}<math> so that <math>H_{inv}(\omega)<math> does not have any singularities.

The final form of the whitening procedure is as follows:

<math>w (t) = \mathcal{F}^{-1} \left\{ H_{inv}(\omega) \right\} * (x(t) - \mu)<math>

so that <math>w(t)<math> is a white noise random process with zero mean and constant, unit power spectral density

<math>S_{w}(\omega) = \mathcal{F} \left\{ \mathbb{E} \{ w(t_1) w(t_2) \} \right\} = H_{inv}(\omega) S_x(\omega) H^{*}_{inv}(\omega) = \frac{S_x(\omega)}{S_x(\omega)} = 1<math>

Note that this power spectral density corresponds to a delta function for the covariance function of <math>w(t)<math>.

<math>K_w(\tau) = \,\!\delta (\tau)<math>

See also

External links

en:White noise ru:Белый шум sv:Vitt brus zh:白噪音

Navigation

  • Art and Cultures
    • Art (https://academickids.com/encyclopedia/index.php/Art)
    • Architecture (https://academickids.com/encyclopedia/index.php/Architecture)
    • Cultures (https://www.academickids.com/encyclopedia/index.php/Cultures)
    • Music (https://www.academickids.com/encyclopedia/index.php/Music)
    • Musical Instruments (http://academickids.com/encyclopedia/index.php/List_of_musical_instruments)
  • Biographies (http://www.academickids.com/encyclopedia/index.php/Biographies)
  • Clipart (http://www.academickids.com/encyclopedia/index.php/Clipart)
  • Geography (http://www.academickids.com/encyclopedia/index.php/Geography)
    • Countries of the World (http://www.academickids.com/encyclopedia/index.php/Countries)
    • Maps (http://www.academickids.com/encyclopedia/index.php/Maps)
    • Flags (http://www.academickids.com/encyclopedia/index.php/Flags)
    • Continents (http://www.academickids.com/encyclopedia/index.php/Continents)
  • History (http://www.academickids.com/encyclopedia/index.php/History)
    • Ancient Civilizations (http://www.academickids.com/encyclopedia/index.php/Ancient_Civilizations)
    • Industrial Revolution (http://www.academickids.com/encyclopedia/index.php/Industrial_Revolution)
    • Middle Ages (http://www.academickids.com/encyclopedia/index.php/Middle_Ages)
    • Prehistory (http://www.academickids.com/encyclopedia/index.php/Prehistory)
    • Renaissance (http://www.academickids.com/encyclopedia/index.php/Renaissance)
    • Timelines (http://www.academickids.com/encyclopedia/index.php/Timelines)
    • United States (http://www.academickids.com/encyclopedia/index.php/United_States)
    • Wars (http://www.academickids.com/encyclopedia/index.php/Wars)
    • World History (http://www.academickids.com/encyclopedia/index.php/History_of_the_world)
  • Human Body (http://www.academickids.com/encyclopedia/index.php/Human_Body)
  • Mathematics (http://www.academickids.com/encyclopedia/index.php/Mathematics)
  • Reference (http://www.academickids.com/encyclopedia/index.php/Reference)
  • Science (http://www.academickids.com/encyclopedia/index.php/Science)
    • Animals (http://www.academickids.com/encyclopedia/index.php/Animals)
    • Aviation (http://www.academickids.com/encyclopedia/index.php/Aviation)
    • Dinosaurs (http://www.academickids.com/encyclopedia/index.php/Dinosaurs)
    • Earth (http://www.academickids.com/encyclopedia/index.php/Earth)
    • Inventions (http://www.academickids.com/encyclopedia/index.php/Inventions)
    • Physical Science (http://www.academickids.com/encyclopedia/index.php/Physical_Science)
    • Plants (http://www.academickids.com/encyclopedia/index.php/Plants)
    • Scientists (http://www.academickids.com/encyclopedia/index.php/Scientists)
  • Social Studies (http://www.academickids.com/encyclopedia/index.php/Social_Studies)
    • Anthropology (http://www.academickids.com/encyclopedia/index.php/Anthropology)
    • Economics (http://www.academickids.com/encyclopedia/index.php/Economics)
    • Government (http://www.academickids.com/encyclopedia/index.php/Government)
    • Religion (http://www.academickids.com/encyclopedia/index.php/Religion)
    • Holidays (http://www.academickids.com/encyclopedia/index.php/Holidays)
  • Space and Astronomy
    • Solar System (http://www.academickids.com/encyclopedia/index.php/Solar_System)
    • Planets (http://www.academickids.com/encyclopedia/index.php/Planets)
  • Sports (http://www.academickids.com/encyclopedia/index.php/Sports)
  • Timelines (http://www.academickids.com/encyclopedia/index.php/Timelines)
  • Weather (http://www.academickids.com/encyclopedia/index.php/Weather)
  • US States (http://www.academickids.com/encyclopedia/index.php/US_States)

Information

  • Home Page (http://academickids.com/encyclopedia/index.php)
  • Contact Us (http://www.academickids.com/encyclopedia/index.php/Contactus)

  • Clip Art (http://classroomclipart.com)
Toolbox
Personal tools