previous up next print clean
Next: CONFIDENCE INTERVALS Up: Resolution Previous: TIME-FREQUENCY-STATISTICAL RESOLUTION

THE CENTRAL-LIMIT THEOREM

The central-limit theorem of probability and statistics is perhaps the most important theorem in the fields of probability and statistics. A derivation of the central limit theorem explains why the gaussian probability function is so frequently encountered in nature; not just in physics but also in the biological and social sciences. No experimental scientist should be unaware of the basic ideas behind this theorem. Although the result is very deep and is even today the topic of active research, we can get to the basic idea quite easily.

One way to obtain random integers from a known probability function is to write integers on slips of paper and place them in a hat. Draw one slip at a time. After each drawing replace the slip in the hat. The probability of drawing the integer i is given by the ratio ai of the number of slips containing the integer i divided by the total number of slips. Obviously the sum over i of ai must be unity. Another way to get random integers is to throw one of a pair of dice. Then all ai equal zero except $a_1 = a_2 = a_3 = a_4 = a_5 = a_6 = {1 \over 6}$.The probability that the integer i will occur on the first drawing and the integer j will occur on the second drawing is ai aj. If you draw two slips or throw a pair of dice, then the probability that the sum of i and j equals k is readily seen to be  
 \begin{displaymath}
c_k \eq \sum_i a_i a_{k-i}\end{displaymath} (53)
Since this equation is a convolution, we may look into the meaning of the Z transform  
 \begin{displaymath}
A(Z) \eq \cdots a_{-1} Z^{-1} + a_0 + a_1 Z + a_2 Z^2 + \cdots\end{displaymath} (54)
In terms of Z transforms the probability that i plus j equals k is simply the coefficient of Zk in  
 \begin{displaymath}
C(Z) \eq A(Z)\, A(Z)\end{displaymath} (55)
Obviously, if we add n of the random numbers, the probability that the sum of them equals k is given by the coefficient of Zk in  
 \begin{displaymath}
G(Z) \eq [A(Z)]^n\end{displaymath} (56)
The central-limit theorem of probability says that as n does to infinity the polynomial G(Z) goes to a special form, almost regardless of the specific polynomial A(Z). The specific form is such that a graph of the coefficients of G(Z) comes closer and closer to fitting under the envelope of the bell-shaped gaussian function. Let us see why this happens. Our development will lack a mathematical rigor because the theorem is not always true. There are pathological A functions which do not result in G tending to gaussian. Despite the fact that some of the pathological functions sometimes turn up in applications, we will not take the time here to look at such instances.

Consider the size of A(Z) for real $\omega$.If $\omega = 0$, the sum of the terms of A(Z) may be visualized in the complex plane as a sum of vectors $a_k e^{i\omega k}$ all pointing in the positive real direction. If $\omega \neq 0$ the vectors point in different directions. This is shown in Figure 9.

 
4-8
Figure 9
The complex numbers $a_k e^{i\omega k}$ added together.
4-8
view

In raising $A(e^{i\omega})$ to the nth power, the values of $\omega$of greatest concern are those near $\omega = 0$ where A is largest--because in any region where A is small An will be extremely small. Near $\omega = 0$ or Z = 1 we may expand A(Z) in a power series in $\omega$ 
 \begin{displaymath}
A(e^{i\omega}) \eq
 \left.A\right\vert _0 +
 \left.{\partial...
 ...\partial \omega^2} \right\vert _0 {\omega^2 \over 2!}
 + \cdots\end{displaymath} (57)
Note that the coefficients of this power series are proportional to the moments mi of the probability function; that is
               \begin{eqnarray}
A(e^{i\omega}) &= & \sum_k a_k e^{ik\omega}
\\ A(1) &= & \sum_k...
 ... \over \partial \omega^2} \right\vert _0 &= & -\sum k^2 a_k = -m_2\end{eqnarray} (58)
(59)
(60)
(61)
(62)
When we raise A(Z) to the nth power we will make the conjecture that only the given first three terms of the power series expansion will be important. (This assumption clearly fails if any of the moments of the probability function are infinite.) Thus, we are saying that as far as G is concerned the only important things about A are its mean value m = m1 and its second moment m2. If this is really so, we may calculate G by replacing A with any function B having the same mean and same second moment as A. We may use the simplest function we can find. A good choice is the so called binomial probability function given by
      \begin{eqnarray}
B &=& {Z^m (Z^{\sigma} + Z^{-\sigma}) \over 2}
\\  &=& {e^{i(m + \sigma)\omega} + e^{i(m - \sigma)\omega} \over 2} \end{eqnarray} (63)
(64)
Let us verify its first moment
      \begin{eqnarray}
{\partial B \over \partial \omega} &=& { i\left[ (m + \sigma) \...
 ...\  \left. {\partial B \over \partial \omega} \right\vert _0 &=& im\end{eqnarray} (65)
(66)
Now let us verify its second moment
      \begin{eqnarray}
\left.{\partial^2 B \over \partial \omega^2} \right\vert _0 &=&...
 ...^2 B \over \partial \omega^2} \right\vert _0 &=& -(m^2 + \sigma^2)\end{eqnarray} (67)
(68)
Hence, $\sigma$ should be chosen so that  
 \begin{displaymath}
m_2 \eq m^2 + \sigma^2\end{displaymath} (69)
Of course, we cannot expect that m and $\sigma$ will necessarily turn out to be integers; therefore (63) will not necessarily be a Z transform in the usual sense. It does not really matter; we simply interpret (63) as saying:

1.
The probability of drawing the number $m + \sigma$ is one-half.
2.
The probability of $m - \sigma$ is one-half.

3.
The probability of any other number is zero.

Now, raising $(Z^{\sigma} + Z^{-\sigma})$ to the nth power gives a series in powers of $Z^{\sigma}$ whose coefficients are symmetrically distributed about Z to the zero power and whose magnitudes are given by the binomial coefficients. A sketch of the coefficients of B(Z)n is given in Figure 10.

 
4-9
4-9
Figure 10
Coefficients of binomial raised to power
view

We will now see how, for large n, the binomial coefficients asymptotically approach a gaussian. Approaching this limit is a bit tricky. Obviously, the sum of n random integers will diverge as $\sqrt{n}$.Likewise the coefficients of powers of Z in $({1 \over 2} + {1 \over 2} Z)^n$individually get smaller while the number of coefficients gets larger. We recall that in time series analysis we used th substitution $Z = e^{i\omega \, \Delta t}$.We commonly chose $\Delta t = 1$,which had the meaning that data points were given at integral points on the time axis. In the present probability theory application of Z transforms, the choice $\Delta t = 1$ arises from our original statement that the numbers chosen randomly from the slips of paper were integers. Now we wish to add n of these random numbers together; and so, it makes sense to rescale the integers to be integers divided by $\sqrt{n}$.Then we can make the substitution $Z = e^{i\omega \, \Delta t} = e^{i\omega/\sqrt{n}}$.The coefficient of Zk now refers to the probability of drawing the number $k/\sqrt{n}$.Raising $(Z^{\sigma} + Z^{-\sigma})/2$ to the nth power to find the probability distribution for the sum of n
\begin{eqnarraystar}[B(Z)]
^n &=& \left( {Z^{\sigma} + Z^{-\sigma} \over 2} \rig...
 ...&=& \left( {\cos {\sigma \omega \over \sqrt{n}}} \right)^n \\ \end{eqnarraystar}
Using the first term of the series expansion for cosine we have

\begin{displaymath}[B(Z)]
^n \approx \left( 1 - {\sigma^2 \omega^2 \over 2n} \right)^n\end{displaymath}

Using the well-known fact that $(1 + x/n)^n \rightarrow e^x$, we have for large n  
 \begin{displaymath}[B(Z)]
^n \approx e^{-\sigma^2 \omega^2/2}\end{displaymath} (70)

The probability that the number t will result from the sum is now found by inverse Fourier transformation of (70). The Fourier transform of the gaussian (70) may be looked up in a table of integrals. It is found to be the gaussian

\begin{displaymath}
p(t) \eq {1 \over \sigma \sqrt{2\pi}}\, e^{-t^2/2\sigma^2}\end{displaymath}


previous up next print clean
Next: CONFIDENCE INTERVALS Up: Resolution Previous: TIME-FREQUENCY-STATISTICAL RESOLUTION
Stanford Exploration Project
10/30/1997