**Characteristic functions of random variables**

The charcteristic function of a random variable completely defines the probability distribution of that variable. On the real numbers, it is given for a random variable X as . If X has a probability density function , then we have

Every distribution on the real line has a characteristic function, and each characteristic function corresponds to one, and only one, probability distribution.

(note the similarity between the characteristic function and the Fourier transform.)

An important theorem involving the characteristic function is the Lévy continuity theorem. This basically states that for an infinite sequence of random variables with corresponding characteristic functions , if the sequence of characteristic functions converge (pointwise) to a function which is continuous at t=0, then the sequence of random variables converges (in distribution) to the random variable X whose characteristic function is .

The characteristic functions have many uses with regard to independent random variables. First, consider independent random variables X and Y. Then the characteristic function of X+Y is:

As X and Y are independent, , and so

Similarly, for n independent random variables and n constants , then for , the characteristic function of S is .

For example, if the all have the same distribution and the constants are all , then S is the sample mean , and so

By expanding the exponential in its Maclaurin series in the definition of the characteristic function, we see that

. where the (n) exponent means the nth deriviative with repect to t. Thus we can find the moments.

Thus, if we have a random variable X with mean μ and variance σ^{2}, then by Taylor’s theorem, as . Thus if we have any random variable Y with zero mean and unit variance, then as . Note that for any random variable X with mean μ and variance σ^{2}, is just such a variable.

For n observations of a random variable X, named X_{1}, X_{2}, X_{3}, …,X_{n}, the standardized value of each of these is .

The standardized mean is given by

Thus

Now, recall that . Thus, as n increases without bound, the distribution of Z_{n} approaches that with the characteristic function , via the Lévy continuity theorem.

Now, the characteristic function of the normal (Gaussian) distribution of mean μ and variance σ^{2} is given by . Thus we see that the standard normal distribution, the normal distribution with μ=0 and σ^{2}=1, has the desired characteristic function.

This result, that for sufficiently large n, the mean of n samples of any random variable for which the first and second moments (and thus mean and variance) exist approaches a normal distribution, is called the “Central Limit Theorem,” and is one of the fundamental theorems of probability theory. Another, the law of large numbers, which can also be proved using characteristic functions.

Tags: Math, Monday Math

February 15, 2008 at 9:53 am |

as a biologist with very little math background, that may be one of the scariest posts I have ever seen.

June 21, 2010 at 12:10 am |

[…] Math 124 By twistedone151 The central limit theorem tells us that for sufficiently large n, the mean of n samples of any random variable for which the […]