## Monday Math 7

Characteristic functions of random variables

The charcteristic function of a random variable completely defines the probability distribution of that variable. On the real numbers, it is given for a random variable X as . If X has a probability density function , then we have

Every distribution on the real line has a characteristic function, and each characteristic function corresponds to one, and only one, probability distribution.
(note the similarity between the characteristic function and the Fourier transform.)

An important theorem involving the characteristic function is the Lévy continuity theorem. This basically states that for an infinite sequence of random variables  with corresponding characteristic functions , if the sequence of characteristic functions converge (pointwise) to a function  which is continuous at t=0, then the sequence of random variables converges (in distribution) to the random variable X whose characteristic function is .

The characteristic functions have many uses with regard to independent random variables. First, consider independent random variables X and Y. Then the characteristic function of X+Y is:

As X and Y are independent, , and so 
Similarly, for n independent random variables  and n constants , then for , the characteristic function of S is .
For example, if the  all have the same distribution and the constants are all , then S is the sample mean , and so


By expanding the exponential in its Maclaurin series in the definition of the characteristic function, we see that
. where the (n) exponent means the nth deriviative with repect to t. Thus we can find the moments.

Thus, if we have a random variable X with mean μ and variance σ2, then by Taylor’s theorem,  as . Thus if we have any random variable Y with zero mean and unit variance, then  as . Note that for any random variable X with mean μ and variance σ2,  is just such a variable.
For n observations of a random variable X, named X1, X2, X3, …,Xn, the standardized value of each of these is .
The standardized mean is given by


Thus 

Now, recall that . Thus, as n increases without bound, the distribution of Zn approaches that with the characteristic function , via the Lévy continuity theorem.
Now, the characteristic function of the normal (Gaussian) distribution of mean μ and variance σ2 is given by . Thus we see that the standard normal distribution, the normal distribution with μ=0 and σ2=1, has the desired characteristic function.
This result, that for sufficiently large n, the mean of n samples of any random variable for which the first and second moments (and thus mean and variance) exist approaches a normal distribution, is called the “Central Limit Theorem,” and is one of the fundamental theorems of probability theory. Another, the law of large numbers, which can also be proved using characteristic functions.