2018年12月29日星期六

Levy’s equivalence theorem

We often says that the convergence almost surely, convergence in probability and convergence in law cannot be mixed and the first one implies the second one, which implies the third one. But in some situation, these three claims can be the same. One very famous situation is the Levy's equivalence theorem : We set $(X_m)_{m \geq 1}$ a family of independent random variables and we define $S_n = \sum_{m=1}^n X_m$, then convergence in law can imply the convergence almost surely !

This seems a little incredible and such an amazing theorem does not often appear in a general textbook. Maybe it is for the reason that such an easy theorem requires sometimes technical proof. Here, I give a personal argument using many big theorems.

The key idea is to use the Kolmogorov's three-series theorem. It is an equivalent criteria for that $S_n$ converges almost surely. The three conditions are to define that $Y_m = X_m \mathbf{1}_{|X_m| \leq A}$ and
$$
\sum_{m=1}^{\infty} \mathbb{P}[|X_m|>A] < \infty,  \quad \sum_{m=1}^{\infty}\mathbb{E}[Y_m]\text{ converges }, \quad \sum_{m=1}^{\infty}\mathbf{Var}[Y_m] < \infty.
$$

For the first one, we put the random variables in Skorohod representation theorem, and say that one is necessary to say the fluctuation is not so often.

For the third one, we argue by absurd and put it in the Lindeberg Feller central limit theorem, to say that if the variance goes to infinite, then
$$
\frac{\sum_{m}^n Y_m - \mathbb{E}[Y_m]}{\sqrt{\sum_{m=1}^{n}\mathbf{Var}[Y_m]}} \Rightarrow \mathcal{N}(0,1),
$$
and we know $\sum_{m}^{\infty} Y_m$ also converges in law to some random variable. Thus, the normalization seems too stronger and we obtain that  $\frac{\sum_{m}^n  - \mathbb{E}[Y_m]}{\sqrt{\sum_{m=1}^{n}\mathbf{Var}[Y_m]}} \Rightarrow \mathcal{N}(0,1),$ which is a contradiction.

Finally, to prove that the expectation is finite, we still argue by absurd. We go back to the very basic definition to test it by a function. However, we have a intuition that $\sum_{m=1}^{\infty}X_m \simeq \sum_{m=1}^{\infty}Y_m$ which will go to infinite, because that its drift is infinite while its variance is finite. This is the situation of vague convergence.

So we prove the three conditions and prove this theorem.