Convergence Almost Sure

Let be a probability space, be a sequence of random variables, be another random variable. The sequence is said to converge almost surely to , denoted by , if:

P \left( \omega \in \Omega : \lim_{n \to \infty} X_n(\omega) = X(\omega) \right) = 1$$

Equivalently, this means that the probability of the set of outcomes for which does not converge to is zero:

P \left( \bigcup_{k=1}^\infty \bigcap_{n=k}^\infty { \omega \in \Omega : |X_n(\omega) - X(\omega)| < \epsilon } \right) = 1, \quad \text{for all } \epsilon > 0

>Or, in terms of the lim sup of the set of deviations: >$$ P \left( \limsup_{n \to \infty} \{ \omega : |X_n(\omega) - X(\omega)| \geq \epsilon \} \right) = 0, \quad \text{for all } \epsilon > 0