The Analysis of Data, volume 1

Modes of Convergence

8.1. Modes of Stochastic Convergence

We consider in this chapter several important limit theorems. We start by exploring different types of convergences, and then move on to the law of large numbers and the central limit theorem. We emphasize the multivariate case of random vectors with $d>1$, but for the sake of intuition it is useful to keep the univariate case in mind.

We list below the three major types or modes of convergences associated with random vectors.

Definition 8.1.1. Let $\bb{X}^{(n)}, n\in\mathbb{N}$ be a sequence of random vectors and $\bb{X}$ be a random vector.

We make the following comments.

  1. In the definitions above, the limit RV $\bb X$ may be deterministic, in other words $\bb X=\bb c \in\R^d$ with probability 1. In this case we use notations such as $X^{(n)}\tooas c$ in the one dimensional case or ${\bb X}^{(n)}\tooas \bb c$ in higher dimensions.
  2. There is a fundamental difference between convergence in distribution and the other two types of convergence. Convergence in distribution merely implies that the distribution of $\bb{X}^{(n)}$ is similar to that of $\bb{X}$ for large $n$. Specifically, it does not say anything about $\bb{X}^{(n)}$ and $\bb{X}$ taking on similar values with high probability. Convergence in probability and convergence with probability 1 imply that for large $n$, the values of $\bb{X}^{(n)}$ and $\bb{X}$ are similar (see the following example).
  3. The following section shows that convergence with probability one implies convergence in probability, which in turn implies convergence in distribution. The converse is not true in general.
Example 8.1.1. If ${X}$ and ${X}^{(n)}, n\in\mathbb{N}$ are independent uniform RVs in $[a,b]$, we have ${X}^{(n)}\tood {X}$ since the distribution of all RVs is identical. But we certainly do not have convergence in probability or with probability 1 since the RVs are independent and typically take on substantially different values.