Probability

The Analysis of Data, volume 1

Random Processes: Moments

6.3. Moments

In the case of random vectors, the expectation is a vector and the variance is a matrix. In the case of random processes, the expectation and variance become functions.

Definition 6.3.1. The expectation of a random process $\mathcal{X}=\{X_t:t\in J\}$ is the function $m:J\to\R$ defined by $m(t)=\E(X_t)$.
Definition 6.3.2. The variance of a random process $\mathcal{X}=\{X_t:t\in J\}$ is the function $v:J\to \R$ defined by $v(t)=\Var(X_t)$.
Definition 6.3.3. The autocorrelation function of a random process $\mathcal{X}=\{X_t:t\in J\}$ is the function $R:J\times J\to\R$ defined by $R(t_1,t_2)=\E(X_{t_1}X_{t_2})$.
Definition 6.3.4. The auto-covariance function of a random process $\mathcal{X}=\{X_t:t\in J\}$ is the function $C:J\times J\to\R$ defined by \begin{align*} C(t_1,t_2)&=\E((X_{t_1}-m(t_1))(X_{t_2}-m(t_2))= R(t_1,t_2)-m(t_1)m(t_2) \end{align*} where the second equality follows from the properties of expectation in Section 4.6.
Definition 6.3.5. The correlation-coefficient function of a random process $\mathcal{X}=\{X_t:t\in J\}$ is the function $\rho:J\times J\to\R$ defined by \begin{align*} \rho(t_1,t_2)&=\frac{C(t_1,t_2)}{\sqrt{v(t_1)v(t_2)}}. \end{align*}
Example 6.3.1. For the random process $X_t=Y\cos(2\pi t)$ in Example 6.2.1, \begin{align*} m(t)&=\E(Y\cos(2\pi t))=\E(Y)\cos(2\pi t)\\ R(t_1,t_2)&=\E(Y\cos(2\pi t_1)Y\cos(2\pi t_2))=\E(Y^2)\cos(2\pi t_1)\cos(2\pi t_2)\\ C(t_1,t_2)&=R(t_1,t_2)-m(t_1)m(t_2) =(\E(Y^2)-(\E(Y))^2)\cos(2\pi t_1)\cos(2\pi t_2). \end{align*}
Example 6.3.2. For the iid RP (see Definition 6.2.2), \begin{align*} m(t)&=\E(X_t)=\E(X_t)=\mu\\ C(t_1,t_2) &= \E((X_{t_1}-\mu)(X_{t_2}-\mu))= \begin{cases} 0 & t_1\neq t_2\\ \Var(X_{t_1})=\sigma^2 & t_1=t_2 \end{cases}\\ R(t_1,t_2)&=C(t_1,t_2)+ m(t_1)m(t_2)=\delta_{t_1,t_2}\sigma^2 + \mu^2 \end{align*} where $\mu,\sigma^2$ are the expectation and variance associated with the cdf $F$, and $\delta_{ij}=1$ if $i=j$ and 0 otherwise.
Example 6.3.3. For the iid RP with $F=\text{Ber}(\theta)$ (see Chapter 3), we have \begin{align*} m(t)&=\theta\\ v(t) &= \theta(1-\theta)\\ C(t_1,t_2)&=\delta_{t_1,t_2}\theta(1-\theta),\\ R(t_1,t_2) &=\delta_{t_1,t_2}\theta(1-\theta)+\theta^2. \end{align*}

Recall that given a random vector $\bb X$ we can define a new random vector $\bb Y$ that is a function of it. The same also holds for random processes.

Example 6.3.4. Consider the iid Process $\mathcal{X}=\{X_t:t\in J\}$ with $F=\text{Ber}(\theta)$ and define the iid process $\mathcal{Y}=\{Y_t:t\in J\}$, $Y_t=2X_t-1$. The RVs $Y_t$ takes on value 1 with probability $\theta$ and value -1 with probability $1-\theta$, resulting in \begin{align*} m(t) &= \E(2X-1)=2\theta-1\\ \Var(Y_t)&=\Var(2X_t-1)=4\Var(X_t)=4\theta(1-\theta)\\ C(t_1,t_2)&=\delta_{t_1,t_2}4\theta(1-\theta)\\ R(t_1,t_2)&=\delta_{t_1,t_2}4\theta(1-\theta)+(2\theta-1)^2. \end{align*}
Definition 6.3.5. Two processes $\mathcal{X},\mathcal{Y}$ are independent if for all $k,l\in\mathbb{N}$ and for all $t_1,\ldots,t_k$ and $t_1',\ldots,t_l'$ \begin{multline*} F_{X_{t_1},\ldots,X_{t_k},Y_{t_1'},\ldots,Y_{t_l'}}(r_1,\ldots,r_k,s_1,\ldots,s_l) = F_{X_{t_1},\ldots,X_{t_k}}(r_1,\ldots,r_k) F_{Y_{t_1'},\ldots,Y_{t_l'}}(s_1,\ldots,s_l). \end{multline*}
Definition 6.3.6. The cross-correlation of the processes $\mathcal{X},\mathcal{Y}$ is $R_{\mathcal{X},\mathcal{Y}}(t_1,t_2)=\E(X_{t_1}Y_{t_2})$. If it is always zero, the two processes are orthogonal. The cross-covariance of the two processes is \[C_{\mathcal{X},\mathcal{Y}}(t_1,t_2)=\E((X_{t_1}-m_\mathcal{X}(t_1))(Y_{t_2}-m_\mathcal{Y}(t_2)).\] If it is always zero, the two processes are uncorrelated.
Definition 6.3.7. An RP $\mathcal{X}$ with $J=\R$ or $J=\mathbb{N}$ is wide sense stationary (WSS) if its mean function $m(t)$ is constant and its autocorrelation $R(t,s)$ is a function only of $|s-t|$, or in other words $R(t,s)=R(t+\tau,s+\tau)$. In this case, we can characterize the auto-correlation function using the function $R:\R\to\R$: \[R(\tau)\defeq R(t,t+\tau).\]

In the case of WSS processes, the $R(\cdot)$ function satisfies the following properties.

  1. We have $R(0) = \E(X_t^2) \geq 0$, implying that the second moment function is constant for all $t$.
  2. The function $R(\cdot)$ is even: \[R(\tau) = \E(X_{t}X_{t+\tau}) = \E(X_{t+\tau}X_t)=R(-\tau).\]
  3. Using the Cauchy-Schwartz inequality (Proposition B.4.1) for the inner product $g(h_1,h_2)=\E(h_2h_2)$, we have \[(R(\tau))^2 = (\E(X_{t+\tau}X_t))^2\leq \E((X_{t+\tau})^2)\E((X_t))^2 = (R(0))^2,\] implying that $R(\tau)$ attains its maximum at $\tau=0$.