Probability
The Analysis of Data, volume 1
Random Vectors: Moment Generating Function
$
\def\P{\mathsf{\sf P}}
\def\E{\mathsf{\sf E}}
\def\Var{\mathsf{\sf Var}}
\def\Cov{\mathsf{\sf Cov}}
\def\std{\mathsf{\sf std}}
\def\Cor{\mathsf{\sf Cor}}
\def\R{\mathbb{R}}
\def\c{\,|\,}
\def\bb{\boldsymbol}
\def\diag{\mathsf{\sf diag}}
\def\defeq{\stackrel{\tiny\text{def}}{=}}
$
4.8. Moment Generating Function
Proposition 4.8.1.
If $X_1,\ldots,X_n$ are independent RVs with mgfs $m_{X_i},\ldots,m_{X_n}$, then the mgf of their sum is
\[ m_{\sum_{i=1}^n X_i} (t) = \prod_{i=1}^n m_{X_i}(t)\]
Proof.
\begin{align*}
m_{\sum_{i=1}^n X_i} (t)
&= \E\left(\exp\left( t\sum_{i=1}^n X_i\right)\right) \\
&= \E\left(\prod_{i=1}^n \exp(tX_i)\right) \\
&=\prod_{i=1}^n \E(\exp(tX_i))\\
&=\prod_{i=1}^n m_{X_i}(t).
\end{align*}
Definition 4.8.1.
We define the moment generating function of a random vector $\bb{X}$ as follows
\[ m_{\bb{X}}:\mathbb{R}^n\to \mathbb{R}, \qquad m_{\bb{X}}(\bb{t}) = \E(\exp(\bb{t}^{\top}\bb{X})). \]
As in the one dimensional case the mgf uniquely characterizes the distribution of the random vector.
Proposition 4.8.2.
Suppose that $\bb{X}$, $\bb{Y}$ are two random vectors whose mgfs $m_{\bb{X}}(\bb{t})$, $m_{\bb{Y}}(\bb{t})$ exist for all $\bb{t}\in B_{\epsilon}(\bb 0)$ for some $\epsilon>0$.
If $m_{\bb{X}}(\bb{t})=m_{\bb{Y}}(\bb{t})$ for all $\bb{t}\in B_{\epsilon}(\bb 0)$, then the cdfs of $\bb X$ and $\bb Y$ are identical (and consequentially also the pdf or pmf functions are identical).
Proof.
The proof is similar to the proof of the one dimensional case (Proposition 2.4.2).