## Equivalent Definitions of Multivariate Normal Distribution

In the following, we are going to give three definitions of multivariate normal distribution and show that these definitions are equivalent to each other.
The explicit and transparent definition.\\
**Definition 1** A random vector $X$ has a non-degenerate (multivariate) normal distribution if it has a joint density function of the form
$$
f_X(x)=\sqrt{\frac{1}{(2\pi)^n \text{det}(Q) }}\text{exp}\big\{-\frac{1}{2} (x-\mu)^t Q^{-1} (x-\mu)\big\}
$$
for some real vector $\mu$ and some positive deﬁnite matrix $Q$ .
The constructive definition.\\
**Definition 2** A random vector $X$ has a (multivariate) normal distribution
if it can be expressed in the form
$$
X = UY + \mu,
$$
for some matrix $U$ and some real vector $\mu$, where $Y$ is a random vector
whose components are independent $N(0, 1)$ random variables.
The elegant definition.\\
**Definition 3** A random vector $X$ has a (multivariate) normal distribution
if for every real vector $a$, the random variable $a' X$ is normal.
//Proofs of the equivalence// First, by the definition of density function and a variable change, we also see that **Definition 2** is equivalent to **Definition 1**.
It is also easy to see that **Definition 2** implies **Definition 3**, and we thus only need to show **Definition 3** implies **Definition 2**.
Let $E(X_i)=\mu_i$ and $\text{Cov}(X_i,X_j)=B_{ij}$, then by direction computation, we find that
$$
(1)\hspace{30pt}E(a'X)=a'\mu,\hspace{10pt} \text{Cov}(a'X)=\text{Cov}(a'X, a'X)=a'Ba,
$$
where $\mu=(\mu_1,\cdots,\mu_n)'$ and $B=(B_{ij})$ is a symmetric matrix. Since Cov$(a'X)$ is always positive, we see from $(1)$ that $B$ is positive definite. Now suppose that $B=U^{-1}U^{'-1}$ where $U$ is non-singular orthogonal matrix, and let $Y = U(X-\mu)$, then $E(Y)=0$ and
$$
(2)\hspace{20pt}\text{Cov}(Y_i,Y_j)=\text{Cov}(U_{il}(X_l-\mu_l), U_{jk}(X_k-\mu_k))=\text{Cov}(U_{il}X_l, U_{jk}X_k)=U_{il}B_{lk}U_{jk}=\delta_{ij},
$$
where implicitly we have summation over $l,k$. From (2), we conclude that $Y$ is a random vector whose components are $N(0,1)$ random variables, and the co-variance of
each two components is $0$, and furthermore, $a'Y$ is also normal. We need the following result to complete our proof.
**Lemma 1** If $Y_1, Y_2$ are both $N(0,1)$ random variables with co-variance $0$, and $a_1Y_1+a_2Y_2$ is normal for all $a_1,a_2$. Then $Y_1$ and $Y_2$ are independent.
By **Lemma 1**, we can conclude that $Y$ is a random vector
whose components are independent $N(0, 1)$ random variables. $\square$
**Remark 1** The proof of **Lemma 1** may be achieved by using characteristic function.
=== References
#Fundamentals of Probability - MIT OpenCourseWare, [[http://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-436j-fundamentals-of-probability-fall-2008/lecture-notes/MIT6_436JF08_lec15.pdf|MULTIVARIATE NORMAL DISTRIBUTIONS]]
#Jin-Ting Zhang, Lecture Notes, [[http://www.stat.nus.edu.sg/~zhangjt/teaching/ST4233/Lecture/ch3.pdf|Linear Models - Multivariate Normal Distribution]]