↑ Return to References

Expected Value

\[\]

This section covers the formulas for calculating the expectation of a random variable, variance of a random variable, and the covariance between two random variables. The expectation of the random variable, if it exists, is the mean for that random variable. The formulas in this section are especially helpful to understand for people interested in courses concerning finance and investing.

1. General case

\[E(X)=\mu_x \]


\[Var(X)=E \left [ (X-\mu_x)^2 \right ]=\sigma^2_x \]


\[Cov(X,Y)=E \left [ (X-\mu_x)(Y-\mu_y) \right ]=\sigma_{xy} \]

2. Continuous Case

\[E(X) = \int_{-\infty}^{\infty}{xf(x)dx}=\mu \]


\[Var(X) = \int_{-\infty}^{\infty}{(x-\mu)^2f(x)dx} = \int_{-\infty}^{\infty}{x^2f(x)dx}-\mu^2=\sigma^2 \]


\[Cov(X,Y)=\int_{-\infty}^{\infty}\int_{-\infty}^{\infty}{(x-\mu_x)(y-\mu_y)f(x,y)dxdy}=\sigma_{xy}\]

3. Discrete case The following sections will cover only the discrete case for a deeper understanding of the latter formulas presented. For discrete random variables summation is required to solve for expectation, variance and covariance. For continuous random variables integration is required, which is beyond the scope of this text.

\begin{equation}
\label{eofx}
E(X)=\sum \pi_ix_i=\mu_x
\end{equation}

\[Var(X)=\sum \pi_i[(x_i-\mu_x)^2]=\sum \pi_ix^2_i-\left(\sum \pi_ix_i\right)^2= \sigma^2_x\]


\[Cov(X,Y)=\sum \pi_i[(x_i-\mu_x)(y_i-\mu_y)]=\sum \pi_ix_iy_i-\left( \sum \pi_ix_i \sum \pi_iy_i \right) \]


\[Cov(X,Y)=\sigma_{xy} \]

where $\pi_i$ is the probability of outcome $i$ and $x_i$ is the outcome.
Example $i$, could refer to recession, stable economy, expanding economy. The $\pi_i$ could be the given probabilities of each and $x_i$ could be the profit/loss of a mutual fund given the situation. Note: $\sum_{i} \pi_i=1$.

4. Expectation and Mean

Let $c$ be a constant, for example let $c=5$.

\[E[X]=\sum \pi_i(x_i)=\mu_x\]


\[E[cX]=\sum \pi_i(cx_i)=c\sum \pi_i(x_i)=c\mu_x\]


\[E[5X]=\sum \pi_i(5(x_i))=5\sum \pi_i(x_i)=5\mu_x\]


\[E[X+c]=\sum \pi_i(x_i+c)=\sum \pi_i(x_i)+\sum \pi_i(c)=\mu_x+c\sum \pi_i=\mu_x+c\]


\[E[X+5]=\sum \pi_i(x_i+5)=\sum \pi_i(x_i)+\sum \pi_i(5)=\mu_x+5\sum \pi_i=\mu_x+5\]


If you have a summation of something over $i$ that doesn't change for any of the $i$, for example a constant $c$, or $5$ it can be moved outside of the summation, as was done above.

5. Expectation and Variance

Let $c$ be a constant, for example let $c=5$.

\[Var(X)=\sum \pi_i[(x_i-\mu_x)^2]=\sigma^2_x\]


\begin{equation*}
Var(cX)=\sum \pi_i[(c(x_i)-c\mu_x)^2]=\sum \pi_i[c((x_i)-\mu_x)^2]=c^2\sum \pi_i[((x_i)-\mu_x)^2]=c^2\sigma^2_x
\end{equation*}
\begin{eqnarray*}
Var(5X)&=&\sum \pi_i[(5(x_i)-5\mu_x)^2]=\sum \pi_i[5((x_i)-\mu_x)^2]
&=&5^2\sum \pi_i[((x_i)-\mu_x)^2]=5^2\sigma^2_x=25\sigma^2_x
\end{eqnarray*}

\begin{eqnarray*}
Var(X+c)&=&\sum \pi_i[((x_i+c)-(\mu_x+c))^2]=\sum \pi_i[((x_i)+c-\mu_x-c)^2]
&=&\sum \pi_i[((x_i)-\mu_x)^2]=\sigma^2_x
\end{eqnarray*}
\begin{eqnarray*}
Var(X+5)&=&\sum \pi_i[((x_i+5)-(\mu_x+5))^2]=\sum \pi_i[((x_i)+5-\mu_x-5)^2]
&=&\sum \pi_i[((x_i)-\mu_x)^2]=\sigma^2_x
\end{eqnarray*}

6. Expectation and Covariance

Let $c_1$ be a constant, for example let $c_1=5$ and $c_2$ be another constant, for example let $c_2=7$.

\[Cov(X,Y)=\sum \pi_i[(x_i-\mu_x)(y_i-\mu_y)] \]


\begin{eqnarray*}
Cov(c_1X,c_2Y)&=&\sum \pi_i[(c_1x_i-c_1\mu_x)(c_2y_i-c_2\mu_y)]
&=&c_1 c_2\sum \pi_i[(x_i-\mu_x)(y_i-\mu_y)]=c_1 c_2Cov(X,Y)
\end{eqnarray*}
\begin{eqnarray*}
Cov(5X,7Y)&=&\sum \pi_i[(5x_i-5\mu_x)(7y_i-7\mu_y)]
&=&5*7\sum \pi_i[(x_i-\mu_x)(y_i-\mu_y)]=35 Cov(X,Y)
\end{eqnarray*}

\begin{eqnarray*}
Cov(X+c_1,Y+c_2)&=&\sum \pi_i[\left( (x_i+c_1)-(\mu_x+c_1) \right) \left( (y_i+c_2)-(\mu_y+c_2) \right) ]
&=& \sum \pi_i[(x_i-\mu_x)(y_i-\mu_y)] =Cov(X,Y)
\end{eqnarray*}

\begin{eqnarray*}
Cov(X+5,Y+7)&=&\sum \pi_i[\left( (x_i+5)-(\mu_x+5) \right) \left( (y_i+7)-(\mu_y+7) \right) ]
&=&\sum \pi_i[(x_i-\mu_x)(y_i-\mu_y)] =Cov(X,Y)
\end{eqnarray*}

The addition of constants cancel each other out as in the variance formula.

7. Expectation and Variance of a Weighted Sum

Let $Z=X+Y$ so:

\[E[Z]=\mu_z\]


by definition, but it can broken into its components $X$ and $Y$.

\[E[Z]=E[X+Y]=\sum \pi_i(x_i+y_i)=\sum \pi_i(x_i)+\sum \pi_i(y_i)=\mu_x+\mu_y \]

Recall:

\[ (a+b)^2=a(a+b)+b(a+b)=a^2+2ab+b^2\]


We will need this to understand $Var(c_1X+c_2Y)$
Again let $Z=X+Y$

\[Var(Z)=E[(Z-\mu_z)^2] =\sigma^2_z\]


but you may only be able to solve for $Z$ from the $X$ and $Y$.

\[Var(Z)=E[(Z-\mu_z)^2]=E[((X+Y)-(\mu_x+\mu_y))^2]= \]


Think of $(X+Y)$ as $a$ and $(\mu_x+\mu_y)$ as $b$.

\[ E[(X+Y)^2-2(X+Y)(\mu_x+\mu_y)+(\mu_x+\mu_y)^2]=\]


Again using $(a+b)^2=a^2+2ab+b^2$

\[E[X^2+2XY+Y^2-2X\mu_x-2X\mu_y-2Y\mu_x-2Y\mu_y+\mu_x^2+2\mu_x\mu_y+\mu_y^2]=\]


Yes, a big mess but the terms can be regrouped and come up with the following:
\begin{eqnarray*}
&=&E \left[ (X^2-2X\mu_x+\mu_x^2)+(Y^2-2Y\mu_y+\mu_y^2)+2(XY-X\mu_y-Y\mu_x+\mu_x\mu_y) \right ]
&=&E \left [ (X-\mu_x)^2 \quad + \quad (Y-\mu_y)^2 \quad + \quad 2(X-\mu_x)(Y-\mu_y) \right ]
&=&E \left [ (X-\mu_x)^2] \quad + \quad E[(Y-\mu_y)^2] \quad + \quad 2E[(X-\mu_x)(Y-\mu_y) \right ]
&=&Var(X)+Var(Y)+2Cov(X,Y)=Var(Z)
\end{eqnarray*}

From these concepts:
\begin{eqnarray*}
Var(c_1X+c_2Y)&=&Var(c_1X)+Var(c_2Y)+2Cov(c_1X,c_2Y)
&=&c_1^2Var(X)+c_2^2Var(Y)+2c_1c_2Cov(X,Y)
\end{eqnarray*}

If we added a constant, $c_3$ this not change the variance, see next formula:
\begin{eqnarray*}
Var(c_1X+c_2Y+c_3)&=&Var(c_1X)+Var(c_2Y)+2Cov(c_1X,c_2Y)
&=&c_1^2Var(X)+c_2^2Var(Y)+2c_1c_2Cov(X,Y)
\end{eqnarray*}

Note: $Cov(X,X)=Var(X)$ and correlation, $\rho=\frac{Cov(X,Y)}{\sigma_x \sigma_y}$

8. Expectation and Variance of an Average of Independently Identically Distributed Random Variables Let $X_1, X_2, \ldots, X_n$ be $n$ independently identically distributed (i.i.d.) random variables with mean of $\mu$ and
a variance of $\sigma^2$. Let the average of the i.i.d. random variables be denoted

\[ \bar{X}=\frac{1}{n}\sum_{i=1}^{n}X_i. \]


Then the expectation of
\begin{equation}
\label{eofxbar}
E[\bar{X}]=E\left [ \frac{1}{n}\sum_{i=1}^{n}X_i \right ]=\frac{1}{n}\sum_{i=1}^{n}E[X_i]=\frac{1}{n}n\mu=\mu,
\end{equation}
and the variance of
\begin{equation}
\label{vofxbar}
Var(\bar{X})=Var\left ( \frac{1}{n}\sum_{i=1}^{n}X_i \right )
=\frac{1}{n^2}\sum_{i=1}^{n}Var(X_i)=\frac{1}{n^2}n\sigma^2=\frac{\sigma^2}{n}.
\end{equation}
Note: Since the random variables are independent the $Cov(X_i,X_j)=0$ for $i \neq j$.

[\latex]