# Variance and standard deviation for joint random variables

The second moment or variance is a derivative of the first moment and it is equal to:

$$E[g(X,Y)] = E(g(X^2,Y^2)) – (E[g(X,y)])^2 = Var(X,Y)$$

The standard deviation of joint random variables is no more than the square root of the variance. This is:

$$ \sigma_{(X,Y)}^2 = \sqrt{E(g(X^2,Y^2)) – (E[g(X,y)])^2}$$

# Variance and standard deviation for marginal random variables

Variance for marginal random variables is calculated the same way as we’ve seen so far, and the same way as the univariate case. Standard deviation is, as well, the square root of the variance.

$$E[f_x(x)] = E(f_x(x^2)) – (E(f_x(x)))^2 = Var(f_x(x)) = Var[X]$$

# Variance and standard deviation for joint conditional variables

The conditional variance of a conditional variable is found with:

$$\sigma_{Y|x}^2= E(Y-E[Y|x])^2|x = \sum_{x}(y-E[Y|x])^2h(y|x)$$

Simplified to:

$$\sigma_{Y|x}^2=E[Y^2|x] – (E[Y|x])^2$$

And the standard deviation is the square root than this value.

**Learning Outcome**

**Topic 3.e: Multivariate Random Variables – Calculate variance, standard deviation for conditional and marginal probability distributions.**