Calculate variance, standard deviation for conditional and marginal probability distributions

Variance and standard deviation for joint random variables

The second moment or variance is a derivative of the first moment and it is equal to:

$$Var(X,Y)= E(g(X^2,Y^2)) – (E[g(X,y)])^2$$

The standard deviation of joint random variables is no more than the square root of the variance:

$$ \sigma_{(X,Y)}^2 = \sqrt{E(g(X^2,Y^2)) – (E[g(X,y)])^2}$$

Variance and standard deviation for marginal random variables

To determine the variance and standard deviation of eaach random variable that forms part of a multivariate distribution, we first determine their marginal distribution functions. Once these have been established, variance and standard deviation are computed just like in the univariate case. The standard deviation is the square root of variance. For example, assuming we have a random variable \(X\) and its marginal probability function \(f_X(x)\),

$$E[f_x(x)] = E(f_x(x^2)) – (E(f_x(x)))^2 = Var(f_x(x)) = Var[X]$$

Variance and standard deviation for joint conditional variables

The conditional variance of a conditional variable is found as follows:

$$\sigma_{Y|x}^2= E(Y-E[Y|x])^2|x = \sum_{x}(y-E[Y|x])^2h(y|x)$$

which simplifies to:

$$\sigma_{Y|x}^2=E[Y^2|x] – (E[Y|x])^2$$

Standard deviation is the square root  of this value.

 

Learning Outcome

Topic 3.e: Multivariate Random Variables – Calculate variance, standard deviation for conditional and marginal probability distributions.


X