Calculate moments for joint, conditional, and marginal random variables

Calculate moments for joint, conditional, and marginal random variables

Moments of a Probability Mass function

The n-th moment about the origin of a random variable is the expected value of its n-th power. Moments about the origin are \(E(X),E({ X }^{ 2 }),E({ X }^{ 3 }),E({ X }^{ 4 }),….\quad\)

For the most part, however, we are going to be looking at moments about the mean, also called central moments. The n-th central moment of a random variable \(X\) is the expected value of the n-th power of the deviation of \(X\) from its expected value.

  • First central moment: Mean
  • Second central moment: Variance

Moments about the mean describe the shape of the probability function of a random variable.

Properties of Expectation

Recall that the expected value of a random variable \(X\) is defined by

$$ E[X] = \sum_{x} {xp(x)} $$

where \(X\) is a discrete random variable with probability mass function \(p(x)\), and by

$$ E[X] = \int_{-\infty}^{\infty} xf(x)dx $$

when \(X\) is a continuous random variable with probability density function \(f(x)\). Since \(E[X]\) is a weighted average of the possible values of \(X\), it follows that if \(X\) must lie between a and b, then so must its expected value, i.e,.

If

$$ P(a\leq X \leq b) = 1 $$

Then

$$ a \leq E[X] \leq b $$

Expectations of a Sum

It follows that,

\(E [ag(X ) + bh(Y )] = aE [g(X )] + bE [h(Y )]\)

where a and b are constants.

What this equation tells us is that handling the expected value of a linear combination of functions is no more difficult than handling the expected values of the individual functions.

This handling also extends to situations where we have more than to variables. Expected values can easily be found from marginal distributions.

Example: Expectation of a sum

You have been given the following joint pmf. Verify that \(E[{ X }^{ 2 }+3Y]=E[{ X }^{ 2 }]+E[3Y]\)

$$ \begin{array}{c|c|c|c|c} {\begin{matrix} X \\ \huge{\diagdown} \\ Y \end{matrix}} & {0} & {1} &{2} \\ \hline {1} & {0.1} & {0.1} & {0} \\ \hline {2} & {0.1} & {0.1} & {0.2} \\ \hline {3} & {0.2} & {0.1} & {0.1} \end{array} $$

Solution

Reading values from the table, we have:

\((E[{ X }^{ 2 }+3Y]=0.1({ 0 }^{ 2 }+3\times 1)+0.1({ 0 }^{ 2 }+3\times 2)+0.2({ 0 }^{ 2 }+3\times 3)\)

                            \(+0.1({ 1 }^{ 2 }+3\times 1)+0.1({ 1 }^{ 2 }+3\times 2)+0.1({ 1 }^{ 2 }+3\times 3)\)

                            \(+0({ 2 }^{ 2 }+3\times 1)+0.2({ 2 }^{ 2 }+3\times 2)+0.1({ 2 }^{ 2 }+3\times 3)=8.1\)

Looking at the terms on the right side above,

\(E[{ X }^{ 2 }]={ 0 }^{ 2 }\times 0.4+{ 1 }^{ 2 }\times 0.3+{ 2 }^{ 2 }\times 0.3=1.5\)

\(E[3Y]=(3\times 1)0.2+(3\times 2)0.4+(3\times 3)0.4=6.6\)

Thus, \(E[{ X }^{ 2 }]+E[3Y]=8.1\), and the reasult has been verified.

Moments of Joint Random Variables

Discrete Case

Let \(X,Y\) be a pair of joint random variables with a joint probability function \(f(x,y)\) on the space \(S\). If there exists a function of these two namely \(g(X,Y)\) defined:

$$ E[g(X,Y)] = \sum_{(x,y) \in S} g(x,y)f(x,y) $$

Then this function is called the mathematical expectation (or expected value) of \(g(X,Y)\).

This mathematical expectation is known as the first moment of joint random variables, or mean.

The second moment is a derivative of the first moment and it is equal to:

$$ E[g(X,Y)]= E(g(X^2,Y^2)) – (E[g(X,Y)])^2 = Var(X,Y) $$

Example 1: Moments of Joint Random Variables

Let \(X\) and \(Y\) have the following pmf:

$$ f(x,y) = \frac{x^2 + 3y}{60} \qquad x = 1,2,3,4\quad y=1,2. $$

Find the expected value with \(g(X,Y) = XY\)
Solution
The possible values for this distribution are:

$$(1,1),(1,2),(2,1),(2,2),(3,1),(3,2),(4,1),(4,2)$$

Then we proceed to calculate,

$$\begin{align} E[XY] & = \sum_{(x,y) \in S} g(x,y) f(x,y)\\ & = \sum_{(x,y) \in S} (xy) \frac{x^2 + 3y}{60}\\ & = (1)\frac{1+3}{60} + (2)\frac{1+6}{60} + (2)\frac{4+3}{60} + (4)\frac{4+6}{60} + \\ & (3)\frac{9+3}{60} + (6)\frac{9+6}{60} + (4)\frac{16+3}{60} + (8)\frac{16+6}{60}\\ & = \frac{4}{60} + \frac{14}{60} + \frac{14}{60} + \frac{40}{60} + \frac{36}{60} + \frac{90}{60} + \frac{76}{60} +\frac{176}{60}\\ & = \frac{15}{2}=7.5 \end{align}$$

This process is quite similar to calculating the mean of any mass function, univariate or multivariate.

If \(X\) and \(Y\) are independent, then;

  • The expectation of the product of X and Y is the product of the individual expectations: \(E(XY ) = E(X)E(Y )\). This product formula holds for any expectation of a function \(X\) times a function of \(Y\)
  • The product formula holds for probabilities of the form P (some condition on X, some condition on Y) (where the comma denotes “and”): For instance, \(P(X\le 3, Y \le 4)=P(X\le 3)P(Y\le 4)\)
  • The variance of the sum of \(X\) and \(Y\) is the sum of the individual variances: \(Var(X + Y ) = Var(X) + Var(Y )\)

Example 2: Moments of Joint Random Variables

Let X and Y have the following pmf:

$$ f\left( x, y\right)=\frac{ x^2+3 y}{96}\ \ \ \ \ \ \ x=1,2,3,4\ \ \ \ y=1,2\ $$

Find the expected value of X and Y

Solution

To find the expected value of X, we need to find the marginal probability mass function of X which is given by;

$$\begin{align} f_{ x}\left( x\right) &=\sum_{ y}{ f\left( x, y\right)= P\left( X= x\right),\ x\epsilon S_{ x}} \\  f_{ X}\left( x\right) & =\frac{ x^2+3\left(1\right)}{96}+\frac{ x^2+3(2)}{96} \\ &=\frac{ x^2+3}{96}+\frac{ x^2+6}{96} \\ &=\frac{ x^2+ x^2+3+6 }{96} \\ \therefore\ f_X( x) & =\frac{2 x^2+9}{96} \end{align}$$

Therefore,

$$ \begin{align*} E\left( x\right) & =\sum_{ x=1}^{ n}{ {xf}_{ X}\left(x\right)\ \ \ \ \ \ \ } \\ & =\sum_{ x=1\ }^{4}{ {xf}_{ x}( x)} \\ & =\sum_{ x=1}^{4}{ x\frac{2 x^2+9}{96}\ } \\ & =\left(1\right)\frac{2\left(1\right)^2+9}{60}+\left(2\right)\frac{2\left(2\right)^2+9}{96}+\left(3\right)\frac{2\left(3\right)^2+9}{96}+\left(4\right)\frac{2\left(4\right)^2+9}{96} \\ & =\left(1\right)\frac{11}{96}+\left(2\right)\frac{17}{96}+\left(3\right)\frac{27}{96}+\left(4\right)\frac{41}{96}=\frac{145}{48}=3.02\ \end{align*} $$

Similarly, to find the expected value of Y, we need to find finding the marginal probability mass function of Y which is given by;

$$ \begin{align*} f_{ y}\left( y\right) & =\sum_{ x}{ f\left( x, y\right)= P\left( Y= y\right),\ \ \ y\epsilon S_{ y}} \\ & =\ \frac{1+3 y}{96}+\frac{4+3 y}{96}+\frac{9+3 y}{96}+\frac{16+3 y}{96} \\ & =\frac{12 y+30}{96} \\ \end{align*} $$

Therefore,

$$ \begin{align*} E\left( y\right) & =\sum_{ y=1}^{ n}{ {yf}_{ Y}\left( y\right)\ } \\ & =\sum_{ y=1}^{2}{ {yf}_{ Y}\left( y\right)} \\ & =\sum_{ y=1}^{2}{ y\frac{12 y+30}{96}} \\ & =\left(1\right)\frac{12\left(1\right)+30}{96}+\left(2\right)\frac{12\left(2\right)+30}{96}\ \\ &=\left(1\right)\frac{42}{96}+\left(2\right)\frac{54}{96}=\frac{25}{16}\ \end{align*} $$

We can also proceed to find the variance for the corresponding variables: For X, we know that:

$$ \text{Variance}= V\left( X\right)= E\left( X^2\right)-\left[ E\left( X\right)\right]^2 $$

Therefore, you need to find \( E( X^2)\) and \( E( X)\).

Continuing with the example above,

$$ \begin{align*} {Var}\left( X\right) & =\sum_{ x=1}^{4}{ x^2 f_{ x}\left( x\right)-\left[ E\left( X\right)\right]^2}\\ &=\sum_{ x=1}^{4}{ x^2\frac{2 x^2+9}{96}-\left(\frac{145}{48}\right)^2} \\ & =\left(1\right)^2\frac{11}{96}+\left(2\right)^2\frac{17}{96}+\left(3\right)^2\frac{27}{96}+\left(4\right)^2\frac{41}{96}-\left(\frac{145}{96}\right)^2=\frac{163}{16}-\left(\frac{145}{48}\right)^2=1.062\ \\ \text{ Similarly, for Y:}\\ {Var}\left( Y\right) & =\sum_{ y=1}^{2}{ y^2{ f}_{ y}\left( y\right)-\left[ E\left( Y\right)\right]^2} \\ & =\sum_{ y=1}^{2}{ y^2\frac{12 y+30}{96}-\left(\frac{25}{16}\right)^2} \\ & =\left(1\right)^2\frac{42}{96}+\left(2\right)^2\frac{54}{96}-\frac{625}{256}=\frac{43}{16}-\frac{625}{256}=\frac{63}{256}\ \end{align*} $$

Continuous Case

Example 3: Moments of Joint Random Variables

Let \(X\) and \(Y\) have the following pmf:

$$ f(x,y) = \begin{cases} 3(x+y), & 0 < x \le y \le 1\\ 0, &\text{otherwise}\\ \end{cases} $$

Find the expected value of \(X\) and \(Y\)

Solution

To find the mean/expectation of X, we need to find the marginal distribution of X. We know that:

$$f_x\left(x\right)=\int_{-\infty}^{\infty}{f\left(x,y\right)dy, \ \ x\epsilon S_x}$$

Then

$$\begin{align} f_X\left(x\right) &=\int_{x}^{1}{\left(3x+3y\right)\ dy}=\left|3xy+\frac{3y^2}{2}\right|_x^1\\ &=\left(3x+\frac{3}{2}\right)-\left(3x^2+\frac{3x^2}{2}\right) =3x-\frac{9x^2}{2}+\frac{3}{2}\end{align}$$

Hence we know that:

$$E\left[X\right]=\int_{-\infty}^{\infty}{xf_X\left(x\right)\ dx}$$

Then,

$$\begin{align} E\left(X\right)&=\int_{0}^{1}{x\left(3x-\frac{9x^2}{2}+\frac{3}{2}\right)\  dx}\\ &=\left|\frac{3x^3}{3}-\frac{9x^4}{8}+\frac{3x^2}{4}\right|_0^1\\ &=1-\frac{9}{8}+\frac{3}{4}=\frac{5}{8}\\ &=1-\frac{9}{8}+\frac{3}{4}=\frac{5}{8} \end{align}$$

$$\therefore E\left(x\right)=\frac{5}{8} $$

For variance of X, we know that,

$$Var(X)=E\left(X^2\right)-\left[E\left(X\right)\right]^2$$

Now,

$$E\left(X^2\right)=\int_{0}^{1}{x^2\left(3x-\frac{9x^2}{2}+\frac{3}{2}\right)\ dx} =\frac{7}{10}$$

Thus,

$$Var(X) = E\left(X^2\right)-\left[E\left(X\right)\right]^2=\frac{7}{10}-\left(\frac{5}{8}\right)^2=\frac{99}{320}$$

Moments for Conditional Random Variables

Let \(X\) and \(Y\) be random variables with a joint probability function \(f(x,y)\) and marginal functions \(f_x(x)\) and \(f_y(y)\)

Discrete Case

The conditional pmf of \(Y\) given that \(X=x\) is defined by:

$$h(y|x)=\cfrac { f(x,y) }{ { f }_{ X }(x) }\quad\quad\quad\quad\quad\text{provided that}\quad f_X(x)>0$$

The conditional mean of \(Y\), given that \(X=x\) is defined:

$$ \mu_{Y|x} = E[Y|x] = \sum_{y} y h(y|x), $$

and the conditional variance of \(Y\), given that \(X=x\) is defined:

$$ \sigma_{Y|x} = E\left\{(Y – E[Y|x])^2|x \right\} = \sum_{x}(y-E[Y|x])^2 h(y|x) $$

This is simplified to:

$$ \sigma^2_{Y|x} = E[Y^2|x] – (E[Y|x])^2 $$

Note also that:

$$h\left(y\middle| x\right)=\frac{f\left(x,y\right)}{f_X\left(x\right)}\ \ provided\ that\ f_X\left(x\right)  >0$$

So that:

$$\mu_{x|y}=E[X|Y]=yxh(x|y)$$

And

$$\sigma_{x|y}^2=E\left[Y^2\middle| X\right]-E[X|Y]^2$$

Example: Conditional Moments in the Discrete Case

The joint probability mass function of variables X and Y is given by:

$$f(x,y) = \frac{x^2 +3y}{60},\ x=1,2,3,4;\ y=1,2$$

Calculate :

a). E(X|Y=1)

b). E(Y|X=3)

c). V(X|Y=1)

Solution 

From the joint function, we can get the following marginal pmfs:

$$f_X\left(x\right)=\frac{2x^2+9}{60}\ \ \text{and} \ f_Y\left(y\right)=\frac{12y+30}{60}$$

We can also find conditional probability mass function:

$$g\left(x\middle| y\right)=\frac{x^2+3y}{12y+30}\ \text{and}\ h\left(y\middle| x\right)=\frac{x^2+3y}{2x^2+9}$$

So,

a). Finding \(E(X|Y=1):

$$\begin{align} E\left(X\middle| Y=1\right)&=E(g(x|y=1))\\ &=\sum_{x=1}^{4}{xg\left(x\middle| y=1\right)}\\&=\sum_{x=1}^{4}{x\frac{x^2+3\left(1\right)}{12\left(1\right)+30}} \\ &=\frac{65}{21}=3.10\end{align}$$

b). Finding \(E(Y|X=3)\):

$$\begin{align} E\left(Y\middle| X=3\right)&=E(h(y|x=3))\\ &=\sum_{y=1}^{2}{yh(y|x=3)}\\ &=\sum_{y=1}^{2}{y\ \frac{3^2+3y}{2\left(3\right)^2+9}}\\&=\left(1\right)\frac{3^2+3\left(1\right)}{2\left(3\right)^2+9}+\left(2\right)\frac{3^2+3\left(2\right)}{2\left(3\right)^2+9}\\&=\frac{12}{27}+\left(2\right)\frac{15}{27}=\frac{14}{9}=1.56\end{align}$$

c). Finding \(Var(X|Y=1)\)

Using the output  from (a), we have:

$$\begin{align}V\left(X\middle| Y=1\right)&=E\left[X-E\left(X\middle| Y=1\right)\right)^2|Y=1]\\ & =\sum_{x=1}^{4}{\left(x-E\left(X\middle| Y=1\right)\right)^2g(x|y=1)}\\ &=\sum_{x=1}^{4}{\left(x-\frac{65}{21}\right)^2\frac{x^2+3\left(1\right)}{12\left(1\right)+30}}\\ &=0.99 \end{align}$$

Continuous Case:

When \(X\) and \(Y\) are continuous random variables, the conditional pdf, mean, and variance are given as follows:

Conditional pdf:

$$ g(x|y)=\cfrac { f(x,y) }{ { f }_{ Y}(y) } \quad\quad\quad\quad\quad\text{provided that}\quad f_Y(y)>0$$

Conversely,

$$h\left(y\middle| x\right)=\frac{f\left(x,y\right)}{f_X\left(x\right)}\ \text{provided\ that}\ f_X\left(x\right) > 0$$

Conditional mean:

$$E(Y|X)=\int _{ -\infty  }^{ \infty  }{ yh(y|x)\partial y } $$

Also

$$E\left(Y\middle| X\right)=\int_{-\infty}^{\infty}{yh\left(y\middle| x\right)\partial y}$$

Conditional variance:

$$\text{Var}(Y|X)=E\{ [Y-E(Y|x)]^{ 2 }|x\} $$

                      $$=\int _{ -\infty  }^{ \infty  }{ [y-E(Y|x)]^{ 2 }h(y|x)\partial y } $$

                      $$=E[{ Y }^{ 2 }|x]-{ [E(Y|x] }^{ 2 }$$

Using the same logic,

$$Var\left(Y\middle| X\right)=E\left[Y^2\middle| X\right]-\left[E\left(Y\middle| X\right)\right]^2$$

Example:  Conditional Moments in the Continuous Case

Let

$$ f(x,y) = \frac{4}{3}(1-xy) \qquad 0\leq x\leq1,\quad 0\leq y\leq1 $$

Find:

a). g(x|y)

b). E(x|y=1)

Solution

a): Conditional function \(g(x|y)\)

We know that:

$$\begin{align} g\left(x\middle| y\right)&=\frac{f\left(x,y\right)}{f_Y\left(y\right)}\\ &=\frac{\frac{4}{3}\left(1-xy\right)}{\int_{0}^{1}{\frac{4}{3}\left(1-xy\right)dx}}\\&=\frac{\frac{4}{3}\left(1-xy\right)}{\frac{4}{3}{\left[x-\frac{x^2y}{2}\right]\left(x-\frac{x^2y}{2}\right)}_{x=0}^{x=1}}\\&\end{align}$$

b):  Conditional mean of X given Y \(E(X|Y=1)

We know that:

$$\begin{align} E\left(x\middle| y\right)&=\int_{0}^{1}{xg\left(x\middle| y\right)dx}\\ &E\left(x\middle| y\right)=\int_{0}^{1}{x\frac{1-xy}{1-\frac{y}{2}}dx}\\ &=\frac{1}{1-\frac{y}{2}}{\left[\frac{x^2}{2}-\frac{x^3y}{3}\right]\left(\frac{x^2}{2}-\frac{x^3y}{3}\right)}_{x=0}^{x=1}\\ &=\frac{1}{1-\frac{y}{2}}\left(\frac{1}{2}-\frac{y}{3}\right)\end{align}$$

Therefore:

$$E\left(Xx\middle| Y y=1\right)=\frac{1}{1-\frac{1}{2}}\left(\frac{1}{2}-\frac{1}{3}\right)=\frac{1}{3}$$

To find variance in the continuous case, we would just integrate over the same region with \(x^2\) instead of x and then find the difference between this integral and \(\left[E\left(x\middle| y\right)\right]^2\)

Concept Reminders

We compute and define conditional expectations, variances, etc., as usual, but with conditional distributions in place of ordinary distributions:

In the discrete case,

\(E(X|Y)=E(X|Y=y)=\sum _{ x }^{  }{ xg(x|y) } \)

\(E({ X }^{ 2 }|Y)=E({ X }^{ 2 }|Y=y)=\sum _{ x }^{  }{ { x }^{ 2 }g(x|y) } \)

\(Var(X|Y)=Var(X|Y=y)=E({ X }^{ 2 }|y)-[E(X|y)]^{ 2 }\)

and in general,

\(E(X|\text{condition})=\sum _{ x }^{  }{ xP(X=x|\text{condition}) } \)

In the continuous case,

\(E(X|y)=E(X|Y=y)=\int _{  }^{  }{ xg(x|y)\partial x } \)

\(E({ X }^{ 2 }|y)=E({ X }^{ 2 }|Y=y)=\int _{  }^{  }{ { x }^{ 2 }g(x|y)\partial x } \)

\(Var(X|y)=Var(X|Y=y)=E({ X }^{ 2 }|y)-[E(X|y)]^{ 2 }\)

and in general,

\((E(X|{ \text{condition} })=\int _{ -\infty  }^{ \infty  }{ xP(X=x|\text{condition}) } \)

The conditional density (pdf or pmf) of \(X\) given that \(Y = y\) is given by:

\(g(x|y)=\cfrac { f(x,y) }{ { f }_{ Y }(y) } \)

The conditional density of \(Y\) given that \(X=x\) is given by:

\(h(y|x)=\cfrac { f(x,y) }{ { f }_{ X }(x) } \)

Learning Outcome

Topic 3.c: Multivariate Random Variables – Calculate moments for joint, conditional, and marginal random variables.

Shop CFA® Exam Prep

Offered by AnalystPrep

Featured Shop FRM® Exam Prep Learn with Us

    Subscribe to our newsletter and keep up with the latest and greatest tips for success
    Shop Actuarial Exams Prep Shop Graduate Admission Exam Prep


    Daniel Glyn
    Daniel Glyn
    2021-03-24
    I have finished my FRM1 thanks to AnalystPrep. And now using AnalystPrep for my FRM2 preparation. Professor Forjan is brilliant. He gives such good explanations and analogies. And more than anything makes learning fun. A big thank you to Analystprep and Professor Forjan. 5 stars all the way!
    michael walshe
    michael walshe
    2021-03-18
    Professor James' videos are excellent for understanding the underlying theories behind financial engineering / financial analysis. The AnalystPrep videos were better than any of the others that I searched through on YouTube for providing a clear explanation of some concepts, such as Portfolio theory, CAPM, and Arbitrage Pricing theory. Watching these cleared up many of the unclarities I had in my head. Highly recommended.
    Nyka Smith
    Nyka Smith
    2021-02-18
    Every concept is very well explained by Nilay Arun. kudos to you man!
    Badr Moubile
    Badr Moubile
    2021-02-13
    Very helpfull!
    Agustin Olcese
    Agustin Olcese
    2021-01-27
    Excellent explantions, very clear!
    Jaak Jay
    Jaak Jay
    2021-01-14
    Awesome content, kudos to Prof.James Frojan
    sindhushree reddy
    sindhushree reddy
    2021-01-07
    Crisp and short ppt of Frm chapters and great explanation with examples.