Limited Time Offer: Save 10% on all 2022 Premium Study Packages with promo code: BLOG10 # Calculate joint moments, such as the covariance and the correlation coefficient

Recall that we have looked at the joint pmf of two discrete andcontinuous random variables $$X$$ and $$Y$$. The variables are considered independent if:

$$P\left(X=x,\ Y=y\right)=P\left(X=x\right)P\left(Y=y\right),\ \ \text{for all x,y (discrete case)}$$

And

$$f_{XY}\left(x,\ y\right)=f_X\left(x\right)f_Y\left(y\right),\ \ \text{for all x,y (continous case)}$$

Intuitively, two random variables are independent if the realization of one does not affect the probability distribution of the other. However, there are situations where random variables X and Y are non-independent/dependent.

If $$X$$ and $$Y$$ are two non-independent (dependent) variables, we would want to establish how one varies with respect to the other. If $$X$$ increases, for example, does $$Y$$ tend to increase or decrease? And if so, how strong is the dependence between the two? Two measures that can help us answer these questions are covariance and correlation coefficient.

## Covariance

Covariance is a measure of the directional relationship between two dependent random variables. The covariance $${Cov}[{{X}},{{Y}}]$$ of two random variables $$X$$ and $$Y$$ is defined by:

$$Cov\left[X,Y\right]=E[(X-E\left[X\right])(Y-E[Y])]$$

This simplifies to:

$$Cov\left[X,Y\right]=E\left[XY\right]-E[X]E[Y]$$

If you look at the covariance definition, there are some similarities between covariance and variance in the univariate case:

$$Var\left(X\right)=E\left[\left(X-E\left(X\right)\right)^2\right]=E\left(X^2\right)-E^2(X)$$

Note: The units of $${Cov}[{{X}},{{Y}}]$$ are the product of those of $${{X}}$$ and $${{Y}}$$. So, for example, if X is a time in hours, and Y is a sum of money in $, then Cov is in$x hours.

Note also that $$Cov\left[X,X\right]=Var\left[X\right]$$.

It is rather convenient that the mean and variance of any variable can be computed from either the joint pmf (or pdf) or the marginal pmf (or pdf) of the same variable. For example, in the discrete case for X,

\begin{align*} \mu_X=E\left(X\right)&=\sum_{x}\sum_{y} xf\left(x,y\right)\\ &=\sum_{x}{x\left[\sum_{y} f\left(x,y\right)\right]=\sum_{x}{xf_X\left(x\right)}}\ \end{align*}

However, to compute the covariance, we need joint pmf (or pdf):

$$Cov\left(\sum_{i=1}^{n}{X_i,\ \sum_{j=1}^{m}Y_j}\right)=\sum_{i=1}^{n}\sum_{j=1}^{m}\left(X_i,Y_j\right)$$

#### Properties of Covariance

Let $$X$$, $$Y$$, and $$Z$$ be random variables and let $$a$$, $$b$$, and $$c$$ be constants. Then, the following properties should hold true:

1. $$Cov\left(X,Y\right)=Cov(Y,X)$$
2. $$Cov \left(X,X\right)=Var\left(X\right)$$
3. $$Cov\left(aX,bY\right)=abCov(X,Y)$$
4. $$Cov\left[aX+b,cY+d\right]=ac.Cov\left[X,Y\right]$$
5. $$Cov\left[X,Y+Z\right]=Cov\left[X,Y\right]+Cov\left[X,Z\right]$$
6. If ­$$X$$ and $$Y$$ are independent, $$Cov\left[X,Y\right]=0$$
7. $$Cov(X,c) = E\left[(X-E(X))(c-c)\right] = E(0)=0$$

The covariance between $$X$$ and $$Y$$ is a measure of the strength of the “linear association” or “linear relationship” between the variables. The covariance can have a positive or a negative sign depending on the relationship between the two variables. When the covariance is positive; it means we have a positive association between the random variables $$X$$ and $$Y$$, while a negative covariance implies a negative association exists between the variables $$X$$ and $$Y$$.

However, one of its major negative points is that its value is dependent on the units of measurement of the variables. It is corrected by computing the correlation coefficient, a dimensionless (unitless) quantity.

## Correlation Coefficient

The correlation coefficient, usually written as $$Corr(X,Y)$$ or $$\rho(X,Y)$$, of two random variables $$X$$ and \)Y is defined  as:

$$Corr\left(X,Y\right)=\rho\left(X,Y\right)=\frac{Cov(X,Y)}{\sqrt{Var\left(X\right)Var\left(Y\right)}}=\frac{Cov(X,Y)}{\sigma_X\sigma_Y}$$

The correlation coefficient takes a value in the range $$-1\le\rho\le1$$. It reflects the degree of association between the two variables. It is also important to note the following:

1. If $$X$$ and $$Y$$ are independent, $$corr\left(X,Y\right)=0$$; and
2. If $$Y=mX+c$$ for some constants $$m\neq0$$ and c, then $$corr \left(X,Y\right)=1$$ if $$m>0$$, and $$corr \left(X,Y\right)=-1$$ if $$m<0$$.

Note: The correlation coefficient is a measure of the degree of linearity between $$X$$ and $$Y$$. A value of $$\rho \text{ near } +1 \text{ or } -{{1}}$$ indicates a high degree of linearity between $$X$$ and $$Y$$, whereas a value near 0 indicates that such linearity is absent. A positive value of $$\rho$$ indicates that $$Y$$ tends to increase when $$X$$ does, whereas a negative value indicates that $$Y$$ tends to decrease when $$X$$ increases. If $$\rho={0}$$, then X and Y are said to be uncorrelated.

## Covariance and Correlation for Discrete Distributions

If $$X$$ and $$Y$$ are discrete random variables, we generally:

1. Find $$E(X)$$ and $$E(Y)$$ at once with iterated integrals which are given by:
$$E\left(x\right)=\sum_{\forall\ x}{x.P(X=x)}$$
and
$$E\left(y\right)=\sum_{\forall\ x}{y.P(Y=y)}$$
2. Find /(E(XY)\) applying the iterated integrals
$$E\left(XY\right)=\sum_{all\ x} \sum_{all\ y}xy [P(X=x,Y=y)]$$
3. Calculate $$Cov(X,Y)$$ and $$Corr(X,Y)$$ using the formulas:
$$Cov\left(X,Y\right)=E\left(XY\right)-E(X)E(Y)$$
And
$$\rho\left(X,Y\right)=\frac{Cov(X,Y)}{\sqrt{Var\left(X\right)Var\left(Y\right)}}=\frac{Cov(X,Y)}{\sigma_X\sigma_Y}$$

#### Example: Covariance and Correlation Coefficient (Discrete Random Variables) #1

Calculate the covariance of the random variables $$X$$ and $$Y$$ given the following joint pmf:

$$\begin{array}{c|c|c|c|c} {\begin{matrix} X \\ \huge{\diagdown} \\ Y \end{matrix}} & {0} & {1} & {2} \\ \hline {1} & {0.1} & {0.1} & {0} \\ \hline {2} & {0.1} & {0.1} & {0.2} \\ \hline {3} & {0.2} & {0.1} & {0.1} \end{array}$$

Solution

We will use the formula $$Cov\ \left(X,Y\right)=E\left[XY\right]-E\left[X\right]E\left[Y\right]$$

Using data from the following table:

\begin{align*} E\left(XY\right)&=\sum_{all\ x}\sum_{all\ y}xy [P(X=x,Y=y)] \\ &=\left[0\times1\right]\times0.1+\left[1\times1\right]\times0.1+\ldots+2\times3\times0.1=2 \end{align*}

The (marginal) probability mass function of $$X$$ is:

$$\begin{array}{c|c|c|c} \text{X} & {0} & {1} & {2} \\ \hline {{P}({X}={x})} & {0.4} & {0.3} & {0.3} \end{array}$$

Thus,

$$E\left(X\right)=0\times0.4+1\times0.3+2\times0.3=0.9$$

The (marginal) probability mass function of $$Y$$ is:

$$\begin{array}{c|c|c|c} \text{Y} & {1} & {2} & {3}\\ \hline {{P}({Y}={y})} & {0.2} & {0.4} & {0.4} \end{array}$$

Thus,

$$E\left(Y\right)=1\times0.2+2\times0.4+3\times0.4=2.2$$

Hence,

$$Cov\left(X,Y\right)=2-0.9\times2.2=0.02$$

To find the correlation coefficient using the respective marginal distributions, we can calculate the $$Var(X)$$ and $$Var(Y)$$. We know that:

\begin{align*} Var\left(X\right)&=E\left(X^2\right)-\left[E\left(X\right)\right]^2\\ &=\left[0^2\times0.4+1^2\times0.3+2^2\times0.3\right]-{0.9}^2\\ &=0.69 \end{align*}

Similarly,

\begin{align*} Var\left(Y\right)&=E\left(Y^2\right)-\left[E\left(Y\right)\right]^2 \\ &=\left[1^2\times0.2+2^2\times0.4+3^2\times0.4\right]-{2.2}^2\\ &=0.56 \end{align*}

Therefore,

\begin{align*} Corr\left(X,Y\right)&=\frac{cov\left(X,Y\right)}{\sqrt{var\left(X\right)var\left(Y\right)}}\\ &=\frac{0.02}{\sqrt{0.69\times0.56}}\approx0.03 \end{align*}

#### Example: Covariance and Correlation Coefficient (Discrete Random Variables) #2

Let $$X$$ and $$Y$$ have the following joint pmf:

$$f\left(x,y\right)=\frac{1}{33}\left(x+2y\right)\ \ \ \ \ \ \ x=1,2\ \ \ \ y=1,2,3.$$

Compute $$Corr\left(X,Y\right)$$.

Solution

First, we need:

\begin{align*} E\left(XY\right)&=\sum_{all\ x}\sum_{all\ y}{xy\ f\left(x,y\right)}\\ &=\sum_{x=1}^{2}\sum_{y=1}^{3}{xy\frac{x+2y}{33}}\\ &=\left(1\right)\left(1\right)\frac{\left(1\right)+2\left(1\right)}{33}+\left(1\right)\left(2\right)\frac{\left(1\right)+2\left(2\right)}{33}+\left(1\right)\left(3\right)\frac{\left(1\right)+2\left(3\right)}{33}\\ &+\left(2\right)\left(1\right)\frac{\left(2\right)+2\left(1\right)}{33}+\left(2\right)\left(2\right)\frac{\left(2\right)+2\left(2\right)}{33}+\left(2\right)\left(3\right)\frac{\left(2\right)+2\left(3\right)}{33}\\ &=\left(1\right)\frac{3}{33}+\left(2\right)\frac{5}{33}+\left(3\right)\frac{7}{33}+\left(2\right)\frac{4}{33}+\left(4\right)\frac{6}{33}+\left(6\right)\frac{8}{33}\\ &=\frac{38}{11} \end{align*}

Also, we need the variances $$Var(X)$$ and $$Var(Y)$$. As such, we need to find the marginal probability mass functions for $$X$$ and $$Y$$. We know that:

\begin{align*} f_X\left(x\right)&=\sum_{all\ y}{f\left(x,y\right)=P\left(X=x\right),\ \ x\epsilon S_x}\\ &=\sum_{y=1}^{3}{\frac{1}{33}\left(x+2y\right)}\\ &=\frac{x+2\left(1\right)}{33}+\frac{x+2\left(2\right)}{33}+\frac{x+2\left(3\right)}{33}\\ &=\frac{3x+12}{33}\\ E\left(X\right)&=\sum_{all\ x}{xf_X\left(x\right)}\\ &=\sum_{x=1}^{2}{x\ \frac{3x+12}{33}}\\ &=\left(1\right)\frac{3\left(1\right)+12}{33}+\left(2\right)\frac{3\left(2\right)+12}{33}=\frac{51}{33}=\frac{17}{11} \end{align*}

Also, we know that:

$$Var\left(X\right)=E\left(X^2\right)-\left[E\left(X\right)\right]^2$$

Now,

\begin{align*} E\left(X^2\right)&=\sum_{all\ x}{xf_X\left(x\right)}\\ &=\sum_{x=1}^{2}{x^2\frac{3x+12}{33}}\\ &=\left(1\right)^2\frac{3\left(1\right)+12}{33}+\left(2\right)^2\frac{3\left(2\right)+12}{33}=\frac{87}{33}=\frac{29}{11} \end{align*}

Thus,

\begin{align*} Var\left(X\right)&=E\left(X^2\right)-\left[E\left(X\right)\right]^2\\ &=\frac{29}{11}-\left(\frac{17}{11}\right)^2=\frac{30}{121}\ \end{align*}

Similarly, the marginal probability mass function for $$Y$$ is given by:

\begin{align*} f_Y\left(y\right)&=\sum_{all\ x}{f\left(x,y\right)=P\left(Y=y\right),\ \ y\epsilon S_y}\\ &=\sum_{x=1}^{2}{\frac{1}{33}\left(x+2y\right)}\\ &=\frac{\left(1\right)+2y}{33}+\frac{\left(2\right)+2y}{33}\\ &=\frac{4y+3}{33} \end{align*}

The mean and the variance of $$Y$$ can be calculated as follows:

\begin{align*} E\left(Y\right)&=\sum_{all\ y}{yf_Y\left(y\right)}\\ &=\sum_{y=1}^{3}{y\frac{4y+3}{33}}\\ &=\left(1\right)\frac{4\left(1\right)+3}{33}+\left(2\right)\frac{4\left(2\right)+3}{33}+\left(3\right)\frac{4\left(2\right)+3}{33}\\ &=1\left(\frac{7}{33}\right)+2\left(\frac{11}{33}\right)+3\left(\frac{115}{33}\right)=\frac{7}{33}+\frac{22}{33}+\frac{3345}{33}=\frac{62}{33}\end{align*}

And,

\begin{align*} E\left(Y^2\right)&=\sum_{y=1}^{3}{y^2\frac{4y+3}{33}}\\ &=\left(1\right)^2\frac{7}{33}\ +\left(2\right)^2\frac{11}{33}+\left(3\right)^2\frac{15}{33} \\ &=\frac{7}{33}+\frac{44}{33}+\frac{135}{33}\\ &=\frac{186}{33}\ \end{align*}

Thus,

\begin{align*} Var\left(Y\right)&=E\left(Y^2\right)-\left[E\left(Y\right)\right]^2\\ &=\frac{186}{33}-\left(\frac{74}{33}\right)^2=\frac{62}{11}-\left(\frac{74}{33}\right)^2\\ &=\frac{2294}{1089} \end{align*}

The covariance of $$X$$ and $$Y$$ is:

$$Cov\left(X,Y\right)=E\left(X,Y\right)-E(X)E(Y)\$$

Now,

$$Cov\left(X,Y\right) =\frac{38}{11}-\frac{17}{11}\times\frac{62}{33}=\frac{200}{363}$$

Hence,

\begin{align*} corr\left(X,Y\right)=\rho\left(X,Y\right)&=\frac{Cov\left(X,Y\right)}{\sqrt{Var\left(X\right)Var\left(Y\right)}}\\ &=\frac{\frac{200}{363}-\frac{4}{363}}{\sqrt{\frac{2294662}{1089}\bullet\frac{30}{121}}}=0.76238 \end{align*}

Note that $$f\left(x,y\right)\neq f_X\left(x\right)f_Y\left(y\right)$$, and thus $$X$$ and $$Y$$ are dependent.

#### Example: Covariance and Correlation Coefficient (Discrete Case) #3

Determine the covariance and correlation coefficient given the following joint probability mass function:

$$f\left(x,y\right)=c\left(x^2+3y\right)\ \ \ \ \ \ x=1,2,3,4,\ \ \ y=1,2$$

Solution

First, we need to find the value of $$c$$ and then proceed to extract the marginal functions. We know that:

$$\sum_{x}\sum_{y}{P(X=x,\ Y=y)}=1$$

\begin{align*} \Rightarrow c(1^2+3\left(1\right)+c(1^2+3\left(2\right)+\ldots+c(4^2+3\left(2\right)&=1\\ =4c+7c+7c+10c+12c+15c+19c+22&=1\\ 96c&=1\\ \therefore c&=\frac{1}{96} \end{align*}

Using the above results, marginal functions are:

$$f_X\left(x\right)=\frac{2x^2+9}{96} \text { and } f_Y\left(y\right)=\frac{12y+30}{96}$$

Let’s now calculate the means of $$X$$ and $$Y$$:

\begin{align*} E\left(X\right)&=\sum_{x=1}^{4}{xf_X\left(x\right)}\\ &=\sum_{x=1}^{4}{x\frac{2x^2+9}{96}}\\ &=\left(1\right)\frac{11}{96}+\left(2\right)\frac{17}{96}+\left(3\right)\frac{27}{96}+\left(4\right)\frac{41}{96}\ \\ &=\frac{11}{96}+\frac{34}{96}+\frac{81}{96}+\frac{164}{96}\\ &=\frac{145}{48}\ \end{align*}

And,

\begin{align*} \sigma_X^2&=Var\left(X\right)=\sum_{x=1}^{4}{x^2f_X\left(x\right)-\left[E\left(X\right)\right]^2}\\ &=\sum_{x=1}^{4}{x^2\frac{2x^2+9}{96}}-\left(\frac{145}{48}\right)^2\\ &=\left(1\right)^2\frac{11}{96}+\left(2\right)^2\frac{17}{96}+\left(3\right)^2\frac{27}{96}+\left(4\right)^2\frac{41}{96}-\left(\frac{145}{48}\right)^2\\ &=\frac{163}{16}-\left(\frac{145}{48}\right)^2=1.062\ \end{align*}

Similarly for $$Y$$:

\begin{align*} \mu_Y&=E\left(Y\right)=\sum_{y=1}^{2}{yf_Y\left(y\right)}\\ &=\sum_{y=1}^{2}{y\frac{12y+30}{96}=\left(1\right)\frac{42}{96}+\left(2\right)\frac{54}{96}\ }\\ &=\frac{42}{96}+\frac{108}{96}\\ &=\frac{25}{16}\ \end{align*}

And,

\begin{align*} \sigma_Y^2&=\sum_{y=1}^{2}{y^2f_Y\left(y\right)-\left[\mu_Y\right]^2}\\ &=\sum_{y=1}^{2}{y^2\frac{12y+30}{96}-\left(\frac{25}{16}\right)^2}\\ &=\left(1\right)^2\frac{42}{96}+\left(2\right)\frac{54}{96}-\left(\frac{25}{16}\right)^2\\ &=\frac{42}{96}+\frac{216}{96}-\frac{625}{256}=\frac{43}{16}-\frac{625}{256}\\ &=\frac{63}{256} \end{align*}

We can now calculate $$Cov\left(X,Y\right)$$ and $$Corr(X,Y)$$.

$$Cov\left(X,Y\right)=E\left(XY\right)-E(X)E(Y)$$

Now,

\begin{align*} E\left(XY\right)&=\sum_{x=1}^{4}\sum_{y=1}^{2}{xy\frac{x^2+3y}{96}}\\ &=\left(1\right)\left(1\right)\frac{4}{96}+\left(1\right)\left(2\right)\frac{7}{96}+\left(2\right)\left(1\right)\frac{7}{96}+\left(2\right)\left(2\right)\frac{10}{96}+\left(3\right)\left(1\right)\frac{12}{96}\\ &+\left(3\right)\left(2\right)\frac{15}{96}+\left(4\right)\left(1\right)\frac{19}{96}+\left(4\right)\left(2\right)\frac{22}{96}\\ &=\frac{75}{16} \end{align*}

Therefore,

\begin{align*} Cov\left(X,Y\right)&=\frac{75}{16}-\left(\frac{145}{48}\right)\left(\frac{25}{16}\right)\\ &=\frac{75}{16}-\frac{3625}{768}\\ &=-\frac{25}{768} \end{align*}

And lastly,

\begin{align*} \rho\left(X,Y\right)&=\frac{Cov\left(X,Y\right)}{\sqrt{\sigma_X^2\sigma_Y^2}}\\ &=-\frac{\frac{25}{768}}{\sqrt{1.062\bullet\left(\frac{63}{256}\right)}}\\ &=-0.0636\ \end{align*}

## Covariance and Correlation Coefficient for Continuous Random Variables

If $$X$$ and $$Y$$ are continuous random variables, we generally:

1. Find $$E(X)$$ and $$E(Y)$$ at once with iterated integrals which are given by:
$$E\left(x\right)=\int_{x}{x.f_X(x)dx}$$
and
$$E\left(y\right)=\int_{y}{y.f_Y(y)dy}$$
2. Find $$E(XY)$$ applying the iterated integrals
$$E\left(XY\right)=\int_{x}\int_{y}{xy.f_{XY}(xy)dxdy}$$
3. Calculate $$Cov(X,Y)$$ and $$Corr(X,Y)$$ using the formulas:
$$Cov\left(X,Y\right)=E\left(XY\right)-E(X)E(Y)$$
and
$$\rho\left(X,Y\right)=\frac{Cov(X,Y)}{\sqrt{Var\left(X\right)Var\left(Y\right)}}=\frac{Cov(X,Y)}{\sigma_X\sigma_Y}$$

#### Example: Covariance and Correlation Coefficient (Continuous Case) #1

Let

$$f\left( x,y \right) =\begin{cases} \begin{matrix} \frac { 2 }{ 3 } \left( 2x+y \right) , & 0 < x < 1,0 < y < 1 \end{matrix} \\ \begin{matrix} 0, & \text{ otherwise } \end{matrix} \end{cases}$$

Find $$Cov\left(X,Y\right)$$ and $$Corr\left(X,Y\right)$$.

Solution

First, we compute the marginal pdf of $$X$$ given by:

\begin{align*} f_X\left(x\right)&=\int_{Y}\ f\left(x,y\right)dy\\ &=\frac{2}{3}\int_{0}^{1}\left(2x+y\right)dy\\ &=\frac{2}{3}\left[2xy+\frac{y^2}{2}\right]_0^1\ \\ &=\frac{2}{3}\left(2x+\frac{1}{2}\right)\ \end{align*}

We need:

\begin{align*} E\left(X\right)&=\int_{x}{x\cdot f\left(x,y\right)}dx\\ &=\frac{2}{3}\int_{0}^{1}{x\left(2x+\frac{1}{2}\right)dx=\frac{2}{3}\left[\frac{2x^3}{3}+\frac{x^2}{4}\right]_0^1}\\ &=\frac{2}{3}\left(\frac{2}{3}+\frac{1}{4}\right)\\ &=\frac{11}{18} \end{align*}

Also, we need:

\begin{align*} E\left(X^2\right)&=\int_{x}{x^2\cdot f\left(x,y\right)}dx\\ &=\int_{0}^{1}{x^2\left(2x+\frac{1}{2}\right)dx=\frac{2}{3}\left[\frac{x^4}{2}+\frac{x^3}{6}\right]_0^1=\frac{2}{3}\left(\frac{1}{2}+\frac{1}{6}\right)}\\ &=\frac{4}{9}\ \end{align*}

Thus,

\begin{align*} Var\left(X\right)&=E\left(X^2\right)-\left[E\left(X\right)\right]^2\\ &=\frac{4}{9}-\frac{121}{324}=\frac{23}{324}\ \end{align*}

Let us compute the marginal pdf for $$Y$$, given by:

\begin{align*} f_Y\left(y\right)&=\int_{x}\ f\left(x,y\right)dx\\ &=\frac{2}{3}\int_{0}^{1}{\left(2x+y\right)dx=\frac{2}{3}\left[x^2+xy\right]_0^1=\frac{2}{3}(1+y)} \end{align*}

We need:

\begin{align*} E\left(Y\right)&=\int_{y}{y\cdot f\left(x,y\right)}dy\\ &=\int_{0}^{1}{y\left(1+y\right)dy=\frac{2}{3}\left[\frac{y^2}{2}+\frac{y^3}{3}\right]_0^1}\\ &=\frac{2}{3}\left(\frac{1}{2}+\frac{1}{3}\right)=\frac{5}{9} \end{align*}

Also, we need:

\begin{align*} E\left(Y^2\right)&=\int_{y}{y^2\cdot f\left(x,y\right)}dy\\ &=\int_{0}^{1}{y^2\left(1+y\right)dy=\frac{2}{3}\left[\frac{y^3}{3}+\frac{y^4}{4}\right]_0^1}\\ &=\frac{2}{3}\left(\frac{1}{3}+\frac{1}{4}\right)=\frac{7}{18} \end{align*}

And,

\begin{align*} Var\left(Y\right)&=E\left(Y^2\right)-\left[E\left(Y\right)\right]\\ &=\frac{7}{18}-\left(\frac{5}{9}\right)^2=\frac{13}{162} \end{align*}

We also need $$E\left(XY\right)$$, where:

\begin{align*} E\left(XY\right)&=\int\int{xy\ f\left(x,y\right)\ dydx}\\ &=\frac{2}{3}\int_{0}^{1}\int_{0}^{1}xy\left(2x+y\right)dxdy\\ &=\frac{2}{3}\int_{0}^{1}\int_{0}^{1}\left(2x^2y+xy^2\right)dxdy\\ &=\frac{2}{3}\int_{0}^{1}\left[\frac{2x^3y}{3}+\frac{x^2y^2}{2}|_0^1\right]dy\\ &=\frac{2}{3}\int_{0}^{1}{\left[\frac{2y}{3}+\frac{y^2}{2}\right]dy=\frac{2}{3}\left[\frac{2y^2}{6}+\frac{y^3}{6}\right]_0^1}\\ &=\frac{2}{3}\left(\frac{1}{3}+\frac{1}{6}\right)=\frac{1}{3} \end{align*}

At this point, we can calculate the covariance for this function:

\begin{align*} Cov\left(X,Y\right)&=E\left[XY\right]-E\left[X\right]E\left[Y\right]\\ &=\frac{1}{3}-\frac{11}{18}\times\frac{5}{9}=-\frac{1}{162} \end{align*}

And lastly,

\begin{align*} \rho&=\frac{cov\left(X,Y\right)}{\sqrt{Var\left(X\right)Var\left(Y\right)}}\\ &=\frac{-\frac{1}{162}}{\sqrt{\frac{23}{324}\times\frac{13}{162}}}=-0.082 \end{align*}

Learning Outcome

Topic 3.f: Multivariate Random Variables – Calculate joint moments, such as the covariance and the correlation coefficient.

Shop CFA® Exam Prep

Offered by AnalystPrep Level I
Level II
Level III
All Three Levels
Featured Shop FRM® Exam Prep FRM Part I
FRM Part II
FRM Part I & Part II
Learn with Us

Subscribe to our newsletter and keep up with the latest and greatest tips for success
Shop Actuarial Exams Prep Exam P (Probability)
Exam FM (Financial Mathematics)
Exams P & FM
Shop GMAT® Exam Prep Complete Course Daniel Glyn
2021-03-24
I have finished my FRM1 thanks to AnalystPrep. And now using AnalystPrep for my FRM2 preparation. Professor Forjan is brilliant. He gives such good explanations and analogies. And more than anything makes learning fun. A big thank you to Analystprep and Professor Forjan. 5 stars all the way! michael walshe
2021-03-18
Professor James' videos are excellent for understanding the underlying theories behind financial engineering / financial analysis. The AnalystPrep videos were better than any of the others that I searched through on YouTube for providing a clear explanation of some concepts, such as Portfolio theory, CAPM, and Arbitrage Pricing theory. Watching these cleared up many of the unclarities I had in my head. Highly recommended. Nyka Smith
2021-02-18
Every concept is very well explained by Nilay Arun. kudos to you man! 2021-02-13   