Limited Time Offer: Save 10% on all 2022 Premium Study Packages with promo code: BLOG10

Explain and apply joint moment generating functions

Explain and apply joint moment generating functions

We can derive moments of most distributions by evaluating probability functions by integrating or summing values as necessary. However, moment generating functions present a relatively simpler approach to obtaining moments.

Univariate Random Variables

In the univariate case, the moment generating function, \(M_X(t)\), of a random variable X is given by:

$$ { M }_{ X }(t)=E{ [e }^{ tx }] $$

for all values of \(t\) for which the expectation exists.

Moment generating functions can be defined for both discrete and continuous random variables. For discrete random variables, the moment generating function is defined as:

$${ M }_{ X }(t)=E{ [e }^{ tx }]=\sum _{ x }^{  }{ { e }^{ tx }P(X=x) } $$

and for the continuous random variables, the moment generating function is given by:

$$\int _{ x }^{  }{ { e }^{ tx }{ f }_{ X }(x)dx } $$

If \(Y=Ax+b\), then it can be shown that:

$$M_Y\left(t\right)=e^{bt}M_X(at)$$

That is:

$$M_Y\left(t\right)=E\left[e^{tY}\right]=E\left[e^{t\left(aX+b\right)}\right]=e^{bt}E\left[e^{atX}\right]=e^{bt}M_X(at)$$

Multivariate Random Variables

Moment generating functions can be extended to multivariate (two or more) random variables, where we use the same underlying concepts. Once the moment generating function is established, we can determine the mean, variance, and other moments.

For a \(n\), random variables: \(X_1,\ldots,\ X_n\) the joint moment generating \(M(t_1,t_2,\ldots,\ t_n)\) is defined for all real values of \(t_1,t_2,\ldots,\ t_n\) as:

$$M\left(t_1,t_2,\ldots,\ t_n\right)=E\left[e^{t_1X_1+t_2X_2+\ldots+t_nX_n}\right]$$

An idividual moment generating function can be obtained from \(M(t_1,t_2,\ldots,\ t_n)\) by letting all \(t_i\prime s\) but one to be 0. That is,

$$M_{X_i}\left(t\right)=E\left[e^{tX_i}\right]=M(0,0,t_i\ldots,\ 0)$$

Moreover, if \(X_1,\ldots,\ X_n\) are n independent random variables and that,  \(Y=X_1+\ldots+X_n\) for n independent random variables, it can be proved that:

$$M_Y\left(t\right)=M_{X_1+\ldots+X_n}\left(t\right)=\prod_{i=1}^{n}{M_{X_i}(t)}=M_{X_1}\left(t\right)\bullet M_{X_2}\left(t\right)\bullet\ldots\bullet M_{X_n}(t)$$

Moment Generating Function for Identically Distributed Random Variables

If \(X_1,X_2\ldots,\ X_n\) are n independent identically distributed random variables, then it can be shown that,

$$M_Y\left(t\right)=\left[M_X(t)\right]^n$$

Example: Moment Generating Function for iid Random Variables

Let \(X_1\) and \(X_2\) be independent identically distributed random variables with the following respective pdfs:

$$f\left(x_1\right)={2e}^{-2X_{1\ }},\  x_1\geq 0$$

And

$$f\left(x_2\right)={2e}^{-2X_2},\  x_2\geq 0$$

Now if \(Y=X_1 +X_2 \), find \(M_Y(t)\) .

Solution

Since \(X_1\) and \(X_2\) are independent and identically distributed random variables, Then the joint distributions given by:

$$f\left(x_1,x_2\right)=f\left(x_1\right)\bullet f\left(x_2\right)$$

Thus the moment generating functions is given by:

$$\begin{align}M_Y(t)&=\int_{0}^{\infty}\int_{0}^{\infty}{e^{tY}f\left(x_1,x_2\right)dx_1dx_2=\int_{0}^{\infty}\int_{0}^{\infty}{e^{t(x_1+x_2)}f\left(x_1\right)\bullet f\left(x_2\right)dx_1dx_2}}\\ &=\int_{0}^{\infty}{{2e}^{(t-2)x_1}dx_1}\int_{0}^{\infty}{{2e}^{\left(t-2\right)x_2}dx_2}\\ &=\frac{2}{t-2}\left[e^{\left(t-2\right)x_1}\right]_0^\infty\bullet\frac{2}{t-2}\left[e^{\left(t-2\right)x_2}\right]_0^\infty \\ &=\left(\frac{2}{t-2}\right)\bullet\left(\frac{2}{t-2}\right)\\ &==\left(\frac{2}{t-2}\right)^2\end{align}$$

Example: Joint Moment Generating Function for Uniformly Distributed Random Variables

Let \(X_1\) and \(X_2\) be independent random variables with a uniform distributions on \({1,2,3,4,5,6}\). Let \(Y = X_1 + X_2\). For example, \(Y\) could equal the sum when two fair dice are rolled. The mgf of \(Y\) is:

$$ M_Y(t) = E\big(e^{tY}\big) = E\big[e^{t(X_1+X_2)}\big] = E\big(e^{tX_1}e^{tX_2}\big) $$

The independence of \(X_1\) and \(X_2\) implies that:

$$ M_Y(t) = E\big(e^{tX_1}\big)E\big(e^{tX_2}\big) $$

Note that \(X_1\) and \(X_2\) have the same pmf:

$$ f(x_1)=f(x_2) = \frac{1}{6}, \qquad x=1,2,3,4,5,6, $$

and thus the same mgf. That is,

$$M_{x_1}\left(t\right)=M_{x_2}\left(t\right)=\frac{1}{6}e^t+\frac{1}{6}e^{2t}+\frac{1}{6}e^{3t}+\frac{1}{6}e^{4t}+\frac{1}{6}e^{5t}+\frac{1}{6}e^{6t}$$

$$ M_x(t) = \frac{1}{6}e^{t} + \frac{1}{6}e^{2t} + \frac{1}{6}e^{3t} + \frac{1}{6}e^{4t} + \frac{1}{6}e^{5t} + \frac{1}{6}e^{6t} $$

It then follows that \(M_Y(t) = [M_X(t)]^2\) equals:

$$ \frac{1}{36}e^{2t} + \frac{2}{36}e^{3t} + \frac{3}{36}e^{4t} + \frac{4}{36}e^{5t} +\frac{5}{36}e^{6t} + \frac{6}{36}e^{7t} + \frac{5}{36}e^{8t} + \frac{4}{36}e^{9t} + \frac{3}{36}e^{10t} \frac{2}{36}e^{11t} + \frac{1}{36}e^{12t}. $$

Note that the coefficient of \(e^{bt}\) is equal to the probability \(P(Y = b)\); that means the probability that the sum of the dice is equal to \(2,5,6,7\) and so on. As observed, we can find the distribution of \(Y\) by only determining its moment generating function.

This leads us to the following theorem:

If \(X_1,X_2,\cdots,X_n\) are \(n\) independent random variables with respective moment generating functions \({ M }_{ { X }_{ i } }(t)=E({ e }^{ t{ X }_{ i } })\) for \(i = 1, 2, \cdots ,n\), then the moment-generating function of the linear combination:

$$ Y=\sum _{ i=1 }^{ N }{ { a }_{ i }{ X }_{ i } } $$

is

$$ M_y(t)= \prod_{i=1}^{n}M_{X_i}(a_it) $$

This theorem explains that if we know the moment generating function of each of the variables that form part of a joint distribution, we can express the moment generating function of this distribution as the product of all the moment generating functions of these variables.

The above theorem, proves the following to be  true:

If \(X_1,X_2,\cdots,X_n\) are observations of a random sample from a distribution with moment-generating function \(M(t)\), where \(-h < t < h\), then:

  1. The moment-generating function of \(Y = \sum_{i=1}^{n}X_i\) is:
    $$ M_Y(t) = \prod_{i=1}^{n}M(t) = [M(t)]^n$$
  2. The moment-generating function of \(\bar{X} = \sum_{i=1}^{n}(1/n)X_i\) is:
    $$ M_{\bar{X}}(t) = \prod_{i=1}^{n}M\bigg(\frac{t}{n}\bigg) = \bigg[M\bigg(\frac{t}{n}\bigg)\bigg]^n $$

Example: Joint Moment Generating Function for Bernoulli Variables

Let \(X_1,X_2,\cdots,X_n\) denote the outcome of \(n\) Bernoulli trials, each with probability success \(p\). The moment generating function of these Bernoulli trials is:

$$M(t) = q + pe^t, \quad -\infty < t < \infty $$

If

$$Y=\sum_{i=1}^{n}X_i$$

then

$$M_Y(t)=\prod_{i=1}^{n}(q+pe^t)= (q+pe^t)^n, \quad -\infty < t < \infty$$

Thus, we see that \(Y\) is a Bernoulli distribution, with \(b(n,p)\).

Example: Joint Moment Generating Function for Binomial Variables

Let \(X\) and \(Y\) are independent binomial random variables with parameters \((n,p)\) and \((m,p)\) respectively.

Find the pmf of \(X + Y\).

Solution

$$\begin{align} { M }_{ X+Y }(t) & ={ M }_{ X }(t){ M }_{ Y }(t) \\ & ={ ({ P }{ e }^{ t }+1-P) }^{ n }{ ({ P }{ e }^{ t }+1-P) }^{ m } \\ & ={ ({ P }{ e }^{ t }+1-P) }^{ n+m } \end{align} $$

To help you understand the example above, the following is the moment generating function of a binomial variable \(X\) with parameters \(n\) and \(p\):

$$ { M }_{ X }(t)=E({ e }^{ tX })=\sum _{ k=0 }^{ n }{ { e }^{ tx }\left( \begin{matrix} n \\ x \end{matrix} \right) { P }^{ x }{ (1-P) }^{ n-x } } $$

$$ =\sum _{ k=0 }^{ n }{ \left( \begin{matrix} n \\ x \end{matrix} \right) { (P{ e }^{ t }) }^{ x }{ (1-P) }^{ n-x } } ={ ({ P }{ e }^{ t }+1-P) }^{ n } $$

It is easy to see why this particular result makes sense. If we toss a coin 5 times in the morning and count the number of heads, the number would be distributed as \(Bin(5, \cfrac { 1 }{ 2 } )\). If we toss the coin a further 10 times in the evening, the number of heads will be distributed as \(Bin(10, \cfrac { 1 }{ 2 })\). Adding the totals together is obviously the same as the \(Bin(15, \cfrac { 1 }{ 2 } )\) distribution that we would expect for the whole day.

Learning Outcome

Topic 3.d: Multivariate Random Variables – Explain and apply joint moment generating functions.

Shop CFA® Exam Prep

Offered by AnalystPrep

Featured Shop FRM® Exam Prep Learn with Us

    Subscribe to our newsletter and keep up with the latest and greatest tips for success
    Shop Actuarial Exams Prep Shop GMAT® Exam Prep


    Daniel Glyn
    Daniel Glyn
    2021-03-24
    I have finished my FRM1 thanks to AnalystPrep. And now using AnalystPrep for my FRM2 preparation. Professor Forjan is brilliant. He gives such good explanations and analogies. And more than anything makes learning fun. A big thank you to Analystprep and Professor Forjan. 5 stars all the way!
    michael walshe
    michael walshe
    2021-03-18
    Professor James' videos are excellent for understanding the underlying theories behind financial engineering / financial analysis. The AnalystPrep videos were better than any of the others that I searched through on YouTube for providing a clear explanation of some concepts, such as Portfolio theory, CAPM, and Arbitrage Pricing theory. Watching these cleared up many of the unclarities I had in my head. Highly recommended.
    Nyka Smith
    Nyka Smith
    2021-02-18
    Every concept is very well explained by Nilay Arun. kudos to you man!
    Badr Moubile
    Badr Moubile
    2021-02-13
    Very helpfull!
    Agustin Olcese
    Agustin Olcese
    2021-01-27
    Excellent explantions, very clear!
    Jaak Jay
    Jaak Jay
    2021-01-14
    Awesome content, kudos to Prof.James Frojan
    sindhushree reddy
    sindhushree reddy
    2021-01-07
    Crisp and short ppt of Frm chapters and great explanation with examples.