Explain and apply joint moment generating functions

The moment-generating function

Moment generating function knowledge can be extended to multivariate case, where we can use the same remarks of the univariate moment generating function. Calculating mean, values within the distribution only by knowing the Moment generating function and many other applications.

If \(Y = u(X_1,X_2,\dots,X_n)\), we say that we can find \(E(Y)\) by evaluating \(E[u(X_1,X_2,\dots,X_n)]\). It is also true that we can find \(E[e^{tY}]\) by evaluating \(E[e^{tu(X_1,X_2,\dots,X_n)}]\).

Let’s see this in a quick example:

Let \(X_1\) and \(X_2\) be independent random variables with uniform distributions on {1,2,3,4,5,6}. Let \(Y = X_1 + X_2\). For example, \(Y\) could equal the sum when two fair dice are rolled. The mgf of \(Y\) is

$$ M_Y(t) = E\big(e^{tY}\big) = E\big[e^{t(X_1+X_2)}\big] = E\big(e^{tX_1}e^{tX_2}\big). $$

The independence of \(X_1\) and \(X_2\) implies that

$$ M_Y(t) = E\big(e^{tX_1}\big)E\big(e^{tX_2}\big) $$

In this example, \(X_1\) and \(X_2\) have the same pmf,

$$ f(x) = \frac{1}{6}, \qquad x=1,2,3,4,5,6, $$

and thus the same mgf,

$$ M_x(t) = \frac{1}{6}e^{t} + \frac{1}{6}e^{2t} + \frac{1}{6}e^{3t} + \frac{1}{6}e^{4t} + \frac{1}{6}e^{5t} + \frac{1}{6}e^{6t}. $$

It then follows that \(M_Y(t) = [M_X(t)]^2\) equals

$$ \frac{1}{36}e^{2t} + \frac{2}{36}e^{3t} + \frac{3}{36}e^{4t} + \frac{4}{36}e^{5t} +\frac{5}{36}e^{6t} + \frac{6}{36}e^{7t} + \frac{5}{36}e^{8t} + \frac{4}{36}e^{9t} + \frac{3}{36}e^{10t} \frac{2}{36}e^{11t} + \frac{1}{36}e^{12t}. $$

Note that the coefficient of \(e^{bt}\) is equal to the probability \(P(Y = b)\); that means the probability that the sum of the dice is equal to \(2,5\) and so on. These probabilities were found in this text before, with this we see that we can find the distribution of \(Y\) with only determining it’s moment generating function. All of the fractions are the probabilities of each point in the distribution of \(Y\) and b is the data point of the distribution.

This leads us to the following theorem:

If \(X_1,X_2,\cdots,X_n\) are independent random variables with respective moment generating functions \(M_{X_i}(t),i=1,2,3,\cdots,n,\) where \(-h_i < t < h_i,i=1,2,\cdots,n,\) for positive numbers \(h_i,i=1,2,\cdots,n,\) then the moment-generating function of \(Y=\sum_{i=1}^{n}a_iX_i\) is

$$ M_y(t)= \prod_{i=1}^{n}M_{X_i}(a_it),\quad \text{where } -h_i < a_it < h_i, i=1,2,\cdot,n. $$

This theorem explains that if we know the moment generating function of each of the variables that conform the joint distribution we can express the moment generating function of this distribution as the multiplication of all the moment generating functions of these variables.

This theorem then allows us to infer the following corollary:

If \(X_1,X_2,\cdots,X_n\) are observations of a random sample from a distribution with moment-generating function \(M(t)\), where \(-h < t < h\), then

  1. the moment-generating function of \(Y = \sum_{i=1}^{n}X_i\) is

    $$ M_Y(t) = \prod_{i=1}^{n}M(t) = [M(t)]^n,\qquad -h < t < h;$$

  2. the moment-generating function of \(\bar{X} = \sum_{i=1}^{n}(1/n)X_i\) is

    $$ M_{\bar{X}}(t) = \prod_{i=1}^{n}M\bigg(\frac{t}{n}\bigg) = \bigg[M\bigg(\frac{t}{n}\bigg)\bigg]^n , \quad -h < \frac{t}{n} < h.$$

To show all these results we will use distributions that are already known by the reader:

Let \(X_1,X_2,\cdots,X_n\) denote the outcome of \(n\) Bernoulli trials, each with probability success \(p\). The moment generating function of these Bernoulli trials is

$$M(t) = q + pe^t, \quad -\infty < t < \infty.$$

If

$$Y=\sum_{i=1}^{n}X_i,$$

then

$$M_Y(t)=\prod_{i=1}^{n}(q+pe^t)= (q+pe^t)^n, \quad -\infty < t < \infty.$$

Thus, we see that \(Y\) is a Bernoulli distribution, with \(b(n,p)\).

We can do the same process this time with a Poisson distribution:

Let’s define \(Y\) as last example, except it is a group of Poisson events. With this we get the result:

$$M_Y(t)= \prod_{i=1}^{n}(e^{\lambda(e^t-1)}) = (e^{\lambda(e^t-1)})^n$$

 

Learning Outcome

Topic 3.d: Multivariate Random Variables – Explain and apply joint moment generating functions.


X