Determine conditional and marginal pro ...
Marginal Probability Distribution In the previous reading, we looked at joint discrete distribution... Read More
We can derive moments of most distributions by evaluating probability functions by integrating or summing values as necessary. However, moment generating functions present a relatively simpler approach to obtaining moments.
In the univariate case, the moment generating function, \(M_X(t)\), of a random variable X is given by:
$$ { M }_{ X }(t)=E{ [e }^{ tx }] $$
for all values of \(t\) for which the expectation exists.
Moment generating functions can be defined for both discrete and continuous random variables. For discrete random variables, the moment generating function is defined as:
$${ M }_{ X }(t)=E{ [e }^{ tx }]=\sum _{ x }^{ }{ { e }^{ tx }P(X=x) } $$
and for the continuous random variables, the moment generating function is given by:
$$\int _{ x }^{ }{ { e }^{ tx }{ f }_{ X }(x)dx } $$
If \(Y=Ax+b\), then it can be shown that:
$$M_Y\left(t\right)=e^{bt}M_X(at)$$
That is:
$$M_Y\left(t\right)=E\left[e^{tY}\right]=E\left[e^{t\left(aX+b\right)}\right]=e^{bt}E\left[e^{atX}\right]=e^{bt}M_X(at)$$
Moment generating functions can be extended to multivariate (two or more) random variables, where we use the same underlying concepts. Once the moment generating function is established, we can determine the mean, variance, and other moments.
For a \(n\), random variables: \(X_1,\ldots,\ X_n\) the joint moment generating \(M(t_1,t_2,\ldots,\ t_n)\) is defined for all real values of \(t_1,t_2,\ldots,\ t_n\) as:
$$M\left(t_1,t_2,\ldots,\ t_n\right)=E\left[e^{t_1X_1+t_2X_2+\ldots+t_nX_n}\right]$$
An idividual moment generating function can be obtained from \(M(t_1,t_2,\ldots,\ t_n)\) by letting all \(t_i\prime s\) but one to be 0. That is,
$$M_{X_i}\left(t\right)=E\left[e^{tX_i}\right]=M(0,0,t_i\ldots,\ 0)$$
Moreover, if \(X_1,\ldots,\ X_n\) are n independent random variables and that, \(Y=X_1+\ldots+X_n\) for n independent random variables, it can be proved that:
$$M_Y\left(t\right)=M_{X_1+\ldots+X_n}\left(t\right)=\prod_{i=1}^{n}{M_{X_i}(t)}=M_{X_1}\left(t\right)\bullet M_{X_2}\left(t\right)\bullet\ldots\bullet M_{X_n}(t)$$
If \(X_1,X_2\ldots,\ X_n\) are n independent identically distributed random variables, then it can be shown that,
$$M_Y\left(t\right)=\left[M_X(t)\right]^n$$
Let \(X_1\) and \(X_2\) be independent identically distributed random variables with the following respective pdfs:
$$f\left(x_1\right)={2e}^{-2X_{1\ }},\ x_1\geq 0$$
And
$$f\left(x_2\right)={2e}^{-2X_2},\ x_2\geq 0$$
Now if \(Y=X_1 +X_2 \), find \(M_Y(t)\) .
Solution
Since \(X_1\) and \(X_2\) are independent and identically distributed random variables, Then the joint distributions given by:
$$f\left(x_1,x_2\right)=f\left(x_1\right)\bullet f\left(x_2\right)$$
Thus the moment generating functions is given by:
$$\begin{align}M_Y(t)&=\int_{0}^{\infty}\int_{0}^{\infty}{e^{tY}f\left(x_1,x_2\right)dx_1dx_2=\int_{0}^{\infty}\int_{0}^{\infty}{e^{t(x_1+x_2)}f\left(x_1\right)\bullet f\left(x_2\right)dx_1dx_2}}\\ &=\int_{0}^{\infty}{{2e}^{(t-2)x_1}dx_1}\int_{0}^{\infty}{{2e}^{\left(t-2\right)x_2}dx_2}\\ &=\frac{2}{t-2}\left[e^{\left(t-2\right)x_1}\right]_0^\infty\bullet\frac{2}{t-2}\left[e^{\left(t-2\right)x_2}\right]_0^\infty \\ &=\left(\frac{2}{t-2}\right)\bullet\left(\frac{2}{t-2}\right)\\ &==\left(\frac{2}{t-2}\right)^2\end{align}$$
Let \(X_1\) and \(X_2\) be independent random variables with a uniform distributions on \({1,2,3,4,5,6}\). Let \(Y = X_1 + X_2\). For example, \(Y\) could equal the sum when two fair dice are rolled. The mgf of \(Y\) is:
$$ M_Y(t) = E\big(e^{tY}\big) = E\big[e^{t(X_1+X_2)}\big] = E\big(e^{tX_1}e^{tX_2}\big) $$
The independence of \(X_1\) and \(X_2\) implies that:
$$ M_Y(t) = E\big(e^{tX_1}\big)E\big(e^{tX_2}\big) $$
Note that \(X_1\) and \(X_2\) have the same pmf:
$$ f(x_1)=f(x_2) = \frac{1}{6}, \qquad x=1,2,3,4,5,6, $$
and thus the same mgf. That is,
$$M_{x_1}\left(t\right)=M_{x_2}\left(t\right)=\frac{1}{6}e^t+\frac{1}{6}e^{2t}+\frac{1}{6}e^{3t}+\frac{1}{6}e^{4t}+\frac{1}{6}e^{5t}+\frac{1}{6}e^{6t}$$
$$ M_x(t) = \frac{1}{6}e^{t} + \frac{1}{6}e^{2t} + \frac{1}{6}e^{3t} + \frac{1}{6}e^{4t} + \frac{1}{6}e^{5t} + \frac{1}{6}e^{6t} $$
It then follows that \(M_Y(t) = [M_X(t)]^2\) equals:
$$ \frac{1}{36}e^{2t} + \frac{2}{36}e^{3t} + \frac{3}{36}e^{4t} + \frac{4}{36}e^{5t} +\frac{5}{36}e^{6t} + \frac{6}{36}e^{7t} + \frac{5}{36}e^{8t} + \frac{4}{36}e^{9t} + \frac{3}{36}e^{10t} \frac{2}{36}e^{11t} + \frac{1}{36}e^{12t}. $$
Note that the coefficient of \(e^{bt}\) is equal to the probability \(P(Y = b)\); that means the probability that the sum of the dice is equal to \(2,5,6,7\) and so on. As observed, we can find the distribution of \(Y\) by only determining its moment generating function.
This leads us to the following theorem:
If \(X_1,X_2,\cdots,X_n\) are \(n\) independent random variables with respective moment generating functions \({ M }_{ { X }_{ i } }(t)=E({ e }^{ t{ X }_{ i } })\) for \(i = 1, 2, \cdots ,n\), then the moment-generating function of the linear combination:
$$ Y=\sum _{ i=1 }^{ N }{ { a }_{ i }{ X }_{ i } } $$
is
$$ M_y(t)= \prod_{i=1}^{n}M_{X_i}(a_it) $$
This theorem explains that if we know the moment generating function of each of the variables that form part of a joint distribution, we can express the moment generating function of this distribution as the product of all the moment generating functions of these variables.
The above theorem, proves the following to be true:
If \(X_1,X_2,\cdots,X_n\) are observations of a random sample from a distribution with moment-generating function \(M(t)\), where \(-h < t < h\), then:
Let \(X_1,X_2,\cdots,X_n\) denote the outcome of \(n\) Bernoulli trials, each with probability success \(p\). The moment generating function of these Bernoulli trials is:
$$M(t) = q + pe^t, \quad -\infty < t < \infty $$
If
$$Y=\sum_{i=1}^{n}X_i$$
then
$$M_Y(t)=\prod_{i=1}^{n}(q+pe^t)= (q+pe^t)^n, \quad -\infty < t < \infty$$
Thus, we see that \(Y\) is a Bernoulli distribution, with \(b(n,p)\).
Let \(X\) and \(Y\) are independent binomial random variables with parameters \((n,p)\) and \((m,p)\) respectively.
Find the pmf of \(X + Y\).
Solution
$$\begin{align} { M }_{ X+Y }(t) & ={ M }_{ X }(t){ M }_{ Y }(t) \\ & ={ ({ P }{ e }^{ t }+1-P) }^{ n }{ ({ P }{ e }^{ t }+1-P) }^{ m } \\ & ={ ({ P }{ e }^{ t }+1-P) }^{ n+m } \end{align} $$
To help you understand the example above, the following is the moment generating function of a binomial variable \(X\) with parameters \(n\) and \(p\):
$$ { M }_{ X }(t)=E({ e }^{ tX })=\sum _{ k=0 }^{ n }{ { e }^{ tx }\left( \begin{matrix} n \\ x \end{matrix} \right) { P }^{ x }{ (1-P) }^{ n-x } } $$
$$ =\sum _{ k=0 }^{ n }{ \left( \begin{matrix} n \\ x \end{matrix} \right) { (P{ e }^{ t }) }^{ x }{ (1-P) }^{ n-x } } ={ ({ P }{ e }^{ t }+1-P) }^{ n } $$
It is easy to see why this particular result makes sense. If we toss a coin 5 times in the morning and count the number of heads, the number would be distributed as \(Bin(5, \cfrac { 1 }{ 2 } )\). If we toss the coin a further 10 times in the evening, the number of heads will be distributed as \(Bin(10, \cfrac { 1 }{ 2 })\). Adding the totals together is obviously the same as the \(Bin(15, \cfrac { 1 }{ 2 } )\) distribution that we would expect for the whole day.
Learning Outcome
Topic 3.d: Multivariate Random Variables – Explain and apply joint moment generating functions.