Explain and apply the concepts of rand ...
Definitions: Variable: In statistics, a variable is a characteristic, number, or quantity that... Read More
The probability generating function of a discrete random variable is a power series representation of the random variable’s probability density function as shown in the formula below:
$$ \begin{align*} \text{G}\left(\text{n}\right)&=\text{P}\ \left(\text{X}\ =\ 0\right)\bullet \ \text{n}^0\ +\ \text{P}\ \left(\text{X}\ =\ 1\right)\bullet\ \text{n}^1\ +\ \text{P}\ \left(\text{X}\ =\ 2\right)\bullet\ \text{n}^2\ \\ &+\ \text{P}\ \left(\text{X}\ =\ 3\right)\bullet \ \text{n}^3\ +\ \text{P}\ \left(\text{X}=4\right)\bullet \ \text{n}^4+\cdots \\ &=\sum_{\text{i}=0}^{\infty}{\text{P}\left(\text{X}=\text{x}_\text{i}\right).\text{n}^\text{i}}=\text{E}\left(\text{n}^\text{i}\right) \end{align*} $$
Note: \( G\left( 1 \right) =P\left( X=0 \right) +P\left( X=1 \right) +P\left( X=2 \right) +P\left( X=3 \right) +P\left( X=4 \right) +\cdots P\left( X=r \right) =1 \)
Given the experiment of rolling a single die, find the probability generating function.
Solution
We know that:
$$ \begin{align*} \text{G}\ \left(\text{n}\right)&=\text{E}\left(\text{n}^\text{i}\right)\\ &=0\times \text{n}^0+\left(\frac{1}{6}\right)\times\ \text{n}^1+\left(\frac{1}{6}\right)\times\ \text{n}^2+\left(\frac{1}{6}\right)\times\ \text{n}^3+\left(\frac{1}{6}\right)\times\ \text{n}^4+\left(\frac{1}{6}\right)\times\ \text{n}^5 \\ &+\left(\frac{1}{6}\right)\times\ \text{n}^6 \end{align*} $$
It can be useful to express \(E(X)\) and \(Var(X)\) as a function of the probability generating function as shown below:
$$ \begin{align*} E\left( X \right) &=G^{ \prime }\left( 1 \right) \\ \text{Var}\left(\text{X}\right)&=\text{G}^{\prime \prime}\left(1\right)+\text{G}^{\prime}\left(1\right)-\left[\text{G}^{\prime}\left(1\right)\right]^2 \\ \end{align*} $$
Given the experiment of rolling a single die, find \(E(X)\) and \(Var(X)\) using the probability generating function.
$$ \begin{align*} & G\left( n \right) =0\bullet n^{ 0 }+\left( \frac { 1 }{ 6 } \right) \bullet n^{ 1 }+\left( \frac { 1 }{ 6 } \right) \bullet n^{ 2 }+\left( \frac { 1 }{ 6 } \right) \bullet n^{ 3 }+\left( \frac { 1 }{ 6 } \right) \bullet n^{ 4 }+\left( \frac { 1 }{ 6 } \right) \bullet n^{ 5 }+\left( \frac { 1 }{ 6 } \right) \bullet n^{ 6 } \\ & G^{ \prime }\left( n \right) =\left( \frac { 1 }{ 6 } \right) +2\bullet \left( \frac { 1 }{ 6 } \right) \bullet n+3\bullet \left( \frac { 1 }{ 6 } \right) \bullet n^{ 2 }+4\bullet \left( \frac { 1 }{ 6 } \right) \bullet n^{ 3 }+5\bullet \left( \frac { 1 }{ 6 } \right) \bullet n^{ 4 }+6\bullet \left( \frac { 1 }{ 6 } \right) \bullet n^{ 5 } \\ & G^{ \prime }\left( 1 \right) =\left( \frac { 1 }{ 6 } \right) +2\bullet \left( \frac { 1 }{ 6 } \right) \bullet 1+3\bullet \left( \frac { 1 }{ 6 } \right) \bullet 1^{ 2 }+4\bullet \left( \frac { 1 }{ 6 } \right) \bullet 1^{ 3 }+5\bullet \left( \frac { 1 }{ 6 } \right) \bullet 1^{ 4 }+6\bullet \left( \frac { 1 }{ 6 } \right) \bullet 1^{ 5 }=3.5 \\ & G^{ \prime \prime }\left( n \right) =2\bullet \left( \frac { 1 }{ 6 } \right) +3\bullet 2\bullet \left( \frac { 1 }{ 6 } \right) \bullet n+4\bullet 3\bullet \left( \frac { 1 }{ 6 } \right) \bullet n^{ 2 }+5\bullet 4\bullet \left( \frac { 1 }{ 6 } \right) \bullet n^{ 3 }+6\bullet 5\bullet \left( \frac { 1 }{ 6 } \right) \bullet n^{ 4 } \\ & G^{ \prime \prime }\left( 1 \right) =2\bullet \left( \frac { 1 }{ 6 } \right) +3\bullet 2\bullet \left( \frac { 1 }{ 6 } \right) \bullet 1+4\bullet 3\bullet \left( \frac { 1 }{ 6 } \right) \bullet 1^{ 2 }+5\bullet 4\bullet \left( \frac { 1 }{ 6 } \right) \bullet 1^{ 3 }+6\bullet 5\bullet \left( \frac { 1 }{ 6 } \right) \bullet 1^{ 4 } = 11.667 \\ & Var\left( X \right) =G^{ ” }\left( 1 \right) +G^{ ‘ }\left( 1 \right) -\left[ G^{ ‘ }\left( 1 \right) \right] ^{ 2 }=11.667+3.5-3.5^{ 2 }=2.92 \\ \end{align*} $$
A moment generating function \(M(t)\) of a random variable \(X\) is defined for all real value of \(t\) by:
$$ \text{M}\left( \text{t} \right) =\text{E}\left( { \text{e} }^{ \text{tX} } \right) =\begin{cases} \sum _{ \text{x} }^{ }{ { \text{e} }^{ \text{tX} } } \text{p}\left( \text{x} \right) , \text{ if X is a discrete with mass function } \text{p}\left(\text{x}\right) \\ \int _{ -\infty }^{ \infty }{ { \text{e} }^{ \text{tX} } } \text{f}\left( \text{x} \right) \text{dx}, \text{ if X is continous with density function } \text{f}\left(\text{x}\right) \end{cases} $$
Given the experiment of rolling a single die, find the moment generating function.
Solution
We know that:
\(M\left( t \right) =e^{ t1 }P\left( X=1 \right) +e^{ t2 }P\left( X=2 \right) +e^{ t3 }P\left( X=3 \right) +e^{ t4 }P\left( X=4 \right) +e^{ t5 }P\left( X=5 \right) +e^{ t6 }P\left( X=6 \right) \)
\( M\left( t \right) =e^{ t }\left( \frac { 1 }{ 6 } \right) +e^{ 2t }\left( \frac { 1 }{ 6 } \right) +e^{ 3t }\left( \frac { 1 }{ 6 } \right) +e^{ 4t }\left( \frac { 1 }{ 6 } \right) +e^{ 5t }\left( \frac { 1 }{ 6 } \right) +e^{ 6t }\left( \frac { 1 }{ 6 } \right) \)
As with the probability generating function, it can also be useful to use the moment generating function to calculate \(E(X)\) and \(Var(X)\) as shown in the formulas below:
$$ \begin{align*} & E \left(X \right)= \mu =M^{\prime} \left(0 \right) \\ & E \left(X^2 \right)=M^{\prime\prime}\left(0 \right) \\ & Var \left(X \right)= {\sigma}^2= M^{\prime\prime} \left(0 \right)-\left[M^{‘} \left(0 \right) \right]^2 \\ \end{align*} $$
Consider the moment generating function above. We wish to calculate \(\text{E}\left(\text{X}\right)\) and \(\text{Var}\left(\text{X}\right)\).
We know that:
$$ \text{E}\left(\text{X}\right)=\mu=\text{M}\prime(0) $$
The moment generating function is given by:
$$ \text{M}\left(\text{t}\right)=\text{e}^\text{t}\left(\frac{1}{6}\right)+\text{e}^{2\text{t}}\left(\frac{1}{6}\right)+\text{e}^{3\text{t}}\left(\frac{1}{6}\right)+\text{e}^{4\text{t}}\left(\frac{1}{6}\right)+\text{e}^{5\text{t}}\left(\frac{1}{6}\right)+\text{e}^{6\text{t}}\left(\frac{1}{6}\right) $$
Taking the first derivative, we have:
$$ \text{M}^\prime\left(\text{t}\right)=\left(\frac{1}{6}\right)\text{e}^\text{t}+\left(\frac{2}{6}\right)\text{e}^{2\text{t}}+\left(\frac{3}{6}\right)\text{e}^{3\text{t}}+\left(\frac{4}{6}\right)\text{e}^{4\text{t}}+\left(\frac{5}{6}\right)\text{e}^{5\text{t}}+\left(\frac{6}{6}\right)\text{e}^{6\text{t}} $$
Substituting \(t=0\) in the above equation, we have:
$$ \text{M}^\prime\left(0\right)=\left(\frac{1}{6}\right)+\left(\frac{2}{6}\right)+\left(\frac{3}{6}\right)+\left(\frac{4}{6}\right)+\left(\frac{5}{6}\right)+\left(\frac{6}{6}\right)=3.5 $$
For the variance, we know that:
$$ \text{Var}\left(\text{X}\right)=\sigma^2=\text{E}\left(\text{X}^2\right)-\left[\text{E}\left(\text{X}\right)\right]^2=\text{M}^{\prime\prime}\left(0\right)-\left[\text{M}^\prime\left(0\right)\right]^2 $$
Now taking the second derivative of the moment generating function, we have:
$$ \begin{align*} \text{M}^{\prime \prime}\left(\text{t}\right)&=\left(\frac{1}{6}\right)\text{e}^\text{t}+\left(\frac{4}{6}\right)\text{e}^{2\text{t}}+\left(\frac{9}{6}\right)\text{e}^{3\text{t}}+\left(\frac{16}{6}\right)\text{e}^{4\text{t}}+\left(\frac{25}{6}\right)\text{e}^{5\text{t}}+\left(\frac{36}{6}\right)\text{e}^{6\text{t}} \\ \Rightarrow\ \text{M}^\prime\left(0\right)&=\left(\frac{1}{6}\right)+\left(\frac{4}{6}\right)+\left(\frac{9}{6}\right)+\left(\frac{16}{6}\right)+\left(\frac{25}{6}\right)+\left(\frac{36}{6}\right)=15.167 \\ \text{Var}\left(\text{X}\right)&=\sigma^2=\text{M}^{\prime\prime}\left(0\right)-\left[\text{M}^\prime\left(0\right)\right]^2=15.167-{3.5}^2=2.92 \end{align*} $$
Suppose that the discrete random variable \(X\) has a distribution:
$$ \text{f}\left(\text{x}\right)= \begin{cases} \frac { 1 }{ 2 } ,\text{x}=1 \\ \frac { 3 }{ 8 } ,\text{x}=2 \\ \frac { 1 }{ 8 } ,\text{x}=3 \end{cases} $$
a. Find the moment generating function \(M(t)\).
Solution
We know that:$$ \begin{align*} \text{M}\left(\text{t}\right)&=\text{E}\left(\text{e}^{\text{xt}}\right)=\left(\text{x}+\text{a}\right)^\text{n}=\sum_{\text{x}}{\text{e}^{\text{xt}}\text{f}(\text{x})}\ \\ &=\text{e}^\text{tf}\left(1\right)+\text{e}^{2\text{t}}\text{f}\left(2\right)+\text{e}^{3\text{t}}\text{f}\left(3\right) \\ &=\left(\frac{1}{2}\right)\text{e}^{\prime \text{t}}+\left(\frac{3}{8}\right)\text{e}^{2\text{t}}+\left(\frac{1}{8}\right)\text{e}^{3t} \\ &\therefore\ \text{M}\left(\text{t}\right)=\frac{1}{2}\text{e}^\text{t}+\frac{3}{8}\text{e}^{2\text{t}}+\frac{1}{8}\text{e}^{3\text{t}} \end{align*} $$
b. Use the moment generating function to find the mean of X.
Solution
The mean of X is \( \text{E}\left(\text{X}\right)=\text{M}\prime(0) \).We are given that:$$ \begin{align*} \text{M}\left(\text{t}\right)&=\frac{1}{2}\text{e}^\text{t}+\frac{3}{8}\text{e}^{2\text{t}}+\frac{1}{8}\text{e}^{3t} \\ &{\Rightarrow \text{M}}^\prime(\text{t})=\frac{1}{2}\text{e}^t+\frac{2\times3}{8}\text{e}^{2\text{t}}+\frac{3\times1}{8}\text{e}^{3t} \\ &=\frac{1}{2}\text{e}^t+\frac{6}{8}\text{e}^{2\text{t}}+\frac{3}{8}\text{e}^{3\text{t}} \\ \text{M}^\prime\left(0\right)&=\frac{1}{2}\text{e}^0+\frac{6}{8}\text{e}^{2\times0}+\frac{3}{8}\text{e}^{3\times0}=\frac{1}{2}+\frac{6}{8}+\frac{3}{8}=\frac{13}{8} \end{align*} $$Mean of \( \text{X}= \text{M’} (0) = \frac{13}{8} \)
c. Use the moment generating function to find the variance of X.
Solution
We know that:$$ \text{Var}\ \left(\text{X}\right) \text{E}\left(\text{X}^2\right)-\text{E}\left(\text{X}\right)^2=\text{M}^{\prime\prime}\left(0\right)-{[\text{M}}^{\prime}\left(0\right)]2 $$From (b), we have:$$ \begin{align*} \text{M}^{\prime}(\text{t})&=\frac{1}{2}e^t+\frac{6}{8}\text{e}^{2\text{t}}+\frac{3}{8}\text{e}^{3\text{t}} \\ {\Rightarrow \text{M}}^{\prime\prime}\left(\text{t}\right)&=\frac{1}{2}\text{e}^\text{t}+\frac{2\times6}{8}\text{e}^{2\text{t}}+\frac{3\times3}{8}\text{e}^{3\text{t}}=\frac{1}{2}\text{e}^\text{t}+\frac{12}{8}\text{e}^{2\text{t}}+\frac{9}{8}\text{e}^{3\text{t}} \\ &{\therefore \text{M}}^{\prime\prime}\left(0\right)=\frac{1}{2}\text{e}^0+\frac{12}{8}\text{e}^{2\times0}+\frac{9}{8}\text{e}^{3\times0} \\ &=\frac{1}{2}+\frac{12}{8}+\frac{9}{8}=\frac{25}{8} \\ \text{Var}\ (\text{X})&=\text{M}^{\prime\prime}\left(0\right)-\text{M}^{\prime}(0)² \\ &=\frac{25}{8}-\left(\frac{13}{8}\right)^2=\frac{25}{8}-\frac{169}{64}=\frac{31}{64} \end{align*} $$
Given the following probability density function of a continuous random variable:
$$ f\left( x \right) =\begin{cases} 0.2{ e }^{ -0.2x }, & 0\le x\le \infty \\ 0, & otherwise \end{cases} $$
Find the moment generating function.
Solution
For a continuous distribution,
$$ \text{M}\left(t\right)=\ \int_{-\infty}^{\infty}{\text{e}^{\text{tx}}\text{f}\left(\text{x}\right)\text{dx}} $$
So, in this case:
$$ \begin{align*} \text{M}\left(\text{t}\right)&=\int_{0}^{\infty}{\text{e}^{\text{tx}}\times\left(0.2\text{e}^{-0.2\text{x}}\right)\times \text{dx}} \\ \text{M}\left(\text{t}\right)&=\int_{0}^{\infty}{0.2\text{e}^{\text{x}\left(\text{t}-0.2\right)}\text{dx}=\left[\frac{0.2\text{e}^{\text{x}\left(t-0.2\right)}}{\text{t}-0.2}\right]_{\text{x}=0}^{\text{x}=\infty}=-\frac{0.2}{t-0.2}} \end{align*} $$
As with the moment generating function of the discrete distribution, we can use the moment generating function of a continuous distribution to calculate \(E(X)\) and \(Var(X)\) using the formulae below:
$$ \begin{align*} E \left(X \right) & = \mu=M^{\prime}\left(0 \right) \\ E \left(X^2 \right) & =M^{\prime\prime} \left(0 \right) \\ Var \left(X \right) & = {\sigma}^2= M^{\prime\prime} \left(0 \right)-\left[M^{\prime} \left(0\right) \right]^2 \\ \end{align*} $$
Given the moment generating function shown below, calculate \(E(X)\) and \(Var(X)\).
$$ \text{M}\left(\text{t}\right)=\frac{0.2}{\text{t}-0.2\ } $$
Solution
Note that we can write:
$$ \text{M}\left(\text{t}\right)=\frac{0.2}{t-0.2\ }=-0.2\left(\text{t}-0.2\right)^{-1} $$
So that:
$$ \text{M}^\prime\left(\text{t}\right)=0.2\left(\text{t}-0.2\right)^{-2} $$
We know that:
$$ \text{E}\left(\text{X}\right)=\text{M}^\prime\left(0\right)=0.2\left(-0.2\right)^2=\frac{1}{0.2}=5 $$
Also,
$$ \begin{align*} {\text{E}\left(\text{X}^2\right)}&=\text{M}^{\prime\prime} \left(\text{t}\right)=-2\times0.2\left(\text{t}-0.2\right)^{-3}=-0.4\left(\text{t}-0.2\right)^{-3} \\ \Rightarrow \text{M}^{\prime\prime}\left(0\right)&=-0.4\left(-0.2\right)^{-3}=50 \\ \therefore \text{Var}\left(\text{X}\right)&=\sigma^2=\text{M}^{\prime\prime}\left(0\right)-\left[\text{M}^\prime\left(0\right)\right]^2=50-5^2=25 \end{align*} $$
The moment generating function for \(X\) with a binomial distribution is an alternate way of determining the mean and variance.
Let us perform n independent Bernoulli trials, each of which has a probability of success \(p\) and probability of failure \(1-p\). Thus, the probability mass function (PMF) is:
$$ \text{f}\left(\text{x}\right)=\binom{\text{n}}{\text{x}}\text{p}^\text{x}\left(1-\text{p}\right)^{\text{n}-\text{x}} $$
Where, \(\binom{\text{n}}{\text{x}}\) can be written as \(\text{C}(\text{n},\text{x}) \) and denotes the number of combinations of n elements taken \(x\) at a time; where \(x\) can take on values \(0,1,2,3, …,n\).
Using the PMF, we can obtain the moment generating function of \(X\):
$$ \text{M}\left(\text{t}\right)=\sum_{\text{x}=0}^{\text{n}}{\text{e}^{\text{tx}}\binom{\text{n}}{\text{x}}\text{p}^\text{x}\left(1-\text{p}\right)^{\text{n}-\text{x}}} $$
Combine the terms with exponential of \(x\):
$$ \text{M}\left(\text{t}\right)=\sum_{\text{x}=0}^{\text{n}}{\left(\text{pe}^\text{t}\right)^\text{x}\binom{\text{n}}{\text{x}}\ \left(1-\text{p}\right)^{\text{n}-\text{x}}} $$
By using the binomial formula, the expression above is simply:
$$ M \left(t \right)={\left(p{e}^{t}+1-p \right)}^n $$
In order to calculate the mean and variance, we need to find both \(\text{M}\prime(0)\) and \(\text{M}\prime\prime(0)\).
First, you start by calculating the derivatives, then evaluate each of them at \(t=0\). The 1st derivative of the moment generating function is:
$$ \text{M}^\prime\left(\text{t}\right)=\text{n}\left(\text{pe}^\text{t}+1-\text{p}\right)^{\text{n}-1}\text{pe}^\text{t} \\ \Rightarrow \text{M}^\prime\left(0\right)=\text{n}\left(\text{p}+1-\text{p}\right)^{\text{n}-1}\text{p}=\text{np} $$
If we differentiate the second time, we get:
$$ \text{M}^{\prime\prime}\left(\text{t}\right)=\text{n}\left(\text{n}-1\right)\left(\text{pe}^t+1-\text{p}\right)^{\text{n}-2}\left(\text{pe}^\text{t}\right)^2+\text{n}\left(\text{pe}^\text{t}+1-\text{p}\right)^{\text{n}-1}\text{pe}^\text{t} $$
So that:
$$ \text{E}\left(\text{X}^2\right)=\text{M}^{\prime\prime}\left(0\right)=\text{n}\left(\text{n}-1\right)\text{p}^2+\text{np} $$
We know that:
$$ \text{Var}\left(\text{X}\right)=\text{E}\left(\text{X}^2\right)-\left[\text{E}\left(\text{X}\right)\right]^2=\text{n}\left(\text{n}-1\right)\text{p}^2+\text{np}-\text{n}^2\text{p}^2=\text{np}(1-\text{p}) $$
The moment generating function of a negative binomial distribution is given by:
$$ M\left( t \right) ={ \left[ \frac { p{ e }^{ t } }{ 1-\left( 1-p \right) { e }^{ t } } \right] }^{ r } $$
Which is derived as follows:
$$ \begin{align*} \text{M}\left(\text{t}\right)=\text{E}\left[\text{e}^{\text{tX}}\right]&=\sum_{\text{k}=\text{r}}^{\infty}{\text{e}^{\text{tk}}\binom{\text{k}-1}{\text{r}-1}\left(1-\text{p}\right)^{\text{k}-\text{r}}\text{p}^\text{r}} \\ &=\sum_{\text{k}=\text{r}}^{\infty}{\text{e}^{\text{tk}}\binom{\text{k}-1}{\text{r}-1}\left(1-\text{p}\right)^{\text{k}-\text{r}}\text{p}^\text{r}\times\frac{\text{e}^{\text{tr}}}{\text{e}^{\text{tr}}}} \\ &=\sum_{\text{k}=\text{r}}^{\infty}{\text{e}^{\text{t}\left(\text{k}-\text{r}\right)}\binom{\text{k}-1}{\text{r}-1}\left(1-\text{p}\right)^{\text{k}-\text{r}}{(\text{e}}^\text{t}{\text{p})}^\text{r}} \\ &=\left(\text{e}^\text{t} \text{p}\right)^\text{r}\sum_{\text{k}=\text{r}}^{\infty}{\binom{\text{k}-1}{\text{r}-1}{(\text{e}}^\text{t}{\left(1-\text{p}\right))}^{\text{k}-\text{r}}} \\ \end{align*} $$
Setting \(j=k-r\),
$$ \text{M}\left(\text{t}\right)={\text{e}^\text{t}}{\text{p}}^\text{r}\sum_{\text{j}=0}^{\infty}{{\text{j}+\text{r}-1}{\text{r}-1}(\text{e}^\text{t}{\left(1-\text{p}\right))}^\text{j}} \\ \frac{\text{e}^\text{t}{\text{p})}^\text{r}}{(1-\text{e}^\text{t}(1-{\text{p}))}^\text{r}}=\text{M}\left(\text{t}\right)=\left[\frac{\text{pe}^\text{t}}{1-\left(1-\text{p}\right)\text{e}^\text{t}}\right]^\text{r} $$
The moment generating function of geometric distribution is given by:
$$ M\left( t \right) =\frac { p{ e }^{ t } }{ 1-\left( 1-p \right) { e }^{ t } } $$
The moment generating function for \( \text{X}\sim geometric \left(\text{p}\right)\) is derived as:
$$ \begin{align*} \text{M}\left(\text{t}\right)&=\text{E}[\text{e}^{\text{tX}}] \\ &=\sum_{\text{x}=0}^{\infty}{\text{e}^{\text{tx}}\text{p}\left(1-\text{p}\right)^\text{x}} \\ \text{Step}\ (\text{i})&=\text{p}\sum_{\text{x}=0}^{\infty}{(\text{e}^\text{t}{\left(1-\text{p}\right))}^\text{x}} \end{align*} $$
Where the series in step (i) converges only if:
$$ \left(1-\text{p}\right)\text{e}^\text{t}$$
Meaning, only if:
$$ \text{e}^\text{t}<\frac{1}{(1-\text{p})} $$
The condition becomes \( \text{t}<-\ln(1-\text{p}) \) by taking the natural log of both sides.
$$ \text{M}(\text{t})=\frac{\text{p}}{1-\left(1-\text{p}\right)\text{e}^\text{t}} $$
Let \(X\) be a discrete random variable with a Poisson distribution with parameter \( \lambda \) for some \( \lambda\epsilon\text{R}>0 \).
Then the moment generating function \({\text{M} }_{\text{X}}\) of X is given by:
$$ \text{M}\left(\text{t}\right)=\text{e}^{\lambda\left(\text{e}^\text{t}-1\right)} $$
From Poisson distribution definition, \(X\) has a PMF:
$$ \text{P}\left(\text{X}=\text{n}\right)=\frac{\lambda^\text{n} \text{e}^{-\lambda}}{\text{n}!} $$
From the moment generating function definition:
$$ \text{M}_\text{X}\left(\text{t}\right)=\text{E}\left(\text{e}^{\text{tX}}\right)=\sum_{\text{n}=0}^{\infty}{\text{P}\left(\text{X}=\text{n}\right)\text{e}^{\text{tn}}} $$
Thus,
$$ \text{M}_\text{X}\left(\text{t}\right)=\sum_{\text{n}=0}^{\infty}\frac{\lambda^\text{n} \text{e}^{-\lambda}}{\text{n}!}\text{e}^{\text{tn}}\\ =\text{e}^ {-\lambda}\sum_{\text{n}=0}^{\infty}\frac{\left(\lambda \text{e}^\text{t}\right)^\text{n}}{\text{n}!} $$
By the power series expansion for the exponential function:
$$ =\text{e}^{-\lambda}\sum_{\text{n}=0}^{\infty}\frac{\left(\lambda \text{e}^\text{t}\right)^\text{n}}{\text{n}!}{=\text{e}}^{-\lambda}\text{e}^{\lambda \text{e}^\text{t}\ \ \ \ \ }\ \ =\text{e}^{\lambda\left(\text{e}^\text{t}-1\right)} $$
Let \(\text{X}~\text{U}[\text{a}\ .\ .\text{b}] \) for some \(\text{a},\ \text{b}\ \epsilon\mathbb{\text{R}},\ \text{a}\neq\ \text{b},\) where U is the continuous uniform distribution
Then the moment generating function of \(X\), \({\text{M}}_{\text{X}}\), is given by:
$$ {\text{M}}_{\text{X}}\left(\text{t}\right) = \begin{cases} \frac { \text{e}^{ \text{tb} }-\text{e}^{ \text{ta} } }{ \text{t}(\text{b}-\text{a}) } & \text{t}\neq 0 \\ 1 & \text{t}= 0 \\ \end{cases} $$
The definition of continuous uniform distribution, \(X\) has a PMF of:
$$ \text f_{\text X}\left(\text x\right)= \begin{cases} \frac { 1 }{ \text{b}-\text{a} } & \text{a}\le \text{x}\le \text{b} \\ 0 & \text{otherwise} \\ \end{cases} $$
The moment generating function definition goes as follows:
$$ \text{M}_\text{X}\left(t\right)=\mathbb{\text{E}}\left[\text{e}^{\text{tX}}\right]=\int_{\infty}^{\infty}{\text{e}^{\text{tx}}\text{f}_\text{X}(\text{x})\text{dx}} $$
Then, we first have to consider that \(\text{t}\neq 0\).
Thus,
$$ \begin{align*} \text{M}_\text{X}\left(\text{t}\right)&=\int_{\infty}^{\text{a}}{0\text{e}^{\text{tx}}\text{dx}+\int_{\text{a}}^{\text{b}}{\frac{\text{e}^{\text{tx}}}{\text{b}-\text{a}}\text{dx}+\int_{\text{b}}^{\infty}{0\text{e}^{\text{tx}}\text{dx}}}} \\ &{=\left[\frac{\text{e}^{\text{tx}}}{\text{t}\left(\text{b}-\text{a}\right)}\right]}_\text{a}^\text{b}\ \text{ by primitive of e}^{\text{ax}}\ (\text{a fundamental theorem of calculus}) \\ \text{M}_\text{X}(\text{t})&=\frac{\text{e}^{\text{tb}}-\text{e}^{\text{ta}}}{\text{t}(\text{b}-\text{a})} \end{align*} $$
Note: For \(t=0\), we have \( \mathbb{\text{E}}\left[\text{X}^0\right]=\mathbb{\text{E}}\left[1\right]=1 \).
If \(X\) has an exponential distribution, then the formulae below apply:
$$ \text{M}\left(\text{t}\right)=\frac{\lambda}{\lambda-\text{t}} $$
$$ \begin{align*} \text{M}\left(\text{t}\right)&=\text{E}\left(\text{e}^{\text{tX}}\right)\\ &=\int_{0}^{\infty}{\text{e}^{\text{tx}}\lambda \text{e}^{-\lambda \text{x}}\text{dx}} \\ &=\lambda\int_{0}^{\infty}{\text{e}^{-\left(\lambda-t\right)\text{x}}\ \text{dx}\ }\\ &=\frac{\lambda}{\lambda-\text{t}}\ for\ \text{t}<\lambda \end{align*} $$
Note that:
$$ \begin{align*} \text{M}^\prime\left(\text{t}\right)& =\frac{\lambda}{\left(\lambda-\text{t}\right)^2} \\ \Rightarrow \text{M}^\prime\left(0\right) & =\frac{1}{\lambda} \\ \therefore \text{E}\left(\text{X}\right) & =\frac{1}{\lambda} \end{align*} $$
Also,
$$ \begin{align*} \text{M}^{\prime\prime}\left(\text{t}\right)&=\frac{2\lambda}{\left(\lambda-t\right)^3}\\ \Rightarrow\ \text{M}^{\prime\prime}\left(0\right) & =\frac{2}{\lambda^2} \\ \therefore\ \text{Var}\left(\text{X}\right)& =\frac{2}{\lambda^2}-\left(\frac{1}{\lambda}\right)^2=\frac{1}{\lambda^2} \end{align*} $$
If \(X\) has a gamma distribution over the interval \( [0,\ \infty] \) with parameters \(k\) and \({ \lambda } \), then the formulae below apply:
$$ \text{M}\left(\text{t}\right)=\left[\frac{\lambda}{\lambda-\text{t}}\right]^\text{s} $$
Let \( \text{x}\sim \text{N}(\mu,\ \sigma^2) \), then the moment generating function is:
$$ \text{M}\left(\text{t}\right)=\text{e}^{\mu \text{t}+\frac{\left(\sigma^2\text{t}^2\right)}{2}} $$
Intuitively, the MGF of a standard normal variable is:
$$ \text{M}\left(\text{t}\right)=\text{e}^\frac{\text{t}^2}{2} $$
Learning Outcome
Topic 2.e: Univariate Random Variables – Define probability generating functions and moment generating functions and use them to calculate probabilities and moments.