Define probability generating functions and moment generating functions

The probability generating function of a discrete random variable is a power series representation of the random variable’s probability density function as shown in the formula below:

\(G\left( n \right) =P\left( X=0 \right) \ast n^{ 0 }+P\left( X=1 \right) \ast n^{ 1 }+P\left( X=2 \right) \ast n^{ 2 }+P\left( X=3 \right) \ast n^{ 3 }+P\left( X=4 \right) \ast n^{ 4 }+\cdots \)

Note: \( G\left( 1 \right) =P\left( X=0 \right) +P\left( X=1 \right) +P\left( X=2 \right) +P\left( X=3 \right) +P\left( X=4 \right) +\cdots P\left( X=r \right) =1 \)

Example

Given the experiment of rolling a single die, find the probability generating function.

$$ G\left( n \right) =0\ast n^{ 0 }+\left( \frac { 1 }{ 6 } \right) \ast n^{ 1 }+\left( \frac { 1 }{ 6 } \right) \ast n^{ 2 }+\left( \frac { 1 }{ 6 } \right) \ast n^{ 3 }+\left( \frac { 1 }{ 6 } \right) \ast n^{ 4 }+\left( \frac { 1 }{ 6 } \right) \ast n^{ 5 }+\left( \frac { 1 }{ 6 } \right) \ast n^{ 6 } $$

It can be useful to express \(E(X)\) and \(Var(X)\) as a function of the probability generating function as shown below:

$$ \begin{align*}
& E\left( X \right) =G^{ ‘ }\left( 1 \right) \\
& Var\left( X \right) ={ G }^{ \prime \prime }\left( 1 \right) +{ G }^{ \prime }\left( 1 \right) -\left[ { G }^{ \prime }\left( 1 \right) \right] ^{ 2 } \\
\end{align*}
$$

Example

Given the experiment of rolling a single die, find \(E(X)\) and \(Var(X)\) using the probability generating function.

$$ \begin{align*}
& G\left( n \right) =0\ast n^{ 0 }+\left( \frac { 1 }{ 6 } \right) \ast n^{ 1 }+\left( \frac { 1 }{ 6 } \right) \ast n^{ 2 }+\left( \frac { 1 }{ 6 } \right) \ast n^{ 3 }+\left( \frac { 1 }{ 6 } \right) \ast n^{ 4 }+\left( \frac { 1 }{ 6 } \right) \ast n^{ 5 }+\left( \frac { 1 }{ 6 } \right) \ast n^{ 6 } \\
& G^{ ‘ }\left( n \right) =\left( \frac { 1 }{ 6 } \right) +2\ast \left( \frac { 1 }{ 6 } \right) \ast n+3\ast \left( \frac { 1 }{ 6 } \right) \ast n^{ 2 }+4\ast \left( \frac { 1 }{ 6 } \right) \ast n^{ 3 }+5\ast \left( \frac { 1 }{ 6 } \right) \ast n^{ 4 }+6\ast \left( \frac { 1 }{ 6 } \right) \ast n^{ 5 } \\
& G^{ ‘ }\left( 1 \right) =\left( \frac { 1 }{ 6 } \right) +2\ast \left( \frac { 1 }{ 6 } \right) \ast 1+3\ast \left( \frac { 1 }{ 6 } \right) \ast 1^{ 2 }+4\ast \left( \frac { 1 }{ 6 } \right) \ast 1^{ 3 }+5\ast \left( \frac { 1 }{ 6 } \right) \ast 1^{ 4 }+6\ast \left( \frac { 1 }{ 6 } \right) \ast 1^{ 5 }=3.5 \\
& G^{ ” }\left( n \right) =2\ast \left( \frac { 1 }{ 6 } \right) +3\ast 2\ast \left( \frac { 1 }{ 6 } \right) \ast n+4\ast 3\ast \left( \frac { 1 }{ 6 } \right) \ast n^{ 2 }+5\ast 4\ast \left( \frac { 1 }{ 6 } \right) \ast n^{ 3 }+6\ast 5\ast \left( \frac { 1 }{ 6 } \right) \ast n^{ 4 } \\
& G^{ ” }\left( 1 \right) =2\ast \left( \frac { 1 }{ 6 } \right) +3\ast 2\ast \left( \frac { 1 }{ 6 } \right) \ast 1+4\ast 3\ast \left( \frac { 1 }{ 6 } \right) \ast 1^{ 2 }+5\ast 4\ast \left( \frac { 1 }{ 6 } \right) \ast 1^{ 3 }+6\ast 5\ast \left( \frac { 1 }{ 6 } \right) \ast 1^{ 4 } = 11.667 \\
& Var\left( X \right) =G^{ ” }\left( 1 \right) +G^{ ‘ }\left( 1 \right) -\left[ G^{ ‘ }\left( 1 \right) \right] ^{ 2 }=11.667+3.5-3.5^{ 2 }=2.92 \\
\end{align*}
$$

A moment generating function is a special case of the probability generating function.

The moment generating function of a discrete random variable

$$ M\left( t \right) =E\left( e^{ tX } \right) =\sum { e^{ tX }f\left( x \right) } $$

Example

Given the experiment of rolling a single die, find the moment generating function.

\(M\left( t \right) =e^{ t1 }P\left( X=1 \right) +e^{ t2 }P\left( X=2 \right) +e^{ t3 }P\left( X=3 \right) +e^{ t4 }P\left( X=4 \right) +e^{ t5 }P\left( X=5 \right) +e^{ t6 }P\left( X=6 \right) \)

\( M\left( t \right) =e^{ t }\left( \frac { 1 }{ 6 } \right) +e^{ 2t }\left( \frac { 1 }{ 6 } \right) +e^{ 3t }\left( \frac { 1 }{ 6 } \right) +e^{ 4t }\left( \frac { 1 }{ 6 } \right) +e^{ 5t }\left( \frac { 1 }{ 6 } \right) +e^{ 6t }\left( \frac { 1 }{ 6 } \right) \)

As with the probability generating function, it can also be useful to use the moment generating function to calculate \(E(X)\) and \(Var(X)\) as shown in the formulas below:

$$ \begin{align*}
& E \left(X \right)= \mu =M^{‘} \left(0 \right) \\
& E \left(X^2 \right)=M^{”}\left(0 \right) \\
& Var \left(X \right)= {\sigma}^2= M^{”} \left(0 \right)-\left[M^{‘} \left(0 \right) \right]^2 \\
\end{align*}
$$

Example

Given the experiment of rolling a single die, find \(E(X)\) and \(Var(X)\) using the moment generating function.

$$ \begin{align*}
& E \left(X \right)= \mu=M^{‘}\left(0\right) \\
& M\left( X \right) =e^{ X }\left( \frac { 1 }{ 6 } \right) +e^{ 2X }\left( \frac { 1 }{ 6 } \right) +e^{ 3X }\left( \frac { 1 }{ 6 } \right) +e^{ 4X }\left( \frac { 1 }{ 6 } \right) +e^{ 5X }\left( \frac { 1 }{ 6 } \right) +e^{ 6X }\left( \frac { 1 }{ 6 } \right) \\
& { M }^{ ‘ }\left( X \right) =\left( \frac { 1 }{ 6 } \right) e^{ x }+\left( \frac { 2 }{ 6 } \right) e^{ 2x }+\left( \frac { 3 }{ 6 } \right) e^{ 3x }+\left( \frac { 4 }{ 6 } \right) e^{ 4x }+\left( \frac { 5 }{ 6 } \right) e^{ 5x }+\left( \frac { 6 }{ 6 } \right) e^{ 6x } \\
& { M }^{ ‘ }\left( 0 \right) =\left( \frac { 1 }{ 6 } \right) +\left( \frac { 2 }{ 6 } \right) +\left( \frac { 3 }{ 6 } \right) +\left( \frac { 4 }{ 6 } \right) +\left( \frac { 5 }{ 6 } \right) +\left( \frac { 6 }{ 6 } \right) =3.5 \\ \\
& Var\left( X \right) =\sigma ^{ 2 }=M^{ ” }\left( 0 \right) -\left[ M^{ ‘ }\left( 0 \right) \right] ^{ 2 } \\
& { M }^{ ” }\left( X \right) =\left( \frac { 1 }{ 6 } \right) e^{ x }+\left( \frac { 4 }{ 6 } \right) e^{ 2x }+\left( \frac { 9 }{ 6 } \right) e^{ 3x }+\left( \frac { 16 }{ 6 } \right) e^{ 4x }+\left( \frac { 25 }{ 6 } \right) e^{ 5x }+\left( \frac { 36 }{ 6 } \right) e^{ 6x } \\
& { M }^{ ” }\left( 0 \right) =\left( \frac { 1 }{ 6 } \right) +\left( \frac { 4 }{ 6 } \right) +\left( \frac { 9 }{ 6 } \right) +\left( \frac { 16 }{ 6 } \right) +\left( \frac { 25 }{ 6 } \right) +\left( \frac { 36 }{ 6 } \right) =15.167 \\
& Var\left( X \right) =\sigma ^{ 2 }=M^{ ” }\left( 0 \right) -\left[ M^{ ‘ }\left( 0 \right) \right] ^{ 2 }=15.167-3.5^{ 2 }=2.92 \\
\end{align*}
$$

The moment generating function of a continuous random variable can be shown as:
$$ M\left( t \right) =\int _{ -\infty }^{ \infty }{ { e }^{ tx }f\left( x \right)dx } $$

Example

Given the following probability density function of a continuous random variable:

$$ f\left( x \right) =\begin{cases} .2{ e }^{ -.2x }, & 0\le x\le \infty \\ 0, & otherwise \end{cases} $$

Find the moment generating function.

$$ \begin{align*}
& M\left( t \right) =\int _{ -\infty }^{ \infty }{ { e }^{ tx }f\left( x \right) dx } =\int _{ 0 }^{ \infty }{ { e }^{ tx }\ast \left( .2{ e }^{ -.2x } \right) \ast dx } \\
& M\left( t \right) =\int _{ 0 }^{ \infty }{ .2{ e }^{ x\left( t-.2 \right) }dx={ \left[ \frac { .2{ e }^{ x\left( t-.2 \right) } }{ t-.2 } \right] }_{ x=0 }^{ x=\infty }=-\frac { .2 }{ t-.2 } } \\
\end{align*}
$$

As with the moment generating function of the discrete distribution, we can use the moment generating function of a continuous distribution to calculate \(E(X)\) and \(Var(X)\) using the formulae below:

$$ \begin{align*}
& E \left(X \right)= \mu=M^{‘}\left(0 \right) \\
& E \left(X^2 \right)=M^{”} \left(0 \right) \\
& Var \left(X \right)= {\sigma}^2= M^{”} \left(0 \right)-\left[M^{‘} \left(0\right) \right]^2 \\
\end{align*}
$$

Example

Given the moment generating function shown below, calculate \(E(X)\) and \(Var(X)\).

$$ \begin{align*}
& M \left(t \right)=-{\frac {.2}{t-.2}}=-.2\left(t-.2 \right)^{-1} \\
& M^{‘} \left(t \right)= .2 \left(t-.2 \right)^{-2} \\
& E \left(X \right)=M^{‘} \left(0 \right)=.2\left(-.2 \right)^2={\frac {1}{.2}}=5 \\
& M^{”} (t)= -2 \ast .2 \left(t-.2 \right)^{-3}=-.4\left(t-.2\right)^{-3} \\
& M^{”} \left(0 \right)= -.4\left(-.2 \right)^{-3}=50 \\
& Var \left(X \right)= {\sigma}^{2}= M^{”} \left(0 \right)-\left[M^{‘} \left(0 \right) \right]^2=50-{5}^{2}=25 \\
\end{align*}
$$

Moment Generating Functions of Common Distributions

Binomial

$$ M \left(t \right)={\left(p{e}^{t}+1-p \right)}^n $$

Negative Binomial

$$ M\left( t \right) ={ \left[ \frac { p{ e }^{ t } }{ 1-\left( 1-p \right) { e }^{ t } } \right] }^{ r } $$

Geometric

$$ M\left( t \right) =\frac { p{ e }^{ t } }{ 1-\left( 1-p \right) { e }^{ t } } $$

Poisson

$$ M\left( t \right) =exp\left\{ \lambda \left( { e }^{ t }-1 \right) \right\} $$

Uniform

$$ M\left( t \right)={\frac {\left (e^{tb}-e^{ta} \right)}{t \left(b-a \right) }} $$

Exponential

$$ M\left( t \right)={\frac {\lambda}{\lambda-t}} $$

Gamma

$$ M\left( t \right)={{\left[\frac {\lambda}{\lambda-t}\right]}^s} $$

Normal

$$ M\left( t \right) =exp\left( \mu t+\frac { { \sigma }^{ 2 }{ t }^{ 2 } }{ 2 } \right) $$

 

Learning Outcome

Topic 2.e: Univariate Random Variables – Define probability generating functions and moment generating functions and use them to calculate probabilities and moments.


X