Probability Generating Functions and M ...
Probability Generating Function The probability generating function of a discrete random variable is... Read More
Given \(X\) and \(Y\) are independent random variables, then the probability density function of \(a=X+Y\) can be shown by the equation below:
$$ { f }_{ X+Y }\left( a \right) =\int _{ -\infty }^{ \infty }{ { f }_{ X }\left( a-y \right) } { f }_{ Y }\left( y \right) dy $$
The cumulative distribution function, also called the convolution of \(X\) and \(Y\), can be shown by the equation below:
$$ { F }_{ X+Y }\left( a \right) =\int _{ -\infty }^{ \infty }{ { F }_{ X }\left( a-y \right) } { f }_{ Y }\left( y \right) dy $$
Given two independent uniform random variables shown by the probability density functions below, find the probability density function of \(a=X+Y\).
$$ f\left( x \right) =\begin{cases} \frac { 1 }{ 2 } & 0\le X\le 2 \\ 0 & \text{otherwise} \end{cases} \quad f\left( y \right) =\begin{cases} \frac { 1 }{ 2 } & 0\le Y\le 2 \\ 0 & \text{otherwise} \end{cases}$$
Solution
We know that:
$$ { f }_{ X+Y }\left( a \right) =\int _{ -\infty }^{ \infty }{ { f }_{ X }\left( a-y \right) } \left( \cfrac{ 1 }{ 2 } \right) dy $$
Therefore,
$$\begin{align} { f }_{ X+Y }\left( a \right) &=\int _{ -\infty }^{ \infty }{ \left( \cfrac{ 1 }{ 2 }\right) . \left( \cfrac{ 1 }{ 2 } \right)} dy\\ &=\int _{ -\infty }^{ \infty }{\frac{1}{4}} dy \end{align}$$
We now need to find the interval of \(a=X+Y\). Clearly, X varies in the interval [0,2] and Y varies in the interval [0,2], and thus \(a=X+Y\) must vary in the interval [0,4]. However, note that from the intervals of X and Y, we have two possible intervals for \(a=X+Y\): [0,2) and (2,4].
Now considering the first interval [0,2), we will integrate \({ f }_{ X+Y }\left( a \right)\) from 0 to \(a\) since he lower interval of Y is 0 and that \(Y\le a\). Thus for this case:
$$ { f }_{ X+Y }\left( a \right) =\int _{ 0 }^{ a }{ \left( \frac { 1 }{ 2 } \right) } \left( \cfrac{ 1 }{ 2 } \right) dy=\cfrac { 1 }{ 4 } \text{ a for } 0 < a < 2 $$
For the interval (2,4], we will integrate \({ f }_{ X+Y }\left( a \right)\) from \(a-2\) to 2 since, the upper bound of Y is 2 and for the lower bound, note that \(Y\le a\) and that \(0\le Y\le 2\). Thus:
$$ \begin{align}{ f }_{ X+Y }\left( a \right) &=\int _{ a-2 }^{ 2 }{ \left( \frac { 1 }{ 2 } \right) } \left( \frac { 1 }{ 2 } \right) dy\\ &={ \left[ \frac { 1 }{ 4 } y \right] }_{ y=a-2 }^{ y=2 }\\&=\left( \frac { 1 }{ 4 } \right) \left( 2-\left( a-2 \right) \right)\\& =\left( \frac { 1 }{ 4 } \right) \left( 4-a \right) \text{ for } 2 < a < 4 \end{align}$$
Therefore:
$${ f }_{ X+Y }\left( a \right) =\begin{cases} \frac { 1 }{ 4 } & 0\le a\le 2 \\ \left( \frac { 1 }{ 4 } \right) \left( 4-a \right) & 2\le a\le 4 \\ 0 & \text{otherwise} \end{cases}$$
If \(X\) and \(Y\) are independent random variables, then the following are true:
$$ P \left(X=x \text{ and } Y=y \right) = P\left(X=x \right) \bullet P\left(Y=y \right)=f \left(x \right)\bullet f \left(y \right) $$
Given the experiment of rolling two dice simultaneously, find the probability of rolling a 6 on both dice.
$$ P \left(X=6 \text{ and } Y=6 \right) = P \left(X=6 \right) \bullet P\left(Y=6 \right) = \cfrac {1}{6} \bullet \cfrac {1}{6} = \cfrac {1}{36} $$
$$ E \left(XY \right)=E \left(X \right) \bullet E\left(Y \right) $$
Given two independent uniform random variables shown by the probability density function below, find \(E(X+Y)\) and \(Var (X+Y)\).
$$
\text{f}\left(\text{x}\right)=\begin{cases} \frac { 1 }{ 2 } ,& 0\le \text{X}\le 2 \\ 0 & \text{otherwise} \\ \end{cases};\ \ \ \ \text{f}\left(\text{y}\right)=\begin{cases} \frac { 1 }{ 2 } ,& 0\le \text{X}\le 2 \\ 0 & \text{otherwise} \\ \end{cases}
$$
Solution
We know that for two independent random variables:
$$
{E}\left( {X}+ {Y}\right)= {E}\left( {X}\right)+ {E}\left( {Y}\right)
$$
Now,
$$
{E}\left( {X}\right)=\frac{2+0}{2}=1
$$
And
$$
{E}\left( {Y}\right)=\frac{2+0}{2}=1 \\
\Rightarrow\ {E}\left( {X}+ {Y}\right)=1+1=2
$$
For the variance, we know that:
$$
{Var}\left( {X}+ {Y}\right)= {Var}\left(
{X}\right)+ {Var}( {Y})
$$
We have that:
$$
{Var}\left( {X}\right)=\frac{\left(2-0\right)^2}{12}=\frac{1}{3}
$$
And
$$ \begin{align*}
{Var}\left(Y\right) & =\frac{\left(2-0\right)^2}{12}=\frac{1}{3} \\
\Rightarrow\ {Var}\left( {X}+ {Y}\right) & =\frac{1}{3}+\frac{1}{3}=\frac{2}{3}
\end{align*} $$
We can also use the probability density function of \(X+Y\) that was derived above to verify this solution:
$$ \begin{align*}
{f}_\left( {X}+ {Y}\right)\left( {x}\right)&=\begin{cases} \frac { 1 }{ 4 } {x},& 0\le {x}\le 2 \\ \left( \frac { 1 }{ 4 } \right) \left( 4- {x} \right) ,& 2\le {x}\le 4 \\ 0 & \text{otherwise} \\ \end{cases} \\
{E}\left( {E}+ {Y}\right)&=\int_{0}^{2}{\left(\frac{1}{4}\right) {x}. {x}\ {dx}\ +\int_{2}^{4}{\left(\frac{1}{4}\right)\left(4- {x}\right) {x}\ {dx}}} \\
&=\left[\frac{1}{12} {x}^3\right]_{ {x}=0}^{ {x}=2}+\left[\left(\frac{1}{2}\right) {x}^2-\frac{1}{12} {x}^3\right]_{ {x}=2}^{ {x}=4}=\frac{2}{3}+\frac{4}{3}=\frac{6}{3}=2\
\end{align*} $$
It is also true that for two independent random variables,
$$
{M}_{ {X}+ {Y}}\left( {t}\right)= {M}_ {X} {t}\times\ {M}_ {Y} {t}
$$
Given the following moment generating functions of independent random variables, \(X\) and \(Y\), find the moment generating function of \(X+Y\).
$$ { M }_{ X }\left( t \right) =exp\left\{ .2\left( { e }^{ t }-1 \right) \right\} \\ \text{And,} \\
\quad { M }_{ Y }\left( t \right) =exp\left\{ .3\left( { e }^{ t }-1 \right) \right\} $$
$$ { M }_{ X+Y }\left( t \right) = exp\left\{ .2\left( { e }^{ t }-1 \right) \right\} \bullet exp\left\{ .3\left( { e }^{ t }-1 \right) \right\} $$
Solution
We know that for two independent random variables:
$$
{M}_{ {X}+ {Y}}\left( {t}\right)= {M}_ {X} {t}\times\ {M}_ {Y} {t}
$$
Thus, in this case:
$$ \begin{align*}
{M}_{ {X}+ {Y}}\left( {t}\right)&= {e}^{0.2\left( {e}^ {t}-1\right)}\times\ {e}^{0.3\left( {e}^ {t}-1\right)} \\
{M}_{ {X}+ {Y}}\left( {t}\right)&= {e}^{(0.2 {e}^ {t}-0.2+0.3 {e}^ {t}-0.3)} \\
&= {e}^{0.5 {e}^ {t}-0.5}= {e}^{0.5\left( {e}^ {t}-1\right)}
\end{align*} $$
If \(X\) and \(Y\) are independent Poisson random variables with parameters \(\lambda_x \) and \(\lambda_y\) respectively, then \({ {X}+ {Y}}\) is a Poison distribution with parameter \(\lambda=\lambda_ {x}+\lambda_ {y} \).
Prove that the sum of two Poisson variables also follows a Poisson distribution.
Solution
The probability generating function (PGF) of a discrete random variable \(x\) is given by:
$$ {G}\left( {t}\right)=\sum_{\left(all\ {x}\right)}{ {P}\left( {X}= {x}\right) {t}^ {x}} $$
Consider \( {X} \sim {Po}(\lambda) \).
Where \( {P}\left( {X}= {x}\right)=\frac{\lambda^ {x} {e}^{-\lambda}}{x!} \).
$$ \begin{align*}
{G}_ {X}\left( {t}\right)&= {E}( {t}^ {x}) \\
&=\sum_{ {x}=0}^{\infty}{ {t}^ {x}\ \frac{\lambda^ {x} {e}^{-\lambda}}{ {x}!}} \\
&= {e}^{-\lambda}\sum_{ {x}=0}^{\infty}\frac{\left( {t}\lambda\right)^ {x}}{ {x}!} \\
&{= {e}}^{-\lambda} {e}^{ {t}\lambda} \\
&{= {e}}^{-\lambda+ {t}\lambda} \\
{ {G}_ {X}( {t})}&= {e}^{-\lambda\left(1- {t}\right)}
\end{align*} $$
Let \( {Y} \sim {Po}(\mu) \)
$$ {G}_ {Y}\left( {t}\right)= {e}^{-\mu\left(1-{t}\right)} $$
Thus,
$$ \begin{align*}
{G}_{ {X}+ {Y}}\left( {t}\right)&= {G}_ {X}\left( {t}\right)+ {G}_ {Y}( {t}) \\
&{= {e}}^{-\lambda\left(1- {t}\right)} {e}^{-\mu\left(1- {t}\right)} \\
&{= {e}}^{-(\lambda+\mu)\left(1- {t}\right)}
\end{align*} $$
Hence \( {X}+ {Y}\sim {P}(\lambda+\mu) \).
Let \(X\), \(Y\), and \(Z\) be independent Poisson random variables with \( {E}\left( {X}\right)=1,\ {E}\left( {Y}\right)=2 \text{ and } {E}\left( {Z}\right)=3\).
Find \({P}( {X}+ {Y}+ {Z}\le2) \).
Solution
Fact 1:
The Poisson PMF is given by:
$$ \begin{align*}
& {P}\left( {X}= {x}\right)=\frac{ {e}^{-\lambda}\lambda^ {x}}{ {x}!},\ \ \ {x}=0,1,2,3,\ldots \\
& {E}\left( {X}\right)=\lambda
\end{align*} $$
Fact 2:
If \( {X}\sim \text{ Poisson } (\lambda_1), \)
and \( {Y}\sim \text{ Poisson } \left(\lambda_2\right), {\text{ X and Y iind.}}, \)
then \( {X}+ {Y}\sim \text{ Poisson }(\lambda_1+\lambda_2) \).
So,
$$ \begin{align*}
& {X}+ {Y}+ {Z} \sim Poisson\left(\lambda_1+\lambda_2+\lambda_3\right)=1+2+3=6 \\
& {P}\left( {X}+ {Y}+ {Z}\right)\le2 \\
& {P}\left( {X}+ {Y}+ {Z}=0\right)+ {P}\left( {X}+ {Y}+ {Z}=1\right)+ {P}\left(X+Y+Z=2\right) \\
&=\frac{ {e}^{-6}6^0}{0!}+\frac{ {e}^{-6}6^1}{1!}+\frac{ {e}^{-6}6^2}{2!} \\
&{= {e}}^{-6}(1+6+18) \\
&=25 {e}^{-6}=0.061968804\approx0.062
\end{align*} $$
If \(X\) and \(Y\) are independent normally distributed random variables with parameters \(\mu_ {x}, \sigma_ {x} \) and \( \mu_ {y}, \sigma_ {y} \) respectively, then \(X+Y\) is normally distributed with parameters \(\mu=\mu_ {x}+\mu_ {y}\) and \(\sigma^2=\sigma_ {x}^2+\sigma_ {y}^2\).
Suppose that \(X\) is a normal random variable with a given mean and variance so that:
$$ {f}_ {X}\left( {x}\right)=\frac{1}{\sqrt{(2\pi\sigma_ {x})}} {e}^\frac{{-\left( {x}-\mu_ {x}\right)}^2}{2\sigma_ {x}^2} $$
Similarly, \(Y\) is normal with a given mean and variance so that:
$$ {f}_ {Y}\left( {y}\right)=\frac{1}{\sqrt{(2\pi\sigma_ {y})}} {e}^\frac{{-\left( {y}-\mu_ {y}\right)}^2}{2\sigma_ {y}^2} $$
Assume that \(X\) and \(Y\) are independent. We wish to find the sum of the two normal random variables by deriving the PDF of \(Z\):
$$ {Z}= {X}+ {Y} $$
We apply the convolution formulae:
Where we have \( {X}\sim {N}\left(\mu_ {x},\ \sigma_ {x}^2\right)\text{ and } {Y}\sim {N}(\mu_ {y},\ \sigma_ {y}^2) \) stated as:
$$ {f}_ {X}\left( {x}\right)=\frac{1}{\sqrt{(2\pi\sigma_ {x})}} {e}^\frac{{-\left( {x}-\mu_ {x}\right)}^2}{2\sigma_ {x}^2}, {f}_ {Y}\left( {y}\right)=\frac{1}{\sqrt{(2\pi\sigma_ {y})}} {e}^\frac{{-\left( {y}-\mu_ {y}\right)}^2}{2\sigma_ {y}^2} $$
Using the convolution formulae \( {f}_ {Z}\left( {z}\right)=\int_{-\infty}^{\infty}{ {f}_ {X}\left( {x}\right) {f}_ {Y}( {z}- {x}) {dx}} \), we plug in the form for the density of \(X\) at \( {f}_ {X}( {x}) \) and form for the density of \(Y\) at \( {f}_ {Y}( {Z}- {X}) \).
Therefore,
$$ \begin{align*}
{f}_ {z}( {z}) & =\int _{ -\infty }^{ \infty }{ \cfrac { 1 }{ \sqrt { \left( 2\pi { \sigma }_{ {x} } \right) } } exp } \left\{ -\cfrac { { \left( {x}-{ \mu }_{ {x} } \right) }^{ 2 } }{ 2{ \sigma }_{ {x} }^{ 2 } } \right\} \cfrac { 1 }{ \sqrt { \left( 2\pi { \sigma }_{ {y} } \right) } } exp\left\{ -\cfrac { { \left( {z}- {x}-{ \mu }_{ {y} } \right) }^{ 2 } }{ 2{ \sigma }_{ {y} }^{ 2 } } \right\} {dx} \\
{f}_ {z}( {z}) & =\cfrac { 1 }{ \sqrt { \left( 2\pi \left( { \sigma }_{ {x} }^{ 2 } \right) \right) } } {exp}\left\{ -\cfrac { { \left( {z}-{ \mu }_{ {x} }-{ \mu }_{ {y} } \right) }^{ 2 } }{ 2\left( { \sigma }_{ {x} }^{ 2 }+{ \sigma }_{ {y} }^{ 2 } \right) } \right\}
\end{align*} $$
Note:
The argument above is based on the sum of two independent normal random variables. Suppose we have the sum of three normally independent random variables such that \(X+Y+W\).
From the above discussion, \( {X}+ {Y} \) is normal, \(W\) is assumed to be normal. We also assume that \( {X}+ {Y} \) and \(W\) are all independent.
Therefore, \( {X}+ {Y} \) are independent from \(W\), so we are dealing with the sum of two independent normal random variables. In that case, the sum of \( {X}+ {Y}+ {W} \) is also going to be normal.
Learning Outcome
Topic 2.f: Univariate Random Variables – Determine the sum of independent random variables (Poisson and normal).