### Determine the sum of independent random variables (Poisson and normal)

Given $$X$$ and $$Y$$ are independent random variables, then the probability density function of $$X+Y$$ can be shown by the equation below:

$${ f }_{ X+Y }\left( a \right) =\int _{ -\infty }^{ \infty }{ { f }_{ X }\left( a-y \right) } { f }_{ Y }\left( y \right) dy$$

The cumulative distribution function, also called the convolution of $$X$$ and $$Y$$, can be shown by the equation below:

$${ F }_{ X+Y }\left( a \right) =\int _{ -\infty }^{ \infty }{ { F }_{ X }\left( a-y \right) } { f }_{ Y }\left( y \right) dy$$

Example

Given two independent uniform random variables shown by the probability density functions below, find the probability density function of $$X+Y$$.

$$f\left( x \right) =\begin{cases} \frac { 1 }{ 2 } & 0\le X\le 2 \\ 0 & otherwise \end{cases} \quad f\left( y \right) =\begin{cases} \frac { 1 }{ 2 } & 0\le Y\le 2 \\ 0 & otherwise \end{cases}$$

$${ f }_{ X+Y }\left( a \right) =\int _{ -\infty }^{ \infty }{ { f }_{ X }\left( a-y \right) } \left( { 1 }/{ 2 } \right) dy$$
Since $$f(y)$$ is only defined for y between 0 and 2 then

$${ f }_{ X+Y }\left( a \right) =\int _{ 0 }^{ 2 }{ { f }_{ X }\left( a-y \right) } \left( { 1 }/{ 2 } \right) dy =$$

Also since $$f(x)$$ is only defined for $$x$$ between 0 and 2 then $$0 < a-y < 2$$ and

\begin{align*} & { f }_{ X+Y }\left( a \right) =\int _{ 0 }^{ a }{ \left( \frac { 1 }{ 2 } \right) } \left( { 1 }/{ 2 } \right) dy={ 1 }/{ 4 }\quad \quad \quad \quad a\quad for\quad 0 < a < 2 \\ & { f }_{ X+Y }\left( a \right) =\int _{ a-2 }^{ 2 }{ \left( \frac { 1 }{ 2 } \right) } \left( \frac { 1 }{ 2 } \right) dy={ \left[ \frac { 1 }{ 4 } y \right] }_{ y=a-2 }^{ y=2 }=\left( \frac { 1 }{ 4 } \right) \left( 2-\left( a-2 \right) \right) =\left( \frac { 1 }{ 4 } \right) \left( 4-a \right) \quad for\quad 2 < a < 4 \\ \end{align*}

If $$X$$ and $$Y$$ are independent random variables, then the following are true:

$$P \left(X=x\quad and \quad Y=y \right) = P\left(X=x \right) \ast P\left(Y=y \right)=f \left(x \right)\ast f \left(y \right)$$

Example

Given the experiment of rolling two dice simultaneously, find the probability of rolling a 6 on both dice.

$$P \left(X=6 \quad and \quad Y=6 \right) = P \left(X=6 \right) \ast P\left(Y=6 \right) = {1}/{6} \ast {1}/{6} = {1}/{36}$$

$$E \left(XY \right)=E \left(X \right) \ast E\left(Y \right)$$

Example

Given the experiment of rolling two dice simultaneously, find $$E(XY)$$.

\begin{align*} & E\left(XY \right) = E \left(X \right) \ast E \left(Y \right) = 3.5 \ast 3.5 = 12.25 \\ & E \left(X+Y \right)=E \left(X \right)+E \left(Y \right) \\ \end{align*}

$$Var \left(X+Y \right)=Var \left(X \right)+Var \left(Y \right)$$

Example

Given two independent uniform random variables shown by the probability density functions below, find $$E(X+Y)$$ and $$Var(X+Y)$$.

$$f\left( x \right) =\begin{cases} \frac { 1 }{ 2 } & 0\le X\le 2 \\ 0 & otherwise \end{cases} \quad f\left( y \right) =\begin{cases} \frac { 1 }{ 2 } & 0\le Y\le 2 \\ 0 & otherwise \end{cases}$$

\begin{align*} & E \left(X+Y \right) = E \left(X \right) + E \left(Y \right) \\ & E \left(X \right) = {\left(2+0\right)}/{2} = 1 \\ & E \left(Y \right) = {\left(2+0\right)}/{2} = 1 \\ & E \left(X+Y \right) = 1 + 1 = 2 \\ \\ & Var \left (X+Y \right) = Var \left(X \right) + Var \left(Y \right) \\ & Var \left(X \right) = {\left(2-0 \right)^2}/{12} = {1}/{3} \\ & Var \left(Y \right) = {\left(2-0 \right)^2}/{12} = {1}/{3} \\ & Var \left (X+Y \right) = {1}/{3}+{1}/{3}={2}/{3} \\ \end{align*}

We can also use the probability density function of $$X+Y$$ that was derived above to verify this solution:

$${ f }_{ X+Y }\left( x \right) =\begin{cases} \cfrac { 1 }{ 4 } x & 0\le x\le 2 \\ \left( \cfrac { 1 }{ 4 } \right) \left( 4-x \right) & 2\le x\le 4 \\ 0 & otherwise \end{cases}$$

\begin{align*} E\left( X+Y \right) & =\int _{ 0 }^{ 2 }{ \left( \frac { 1 }{ 4 } \right) x\ast xdx } ={ \left[ \frac { 1 }{ 12 } { x }^{ 3 } \right] }_{ x=0 }^{ x=2 }=\frac { 8 }{ 12 } =\frac { 2 }{ 3 } \\ & +\int _{ 2 }^{ 4 }{ \left( \frac { 1 }{ 4 } \right) \left( 4-x \right) \ast xdx } ={ \left[ \left( \frac { 1 }{ 2 } \right) { x }^{ 2 }-\frac { 1 }{ 12 } { x }^{ 3 } \right] }_{ x=2 }^{ x=4 }=\frac { 4 }{ 3 } \\ \end{align*}

\begin{align*} & E \left(X+Y \right) = {2}/{3} + {4}/{3} = {6}/{3} = 2 \\ \end{align*}

\begin{align*} & { M }_{ X+Y }\left( t \right) ={ M }_{ X }\left( t \right) \ast { M }_{ Y }\left( t \right) \end{align*}

Example

Given the following moment generating functions of independent random variables, $$X$$ and $$Y$$, find the moment generating function of $$X+Y$$.

$${ M }_{ X }\left( t \right) =exp\left\{ .2\left( { e }^{ t }-1 \right) \right\} \quad \quad \quad { M }_{ Y }\left( t \right) =exp\left\{ .3\left( { e }^{ t }-1 \right) \right\} \quad \quad \quad$$

$${ M }_{ X+Y }\left( t \right) = exp\left\{ .2\left( { e }^{ t }-1 \right) \right\} \ast exp\left\{ .3\left( { e }^{ t }-1 \right) \right\}$$

$${ M }_{ X+Y }\left( t \right) =exp\left\{ .2{ e }^{ t }-.2+.3{ e }^{ t }-.3 \right\} =exp\left\{ .5{ e }^{ t }-.5 \right\} =exp\left\{ .5\left( { e }^{ t }-1 \right) \right\}$$

Sum of Normal Random Variables

If $$X$$ and $$Y$$ are independent normally distributed random variables with parameters $$\mu_x$$, $$\sigma_x$$ and $$\mu_y$$, $$\sigma_y$$ respectively then $$X+Y$$ is normally distribution with parameters $$\mu= \mu_x+\mu_y$$ and $${ \sigma }^{ 2 }={ \sigma }_{ x }^{ 2 }+{ \sigma }_{ y }^{ 2 }$$.

Sum of Poisson Random Variables

If $$X$$ and $$Y$$ are independent Poisson random variables with parameters $$\lambda_x$$ and $$\lambda_y$$ respectively, then $$X+Y$$ is a Poisson distribution with parameter $$\lambda = \lambda_x + \lambda_y$$.

Learning Outcome

Topic 2.f: Univariate Random Variables – Determine the sum of independent random variables (Poisson and normal).