##### State and apply the Central Limit Theorem

For this learning objective, a certain knowledge of the normal distribution and knowing how to use the Z-table is assumed. The central limit theorem is of the most important results in the probability theory. It states that the sum of…

##### Calculate probabilities and moments for linear combinations of independent random variables

Given random variables \(X_1,\ X_2,\ldots,X_p\) and constants \(c_1, c_2,\ldots,\ c_p\) then: $$Y=c_1X_1+c_2X_2+\ldots+c_pX_p$$ is a linear combination of \(X_1,\ X_2,\ldots,\ X_p\). Mean of a Linear Combination If \(Y=c_1X_1+c_2X_2+\ldots+c_pX_p\), then: $$E\left(Y\right)=c_1E\left(X_1\right)+c_2E\left(X_2\right)+\ldots+c_pE(X_p)$$ This true because recall that, if we have \(u\left(X,Y\right)=X+Y\), and let’s say…

##### Determine the distribution of a transformation of jointly distributed random variables

Transformation for Bivariate Discrete Random Variables Let \(X_1\) and \(X_2\) be a discrete random variables with joint probability mass function \(f_{X_1,X_2}(x_1,x_2)\) defined on a two dimensional set \(A\). Define the following functions: $$ y_1 =g_1 (x_1, x_2)$$ and $$y_2 =g_2(x_1,x_2)$$…

##### Calculate joint moments, such as the covariance and the correlation coefficient

Recall that we have looked at the joint pmf of two discrete andcontinuous random variables \(X\) and \(Y\). The variables are considered independent if: $$ P\left(X=x,\ Y=y\right)=P\left(X=x\right)P\left(Y=y\right),\ \ \text{for all x,y (discrete case)} $$ And $$ f_{XY}\left(x,\ y\right)=f_X\left(x\right)f_Y\left(y\right),\ \ \text{for…

##### Calculate variance, standard deviation for conditional and marginal probability distributions

Variance and Standard Deviation for Marginal Probability Distributions Generally, the variance for a joint distribution function of random variables \(X\) and \(Y\) is given by: $$ Var\left(X,Y\right)=E\left(g\left({x}^2,\ {y}^2\right)\right)-\left(E\left[g\left(x,y\right)\right]\right)^2 $$ The standard deviation of joint random variables is the square root…

##### Explain and apply joint moment generating functions

We can derive moments of most distributions by evaluating probability functions by integrating or summing values as necessary. However, moment generating functions present a relatively simpler approach to obtaining moments. Univariate Random Variables In the univariate case, the moment generating…

##### Calculate moments for joint, conditional, and marginal random variables

Moments of a Probability Mass function The n-th moment about the origin of a random variable is the expected value of its n-th power. Moments about the origin are \(E(X),E({ X }^{ 2 }),E({ X }^{ 3 }),E({ X }^{ 4 }),….\quad\) For…

##### Determine conditional and marginal probability functions

Conditional Distributions Conditional probability is a key part of Baye’s theorem, which describes the probability of an event based on prior knowledge of conditions that might be related to the event. It differs from joint probability, which does not rely…

##### Explain and perform calculations concerning joint probability functions, probability density functions, and cumulative distribution functions.

Bivariate Distributions (Joint Probability Distributions) Sometimes certain events can be defined by the interaction of two measurements. These types of events explained by the interaction of the two variables constitute what we call bivariate distributions. When put simply, bivariate distribution…