2019 Syllabus – Learning Outcomes

General Probability 1.a – Define set functions, Venn diagrams, sample space, and events. Define probability as a set function on a collection of events and state the basic axioms of probability. 1.b – Calculate probabilities using addition and multiplication rules. 1.c – Define…

More Details
State and apply the Central Limit Theorem

For this chapter, a certain knowledge of normal distribution and knowing how to use a table for the normal distribution is assumed The central limit theorem is of the most important results in the probability theory. It states that the…

More Details
Calculate probabilities and moments for linear combinations of independent random variables

Probabilities and moments such as the mean, variance of joint random variables is not an unknown topic for the reader. Calculating the expected value of two independent variables is a linear combination of them. For instance let’s take \(u(X,Y) =…

More Details
Determine the distribution of a transformation of jointly distributed random variables

Consider a transformation of one random variable \(X\) with pdf \(f(x)\). Let’s think about the continuous case, let \(Y = u(X)\) be an increasing or decreasing function of \(X\), with inverse \(X = v(Y)\), then the pdf of \(Y\) was…

More Details
Calculate joint moments, such as the covariance and the correlation coefficient

Covariance and Correlation Coefficient for Joint Random Variables In learning outcomes covered previously, we have looked at the joint p.m.f. of two discrete/continuous random variables \(X\) and \(Y\), and we have also established the condition required for \(X\) and \(Y\) what…

More Details
Calculate variance, standard deviation for conditional and marginal probability distributions

Variance and standard deviation for joint random variables The second moment or variance is a derivative of the first moment and it is equal to: $$Var(X,Y)= E(g(X^2,Y^2)) – (E[g(X,y)])^2$$ The standard deviation of joint random variables is no more than…

More Details
Explain and apply joint moment generating functions

The moment-generating function Introduction We can derive moments of most distributions by evaluating probability functions by integrating or summing values as necessary. However, moment generating functions present a relatively simpler approach to obtaining moments. In the univariate case, the moment…

More Details
Calculate moments for joint, conditional, and marginal random variables

Moments of a Probability Mass function The n-th moment about the origin of a random variable is the expected value of its n-th power. Moments about the origin are \(E(X),E({ X }^{ 2 }),E({ X }^{ 3 }),E({ X }^{ 4 }),….\quad\) For the…

More Details
Determine conditional and marginal probability functions

Conditional Distributions Conditional probability is a key part of Baye’s theorem. In plain language, it is the probability of one thing being true given that another thing is true. It differs from joint probability, which is the probability that both things…

More Details
Explain and perform calculations concerning joint probability functions

Bivariate Distributions of the discrete type (Joint Probability) Sometimes certain events can be defined by the interaction of two measurements. These types of events that are explained by the interaction of the two variables constitute what we call bivariate distributions….

More Details
X