Apply the concepts of deductibles, coinsurance, benefit limits, and inflation to convert a given loss amount from a policyholder into the corresponding payment amount for an insurance company

Policy modifications refer to changes made to the loss random variable for an insurance product. In this chapter, we will explore several policy modifications, each serving a specific purpose in enhancing insurance coverage. These modifications include: Deductibles Benefit/policy limits Coinsurance…

More Details
Exam P Images

Chapter 1-General Probability Reading 1 Question 4

More Details
Exam P Syllabus – Learning Outcomes

General Probability 1.a – Define set functions, Venn diagrams, sample space, and events. Define probability as a set function on a collection of events and state the basic axioms of probability. 1.b – Calculate probabilities using addition and multiplication rules. 1.c – Define…

More Details
State and apply the Central Limit Theorem

For this learning objective, a certain knowledge of the normal distribution and knowing how to use the Z-table is assumed. The central limit theorem is of the most important results in the probability theory. It states that the sum of…

More Details
Calculate probabilities for linear combinations of independent normal random variables

Definition: Let \(X_1, X_2,\ldots,X_n\) be random variables and let \(c_1, c_2,\ldots, c_n\) be constants. Then, $$ Y=c_1X_1+c_2X_2+\ldots+c_nX_n $$ is a linear combination of \(X_1, X_2,\ldots, X_n\). In this reading, however, we will only base our discussion on the linear combinations…

More Details
Determine the distribution of a transformation of jointly distributed random variables

Transformation for Bivariate Discrete Random Variables Let \(X_1\) and \(X_2\) be a discrete random variables with joint probability mass function \(f_{X_1,X_2}(x_1,x_2)\) defined on a two dimensional set \(A\). Define the following functions: $$ y_1 =g_1 (x_1, x_2)$$ and  $$y_2  =g_2(x_1,x_2)$$…

More Details
Calculate joint moments, such as the covariance and the correlation coefficient

Let \(X\) and \(Y\) be two discrete random variables, with a joint probability mass function, \(f\left(x, y\right)\). Then, the random variables \(X\) and \(Y\) are said to be independent if and only if, $$ f\left(x,\ y\right)=f\left(x\right)\times f\left(y\right),\ \ \ \…

More Details
Calculate variance, standard deviation for conditional and marginal probability distributions

Variance and Standard Deviation for Conditional Discrete Distributions In the previous readings, we introduced the concept of conditional distribution functions for random variable \(X\) given \(Y=y\) and the conditional distribution of \(Y\) given \(X=x\). We defined the conditional distribution function…

More Details
Explain and apply joint moment generating functions

We can derive moments of most distributions by evaluating probability functions by integrating or summing values as necessary. However, moment generating functions present a relatively simpler approach to obtaining moments. Univariate Random Variables In the univariate case, the moment generating…

More Details
Calculate moments for joint, conditional, and marginal random variables

Moments of a Probability Mass function The n-th moment about the origin of a random variable is the expected value of its n-th power. Moments about the origin are \(E(X),E({ X }^{ 2 }),E({ X }^{ 3 }),E({ X }^{ 4 }),….\quad\) For…

More Details