Big Data

Big data is a term that describes large, complex datasets. These datasets are analyzed with computers to uncover patterns and trends, particularly those related to human behavior. Big data includes traditional sources like company reports and government data and non-traditional…

More Details
Introduction to Big Data Techniques

Fintech refers to technological innovation in designing and delivering financial services and products. At its core, fintech has helped companies, business owners, and investment managers better manage their operations through specialized software and algorithms. Note that the term fintech is…

More Details
Parametric and Non-Parametric Test

Parametric versus Non-parametric Tests of Independence A parametric test is a hypothesis test concerning a population parameter used when the data has specific distribution assumptions. If these assumptions are not met, non-parametric tests are used. In summary, researchers use non-parametric…

More Details
Hypothesis Tests of Risk and Risk

Hypothesis Test Concerning Single Mean The z-test is the ideal hypothesis test when the sample’s sampling distribution is normally distributed or when the standard deviation is known. The z-statistic is the test statistic used in hypothesis testing. Testing \(\bf{H_0: \mu…

More Details
Hypothesis Testing

A hypothesis is an assumed statement about a population’s characteristics, often considered an opinion or claim about an issue. To determine if a hypothesis is accurate, statistical tests are used. Hypothesis testing uses sample data to evaluate if a sample…

More Details
Resampling

Resampling refers to the act of repeatedly drawing samples from the original observed data sample for the statistical inference of population parameters. The two commonly used methods of resampling are bootstrap and jackknife. Bootstrap Using a computer, the bootstrap resampling…

More Details
The Central Limit Theorem

The central limit theorem asserts that “given a population described by any probability distribution having mean \(\mu\) and finite variance \(\sigma^2\), the sampling distribution of the sample mean \(\bar{X}\) computed from random samples of size \(n\) from this population will…

More Details
Probability Sampling Methods

Sampling is the systematic process of selecting a subset or sample from a larger population. Sampling is essential because it is costly and time-consuming to analyze the whole population. Sampling methods can be broadly categorized into probability sampling and non-probability…

More Details
Bootstrap Resampling

Resampling Resampling means repeatedly drawing samples from the original observed sample to make statistical inferences about population parameters. There are two common methods: Bootstrap and jackknife. Here, we’ll focus on the Bootstrap method. Bootstrap Resampling Bootstrap resampling relies on computer…

More Details
Monte-Carlo Simulation

Monte Carlo simulations are about producing many random variables based on specific probability distributions. This helps in estimating the probability of various results. We will give an example to illustrate Monte Carlo Simulation implementation. Steps Involved in Project Appraisal Imagine…

More Details