Probability_Theory

This blog just works as a formula stack.

Formula

Law of Total Probability

IF: {${A_{i}:i=1,2,3…,n}$} is a finite or countably infinite partition of a sample space.

THEN for any event $B$:
$$ P(B)=\sum*{i=1}^n P(A*{i})P(B|A_{i}) $$

Bayes’ Theorem

IF:{${A_{i}:i=1,2,3,…,n}$} is a finite or countably infinite partition of a sample space (happens firstly), and $B$ is a fixed event(happens secondly).

THEN for any event $A_{k}(k\in{1,2,3,…,n})$:

$$
P(A_{k}|B)=\frac{P(A_{k})P(B|A_{k})}{\sum_{i=1}^nP(A_{i})P(B|A_{i})}
$$

Binomial Distribution $X \sim B(n, p)$

IF the random variable $X$ follows the binomial distribution with and $p ∈ [0,1]$, we write $X \sim B(n, p)$.

THENThe probability of getting exactly k successes in n independent Bernoulli trials is given by the probability mass function :

$$
P\left(X=k\right)=C_{n}^kp^k(1-p)^k
$$

Poisson Distribution $X \sim P(\lambda)$

IF a discrete random variable X is said to have a Poisson distribution, with parameter $\lambda>0$, we write $X \sim P(\lambda)$ or $X \sim \pi(\lambda)$.

THEN it has a probability mass function given by :

$$
P\left(X=k\right)=\frac{\lambda^k}{k!}e^{-\lambda}
$$

Continuous Uniform Distribution $X \sim U(a,b)$

IF the probability density function of the continuous uniform distribution $x$ is :

$$
f(x) = \begin{cases}
\frac{1}{b-a},& a<x<b\\
0,& else \\
\end{cases}
$$

THEN we write $X \sim U(a,b)$, and the cumulative distribution function is :

$$
F(x) = \begin{cases}
0,& x<a\\
\frac{x-a}{b-a},& a \leq x < b \\
1,& x \geq b
\end{cases}
$$

Exponential distribution $X \sim E(\lambda)$

IF the probability density function of the continuous uniform distribution $x$ and the rate parameter $\lambda > 0$ is :

$$
f(x;\lambda) = \begin{cases}
\lambda e^{-\lambda x},& x\geq 0 \\
0,& x<0 \\
\end{cases}
$$

THEN we write $X \sim E(\lambda)$, and the cumulative distribution function is given by :

$$
F(x)=\begin{cases}
1-e^{-\lambda x},&x \geq 0\\
0,&x<0\\
\end{cases}
$$

Normal Distribution $X \sim N(\mu,\sigma^2)$

Normal distribution, also called Gaussian distribution.

IF there is a real-valued random variable X, and the general form of its probability density function is:

$$
f(x)=\frac{1}{\sqrt{2\pi}\sigma}e^{-\frac{(x-\mu)^2}{2\sigma^2}}
$$

THEN We write $X \sim N(\mu,\sigma^2)$. The parameter $\mu$ is the mean or expectation of the distribution (and also its median and mode), while the parameter $\sigma$ is its standard deviation. The variance of the distribution is $\sigma^2$.

Standard Normal Distribution $X \sim N(0,1)$

IF $X \sim N(\mu,\sigma^2)$.

THEN when $\mu=0,\sigma=1$, we write $X \sim N(0,1)$. It is described by this probability density function:

$$
\varphi(x) = \frac{1}{\sqrt{2\pi}}e^{-\frac{x^2}{2}}
$$

and the cumulative distribution function is given by :

$$
\phi(x) = \frac{1}{\sqrt{2\pi}}\int_{-\infty}^x e^{-\frac{t^2}{2}} dt
$$

To Be Continued…


Probability_Theory
http://xxblog.net/Mathematics/Probability-Theory/
Author
XX
Posted on
November 8, 2022
Updated on
March 23, 2023
Licensed under