# Statistical economics/Common Probability Density Functions

Probability distributions can be broken up into categories based on the continuity of the variables involved. This page breaks up these distributions by their use of discrete or continuous random variables.

## General Notation

• ${\displaystyle P(X=x)}$ is the probability that the random variable ${\displaystyle X}$ takes on the value ${\displaystyle x}$.
• ${\displaystyle E[X]}$ is the expected value of the random variable ${\displaystyle X}$, and is equal to ${\displaystyle \sum \limits _{x}xp(x)}$ for discrete random variables and ${\displaystyle \int \limits _{x}xp(x)dx}$ for continuous variables.
• var${\displaystyle (X)}$ is the variance of the random variable ${\displaystyle X}$, and is a measure of the spread of the possible values of the variable. If ${\displaystyle X}$ takes on a large range of values, then the variance is larger than if ${\displaystyle X}$ only takes on a values relatively close to each other.

## Probability Distributions of Discrete Random Variables

Some outcomes only take on discrete values, commonly integers (but not always). These outcomes can be modeled by discrete random variables that can only take on those values, and which values appear with probabilities determined by the probability distribution that the random variable is characterized by. Examples of discrete outcomes are the number of cars passing a landmark each day, the number of customers buying more than \$50 each day, or simply the number of heads that show up during a coin tossing contest.

Notes for Discrete Random Variables:

• ${\displaystyle p(x)}$ is the probability that ${\displaystyle X}$ takes on the value ${\displaystyle x}$ (i.e., ${\displaystyle P(X=x)}$). This is commonly referred to as the Probability Mass Function (PMF) for discrete random variables.

### The Bernoulli Distribution

This is possibly the simplest probability distribution, with only two possible outcomes: success or failure. A common example of a Bernoulli random variable is a coin toss. A Bernoulli distribution is described by the parameter ${\displaystyle p}$, which is the probability of success. Naturally, ${\displaystyle (1-p)}$ is the probability of failure (if we were hoping for the coin to land on heads, this would be the probability that it lands on tails instead).

#### Notation

If a random variable ${\displaystyle X}$ follows a Bernoulli distribution, it can be denoted by ${\displaystyle X\sim {\text{Bernoulli}}(p)}$, where ${\displaystyle p}$ is the probability of success or the favorable outcome. Conventionally, since this is a binary variable, we say that the result is a success when ${\displaystyle X=1}$, and a failure if ${\displaystyle X=0}$.

#### Distribution Details

• ${\displaystyle p(1)=p,\quad p(0)=1-p}$
• ${\displaystyle X\in \{0,1\}}$
• ${\displaystyle E[X]=p}$
• var${\displaystyle (X)=p(1-p)}$

### The Binomial Distribution

A Binomial random variable ${\displaystyle X}$ can take on integer values from 1 to ${\displaystyle n}$, where ${\displaystyle n}$ is the total number of identical trails, and ${\displaystyle X}$ is the number of successful trials. This means that ${\displaystyle X}$ can understood as a sum of ${\displaystyle n}$ Binomial random variables, each with probability ${\displaystyle p}$ of success, where each Binomial random variable is the success or failure (1 or 0) of each individual trial. A common example of a Binomial random variable is the number of heads that result from flipping a coin ${\displaystyle n}$ times, with the probability ${\displaystyle p}$ of landing on heads.

#### Notation

If a random variable ${\displaystyle X}$ follows a Binomial distribution, it can be denoted by ${\displaystyle X\sim {\text{Binomial}}(n,p)}$ or ${\displaystyle X\sim {\text{B}}(n,p)}$, where ${\displaystyle n}$ is the number of trials and ${\displaystyle p}$ is the probability of success of the favorable outcome for each trial.

#### Distribution Details

• ${\displaystyle p(x)={\binom {n}{x}}p^{x}(1-p)^{n}-x}$
• ${\displaystyle X\in \{0,1,2,...,n\}}$
• ${\displaystyle E[X]=np}$
• var${\displaystyle (X)=np(1-p)}$

### The Geometric Distribution

#### Notation

If a random variable ${\displaystyle Y}$ follows a Geometric distribution, it can be denoted by ${\displaystyle Y\sim {\text{Geometric}}(p)}$, where ${\displaystyle p}$ is the probability of success of the favorable outcome.

#### Distribution Details

• ${\displaystyle p(y)=(1-p)^{y-1}p}$
• ${\displaystyle Y\in \{1,2,3,...\}}$
• ${\displaystyle E[Y]={\frac {1}{p}}}$
• var${\displaystyle (Y)={\frac {1-p}{p^{2}}}}$

### The Negative Binomial Distribution

#### Notation

If a random variable ${\displaystyle X}$ follows a Negative Binomial distribution, it can be denoted by ${\displaystyle X\sim {\text{NB}}(r,p)}$, where ${\displaystyle r}$ is the number of desired successes and ${\displaystyle p}$ is the probability of success of the favorable outcome.

#### Distribution Details

• ${\displaystyle p(y)={\binom {y-1}{r-1}}p^{r}(1-p)^{y-r}}$
• ${\displaystyle Y\in \{r,r+1,r+2,...\}}$
• ${\displaystyle E[Y]={\frac {r}{p}}}$
• var${\displaystyle (Y)=r{\frac {(1-p)}{p^{2}}}}$

### The Poisson Distribution

#### Notation

If a random variable ${\displaystyle Y}$ follows a Poisson distribution, it can be denoted by ${\displaystyle X\sim {\text{Pois}}(\lambda )}$, where ${\displaystyle \lambda }$ is mean of the variable.

#### Distribution Details

• ${\displaystyle p(y)={\frac {\lambda ^{y}}{y!}}e^{-\lambda }}$
• ${\displaystyle Y\in \{0,1,2,...\}}$
• ${\displaystyle E[Y]=\lambda }$
• var${\displaystyle (Y)=\lambda }$

### The Hypergeometric Distribution

#### Notation

If a random variable ${\displaystyle Y}$ follows a Hypergeometric distribution, it can be denoted by ${\displaystyle X\sim {\text{Hypergeometric}}(N,n,r)}$, where ${\displaystyle N}$ is the total size of the population, ${\displaystyle n}$ is the size of the sample taken, and ${\displaystyle r}$ is the number of favorable objects in the population.

#### Distribution Details

• ${\displaystyle p(y)={\frac {{\binom {r}{y}}{\binom {N-r}{n-y}}}{\binom {N}{n}}};\quad y\leq r;n-y\leq N-r}$
• ${\displaystyle Y\in \{0,1,...,n\}}$
• ${\displaystyle E[Y]={\frac {nr}{N}}}$
• var${\displaystyle (Y)=n\left({\frac {r}{N}}\right)\left({\frac {N-r}{N}}\right)\left({\frac {N-n}{N-1}}\right)}$

### The Uniform Distribution

#### Notation

If a random variable ${\displaystyle X}$ follows a Uniform distribution, it can be denoted by ${\displaystyle X\sim {\text{Unif}}(N)}$, where ${\displaystyle N}$ is the total number of outcomes. A uniform distribution has equal probability over all possible outcomes, which is simply ${\displaystyle p={\frac {1}{N}}}$.

#### Distribution Details

• ${\displaystyle X\in \{{\text{set of all possible outcomes}}\}}$

If the set of outcomes contains only consecutive integers starting at 1, then:

• ${\displaystyle E[X]={\frac {n+1}{2}}}$
• var${\displaystyle (X)={\frac {n^{2}-1}{12}}}$

## Probability Distributions of Continuous Random Variables

Notes for Discrete Random Variables:

• ${\displaystyle f(x)}$ is the probability density value that ${\displaystyle X}$ takes on the value ${\displaystyle x}$. This is commonly referred to as the Probability Density Function (PDF) for continuous random variables. For continuous variables, since any range of real numbers has infinitely many numbers in it, the probability of ${\displaystyle X}$ taking on any single number is 0. For continuous random variables, we instead ask about the probability that ${\displaystyle X}$ will fall within some range of numbers. To get this probability, we define ${\displaystyle f(x)}$ as a density function that gives us probability when integrated. In other words, the probability that ${\displaystyle X}$ is between the real numbers ${\displaystyle a}$ and ${\displaystyle b}$ is ${\displaystyle P(a.