Statistical economics/Common Probability Density Functions

From Wikiversity
Jump to navigation Jump to search

Probability distributions can be broken up into categories based on the continuity of the variables involved. This page breaks up these distributions by their use of discrete or continuous random variables.

General Notation[edit | edit source]

Here are some notations used throughout this page:

  • is the probability that the random variable takes on the value .
  • is the expected value of the random variable , and is equal to for discrete random variables and for continuous variables.
  • var is the variance of the random variable , and is a measure of the spread of the possible values of the variable. If takes on a large range of values, then the variance is larger than if only takes on a values relatively close to each other.

Probability Distributions of Discrete Random Variables[edit | edit source]

Some outcomes only take on discrete values, commonly integers (but not always). These outcomes can be modeled by discrete random variables that can only take on those values, and which values appear with probabilities determined by the probability distribution that the random variable is characterized by. Examples of discrete outcomes are the number of cars passing a landmark each day, the number of customers buying more than $50 each day, or simply the number of heads that show up during a coin tossing contest.

Notes for Discrete Random Variables:

  • is the probability that takes on the value (i.e., ). This is commonly referred to as the Probability Mass Function (PMF) for discrete random variables.

The Bernoulli Distribution[edit | edit source]

This is possibly the simplest probability distribution, with only two possible outcomes: success or failure. A common example of a Bernoulli random variable is a coin toss. A Bernoulli distribution is described by the parameter , which is the probability of success. Naturally, is the probability of failure (if we were hoping for the coin to land on heads, this would be the probability that it lands on tails instead).

Notation[edit | edit source]

If a random variable follows a Bernoulli distribution, it can be denoted by , where is the probability of success or the favorable outcome. Conventionally, since this is a binary variable, we say that the result is a success when , and a failure if .

Distribution Details[edit | edit source]

  • var


The Binomial Distribution[edit | edit source]

A Binomial random variable can take on integer values from 1 to , where is the total number of identical trails, and is the number of successful trials. This means that can be understood as a sum of Binomial random variables, each with probability of success, where each Binomial random variable is the success or failure (1 or 0) of each individual trial. A common example of a Binomial random variable is the number of heads that result from flipping a coin times, with the probability of landing on heads.

Notation[edit | edit source]

If a random variable follows a Binomial distribution, it can be denoted by or , where is the number of trials and is the probability of success of the favorable outcome for each trial.

Distribution Details[edit | edit source]

  • var


The Geometric Distribution[edit | edit source]

Notation[edit | edit source]

If a random variable follows a Geometric distribution, it can be denoted by , where is the probability of success of the favorable outcome.

Distribution Details[edit | edit source]

  • var


The Negative Binomial Distribution[edit | edit source]

Notation[edit | edit source]

If a random variable follows a Negative Binomial distribution, it can be denoted by , where is the number of desired successes and is the probability of success of the favorable outcome.

Distribution Details[edit | edit source]

  • var


The Poisson Distribution[edit | edit source]

Notation[edit | edit source]

If a random variable follows a Poisson distribution, it can be denoted by , where is mean of the variable.

Distribution Details[edit | edit source]

  • var


The Hypergeometric Distribution[edit | edit source]

Notation[edit | edit source]

If a random variable follows a Hypergeometric distribution, it can be denoted by , where is the total size of the population, is the size of the sample taken, and is the number of favorable objects in the population.

Distribution Details[edit | edit source]

  • var


The Uniform Distribution[edit | edit source]

Notation[edit | edit source]

If a random variable follows a Uniform distribution, it can be denoted by , where is the total number of outcomes. A uniform distribution has equal probability over all possible outcomes, which is simply .

Distribution Details[edit | edit source]

If the set of outcomes contains only consecutive integers starting at 1, then:

  • var


Probability Distributions of Continuous Random Variables[edit | edit source]

Notes for Discrete Random Variables:

  • is the probability density value that takes on the value . This is commonly referred to as the Probability Density Function (PDF) for continuous random variables. For continuous variables, since any range of real numbers has infinitely many numbers in it, the probability of taking on any single number is 0. For continuous random variables, we instead ask about the probability that will fall within some range of numbers. To get this probability, we define as a density function that gives us probability when integrated. In other words, the probability that is between the real numbers and is .

The Uniform Distribution[edit | edit source]

The Normal Distribution[edit | edit source]

The Poisson Scatter Distribution[edit | edit source]

The Exponential Distribution[edit | edit source]

The Gamma Distribution[edit | edit source]

The Chi-square Distribution[edit | edit source]

The Beta Distribution[edit | edit source]

The Distribution[edit | edit source]