 # Statistics Discrete Probability Distributions

### Discrete Probability distributions

Introduction..... Symbols..... General Notes..... Uniform Distribution.....
Binomial Distribution..... Poisson Distribution..... Hypergeometric Distribution.....

Introduction

A discrete random variable can only take on the value of a distinct number i.e. 0,1,2,3 etc.  Typical descrete numbers include number of children in a family, attendance at a theatre, the number of patients in a hospital ward, the number of defective bulbs in a box of ten.

The notes on this webpage related to the probability distributions primarily associated with discrete random variables.

Symbols

In the notes below the probability distributions relate to probability of successes.  In practice they could equally relate to failures, or outcomes with desired values ( dice throw = 6)

 X = random variable μ = distribution mean E(X) = expected mean of random variable Var(X) = variance of random variable σ 2 = Distribution variance σ = Distribution standard deviation f(x) = probability density function P(k) = probability that in that k successes occur in n trials F(x) = probability distribution function. *** xm = arithmetic mean of sample sx 2 = variance of sample sx = Standard deviation of sample N = Number of items in distribution n = number of trial/ sample items M = Number of success items in options (= pN) k = number of successes p = probability of success 0 < p < 1 pi = probability of x = xi xi = discrete value of random sample *** F(x) ... In n trials number of successes x

General Notes The expectation E(X) is synonymous with the mean and μ is often used to represent the mean of a random variable.
Also σ 2 is often used to represent the variance of a random variable in place of Var(X).

Uniform Distribution

Note: ..
The uniform distribution is often used for continuous variables in addition to discrete variables so I have include the equations for both .

A distribution in which every possible value of the variable has the same probability is called a rectangular or uniform distribution.   The most obvious uniform probability distribution is that related to tossing a true dice. The probability of each of the numbers (1 to 6)being the same at 1/6.   For a discrete uniform distribution, P(k) = 1/n where n is the number of possible values of k. The probability density function The probability distribution function The distribution mean The Variance Binomial Distribution
Table of binomial probability distributions

This distribution relates to the number of times a success occurs in n independent trials.
The binomial distribution is important in sampling with replacement.
The probability of the identified event in a single trial = p
P(k) means the probability that in n randon trials the number of successes = k exactly

The conditions for a binomial probability distribution are.

The trials must be independent
Each trial result in only two outcomes e.g. a success or a failure.
Each trial has the same probability of success
The number of trials are fixed in advance

The probability density function is The probability distribution function is. Identities the probability of up to and including k events occur The expected mean μ The variance σ 2 = The graph below shown some typical binomial distribution probability density forms . Poisson Distribution
Table of Poisson Probability distributions

The Poisson distribution is simply a limiting case of the binomial distribution with p -> 0, and n - > infinity such that the mean is m = np which approaches a finite value.  Typically, a Poisson random variable is a count of the number of "successful events-!!" that occur in a certain time interval or spatial area.    For example, the number of cars passing a fixed point in fixed time period, the number of defects per unit area of material, or the defects in a thread per unit length.

The Poisson distribution is sometimes used to approximate the Binomial distribution with parameters n and p.   When the number of observations n is large, and the success probability p is small, the binomial distribution approaches the Poisson distribution with the parameter given by m = np.   This is useful since the computations involved in calculating binomial probabilities are significantly reduced.

The following conditions are required for a Poission distribution

The basis measurement e.g length of the observation period, area, length , needs to be fixed in advance;
the events occur at a constant average rate
the number of events occurring in disjoint intervals are statistically independent.

The probability density function is The probability distribution function is The expected mean μ The variance σ 2 = The graph below shown some typical Poisson distribution probability density forms . Hypergeometric Distribution

Where the binomial probability function involves sampling with replacement i.e. each trial is in dependent the hypergeometric distribution involves sampling without replacement.  The trials are therefore not independent.

The probability density function is The probability distribution function is The expected mean μ The variance σ 2 = The graph below shown some typical hypergeometric distribution probability density forms . 