Relationships among probability distributions
In probability theory and statistics, there are several relationships among probability distributions. These relations can be categorized in the following groups:
- One distribution is a special case of another with a broader parameter space
- Transforms ;
- Combinations ;
- Approximation relationships;
- Compound relationships ;
- Duality;
- Conjugate priors.
Special case of distribution parametrization
- A binomial random variable with n = 1, is a Bernoulli random variable.
- A negative binomial distribution with n = 1 is a geometric distribution.
- A gamma distribution with shape parameter α = 1 and scale parameter θ is an exponential distribution with expected value θ.
- A gamma random variable with α = ν/2 and β = 1/2, is a chi-squared random variable with ν degrees of freedom.
- A chi-squared distribution with 2 degrees of freedom is an exponential distribution with mean 2 and vice versa.
- A Weibull random variable is an exponential random variable with mean β.
- A beta random variable with parameters α = β = 1 is a uniform random variable.
- A beta-binomial random variable is a discrete uniform random variable over the values 0,..., n.
- A random variable with a t distribution with one degree of freedom is a Cauchy random variable.
- When c = 1, the Burr type XII distribution becomes the Pareto Type II distribution.
Transform of a variable
Multiple of a random variable
Multiplying the variable by any positive real constant yields a scaling of the original distribution.Some are self-replicating, meaning that the scaling yields the same family of distributions, albeit with a different parameter:
normal distribution, gamma distribution, Cauchy distribution, exponential distribution, Erlang distribution, Weibull distribution, logistic distribution, error distribution, power-law distribution, Rayleigh distribution.
Example:
- If X is a gamma random variable with shape and rate parameters, then Y = aX is a gamma random variable with parameters.
- If X is a gamma random variable with shape and scale parameters, then Y = aX is a gamma random variable with parameters.
Linear function of a random variable
Normal distribution, Cauchy distribution, Logistic distribution, Error distribution, Power distribution, Rayleigh distribution.
Example:
- If Z is a normal random variable with parameters, then X = aZ + b is a normal random variable with parameters.
Reciprocal of a random variable
Cauchy distribution, F distribution, log logistic distribution.
Examples:
- If X is a Cauchy random variable, then 1/X is a Cauchy random variable where C = μ2 + σ2.
- If X is an F random variable then 1/X is an F random variable.
Other cases
Example:
- If X is a beta random variable then is a beta random variable.
- If X is a binomial random variable then is a binomial random variable.
- If X has cumulative distribution function FX, then the inverse of the cumulative distribution F is a standard uniform random variable
- If X is a normal random variable then eX is a lognormal random variable.
- If X is an exponential random variable with mean β, then X1/γ is a Weibull random variable.
- The square of a standard normal random variable has a chi-squared distribution with one degree of freedom.
- If X is a Student’s t random variable with ν degree of freedom, then X2 is an F random variable.
- If X is a double exponential random variable with mean 0 and scale λ, then |X| is an exponential random variable with mean λ.
- A geometric random variable is the floor of an exponential random variable.
- A rectangular random variable is the floor of a uniform random variable.
- A reciprocal random variable is the exponential of a uniform random variable.
Functions of several variables
Sum of variables
The distribution of the sum of independent random variables is the convolution of their distributions. Suppose is the sum of independent random variables each with probability mass functions. Thenhas
If it has a distribution from the same family of distributions as the original variables, that family of distributions is said to be closed under convolution.
Examples of such univariate distributions are: normal distributions, Poisson distributions, binomial distributions, negative binomial distributions, gamma distributions, chi-squared distributions, Cauchy distributions, hyperexponential distributions.
Examples:
- *If X1 and X2 are Poisson random variables with means μ1 and μ2 respectively, then X1 + X2 is a Poisson random variable with mean μ1 + μ2.
- * The sum of gamma random variables has a gamma distribution.
- *If X1 is a Cauchy random variable and X2 is a Cauchy, then X1 + X2 is a Cauchy random variable.
- *If X1 and X2 are chi-squared random variables with ν1 and ν2 degrees of freedom respectively, then X1 + X2 is a chi-squared random variable with ν1 + ν2 degrees of freedom.
- *If X1 is a normal random variable and X2 is a normal random variable, then X1 + X2 is a normal random variable.
- *The sum of N chi-squared random variables has a chi-squared distribution with N degrees of freedom.
- The sum of n Bernoulli random variables is a binomial random variable.
- The sum of n geometric random variable with probability of success p is a negative binomial random variable with parameters n and p.
- The sum of n exponential random variables is a gamma random variable.
- *If the exponential random variables have a common rate parameter, their sum has an Erlang distribution, a special case of the gamma distribution.
- The sum of the squares of N standard normal random variables has a chi-squared''' distribution with N degrees of freedom.
Product of variables
Example:
- If X1 and X2 are independent log-normal random variables with parameters and respectively, then X1 X2 is a log-normal random variable with parameters.
Minimum and maximum of independent random variables
Bernoulli distribution, Geometric distribution, Exponential distribution, Extreme value distribution, Pareto distribution, Rayleigh distribution, Weibull distribution.
Examples:
- If X1 and X2 are independent geometric random variables with probability of success p1 and p2 respectively, then min is a geometric random variable with probability of success p = p1 + p2 − p1 p2. The relationship is simpler if expressed in terms probability of failure: q = q1 q2.
- If X1 and X2 are independent exponential random variables with rate μ1 and μ2 respectively, then min is an exponential random variable with rate μ = μ1 + μ2.
Bernoulli distribution, Power law distribution.
Other
- If X and Y are independent standard normal random variables, X/Y is a Cauchy random variable.
- If X1 and X2 are independent chi-squared random variables with ν1 and ν2 degrees of freedom respectively, then / is an F random variable.
- If X is a standard normal random variable and U is an independent chi-squared random variable with ν degrees of freedom, then is a Student's t random variable.
- If X1 is a gamma random variable and X2 is an independent gamma random variable then X1/ is a beta random variable. More generally, if X1 is a gamma random variable and X2 is an independent gamma random variable then β2 X1/ is a beta random variable.
- If X and Y are independent exponential random variables with mean μ, then X − Y is a double exponential random variable with mean 0 and scale μ.
Approximate (limit) relationships
- either that the combination of an infinite number of iid random variables tends to some distribution,
- or that the limit when a parameter tends to some value approaches to a different distribution.
- Given certain conditions, the sum of a sufficiently large number of iid random variables, each with finite mean and variance, will be approximately normally distributed. This is the central limit theorem.
- X is a hypergeometric random variable. If n and m are large compared to N, and p = m/N is not close to 0 or 1, then X approximately has a Binomial distribution.
- X is a beta-binomial random variable with parameters. Let p = α/ and suppose α + β is large, then X approximately has a binomial distribution.
- If X is a binomial random variable and if n is large and np is small then X approximately has a Poisson distribution.
- If X is a negative binomial random variable with r large, P near 1, and r = λ, then X approximately has a Poisson distribution with mean λ.
- If X is a Poisson random variable with large mean, then for integers j and k, P approximately equals to P where Y is a normal distribution with the same mean and variance as X.
- If X is a binomial random variable with large np and n, then for integers j and k, P approximately equals to P where Y is a normal random variable with the same mean and variance as X, i.e. np and np.
- If X is a beta random variable with parameters α and β equal and large, then X approximately has a normal distribution with the same mean and variance, i. e. mean α/ and variance αβ/2).
- If X is a gamma random variable and the shape parameter α is large relative to the scale parameter β, then X approximately has a normal random variable with the same mean and variance.
- If X is a Student's t random variable with a large number of degrees of freedom ν then X approximately has a standard normal distribution.
- If X is an F random variable with ω large, then νX is approximately distributed as a chi-squared random variable with ν degrees of freedom.
Compound (or Bayesian) relationships
Examples:
- If X | N is a binomial random variable, where parameter N is a random variable with negative-binomial distribution, then X is distributed as a negative-binomial.
- If X | N is a binomial random variable, where parameter N is a random variable with Poisson distribution, then X is distributed as a Poisson.
- If X | μ is a Poisson random variable and parameter μ is random variable with gamma distribution, then X is distributed as a negative-binomial, sometimes called gamma-Poisson distribution.
beta-binomial distribution, beta-Pascal distribution, gamma-normal distribution.
Examples:
- If X is a Binomial random variable, and parameter p is a random variable with beta distribution, then X is distributed as a Beta-Binomial.
- If X is a negative-binomial random variable, and parameter p is a random variable with beta distribution, then X is distributed as a Beta-Pascal.