Bernoulli distribution


In probability theory and statistics, the Bernoulli distribution, named after Swiss mathematician Jacob Bernoulli, is the discrete probability distribution of a random variable which takes the value 1 with probability and the value 0 with probability. Less formally, it can be thought of as a model for the set of possible outcomes of any single experiment that asks a yes–no question. Such questions lead to outcomes that are boolean-valued: a single bit whose value is success/yes/true/one with probability p and failure/no/false/zero with probability q. It can be used to represent a coin toss where 1 and 0 would represent "heads" and "tails", respectively, and p would be the probability of the coin landing on heads or tails, respectively. In particular, unfair coins would have
The Bernoulli distribution is a special case of the binomial distribution where a single trial is conducted. It is also a special case of the two-point distribution, for which the possible outcomes need not be 0 and 1.

Properties

If is a random variable with this distribution, then:
The probability mass function of this distribution, over possible outcomes k, is
This can also be expressed as
or as
The Bernoulli distribution is a special case of the binomial distribution with
The kurtosis goes to infinity for high and low values of but for the two-point distributions including the Bernoulli distribution have a lower excess kurtosis than any other probability distribution, namely −2.
The Bernoulli distributions for form an exponential family.
The maximum likelihood estimator of based on a random sample is the sample mean.

Mean

The expected value of a Bernoulli random variable is
This is due to the fact that for a Bernoulli distributed random variable with and we find

Variance

The variance of a Bernoulli distributed is
We first find
From this follows

Skewness

The skewness is. When we take the standardized Bernoulli distributed random variable we find that this random variable attains with probability and attains with probability. Thus we get

Higher moments and cumulants

The central moment of order is given by
The first six central moments are
The higher central moments can be expressed more compactly in terms of and
The first six cumulants are

Related distributions