Method of moments (statistics)


In statistics, the method of moments is a method of estimation of population parameters.
It starts by expressing the population moments as functions of the parameters of interest. Those expressions are then set equal to the sample moments. The number of such equations is the same as the number of parameters to be estimated. Those equations are then solved for the parameters of interest. The solutions are estimates of those parameters.
The method of moments was introduced by Pafnuty Chebyshev in 1887 in the proof of the central limit theorem. The idea of matching empirical moments of a distribution to the population moments dates back at least to Pearson.

Method

Suppose that the problem is to estimate unknown parameters characterizing the distribution of the random variable. Suppose the first moments of the true distribution can be expressed as functions of the s:
Suppose a sample of size is drawn, resulting in the values. For, let
be the j-th sample moment, an estimate of. The method of moments estimator for denoted by is defined as the solution to the equations:

Advantages and disadvantages

The method of moments is fairly simple and yields consistent estimators, though these estimators are often biased.
In some respects, when estimating parameters of a known family of probability distributions, this method was superseded by Fisher's method of maximum likelihood, because maximum likelihood estimators have higher probability of being close to the quantities to be estimated and are more often unbiased.
However, in some cases the likelihood equations may be intractable without computers, whereas the method-of-moments estimators can be computed much more quickly and easily. Due to easy computability, method-of-moments estimates may be used as the first approximation to the solutions of the likelihood equations, and successive improved approximations may then be found by the Newton-Raphson method. In this way the method of moments can assist in finding maximum likelihood estimates.
In some cases, infrequent with large samples but not so infrequent with small samples, the estimates given by the method of moments are outside of the parameter space ; it does not make sense to rely on them then. That problem never arises in the method of maximum likelihood. Also, estimates by the method of moments are not necessarily sufficient statistics, i.e., they sometimes fail to take into account all relevant information in the sample.
When estimating other structural parameters, appropriate probability distributions may not be known, and moment-based estimates may be preferred to maximum likelihood estimation.

Examples

An example application of the method of moments is to estimate polynomial probability density distributions. In this case, an approximate polynomial of order is defined on an interval. The method of moments then yields a system of equations, whose solution involves the inversion of a Hankel matrix.

Uniform distribution

Consider the uniform distribution on the interval,. If then we have
Solving these equations gives
Given a set of samples we can use the sample moments and in these formulae in order to estimate and.
Note, however, that this method can produce inconsistent results in some cases. For example, the set of samples results in the estimate even though and so it is impossible for the set to have been drawn from in this case.