Divergence (statistics)
In statistics and information geometry, divergence or a contrast function is a function which establishes the "distance" of one probability distribution to the other on a statistical manifold. The divergence is a weaker notion than that of the distance, in particular the divergence need not be symmetric, and need not satisfy the triangle inequality.
Definition
Suppose S is a space of all probability distributions with common support. Then a divergence on S is a function satisfying- D ≥ 0 for all p, q ∈ S,
- D = 0 if and only if p = q,
Geometrical properties
Many properties of divergences can be derived if we restrict S to be a statistical manifold, meaning that it can be parametrized with a finite-dimensional coordinate system θ, so that for a distribution we can write.For a pair of points with coordinates θp and θq, denote the partial derivatives of D as
Now we restrict these functions to a diagonal, and denote
By definition, the function D is minimized at, and therefore
where matrix g is positive semi-definite and defines a unique Riemannian metric on the manifold S.
Divergence D also defines a unique torsion-free affine connection ∇ with coefficients
and the dual to this connection ∇* is generated by the dual divergence D*.
Thus, a divergence D generates on a statistical manifold a unique dualistic structure , ∇, ∇. The converse is also true: every torsion-free dualistic structure on a statistical manifold is induced from some globally defined divergence function.
For example, when D is an f-divergence for some function ƒ, then it generates the metric and the connection, where g is the canonical Fisher information metric, ∇ is the α-connection,, and.
Examples
The two most important divergences are the relative entropy, which is central to information theory and statistics, and the squared Euclidean distance. Minimizing these two divergences is the main way that linear inverse problem are solved, via the principle of maximum entropy and least squares, notably in logistic regression and linear regression.The two most important classes of divergences are the f-divergences and Bregman divergences; however, other types of divergence functions are also encountered in the literature. The only divergence that is both an f-divergence and a Bregman divergence is the Kullback–Leibler divergence; the squared Euclidean divergence is a Bregman divergence, but not an f-divergence.
f-divergences
This family of divergences are generated through functions f, convex on and such that. Then an f-divergence is defined asKullback–Leibler divergence: | |
squared Hellinger distance: | |
Jeffreys divergence: | |
Chernoff's α-divergence: | |
exponential divergence: | |
Kagan's divergence: | |
-product divergence: |
If a Markov process has a positive equilibrium probability distribution then is a monotonic function of time, where the probability distribution is a solution of the Kolmogorov forward equations, used to describe the time evolution of the probability distribution in the Markov process. This means that all f-divergencies are the Lyapunov functions of the Kolmogorov forward equations. Reverse statement is also true: If is a Lyapunov function for all Markov chains with positive equilibrium and is of the trace-form
then, for some convex function f. Bregman divergences in general do not have such property and can increase in Markov processes.
Bregman divergences
Bregman divergences correspond to convex functions on convex sets. Given a strictly convex, continuously-differentiable function on a convex set, known as the Bregman generator, the Bregman divergence measures the convexity of: the error of the linear approximation of from as an approximation of the value at :The dual divergence to a Bregman divergence is the divergence generated by the convex conjugate of the Bregman generator of the original divergence. For example, for the squared Euclidean distance, the generator is, while for the relative entropy the generator is the negative entropy.