Entropy power inequality


In information theory, the entropy power inequality is a result that relates to so-called "entropy power" of random variables. It shows that the entropy power of suitably well-behaved random variables is a superadditive function. The entropy power inequality was proved in 1948 by Claude Shannon in his seminal paper "A Mathematical Theory of Communication". Shannon also provided a sufficient condition for equality to hold; Stam showed that the condition is in fact necessary.

Statement of the inequality

For a random variable X : Ω → Rn with probability density function f : RnR, the differential entropy of X, denoted h, is defined to be
and the entropy power of X, denoted N, is defined to be
In particular, N = |K| 1/n when X is normal distributed with covariance matrix K.
Let X and Y be independent random variables with probability density functions in the Lp space Lp for some p > 1. Then
Moreover, equality holds if and only if X and Y are multivariate normal random variables with proportional covariance matrices.