Brascamp–Lieb inequality


In mathematics, the Brascamp–Lieb inequality is either of two inequalities. The first is a result in geometry concerning integrable functions on n-dimensional Euclidean space. It generalizes the Loomis–Whitney inequality and Hölder's inequality. The second is a result of probability theory which gives a concentration inequality for log-concave probability distributions. Both are named after Herm Jan Brascamp and Elliott H. Lieb.

The geometric inequality

Fix natural numbers m and n. For 1 ≤ im, let niN and let ci > 0 so that
Choose non-negative, integrable functions
and surjective linear maps
Then the following inequality holds:
where D is given by
Another way to state this is that the constant D is what one would obtain by restricting attention to the case in which each is a centered Gaussian function, namely.

Relationships to other inequalities

The geometric Brascamp–Lieb inequality

The geometric Brascamp–Lieb inequality is a special case of the above, and was used by Keith Ball, in 1989, to provide upper bounds for volumes of central sections of cubes.
For i = 1,..., m, let ci > 0 and let uiSn−1 be a unit vector; suppose that ci and ui satisfy
for all x in Rn. Let fiL1 for each i = 1,..., m. Then
The geometric Brascamp–Lieb inequality follows from the Brascamp–Lieb inequality as stated above by taking ni = 1 and Bi = x · ui. Then, for ziR,
It follows that D = 1 in this case.

Hölder's inequality

As another special case, take ni = n, Bi = id, the identity map on, replacing fi by f, and let ci = 1 / pi for 1 ≤ im. Then
and the log-concavity of the determinant of a positive definite matrix implies that D = 1. This yields Hölder's inequality in :

The concentration inequality

Consider a probability density function. This probability density function is said to be a log-concave measure if the function is convex. Such probability density functions have tails which decay exponentially fast, so most of the probability mass resides in a small region around the mode of. The Brascamp–Lieb inequality gives another characterization of the compactness of by bounding the mean of any statistic.
Formally, let be any derivable function. The Brascamp–Lieb inequality reads:
where H is the Hessian and is the Nabla symbol.

Relationship with other inequalities

The Brascamp–Lieb inequality is an extension of the Poincaré inequality which only concerns Gaussian probability distributions.
The Brascamp–Lieb inequality is also related to the Cramér–Rao bound. While Brascamp–Lieb is an upper-bound, the Cramér–Rao bound lower-bounds the variance of. The expressions are almost identical:
Further reference for both points can be found in "Log-concavity and strong log-concavity: A review", by A. Saumard and J. Wellner.