Weakly dependent random variables


In probability, weak dependence of random variables is a generalization of independence that is weaker than the concept of a martingale. A sequence of random variables is weakly dependent if distinct portions of the sequence have a covariance that asymptotically decreases to 0 as the blocks are further separated in time. Weak dependence primarily appears as a technical condition in various probabilistic limit theorems.

Formal definition

Fix a set, a sequence of sets of measurable functions, a decreasing sequence, and a function. A sequence of random variables is -weakly dependent iff, for all, for all, and, we have
Note that the covariance does not decay to uniformly in and.

Common applications

Weak dependence is a sufficient weak condition that many natural instances of stochastic processes exhibit it. In particular, weak dependence is a natural condition for the ergodic theory of random functions.
A sufficient substitute for independence in the Lindeberg–Lévy central limit theorem is weak dependence. For this reason, specializations often appear in the probability literature on limit theorems. These include Withers' condition for strong mixing, Tran's "absolute regularity in the locally transitive sense," and Birkel's "asymptotic quadrant independence."
Weak dependence also functions as a substitute for strong mixing. Again, generalizations of the latter are specializations of the former; an example is Rosenblatt's mixing condition.
Other uses include a generalization of the Marcinkiewicz–Zygmund inequality and Rosenthal inequalities.
Martingales are weakly dependent, so many results about martingales also hold true for weakly dependent sequences. An example is Bernstein's bound on higher moments, which can be relaxed to only require