Wald's equation


In probability theory, Wald's equation, Wald's identity or Wald's lemma is an important identity that simplifies the calculation of the expected value of the sum of a random number of random quantities. In its simplest form, it relates the expectation of a sum of randomly many finite-mean, independent and identically distributed random variables to the expected number of terms in the sum and the random variables' common expectation under the condition that the number of terms in the sum is independent of the summands.
The equation is named after the mathematician Abraham Wald. An identity for the second moment is given by the Blackwell–Girshick equation.

Basic version

Let be a sequence of real-valued, independent and identically distributed random variables and let be a nonnegative integer-value random variable that is independent of the sequence. Suppose that and the have finite expectations. Then

Example

Roll a six-sided dice. Take the number on the die and roll that number of six-sided dice to get the numbers, and add up their values. By Wald's equation, the resulting value on average is

General version

Let be an infinite sequence of real-valued random variables and let be a nonnegative integer-valued random variable.
Assume that:
Then the random sums
are integrable and
If, in addition,
then
Remark: Usually, the name Wald's equation refers to this last equality.

Discussion of assumptions

Clearly, assumption is needed to formulate assumption and Wald's equation. Assumption controls the amount of dependence allowed between the sequence and the number of terms; see the counterexample below for the necessity. Note that assumption is satisfied when is a stopping time for the sequence. Assumption is of more technical nature, implying absolute convergence and therefore allowing arbitrary rearrangement of an infinite series in the proof.
If assumption is satisfied, then assumption can be strengthened to the simpler condition
Indeed, using assumption,
and the last series equals the expectation of , which is finite by assumption. Therefore, and imply assumption.
Assume in addition to and that
Then all the assumptions,, and, hence also are satisfied. In particular, the conditions and are satisfied if
Note that the random variables of the sequence don't need to be independent.
The interesting point is to admit some dependence between the random number of terms and the sequence. A standard version is to assume,, and the existence of a filtration such that
Then implies that the event is in, hence by independent of. This implies, and together with it implies.
For convenience and to specify the relation of the sequence and the filtration, the following additional assumption is often imposed:
Note that and together imply that the random variables are independent.

Application

An application is in actuarial science when considering the total claim amount follows a compound Poisson process
within a certain time period, say one year, arising from a random number of individual insurance claims, whose sizes are described by the random variables. Under the above assumptions, Wald's equation can be used to calculate the expected total claim amount when information about the average claim number per year and the average claim size is available. Under stronger assumptions and with more information about the underlying distributions, Panjer's recursion can be used to calculate the distribution of.

Examples

Example with dependent terms

Let be an integrable, -valued random variable, which is independent of the integrable, real-valued random variable with. Define for all. Then assumptions,,, and with are satisfied, hence also and, and Wald's equation applies. If the distribution of is not symmetric, then does not hold. Note that, when is not almost surely equal to the zero random variable, then and cannot hold simultaneously for any filtration, because cannot be independent of itself as is impossible.

Example where the number of terms depends on the sequence

Let be a sequence of independent, symmetric, and -valued random variables. For every let be the σ-algebra generated by and define when is the first random variable taking the value. Note that, hence by the ratio test. The assumptions, and, hence and with,,, and hold, hence also, and and Wald's equation applies. However, does not hold, because is defined in terms of the sequence. Intuitively, one might expect to have in this example, because the summation stops right after a one, thereby apparently creating a positive bias. However, Wald's equation shows that this intuition is misleading.

Counterexamples

A counterexample illustrating the necessity of assumption ()

Consider a sequence of i.i.d. random variables, taking each of the two values 0 and 1 with probability ½. Define. Then is identically equal to zero, hence, but and and therefore Wald's equation does not hold. Indeed, the assumptions,, and are satisfied, however, the equation in assumption holds for all except for.

A counterexample illustrating the necessity of assumption ()

Very similar to the second example above, let be a sequence of independent, symmetric random variables, where takes each of the values and with probability ½. Let be the first such that. Then, as above, has finite expectation, hence assumption holds. Since for all, assumptions and hold. However, since almost surely, Wald's equation cannot hold.
Since is a stopping time with respect to the filtration generated by, assumption holds, see above. Therefore, only assumption can fail, and indeed, since
and therefore for every, it follows that

A proof using the optional stopping theorem

Assume,,,, and. Using assumption, define the sequence of random variables
Assumption implies that the conditional expectation of given equals almost surely for every, hence is a martingale with respect to the filtration by assumption. Assumptions, and make sure that we can apply the optional stopping theorem, hence is integrable and
Due to assumption,
and due to assumption this upper bound is integrable. Hence we can add the expectation of to both sides of Equation and obtain by linearity
Remark: Note that this proof does not cover the [|above example with dependent terms].

General proof

This proof uses only Lebesgue's monotone and dominated convergence theorems.
We prove the statement as given above in three steps.

Step 1: Integrability of the random sum

We first show that the random sum is integrable. Define the partial sums
Since takes its values in and since, it follows that
The Lebesgue monotone convergence theorem implies that
By the triangle inequality,
Using this upper estimate and changing the order of summation, we obtain
where the second inequality follows using the monotone convergence theorem. By assumption, the infinite sequence on the right-hand side of converges, hence is integrable.

Step 2: Integrability of the random sum

We now show that the random sum is integrable. Define the partial sums
of real numbers. Since takes its values in and since, it follows that
As in step 1, the Lebesgue monotone convergence theorem implies that
By the triangle inequality,
Using this upper estimate and changing the order of summation, we obtain
By assumption,
Substituting this into yields
which is finite by assumption, hence is integrable.

Step 3: Proof of the identity

To prove Wald's equation, we essentially go through the same steps again without the absolute value, making use of the integrability of the random sums and in order to show that they have the same expectation.
Using the dominated convergence theorem with dominating random variable and the definition of the partial sum given in, it follows that
Due to the absolute convergence proved in above using assumption, we may rearrange the summation and obtain that
where we used assumption and the dominated convergence theorem with dominating random variable for the second equality. Due to assumption and the σ-additivity of the probability measure,
Substituting this result into the previous equation, rearranging the summation, using linearity of expectation and the definition of the partial sum of expectations given in,
By using dominated convergence again with dominating random variable,
If assumptions and are satisfied, then by linearity of expectation,
This completes the proof.

Further generalizations