Decision rule


In decision theory, a decision rule is a function which maps an observation to an appropriate action. Decision rules play an important role in the theory of statistics and economics, and are closely related to the concept of a strategy in game theory.
In order to evaluate the usefulness of a decision rule, it is necessary to have a loss function detailing the outcome of each action under different states.

Formal definition

Given an observable random variable X over the probability space, determined by a parameter θΘ, and a set A of possible actions, a decision rule is a function δ : → A.

Examples of decision rules