Physical information


Physical information is a form of information. In physics, it refers to the information of a physical system. Physical information is an important concept used in a number of fields of study in physics. For example, in quantum mechanics, the form of physical information known as quantum information is used in many descriptions of quantum phenomena, such as quantum observation, quantum entanglement and the causal relationship between quantum objects that carry out either or both close and long-range interactions with one another.
In a general sense, information is that which resolves uncertainty, which is due to the fact that it describes the details of that which is associated with the uncertainty. The description itself is, however, divorced from any type of language.
When clarifying the subject of information, care should be taken to distinguish between the following specific cases:
As the above usages are all conceptually distinct from each other, overloading the word "information" to denote several of these concepts simultaneously can lead to confusion. Accordingly, this article uses more detailed phrases, such as those shown in bold above, whenever the intended meaning is not made clear by the context.

Classical versus quantum information

The instance of information that is contained in a physical system is generally considered to specify
that system's "true" state.
When discussing the information that is contained in physical systems according to modern quantum physics, we must distinguish between classical information and quantum information. Quantum information specifies the complete quantum state vector of a system, whereas classical information, roughly speaking, only picks out a definite quantum state if we are already given a prespecified set of distinguishable quantum states to choose from; such a set forms a basis for the vector space of all the possible pure quantum states. Quantum information could thus be expressed by providing a choice of a basis such that the actual quantum state is equal to one of the basis vectors, together with the classical information specifying which of these basis vectors is the actual one.
Note that the amount of classical information in a quantum system gives the maximum amount of information that can actually be measured and extracted from that quantum system for use by external classical systems, since only basis states are operationally distinguishable from each other. The impossibility of differentiating between non-orthogonal states is a fundamental principle of quantum mechanics, equivalent to Heisenberg's uncertainty principle. Because of its more general utility, the remainder of this article will deal primarily with classical information, although quantum information theory does also have some potential applications that are currently being actively explored by both theorists and experimentalists.

Quantifying classical physical information

An amount of physical information may be quantified, as in information theory, as follows. For a system S, defined abstractly in such a way that it has N distinguishable states that are consistent with its description, the amount of information I contained in the system's state can be said to be log. The logarithm is selected for this definition since it has the advantage that this measure of information content is additive when concatenating independent, unrelated subsystems; e.g., if subsystem A has N distinguishable states = log and an independent subsystem B has M distinguishable states = log, then the concatenated system has NM distinguishable states and an information content I = log = log + log = I + I. We expect information to be additive from our everyday associations with the meaning of the word, e.g., that two pages of a book can contain twice as much information as one page.
The base of the logarithm used in this definition is arbitrary, since it affects the result by only a multiplicative constant, which determines the unit of information that is implied. If the log is taken base 2, the unit of information is the binary digit or bit ; if we use a natural logarithm instead, we might call the resulting unit the "nat." In magnitude, a nat is apparently identical to Boltzmann's constant k or the ideal gas constant R, although these particular quantities are usually reserved to measure physical information that happens to be entropy, and that are expressed in physical units such as joules per kelvin, or kilocalories per mole-kelvin.

Physical information and entropy

An easy way to understand the underlying unity between physical entropy and information-theoretic entropy is as follows:
Entropy is simply that portion of the physical information contained in a system of interest whose identity is unknown.
This informal characterization corresponds to both von Neumann's formal definition of the entropy of a mixed quantum state, as well as Claude Shannon's definition of the entropy of a probability distribution over classical signal states or messages. Incidentally, the credit for Shannon's entropy formula really belongs to Boltzmann, who derived it much earlier for use in his H-theorem of statistical mechanics.
Furthermore, even when the state of a system is known, we can say that the information in the system is still effectively entropy if that information is effectively incompressible, that is, if there are no known or feasibly determinable correlations or redundancies between different pieces of information within the system. Note that this definition of entropy can even be viewed as equivalent to the previous one if we take a meta-perspective, and say that for observer A to "know" the state of system B means simply that there is a definite correlation between the state of observer A and the state of system B; this correlation could thus be used by a meta-observer to compress his own description of the joint system AB.
Due to this connection with algorithmic information theory, entropy can be said to be that portion of a system's information capacity which is "used up," that is, unavailable for storing new information. The rest of a system's information capacity might be called extropy, and it represents the part of the system's information capacity which is potentially still available for storing newly derived information. The fact that physical entropy is basically "used-up storage capacity" is a direct concern in the engineering of computing systems; e.g., a computer must first remove the entropy from a given physical subsystem in order for that subsystem to be used to store some newly computed information.

Extreme physical information

In a theory developed by B. Roy Frieden, "physical information" is defined as the loss of Fisher information that is incurred during the observation of a physical effect. Thus, if the effect has an intrinsic information level J but is observed at information level I, the physical information is defined to be the difference IJ. This defines an information Lagrangian. Frieden's principle of extreme physical information or EPI states that extremalizing IJ by varying the system probability amplitudes gives the correct amplitudes for most or even all physical theories. The EPI principle was recently proven. It follows from a system of mathematical axioms of L. Hardy defining all known physics.