Entropic force


In physics, an entropic force acting in a system is an emergent phenomenon resulting from the entire system's statistical tendency to increase its entropy, rather than from a particular underlying force on the atomic scale.
The entropic force can be considered as an emergent of the entropic interaction. The concept of entropic interaction was usually used in a subjunctive mood. For example: "macromolecule links, as if, entropically repulse from each other at a short distance and entropically are attracted to each other at a long distance”. In a modern view the entropic interaction is considered to be a real-life interaction, and it is viewed as a mutual influence of open thermodynamic systems on each other by means of transferring information about their states, changing their entropies and translation of these systems into more probable conditions. The entropic interaction is a quintessential physical interaction that is realized by well-known basic interactions through the processes that occur elsewhere in the universe including the solar system, our planet Earth, and living organisms. The basic interactions are considered to be daughter of the entropic interaction. The entropic interaction is not a consequence of existence of some entropy charge and a field accompanying it. It should not be referred to as a distribution of the entropy in the space. Entropy interaction reflects only an “order” and “structure” of the space, the state of the space and physical systems in it and, ultimately, affects the energy, behavior and evolution of such systems as well as the space as a whole. The entropic interaction results in the alteration of symmetry, free energy, and other characteristics of the physical system. Using this interaction, all material objects in Nature exert a certain influence on each other, regardless of the distance between them.

Mathematical formulation

In the canonical ensemble, the entropic force associated to a macrostate partition is given by:


where is the temperature, is the entropy associated to the macrostate and is the present macrostate.

Examples

Mach's principle

According to Mach's principle, local physics laws are determined by a large-scale structure of the universe and changes in any part of the universe affect a corresponding impact on all of its parts First of all, such changes are due by the entropic interaction. Once they have a place in one part of the universe, the entropy of the universe as a whole changes as well. That is, the entire universe “feels” such changes at the same time. In other words, the entropic interaction between different parts of any thermodynamic system happens instantly without the transfer of any material substance, meaning it is always a long-range action. After that, some processes emerge inside the system to transfer some substances or portions of energy in the appropriate direction. These actions are produced by one of basic interactions according to the mode of short-range action.

Heat dispersion

Heat dispersion is one of the examples of the entropic interaction. When one side of a metal pole is heated, a non-homogeneous temperature distribution is created along the pole. Because of entropic interaction between different parts of the pole, the entropy of the entire pole will decrease instantly. At the same time, the tendency appears to obtain a homogeneous distribution of the temperature. This would be a long-range action. The process of heat conductivity will emerge to realize this tendency by a short-range action. Overall, this is an example of co-existence of the long and short-range actions in one process.

Pressure of an ideal gas

The internal energy of an ideal gas depends only on its temperature, and not on the volume of its containing box, so it is not an energy effect that tends to increase the volume of the box as gas pressure does. This implies that the pressure of an ideal gas has an entropic origin.
What is the origin of such an entropic force? The most general answer is that the effect of thermal fluctuations tends to bring a thermodynamic system toward a macroscopic state that corresponds to a maximum in the number of microscopic states that are compatible with this macroscopic state. In other words, thermal fluctuations tend to bring a system toward its macroscopic state of maximum entropy.

Brownian motion

The entropic approach to Brownian movement was initially proposed by R. M. Neumann. Neumann derived the entropic force for a particle undergoing three-dimensional Brownian motion using the Boltzmann equation, denoting this force as a diffusional driving force or radial force. In the paper, three example systems are shown to exhibit such a force:
A standard example of an entropic force is the elasticity of a freely-jointed polymer molecule. For an ideal chain, maximizing its entropy means reducing the distance between its two free ends. Consequently, a force that tends to collapse the chain is exerted by the ideal chain between its two free ends. This entropic force is proportional to the distance between the two ends. The entropic force by a freely-jointed chain has a clear mechanical origin, and can be computed using constrained Lagrangian dynamics.

Hydrophobic force

Another example of an entropic force is the hydrophobic force. At room temperature, it partly originates from the loss of entropy by the 3D network of water molecules when they interact with molecules of dissolved substance. Each water molecule is capable of
Therefore, water molecules can form an extended three-dimensional network. Introduction of a non-hydrogen-bonding surface disrupts this network. The water molecules rearrange themselves around the surface, so as to minimize the number of disrupted hydrogen bonds. This is in contrast to hydrogen fluoride or ammonia, which mainly form linear chains.
If the introduced surface had an ionic or polar nature, there would be water molecules standing upright on 1 or 2 of the four sp3 orbitals. These orientations allow easy movement, i.e. degrees of freedom, and thus lowers entropy minimally. But a non-hydrogen-bonding surface with a moderate curvature forces the water molecule to sit tight on the surface, spreading 3 hydrogen bonds tangential to the surface, which then become locked in a clathrate-like basket shape. Water molecules involved in this clathrate-like basket around the non-hydrogen-bonding surface are constrained in their orientation. Thus, any event that would minimize such a surface is entropically favored. For example, when two such hydrophobic particles come very close, the clathrate-like baskets surrounding them merge. This releases some of the water molecules into the bulk of the water, leading to an increase in entropy.
Another related and counter-intuitive example of entropic force is protein folding, which is a spontaneous process and where hydrophobic effect also plays a role. Structures of water-soluble proteins typically have a core in which hydrophobic side chains are buried from water, which stabilizes the folded state. Charged and polar side chains are situated on the solvent-exposed surface where they interact with surrounding water molecules. Minimizing the number of hydrophobic side chains exposed to water is the principal driving force behind the folding process,
although formation of hydrogen bonds within the protein also stabilizes protein structure.

Colloids

Entropic forces are important and widespread in the physics of colloids, where they are responsible for the depletion force, and the ordering of hard particles, such as the crystallization of hard spheres, the isotropic-nematic transition in liquid crystal phases of hard rods, and the ordering of hard polyhedra. Because of this, entropic forces can be an important driver of self-assembly
Entropic forces arise in colloidal systems due to the osmotic pressure that comes from particle crowding. This was first discovered in, and is most intuitive for, colloid-polymer mixtures described by the Asakura–Oosawa model. In this model, polymers are approximated as finite-sized spheres that can penetrate one another, but cannot penetrate the colloidal particles. The inability of the polymers to penetrate the colloids leads to a region around the colloids in which the polymer density is reduced. If the regions of reduced polymer density around two colloids overlap with one another, by means of the colloids approaching one another, the polymers in the system gain an additional free volume that is equal to the volume of the intersection of the reduced density regions. The additional free volume causes an increase in the entropy of the polymers, and drives them to form locally dense-packed aggregates. A similar effect occurs in sufficiently dense colloidal systems without polymers, where osmotic pressure also drives the local dense packing of colloids into a diverse array of structures that can be rationally designed by modifying the shape of the particles. These effects are for anisotropic particles referred to as directional entropic forces.

Controversial examples

Some forces that are generally regarded as conventional forces have been argued to be actually entropic in nature. These theories remain controversial and are the subject of ongoing work. Matt Visser, professor of mathematics at Victoria University of Wellington, NZ in "Conservative Entropic Forces" criticizes selected approaches but generally concludes:

Gravity

In 2009, Erik Verlinde argued that gravity can be explained as an entropic force. It claimed that gravity is a consequence of the "information associated with the positions of material bodies". This model combines the thermodynamic approach to gravity with Gerard 't Hooft's holographic principle. It implies that gravity is not a fundamental interaction, but an emergent phenomenon.

Other forces

In the wake of the discussion started by Verlinde, entropic explanations for other fundamental forces have been suggested, including Coulomb's law, the electroweak and strong forces. The same approach was argued to explain dark matter, dark energy and Pioneer effect.

Links to adaptive behavior

It was argued that causal entropic forces lead to spontaneous emergence of tool use and social cooperation. Causal entropic forces by definition maximize entropy production between the present and future time horizon, rather than just greedily maximizing instantaneous entropy production like typical entropic forces.
A formal simultaneous connection between the mathematical structure of the discovered laws of nature, intelligence and the entropy-like measures of complexity was previously noted in 2000 by Andrei Soklakov in the context of Occam's razor principle.