Correlation function (astronomy)


In astronomy, a correlation function describes the distribution of galaxies in the universe. By default, "correlation function" refers to the two-point autocorrelation function. The two-point autocorrelation function is a function of one variable ; it describes the excess probability of finding two galaxies separated by this distance. It can be thought of as a lumpiness factor - the higher the value for some distance scale, the more lumpy the universe is at that distance scale.
The following definition is often cited:
However, it can only be correct in the statistical sense that it is averaged over a large number of galaxies chosen as the first, random galaxy. If just one random galaxy is chosen, then the definition is no longer correct, firstly because it is meaningless to talk of just one "random" galaxy, and secondly because the function will vary wildly depending on which galaxy is chosen, in contradiction with its definition as a function.
Assuming the universe is isotropic, the correlation function is a function of a scalar distance. The two-point correlation function can then be written as
where is a unitless measure of overdensity, defined at every point. Letting, it can also be expressed as the integral
The spatial correlation function is related to the Fourier space power spectrum of the galaxy distribution,, as
The n-point autocorrelation functions for n greater than 2 or cross-correlation functions for particular object types are defined similarly to the two-point autocorrelation function.
The correlation function is important for theoretical models of physical cosmology because it provides a means of testing models which assume different things about the contents of the universe.