In mathematics, a formh of degree 2m in the real n-dimensional vector x is sum of squares of forms if and only ifthere exist forms of degree m such that Every form that is SOS is also a positive polynomial, and although the converse is not always true, Hilbert proved that for n = 2, m = 1 or n = 3 and 2m = 4 a form is SOS if and only if it is positive. The same is also valid for the analog problem on positive symmetric forms. Although not every form can be represented as SOS, explicit sufficient conditions for a form to be SOS have been found. Moreover, every real nonnegative form can be approximated as closely as desired by a sequence of forms that are SOS.
To establish whether a form h is SOS amounts to solving a convex optimization problem. Indeed, any h can be written as where is a vector containing a base for the forms of degree m in x, the prime ′ denotes the transpose, H is any symmetric matrix satisfying and is a linear parameterization of the linear space The dimension of the vector is given by whereas the dimension of the vector is given by Then, h is SOS if and only if there exists a vector such that meaning that the matrix is positive-semidefinite. This is a linear matrix inequalityfeasibility test, which is a convex optimization problem. The expression was introduced in with the name square matricial representation in order to establish whether a form is SOS via an LMI. This representation is also known as Gram matrix.
Examples
Consider the form of degree 4 in two variables. We have
Consider the form of degree 4 in three variables. We have
Generalizations
Matrix SOS
A matrix form F of dimension r and degree 2m in the real n-dimensional vector x is SOS if and only if there exist matrix forms of degree m such that
Matrix SMR
To establish whether a matrix form F is SOS amounts to solving a convex optimization problem. Indeed, similarly to the scalar case any F can be written according to the SMR as where is the Kronecker product of matrices, H is any symmetric matrix satisfying and is a linear parameterization of the linear space The dimension of the vector is given by Then, F is SOS if and only if there exists a vector such that the following LMI holds: The expression was introduced in in order to establish whether a matrix form is SOS via an LMI.
Noncommutative polynomial SOS
Consider the free algebraR⟨X⟩ generated by the n noncommuting letters X = and equipped with the involution T, such that T fixes R and X1,...,Xn and reverse words formed by X1,...,Xn. By analogy with the commutative case, the noncommutative symmetric polynomialsf are the noncommutative polynomials of the form f = fT. When any real matrix of any dimension r x r is evaluated at a symmetric noncommutative polynomial f results in a positive semi-definite matrix, f is said to be matrix-positive. A noncommutative polynomial is SOS if there exists noncommutative polynomials such that Surprisingly, in the noncommutative scenario a noncommutative polynomial is SoS if and only if it is matrix-positive. Moreover, there exist algorithms available to decompose matrix-positive polynomials in sum of squares of noncommutative polynomials.