Polynomial SOS

This article is about representing a non-negative polynomial as sum of squares of polynomials. For representing polynomial as a sum of squares of rational functions, see Hilbert's seventeenth problem. For the sum of squares of consecutive integers, see square pyramidal number. For representing an integer as a sum of squares of integers, see Lagrange's four-square theorem.

In mathematics, a form (i.e. a homogeneous polynomial) h(x) of degree 2m in the real n-dimensional vector x is sum of squares of forms (SOS) if and only if there exist forms g_1(x),\ldots,g_k(x) of degree m such that


h(x)=\sum_{i=1}^k g_i(x)^2 .

Every form that is SOS is also a positive polynomial, and although the converse is not always true, Hilbert proved that for n = 2, m = 1 or n = 3 and 2m = 4 a form is SOS if and only if is positive.[1] The same is also valid for the analog problem on positive symmetric forms.[2][3]

Although not every form can be represented as SOS, explicit sufficient conditions for a form to be SOS have been found.[4][5] Moreover, every real nonnegative form can be approximated as closely as desired (in the l_1-norm of its coefficient vector) by a sequence of forms \{f_\epsilon\} that are SOS.[6]

Square matricial representation (SMR)

To establish whether a form h(x) is SOS amounts to solving a convex optimization problem. Indeed, any h(x) can be written as


h(x)=x^{\{m\}'}\left(H+L(\alpha)\right)x^{\{m\}}

where x^{\{m\}} is a vector containing a base for the forms of degree m in x (such as all monomials of degree m in x), the prime denotes the transpose, H is any symmetric matrix satisfying


h(x)=x^{\left\{m\right\}'}Hx^{\{m\}}

and L(\alpha) is a linear parameterization of the linear space


\mathcal{L}=\left\{L=L':~x^{\{m\}'} L x^{\{m\}}=0\right\}.

The dimension of the vector x^{\{m\}} is given by


\sigma(n,m)=\binom{n+m-1}{m}

whereas the dimension of the vector \alpha is given by


\omega(n,2m)=\frac{1}{2}\sigma(n,m)\left(1+\sigma(n,m)\right)-\sigma(n,2m).

Then, h(x) is SOS if and only if there exists a vector \alpha such that


H + L(\alpha) \ge 0,

meaning that the matrix H + L(\alpha) is positive-semidefinite. This is a linear matrix inequality (LMI) feasibility test, which is a convex optimization problem. The expression h(x)=x^{\{m\}'}\left(H+L(\alpha)\right)x^{\{m\}} was introduced in [7] with the name square matricial representation (SMR) in order to establish whether a form is SOS via an LMI. This representation is also known as Gram matrix.[8]

Examples


m=2,~x^{\{m\}}=\left(\begin{array}{c}x_1^2\\x_1x_2\\x_2^2\end{array}\right),
~H+L(\alpha)=\left(\begin{array}{ccc}
1&0&-\alpha_1\\0&-1+2\alpha_1&0\\-\alpha_1&0&1
\end{array}\right).
Since there exists α such that H+L(\alpha)\ge 0, namely \alpha=1, it follows that h(x) is SOS.

m=2,~x^{\{m\}}=\left(\begin{array}{c}x_1^2\\x_1x_2\\x_1x_3\\x_2^2\\x_2x_3\\x_3^2\end{array}\right),
~H+L(\alpha)=\left(\begin{array}{cccccc}
2&-1.25&0&-\alpha_1&-\alpha_2&-\alpha_3\\
-1.25&2\alpha_1&0.5+\alpha_2&0&-\alpha_4&-\alpha_5\\
0&0.5+\alpha_2&2\alpha_3&\alpha_4&\alpha_5&-1\\
-\alpha_1&0&\alpha_4&5&0&-\alpha_6\\
-\alpha_2&-\alpha_4&\alpha_5&0&2\alpha_6&0\\
-\alpha_3&-\alpha_5&-1&-\alpha_6&0&1
\end{array}\right).
Since H+L(\alpha)\ge 0 for \alpha=(1.18,-0.43,0.73,1.13,-0.37,0.57), it follows that h(x) is SOS.

Generalizations

Matrix SOS

A matrix form F(x) (i.e., a matrix whose entries are forms) of dimension r and degree 2m in the real n-dimensional vector x is SOS if and only if there exist matrix forms G_1(x),\ldots,G_k(x) of degree m such that


F(x)=\sum_{i=1}^k G_i(x)'G_i(x) .

Matrix SMR

To establish whether a matrix form F(x) is SOS amounts to solving a convex optimization problem. Indeed, similarly to the scalar case any F(x) can be written according to the SMR as


F(x)=\left(x^{\{m\}}\otimes I_r\right)'\left(H+L(\alpha)\right)\left(x^{\{m\}}\otimes I_r\right)

where \otimes is the Kronecker product of matrices, H is any symmetric matrix satisfying


F(x)=\left(x^{\{m\}}\otimes I_r\right)'H\left(x^{\{m\}}\otimes I_r\right)

and L(\alpha) is a linear parameterization of the linear space


\mathcal{L}=\left\{L=L':~\left(x^{\{m\}}\otimes I_r\right)'L\left(x^{\{m\}}\otimes I_r\right)=0\right\}.

The dimension of the vector \alpha is given by


\omega(n,2m,r)=\frac{1}{2}r\left(\sigma(n,m)\left(r\sigma(n,m)+1\right)-(r+1)\sigma(n,2m)\right).

Then, F(x) is SOS if and only if there exists a vector \alpha such that the following LMI holds:


H+L(\alpha) \ge 0.

The expression F(x)=\left(x^{\{m\}}\otimes I_r\right)'\left(H+L(\alpha)\right)\left(x^{\{m\}}\otimes I_r\right) was introduced in [9] in order to establish whether a matrix form is SOS via an LMI.

Noncommutative polynomial SOS

Consider the free algebra RX⟩ generated by the n noncommuting letters X = (X1,...,Xn) and equipped with the involution T, such that T fixes R and X1,...,Xn and reverse words formed by X1,...,Xn. By analogy with the commutative case, the noncommutative symmetric polynomials f are the noncommutative polynomials of the form f=fT. When any real matrix of any dimension r x r is evaluated at a symmetric noncommutative polynomial f results in a positive semi-definite matrix, f is said to be matrix-positive.

A noncommutative polynomial is SOS if there exists noncommutative polynomials h_1,\ldots,h_k such that

 f(X) = \sum_{i=1}^{k} h_i(X)^T h_i(X).

Surprisingly, in the noncommutative scenario a noncommutative polynomial is SoS if and only if is matrix-positive.[10] Moreover, there exist algorithms available to decompose matrix-positive polynomials in sum of squares of noncommutative polynomials.[11]

References

  1. Hilbert, David (September 1888). "Ueber die Darstellung definiter Formen als Summe von Formenquadraten". Mathematische Annalen 32 (3): 342–350. doi:10.1007/bf01443605.
  2. Choi, M. D.; Lam, T. Y. (1977). "An old question of Hilbert". Queen's papers in pure and applied mathematics 46: 385–405.
  3. Goel, Charu; Kuhlmann, Salma; Reznick, Bruce (May 2016). "On the Choi–Lam analogue of Hilbert's 1888 theorem for symmetric forms". Linear Algebra and its Applications 496: 114–120. doi:10.1016/j.laa.2016.01.024.
  4. Lasserre, Jean B. "Sufficient conditions for a real polynomial to be a sum of squares". Archiv der Mathematik 89 (5): 390–398. doi:10.1007/s00013-007-2251-y.
  5. Powers, Victoria; Wörmann, Thorsten (1998). "An algorithm for sums of squares of real polynomials" (PDF). Journal of Pure and Applied Algebra 127 (1): 99–104. doi:10.1016/S0022-4049(97)83827-3.
  6. Lasserre, Jean B. (2007). "A Sum of Squares Approximation of Nonnegative Polynomials". SIAM Review 49 (4): 651–669. doi:10.1137/070693709.
  7. Chesi, G.; Tesi, A.; Vicino, A.; Genesio, R. (1999). "On convexification of some minimum distance problems". Proceedings of the 5th European Control Conference. Karlsruhe, Germany: IEEE. pp. 1446–1451.
  8. Choi, M.; Lam, T.; Reznick, B. (1995). "Sums of squares of real polynomials". Proceedings of Symposia in Pure Mathematics. pp. 103–125.
  9. Chesi, G.; Garulli, A.; Tesi, A.; Vicino, A. (2003). "Robust stability for polytopic systems via polynomially parameter-dependent Lyapunov functions". Proceedings of the 42nd IEEE Conference on Decision and Control. Maui, Hawaii: IEEE. pp. 4670–4675.
  10. Helton, J. William (September 2002). ""Positive" Noncommutative Polynomials Are Sums of Squares". The Annals of Mathematics 156 (2): 675–694. doi:10.2307/3597203.
  11. Burgdorf, Sabine; Cafuta, Kristijan; Klep, Igor; Povh, Janez (25 October 2012). "Algorithmic aspects of sums of Hermitian squares of noncommutative polynomials". Computational Optimization and Applications 55 (1): 137–153. doi:10.1007/s10589-012-9513-8.

See also

This article is issued from Wikipedia - version of the Saturday, February 13, 2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.