This chapter introduces the heart of SOS programming, i.e. SOS polynomials, which are polynomials that can be expressed as sum of squares of polynomials. To this end, the chapter firstly recalls the standard representation of polynomials based on vector bases, which are denoted as power vectors. Then, it is shown how polynomials can be represented through a quadratic product in an extended space by using power vectors, which is known in the literature as Gram matrix method and SMR. SOS polynomials are hence introduced, showing in particular that a necessary and sufficient condition for a polynomial to be SOS is that the polynomial admits a positive semidefinite SMR matrix, which is an LMI feasibility test. This result is successively reformulated by introducing the SOS index, which measures how SOS is a polynomial and which can be found by solving an SDP. The chapter proceeds by introducing the representation of parameter-dependent polynomials and matrix polynomials via power vectors and the SMR, the concepts of SOS parameter-dependent polynomials and SOS matrix polynomials, and their characterization via the SMR and LMIs. Then, the problem of extracting power vectors from linear subspaces is addressed, which will play a key role in establishing optimality in SOS programming, and for which a simple solution based on computing the roots of a univariate polynomial is presented. Lastly, the gap between positive polynomials and SOS polynomials is briefly discussed, recalling classical and recent results. The studies above mentioned are presented for the case of general polynomials and, where appropriate, for the special cases of locally quadratic polynomials (i.e., polynomials without constant and linear monomials) and homogeneous polynomials (i.e., polynomials with all monomials of the same degree), which will be exploited throughout the book.
Unable to display preview. Download preview PDF.