MathGroup Archive 2001

[Date Index] [Thread Index] [Author Index]

Search the Archive

Re: Sum of Squares

  • To: mathgroup at smc.vnet.net
  • Subject: [mg26600] Re: [mg26587] Sum of Squares
  • From: Daniel Lichtblau <danl at wolfram.com>
  • Date: Thu, 11 Jan 2001 10:39:12 -0500 (EST)
  • References: <200101090652.BAA00248@smc.vnet.net>
  • Sender: owner-wri-mathgroup at wolfram.com

Will Self wrote:
> 
> Is there, or could there be, a "sum of squares" function for
> multivariate polynomials, which would rewrite an expression as a sum of
> squares, when possible?
> 
> It might work like this:
> 
> SumOfSquares[5 x^2 + 8 x y + 5 y^2 + 2 x z - 2 y z + 2 z^2]
> 
> --->
> 
> (x + 2y - z)^2 + (2x + y + z)^2
> 
> Will Self
> email appreciated
> 
> Sent via Deja.com
> http://www.deja.com/


In general this is not feasible.

For the special case of a homogeneous quadratic polynomial one can do
this (not uniquely, as you'll see) by diagonalizing the symmetric
quadratic form matrix for that polynomial.

In[139]:= poly = 5*x^2 + 8*x*y + 5*y^2 + 2*x*z - 2*y*z + 2*z^2;

In[140]:= vars = {x,y,z};

In[141]:= qform[poly_, vars_] := With[{jac = Map[D[poly,#]&, vars]},
        1/2 * Map[D[jac,#]&, vars]]          

In[142]:= mat = qform[poly, vars]
Out[142]= {{5, 4, 1}, {4, 5, -1}, {1, -1, 2}}

In[143]:= {evals, evecs} = Eigensystem[mat]
Out[143]= {{0, 3, 9}, {{-1, 1, 1}, {1, -1, 2}, {1, 1, 0}}}

SInce our matrix is symmetric we can diagonalize using an orthogonal
matrix, which means in effect we must normalize the eigenvectors.
Actually this need not always be sufficient because there can be
eigenspaces of dimension larger than one if we have multiple
eigenvalues; in this case one would need to use e.g. Gram-Schmidt or
QRDecomposition to make them pairwise orthogonal. Not an issue in this
example.

To normalize them:

In[146]:= mults = Map[1/Sqrt[#.#]&, evecs];

In[147]:= InputForm[newvecs = mults*evecs]
Out[147]//InputForm= 
{{-(1/Sqrt[3]), 1/Sqrt[3], 1/Sqrt[3]}, {1/Sqrt[6], -(1/Sqrt[6]),
Sqrt[2/3]}, 
 {1/Sqrt[2], 1/Sqrt[2], 0}}

Check that our change-of-basis matrix is orthogonal:

In[148]:= Transpose[newvecs] . newvecs == IdentityMatrix[3]
Out[148]= True

With respect to the above change-of-basis matrix newvecs our quadratic
form is now diagonalized, and moreover we have
Inverse[newvecs]==Transpose[newvecs]. Thus we will get a new set of
linear polynomials, newvecs . vars, for which we'll have our sum of
squares.

In[149]:= linpolys = newvecs . vars;

In[150]:= sumofsquares = evals . linpolys^2
                x         y    2         x         y           2    2
Out[150]= 9 (------- + -------)  + 3 (------- - ------- + Sqrt[-] z)
             Sqrt[2]   Sqrt[2]        Sqrt[6]   Sqrt[6]        3

Quick check:

In[153]:= Expand[sumofsquares]==poly
Out[153]= True


If this change-of-basis part is unclear, use the facts that

(i) Transpose[newvecs] . DiagonalMatrix[evals] . newvecs == mat
(ii) poly == vars . mat . vars

Hence

poly == Transpose[newvecs.vars] . DiagonalMatrix[evals] . (newvecs.vars)
 == linpolys . DiagonalMatrix[evals] . linpolys
 == evals . linpolys^2


Daniel Lichtblau
Wolfram Research


  • Prev by Date: Re: How to get the slope of a Interpolation[] function at a specified point?
  • Next by Date: Intersection
  • Previous by thread: Sum of Squares
  • Next by thread: Re: Sum of Squares