MathGroup Archive 2001

[Date Index] [Thread Index] [Author Index]

Search the Archive

Re: Proof For Cov. Matrix equation?

  • To: mathgroup at smc.vnet.net
  • Subject: [mg30005] Re: [mg29904] Proof For Cov. Matrix equation?
  • From: Daniel Lichtblau <danl at wolfram.com>
  • Date: Sat, 21 Jul 2001 00:49:12 -0400 (EDT)
  • References: <200107180608.CAA18955@smc.vnet.net>
  • Sender: owner-wri-mathgroup at wolfram.com

Robert Parry wrote:
> 
> I know I've already posted this message to alt.math.recreational (5 days
> ago) but to date I have had no reply. Could anyone help me with the
> following derivation/proof?
> 
> I'm looking for a short proof for the nxn version of the following problem.
> I managed to prove it for the 2x2 case but ended up with a 3 page proof so
> I'd hate to attempt the proof for larger matrices.
> 
> Here's the 2x2 case: (For the nxn case x,y becomes x1,x2,...,xn etc.)
> 
> Let x, y & r be 3 random variables (in this case timeseries)
> now r = x*Vx + y*Vy is a weighted mean (where Vx + Vy = 1 - Vx & Vy are
> weights)
> Let the Covariance Matrices H and T be defined as
> 
> H = [ var[x] covar[x,y]; covar[x,y] var[y] ]
> (Historic Covariance)
> 
> and T = [ var[x-r] covar[x-r,y-r]; covar[x-r,y-r] var[y-r] ]
> (Tracking Error Covariance)
> 
> Prove that
> 
> [ Wx Wy ].T.[ Wx; Wy ] = [ Wx-Vx Wy-Vy ].H.[ Wx-Vx; Wy-Vy ]
> where Wx + Wy = 1
> 
> (remember that:
>     covar[x,y] = covar[y,x]
>     covar[a*x1+b*x2,y] = a*covar[x1,y] + b*covar[x2,y]
> 
> - i.e. symmetric and bilinear)
> 
> Thanks in advance,
> Robert


This being a Mathematica-related news group, we'll warm up with some
Mathematica code to automate the proofs for low dimension.

First some notation. I use avec = {a[1],a[2],...} where you have {Vx,
Vy}, bvec={b[1],b[2],...} where you have {Wx,Wy}, xvars={x[1],x[2],...}
where you have {x,y} (or {x1,x2,...}. Thus your 'r' becomes avec.xvars.
I use for the covariance matrix elements of the form v[x[i],x[j]] which
in turn is abbreviated v[i,j].

Now we put in place rules that make the covariance matrix symmetric and
furthermore make the function v[a,b] bilinear.

v[i_Integer, j_Integer] /; j < i := v[j, i]
v[x[i_Integer], x[j_Integer]] := v[i, j]
v[s_*x[i_Integer], y_] /; Head[s] =!= x := s*v[x[i], y]
v[y_, s_*x[i_Integer]] /; Head[s] =!= x := s*v[y, x[i]]
v[a_ + b_, y_] := v[a, y] + v[b, y]
v[y_, a_ + b_] := v[y, a] + v[y, b]

With this we now form a function to demonstrate the identity given a
specific dimension. We define local variables of the form noted above,
form the "translated" variables transvars=x[j]-avec.xvars, form the
matrix tmat by applying v to all pairs {transvars[i],transvars[j], and
take the difference of the two dot products in question.

At this point it suffices to simplify the expression subject to the
relation b[1]+...+b[dim]==1. Note that this is the only restriction we
need enforce on the vectors (so in particular we do not care about the
sum of the a values). But we can make this much more efficient at higher
dimensions if we first rewrite the difference as a polynomial with
monomial variables consisting of the a[i], x[j], and v[m,n], and
coefficients from the b[k]. We can then simplify each coefficient
individually (much more efficient) and check that they all simplify to
zero. Clearly this suffices; the relation between b[k]'s will not
simplify the whole thing to zero unless it simplifies each such term to
zero.

Finally we output the leaf count of the difference and also whether the
difference simplified to zero: True means our identity is proven.

identityProof[dim_Integer] /; dim > 0 := Module[
    {vmat, avec, bvec, xvars, transvars, tmat, diff, coeffs, rel},
    vmat = Array[v, {dim, dim}];
    avec = Array[a, {dim}];
    bvec = Array[b, {dim}];
    xvars = Array[x, {dim}];
    transvars = xvars - avec.xvars;
    tmat = Outer[v, transvars, transvars];
    diff = bvec.tmat.bvec -  (bvec - avec).vmat.(bvec - avec);
    coeffs = 
      Map[#[[2]] &, 
        First[Internal`DistributedTermsList[diff, 
            Join[avec, xvars, Flatten[vmat]]]]];
    rel = Apply[Plus, bvec] == 1;
    {LeafCount[diff],
      Apply[And, Map[(Simplify[#, rel] == 0) &, coeffs]]}
    ]

In[11]:= Table[Timing[identityProof[j]], {j,1,15}]
Out[11]= {{0.04 Second, {46, True}}, {0.15 Second, {333, True}}, 
  {0.23 Second, {1177, True}}, {0.81 Second, {3097, True}}, 
  {1.61 Second, {6771, True}}, {2.9 Second, {13045, True}}, 
  {4.74 Second, {22933, True}}, {7.54 Second, {37617, True}}, 
  {11.32 Second, {58447, True}}, {16.82 Second, {86941, True}}, 
  {24.37 Second, {124785, True}}, {33.59 Second, {173833, True}}, 
  {46.19 Second, {236107, True}}, {61.27 Second, {313797, True}}, 
  {82.49 Second, {409261, True}}}

So a couple dozen lines of code and we can show the identity
automatically given not-too-large dimension. Now for a general proof.

Again, some notation. Your matrix H (I'll refer to it below as hmat) has
entries H[[i,j]] equal to what I have called v[i,j]. Your T (what I
called tmat in identityProof) has entries T[[i,j]] equal to

v[x[i]-avec.xvars, x[j]-avec.xvars]

This expands (using, more or less, Mathematica notation) as

tmat[[i,j]] = v[i,j] - Sum[a[k]*(v[i,k]+v[k,j]),{k,dim}] +
  Sum[a[k]*a[m]*v[k,m],{k,dim},{m,dim}]

Thus tmat may be written as hmat + mat2 + mat3 with mat2 and mat3
elements given by the second and third sums respectively.

Now we look at the equation we are to show. We have on the left side
bvec.tmat.bvec and on the right side (bvec-avec).hmat.(bvec-avec). The
left side expands symbolically as

bvec.hmat.bvec + bvec.mat2.bvec + bvec.mat3.bvec

Note that I am using a Mathematica convention which, for purposes of dot
products with matrices, treats vectors as row or column according to
whichever makes sense (so on the left of a matrix it's a row vector, on
the right it's a column vector).

The right side expands as

bvec.hmat.bvec - 2*avec.hmat.bvec + avec.hmat.avec.

We will show that

(i) bvec.mat3.bvec == avec.hmat.avec

and

(ii) bvec.mat2.bvec == -2*avec.hmat.bvec

under the assumption that Sum[b[i],{i,dim}]==1 and this suffices to
prove the identity

First, avec.hmat.avec expands as Sum[a[k]*a[m]*v[k,m],{k,dim},{m,dim}]
which is the same as each element of mat3 as defined above. We will call
that sum s. Since all elements of mat3 are the same, mat3.bvec is simply
a vector will all components given by s*Sum[b[j],{j,dim}]. Under our
assumption this in turn is just s so the matrix times vector product
gives svec=Table[s,{dim}]. Similarly bvec.svec = Sum[b[j],{j,dim}]*s = s
again using our assumption. This finishes (i).

We now split mat2 into mat2a and mat2b where

mat2a[[i,j]] = -Sum[a[k]*v[i,k],{k,dim}]

and

mat2b[[i,j]] = -Sum[a[k]*v[k,j],{k,dim}]

We see that mat2b=Transpose[mat2a] and hence
bvec.mat2.bvec=2*bvec.mat2a.bvec. Moreover each row of mat2a is constant
and we have (mat2a.bvec)[[i]] given by the product
-Sum[a[k]*v[i,k],{k,dim}]*Sum[b[j],{j,dim}]. This just reduces to
-Sum[a[k]*v[i,k],{k,dim}] under the operatin assumption about bvec. So
we obtain

bvec.mat2.bvec = -2*bvec.Table[Sum[a[k]*v[i,k],{k,dim}],{i,dim}]

This Table is Mathematica notation for the vector described above as
mat2a.bvec; if it helps for readability, a more conventional ascii
notation might be

{Sum[a[k]*v[1,k],{k,dim}], Sum[a[k]*v[2,k],{k,dim}], ... ,
  Sum[a[k]*v[dim,k],{k,dim}]}

Now just notice that this is exactly the same as -2*bvec.hmat.avec. This
demonstrates (ii) and thus finishes the proof of the identity.


Daniel Lichtblau
Wolfram Research


  • Prev by Date: Re: [Q] Generalized Partitions
  • Next by Date: Embedding Excel (COM Objects) in Notebook
  • Previous by thread: Proof For Cov. Matrix equation?
  • Next by thread: default = "real" ?