MathGroup Archive 2001

[Date Index] [Thread Index] [Author Index]

Search the Archive

Proof For Cov. Matrix equation?

  • To: mathgroup at smc.vnet.net
  • Subject: [mg29904] Proof For Cov. Matrix equation?
  • From: "Robert Parry" <rparry at ct.bbd.co.za>
  • Date: Wed, 18 Jul 2001 02:08:47 -0400 (EDT)
  • Sender: owner-wri-mathgroup at wolfram.com

I know I've already posted this message to alt.math.recreational (5 days
ago) but to date I have had no reply. Could anyone help me with the
following derivation/proof?

I'm looking for a short proof for the nxn version of the following problem.
I managed to prove it for the 2x2 case but ended up with a 3 page proof so
I'd hate to attempt the proof for larger matrices.

Here's the 2x2 case: (For the nxn case x,y becomes x1,x2,...,xn etc.)

Let x, y & r be 3 random variables (in this case timeseries)
now r = x*Vx + y*Vy is a weighted mean (where Vx + Vy = 1 - Vx & Vy are
weights)
Let the Covariance Matrices H and T be defined as

H = [ var[x] covar[x,y]; covar[x,y] var[y] ]
(Historic Covariance)

and T = [ var[x-r] covar[x-r,y-r]; covar[x-r,y-r] var[y-r] ]
(Tracking Error Covariance)


Prove that

[ Wx Wy ].T.[ Wx; Wy ] = [ Wx-Vx Wy-Vy ].H.[ Wx-Vx; Wy-Vy ]
where Wx + Wy = 1


(remember that:
    covar[x,y] = covar[y,x]
    covar[a*x1+b*x2,y] = a*covar[x1,y] + b*covar[x2,y]

- i.e. symmetric and bilinear)

Thanks in advance,
Robert





  • Prev by Date: Re: FindRoot question
  • Next by Date: Re: Memory Conservation
  • Previous by thread: Re: Re: Thickness Isn't Thickness
  • Next by thread: Re: Proof For Cov. Matrix equation?