[Date Index] [Thread Index] [Author Index]
Proof For Cov. Matrix equation?
I know I've already posted this message to alt.math.recreational (5 days ago) but to date I have had no reply. Could anyone help me with the following derivation/proof? I'm looking for a short proof for the nxn version of the following problem. I managed to prove it for the 2x2 case but ended up with a 3 page proof so I'd hate to attempt the proof for larger matrices. Here's the 2x2 case: (For the nxn case x,y becomes x1,x2,...,xn etc.) Let x, y & r be 3 random variables (in this case timeseries) now r = x*Vx + y*Vy is a weighted mean (where Vx + Vy = 1 - Vx & Vy are weights) Let the Covariance Matrices H and T be defined as H = [ var[x] covar[x,y]; covar[x,y] var[y] ] (Historic Covariance) and T = [ var[x-r] covar[x-r,y-r]; covar[x-r,y-r] var[y-r] ] (Tracking Error Covariance) Prove that [ Wx Wy ].T.[ Wx; Wy ] = [ Wx-Vx Wy-Vy ].H.[ Wx-Vx; Wy-Vy ] where Wx + Wy = 1 (remember that: covar[x,y] = covar[y,x] covar[a*x1+b*x2,y] = a*covar[x1,y] + b*covar[x2,y] - i.e. symmetric and bilinear) Thanks in advance, Robert