MathGroup Archive 1998

[Date Index] [Thread Index] [Author Index]

Search the Archive

Re: Differentiation ?

  • To: mathgroup at smc.vnet.net
  • Subject: [mg14703] Re: [mg14668] Differentiation ?
  • From: Jurgen Tischer <jtischer at col2.telecom.com.co>
  • Date: Tue, 10 Nov 1998 01:20:57 -0500
  • Organization: Universidad del Valle
  • References: <004d01be0b2c$71a4e0d0$0100a8c0@bertha.bureau>
  • Sender: owner-wri-mathgroup at wolfram.com

Well Yves,
I don't think math works different in statistics, so let me try once
more.
You start with 
  e'e=(y-X b)'(y-X b)
where X is in M[n,m+1], b in R^(m+1) (I hope not in R^(m-1) as you
wrote), e and y in R^n. First of all, you have an error in your expand,
it should read
  e'e=y'y-(X b)'y - y'X b + (X b)'X b = y'y - b'X'y - y'X b + b'X'X b. 
Now since b'X'y is a real, it's equal to its transpose, so you can add
the two middle terms and get
  e'e = y'y - 2 y'X b + b'X'X b.
Next we derivate the function f : R^n --> R with respect to b. Again the
first term is constant so the derivative is zero, the second term is
linear, so it's is own derivative, and the third term is quadratic, so
by the rule for bilinear functions we have

Df(b)(u) = -2 y'X u + u'X'X b + b'X'X u

As before we can apply transpose to the second term and have 

Df(b)(u) = -2 y'X u + 2 b'X'X u = 2 ( b'X'X - y'X ) u

So we have the derivative of f at a point b calculated as

Df(b) = 2 ( b'X'X - y'X )

The condition for minimality gives 

b'X'X = y'X

or taking transpose on both sides

X'X b = X'y.

So lets come back to Mathematica. First of all, I think you should
consult Statistics`LinearRegression`. To check the validity of the
above formula for a general example now is much easier, since we have a
lot of vectors instead of matrices, so 
 
In[1]:=
y={y1,y1};
b={b1,b2,b3};
X= {{x11,x12,x13},{x21,x22,x23}};    

In[2]:= Outer[D,{(y-X.b).(y-X.b)},{b1,b2,b3}][[1]]==2 ( b.Transpose[X].X
- y.X )//
  Simplify 

Out[2]= True

You notice I didn't use Transpose on vectors, this is due to the
definition of Dot. To allow the natural identification of M(n,1) and so
on with R^n and M(1,1) with R, Dot accepts things like {y1,y2}.{y1,y2},
which in matrix form should be {{y1,y2}}.{{y1},{y2}}. If you insist to
check the above formula with all the transposes, you can do the
following:

In[3]:=
y={{y1},{y1}};
b={{b1},{b2},{b3}};
X= {{x11,x12,x13},{x21,x22,x23}};

In[4]:= Outer[D,Transpose[y-X.b].(y-X.b),{b1,b2,b3}][[1]]==
    2 (Transpose[b].Transpose[X].X -Transpose[y].X )//Simplify

Out[4]= True

Yves Gauvreau wrote:
> 

> 
> I'll give you all of the author solution approch.
> 
> """""""""""""""""""
> This is the general linear regression model
> 
>         Y = XB + E
> eq 1
> 
> where
> 
> Y is n x 1 matrix of observed values
> X is an n x (m+1) matrix of independent variables
> B is an (m-1) x 1 matrix of the unknown parameters
> E is an n x 1 matrix of error terms
> 
> The sums of squares to be minimized is written in matrix form as
> 
>         E'E = (Y - XB)' (Y - XB) = Y'Y - B'X'Y + B'X'XB                eq 2
> 
> To minimize this function, we take the derivative with respect to the matrix
> B and get the following:
> 
>         D[E'E,B] = -2 X'Y + 2X'XB
> eq 3
> 
> Equating to zero yields:
> 
>         (X'X)Bhat = X'Y
> eq 4
> 
> The solutions to this matrix equation are
> 
>         Bhat = inverse(X'X)X'Y
> eq 5
> 
> """"""""""""""""""""""
> 
> As you can see the final solution is correct and as you mentionned about
> the -2 in equation 3 I had the same concern and that is the reason of my
> query. I wanted to know if I could solve this in Mathematica and arrive at
> the same result. Due to the way M does thing I found that I could not setup
> the problem litteraly and I hoped someone had an idea on how to do it.
> 
> Thanks
> 
> Yves



  • Prev by Date: Re: Abs and derivative problems
  • Next by Date: Re: Vector Transposition
  • Previous by thread: Re: Differentiation ?
  • Next by thread: Vector Transposition