MathGroup Archive 1998

[Date Index] [Thread Index] [Author Index]

Search the Archive

Re: Differentiation ?

  • To: mathgroup at smc.vnet.net
  • Subject: [mg14702] Re: [mg14668] Differentiation ?
  • From: Yves Gauvreau <gauy at videotron.ca>
  • Date: Sun, 8 Nov 1998 21:15:58 -0500
  • Sender: owner-wri-mathgroup at wolfram.com

-----Original Message-----
From: Jurgen Tischer <jtischer at col2.telecom.com.co> To:
mathgroup at smc.vnet.net
Subject: [mg14702] Re: [mg14668] Differentiation ?


>Hi Yves,
>as to your first problem:
>
>In[1]:= s = Sum[y[i]^2 - 2 mu y[i] + mu^2, {i, 1, n}]
>
>In[2]:= Evaluate /@ D[s, mu]
>
>Out[2]= Sum[2*mu - 2*y[i], {i, 1, n}]
>
>You need the Evaluate/@ because Sum has Attribute HoldAll. Now if you
>insist in the formula you ended up with (or better with the correct
>version) you would have to write some simplification rules to apply to
>the result, something like:
>
>In[3]:= simplifySum={
>    Sum[x_+y_.,{n_,na_,ne_}]/;FreeQ[x,n]:>(ne-na+1)x+ Sum[y,{n,na,ne}],
>    Sum[x_ y_,{n_,na_,ne_}]/;FreeQ[x,n]:>x Sum[y,{n,ne,na}]}
>
>In[4]:= Evaluate/@D[s,mu]/.simplifySum
>
>Out[4]= 2*mu*n + Sum[-2*y[i], {i, 1, n}]
>
>
>Now to the second problem:
>First of all, I think your formula (apart from the trivial error of a
>factor of 2 of the first term) is false. In any case it's a problem of
>interpretation. Let me show what I mean with an example, using matrices
>in M(2,2). So my interpretation of your formula  E'E = Y'Y - B'X'Y +
>B'X'XB is that there you have a function f[B_]:=Y'Y-B'X'Y+B'X'X B where
>Y, X are constants (in M(2,2)), and you are searching the derivative.
>Now by linearity of the derivative we can interchange derivation and
>addition so we have: the first term is constant, so the derivative is 0,
>the second term is linear (in b) so the derivative is THE SAME linear
>function and the third term is quadratic, so by the rules for
>derivatives of bilinear forms (see for example Dieudonne', A treatise in
>analysis) we get all together D[f,B][U]==U'X'Y+U'X'X B+B'X'X U. So far
>the treatment without Mathematica, now lets do it with Mathematica. We
>identify M(2,2) with R^4 (just use Flatten on the matrices), so the
>original function now reads
>
>In[1]:=
>x={{x11,x12},{x21,x22}};
>y={{y11,y12},{y21,y22}};
>b={{b11,b12},{b21,b22}};
>u={{u11,u12},{u21,u22}};
>
>In[2]:= f[{b11_,b12_,b21_,b22_}]=
>  Flatten[Transpose[y].y-Transpose[b].Transpose[x].y+
>      Transpose[b].Transpose[x].x.b];
>
>In[2]:= df=Outer[D,f[{b11,b12,b21,b22}],{b11,b12,b21,b22}];
>
>(* This df is in M(4,4) and represents the derivative in the usual
>matrix form. Lets check if the two results coincide. *)
>
>In[3]:= df.Flatten[u]==
>    Flatten[-Transpose[u].Transpose[x].y+Transpose[u].Transpose[x].x.b+
>        Transpose[b].Transpose[x].x.u]//Simplify
>
>Out[3]= True
>
>This brings us back to your original question: Is it possible to
>implement this type of derivation in Mathematica. As I see it, if you
>need a concrete derivative of that type (in finite, low order dimension)
>the method of identifying M(n,n) with R^(n^2) is viable. If you want
>derivation in Banach spaces in a theoretic fashion, you would have to
>implement it, and I think that would be quite a challenge (think only
>how to implement the rule for multilinear functions and how to identify
>a multilinear function automatically).
>
>
>Jurgen
>
>
>
>
>Yves Gauvreau wrote:
>>
>> Hi,
>>
>> I saw this equation in a book (Sum => greek SIGMA)
>>
>>         Sum ei^2 = Sum (yi^2 - 2 mu yi + mu^2)
>>
>> That easy to implement but there's this one
>>
>>         D[ Sum ei^2, mu] = 2 Sum yi - 2 n mu
>>
>> and this one to
>>
>>         E'E = Y'Y - B'X'Y + B'X'XB
>>
>>         D[E'E,B] = -2 X'Y + 2X'XB
>>
>> the ' mean Transpose[]
>>
>> How can I implement these type of Differenciation ?  Is it possible to
>> do that in Mathematica ?
>>
>> Thanks
>> Yves
>
>

I'll give you all of the author solution approch.

"""""""""""""""""""
This is the general linear regression model

        Y = XB + E
eq 1

where

Y is n x 1 matrix of observed values X is an n x (m+1) matrix of
independent variables B is an (m-1) x 1 matrix of the unknown
parameters E is an n x 1 matrix of error terms

The sums of squares to be minimized is written in matrix form as

        E'E = (Y - XB)' (Y - XB) = Y'Y - B'X'Y + B'X'XB               
eq 2

To minimize this function, we take the derivative with respect to the
matrix B and get the following:

        D[E'E,B] = -2 X'Y + 2X'XB
eq 3

Equating to zero yields:

        (X'X)Bhat = X'Y
eq 4

The solutions to this matrix equation are

        Bhat = inverse(X'X)X'Y
eq 5

""""""""""""""""""""""

As you can see the final solution is correct and as you mentionned about
the -2 in equation 3 I had the same concern and that is the reason of
my query. I wanted to know if I could solve this in Mathematica and
arrive at the same result. Due to the way M does thing I found that I
could not setup the problem litteraly and I hoped someone had an idea
on how to do it.

Thanks

Yves



  • Prev by Date: Re: How to transpose vector?
  • Next by Date: Re: Attach frequency list to a list of fourier tranformed numbers
  • Previous by thread: Re: Differentiation ?
  • Next by thread: Re: Differentiation ?