MathGroup Archive 2001

[Date Index] [Thread Index] [Author Index]

Search the Archive

Re: FindMinimum and least square minimization

  • To: mathgroup at smc.vnet.net
  • Subject: [mg30723] Re: FindMinimum and least square minimization
  • From: "Carl K. Woll" <carlw at u.washington.edu>
  • Date: Sat, 8 Sep 2001 02:56:04 -0400 (EDT)
  • Organization: University of Washington
  • References: <9nam8i$nqd$1@smc.vnet.net>
  • Sender: owner-wri-mathgroup at wolfram.com

Hi John,

This question has come up before, and the solution I recommend is to use
NonlinearFit while teaching Mathematica to take derivatives of your
function, as you did in your example.

Note first, however, that your sample problem is not a nonlinear fitting
question, but is instead a linear fitting question. When I tried to apply
NonlinearFit to this problem with user defined derivative functions,
Mathematica would not return a result. As such, I have modified your example
by including a nonlinear term. So, we have the data:

In[13]:=
y[x_]:=2.31 x^3+1.2x^2+3.42 Sin[1. x]+131.56;
xydata=Table[{u,y[u]},{u,-10,10,0.1}];

Next, we define the model:

In[15]:=
Clear[model]
model[a_,b_,c_,d_,e_,x_?NumericQ]:=a x^3+b x^2+c Sin[d x]+e

Note the use of the ?NumericQ above. It may be necessary to include this for
each of the arguments to prevent Mathematica from evaluating your model with
nonnumeric quantities, if your model is indeed defined by an NDSolve routine
(NDSolve of a symbolic quantity would cause an error).

Next, define the derivatives of your model with respect to each parameter:

In[31]:=
Clear[Derivative]
Derivative[1, 0, 0, 0, 0, 0][model][a_,b_,c_,d_,e_,x_] = x^3 ;
Derivative[0, 1, 0, 0, 0, 0][model][a_,b_,c_,d_,e_,x_] = x^2 ;
Derivative[0, 0, 1, 0, 0, 0][model] = Sin[#4*#6] & ;
Derivative[0, 0, 0, 1, 0, 0][model] = #3*#6*Cos[#4*#6] & ;
Derivative[0, 0, 0, 0, 1, 0][model] = 1 & ;

I have included two ways to define your derivatives. If your model is the
output of an NDSolve operation, you can obtain these derivatives by suitable
differentiating your system of differential equations and then running
NDSolve on these differentiated equations. If you have trouble defining the
derivatives of your ODE system, give us a simple example of the type of ODE
you are working with.

Finally, load the package

In[11]:=
<< "Statistics`NonlinearFit`"

and run NonlinearFit on your data:

In[37]:=
NonlinearFit[xydata,model[a,b,c,d,e,x],x,{{a,1},{b,1},{c,2},{d,.8},{e,100}}]

Out[37]=
model[2.31, 1.2, 3.42, 1., 131.56, x]

The time and memory usage involved in this problem using this method was
very minimal.

One of the nice things in getting NonlinearFit to work with this type of
problem is that you can use NonlinearRegress and obtain lots of statistical
information on the fit.

Feel free to respond if you have any trouble putting the above into
practice.

Carl Woll
Physics Dept
U of Washington

"Dr J. Satherley" <js1 at liverpool.ac.uk> wrote in message
news:9nam8i$nqd$1 at smc.vnet.net...
>
> Dear All
> I have a complicated non-linear least square minimization problem. It is
to
> fit numerical data to a function that has to be solved using NDSolve at
each
> iteration. To do this I have written a function to compute the sum of
> squares between each data point and the value of the function at this
point.
> I then use FindMinimum to find the values of the parameters which minimise
> the sum of squares. Mathematica does this very slowly and I want to find a
> more efficient way to do the calculation. To that end I have worked on a
> simple example to assist me with finding improvements, the main one of
which
> is to supply the partial derivatives of the function with respect to each
> parameter. However, the example leaves me a little perplexed and I wonder
if
> anyone out there can enlighten me on the points I raise below.
>
> Here's the example:
>
> First the inital equation:
> y[x_] := 2.31  x^3 + 1.2  x^2 + 3.42  Sin[x] + 131.56
>
> Generate some data using this equation:
> xydata = Table[{u, y[u]}, {u, -10, 10, 0.1}]
>
> Write the function to calculate the sum of squares. I'm simply trying to
> find the parameters in the above equation:
> Q[a_?NumberQ,b_?NumberQ,c_?NumberQ,d_?NumberQ]:=Module[{output,fn,temp},
>       fn=#[[2]]-(a #[[1]]^3+b #[[1]]^2+c Sin[#[[1]]]+d )&;
>       temp=fn/@xydata;
>       output=temp.temp;
>       total=totala+totalb+totalc+totald;
>       Print[output," ",++n," ",a," ",b," ",c," ",d," ",totala," ",totalb,"

> ",totalc," ",totald," ",total];(*this prints out what is happening at each
> iteration*)
>       output
>       ];
>
> Define the partial derivatives of Q wrt each parameter:
> dQda[a_, b_, c_, d_] := totala = Plus @@ (2*(#1[[2]] - (a*#1[[1]]^3 +
> b*#1[[1]]^2 + c*Sin[#1[[1]]]+d))*#1[[1]]^3 & ) /@ xydata;
> dQdb[a_, b_, c_, d_] := totalb = Plus @@ (2*(#1[[2]] - (a*#1[[1]]^3 +
> b*#1[[1]]^2 + c*Sin[#1[[1]]]+d))*#1[[1]]^2 & ) /@ xydata;
> dQdc[a_, b_, c_, d_] := totalc = Plus @@ (2*(#1[[2]] - (a*#1[[1]]^3 +
> b*#1[[1]]^2 + c*Sin[#1[[1]]]+d))*Sin[#1[[1]]] & ) /@ xydata;
> dQdd[a_, b_, c_, d_] := totald = Plus @@ (2*(#1[[2]] - (a*#1[[1]]^3 +
> b*#1[[1]]^2 + c*Sin[#1[[1]]]+d)) & ) /@ xydata;
> Derivative[1, 0, 0, 0][Q] := dQda;
> Derivative[0, 1, 0, 0][Q] := dQdb;
> Derivative[0, 0, 1, 0][Q] := dQdc;
> Derivative[0, 0, 0, 1][Q] := dQdd;
>
> Run FindMinimum:
> MemoryInUse[]
>
n=0;result2=FindMinimum[Q[a,b,c,d],{a,2},{b,2},{c,2},{d,2},MaxIterations->50
> ]
> MemoryInUse[]
>
> These are the points I've noted when running these functions:
> 1. I was expecting the convergence to be rather rapid compared to giving 2
> starting values to FindMinimum. However, it is only marginally quicker -
> maybe 150 iterations instead of 220. Is this to be expected? Or have I not
> formulated the problem correctly?
> 2. I was expecting the sum of the 4 parital derivative functions to
approach
> zero at the convergence point. However, it was not as close as I would
have
> thought - for example only around 0.007.
> 3. Reaching the correct solution is more sensitive to the choice of
starting
> values when using FindMinimum together with the partial derivatives. Using
> FindMinimum with 2 starting values does take more iterations but reaches a
> solution with a smaller sum of squares (4 orders of magnitude less).)
> 4. I have noticed that Mathematica uses a vast amount of memory when I've
> performed my actually problem. It uses up all the available RAM (256MB on
my
> system) and them runs off the harddisk using the swap file. That is why
I've
> included the MemoryInUse before and after running FindMinimum to monitor
the
> memory use. Even with my simple example memory is not returned for use.
I'm
> using Mathematica 4.0.1  on Windows98. Is there anything I can do to fix
> this problem? On my real problem after running the function a couple of
> times it slows dramatically once the harddisk is accessed.
>
> I'd be grateful for any comments or suggestions related to these
> observations particularly those that may reduce the number of iterations
and
> the problem about the memory use.
>
> Cheers
> John Satherley
>




  • Prev by Date: Re: FindMinimum and least square minimization
  • Next by Date: Re: cubic complex
  • Previous by thread: Re: FindMinimum and least square minimization
  • Next by thread: Re: FindMinimum and least square minimization