MathGroup Archive 2001

[Date Index] [Thread Index] [Author Index]

Search the Archive

Re: FindMinimum and least square minimization

  • To: mathgroup at smc.vnet.net
  • Subject: [mg30718] Re: FindMinimum and least square minimization
  • From: Mike Yukish <may106 at psu.edu>
  • Date: Sat, 8 Sep 2001 02:55:50 -0400 (EDT)
  • Organization: Penn State University, Center for Academic Computing
  • References: <9nam8i$nqd$1@smc.vnet.net>
  • Sender: owner-wri-mathgroup at wolfram.com

"Dr J. Satherley" wrote:

> Dear All
> I have a complicated non-linear least square minimization problem. It is to
> fit numerical data to a function that has to be solved using NDSolve at each
> iteration. To do this I have written a function to compute the sum of
> squares between each data point and the value of the function at this point.
> I then use FindMinimum to find the values of the parameters which minimise
> the sum of squares. Mathematica does this very slowly and I want to find a
> more efficient way to do the calculation. To that end I have worked on a
> simple example to assist me with finding improvements, the main one of which
> is to supply the partial derivatives of the function with respect to each
> parameter. However, the example leaves me a little perplexed and I wonder if
> anyone out there can enlighten me on the points I raise below.

An "in general" point I noticed is that, in computing the partials, you have to
do a summation over all of the points in your data. So computing a partial is
probably as expensive as computing the value at a point. This means you are
summing five times total for each iteration.

In computing the partials using a finite differencing, you are computing the
value at five points, which would be close to the same computational load as
when you gave partials. So not much speed up there. You'd get fewer iterations
since the partials directly computed would be more accurate than the one's
calculated with finite differencing, but not much better.

Also, Find Minimum uses a steepest descent algorithm, which generally can be
improved upon.

I would be tempted to do a design of experiments over the set of
parametersa,b,c,d, then build an approximation to Q[ , , ,] and its derivatives,
use that to get the approximate minimum, and use the real deal for the last
little bit.



  • Prev by Date: Re: Bug in Sum on symbolic differentiation
  • Next by Date: Re: = or := ???
  • Previous by thread: FindMinimum and least square minimization
  • Next by thread: Re: FindMinimum and least square minimization