MathGroup Archive 2001

[Date Index] [Thread Index] [Author Index]

Search the Archive

Re: FindMinimum and least square minimization

  • To: mathgroup at
  • Subject: [mg30735] Re: FindMinimum and least square minimization
  • From: "Alan Mason" <amason2 at>
  • Date: Sat, 8 Sep 2001 02:56:35 -0400 (EDT)
  • References: <9nam8i$nqd$>
  • Sender: owner-wri-mathgroup at

Dear John,
In my experience, Mathematica is not the tool of choice for nonlinear
optimizations like the one you describe.  Although FindMinimum has been
revamped for Version 4, it still is not very smart.
I recommend you look at the book "Numerical Recipes in C" by Press et al.
If you have differentiability (and it sounds like you do, though there is
ambiguity related to your choice of sampling points), you can use the Powell
or conjugate gradient methods, described in that book.
Be sure to check that your results are stable relative to the number and
choice of sampling points.
 If your problem is only continuous (this is typical when you want to
minimize a uniform-norm estimate, and this might be appropriate in your
situation, as it avoids the choice of a discrete set of sampling points) the
downhill simplex algorithm -- sometimes affectionately called the
"amoeba" -- is very effective, especially when combined with simulated

You can find the code for all this and more in the book, and also at the web
Buy the book, then download the code from that site.  You can build a
dynamic link library from the code if you like (I did), but the code needs
considerable editing. For example, the function prototypes are unduly
restrictive, forcing a wholesale abuse of global variables; if you have a
C++ compiler, build the code in C++ and overload functions as you see fit.
However, the algorithms themselves are excellent.

Using those methods I've been able to find a sum of nine Gaussians  that
approximates  x/(x+1) uniformly on [x=0, Infinity] to better than 0.8%.
(This "Gaussian geminals" problem is important in computational chemistry.)
In this particular case the problem is only continuous, not differentiable,
and the amoeba works beautifully.

I'm not knocking Mathematica, which is one of my favorite tools -- but you
can't live on Mathematica alone.  You might consider optimizing your
approximation separately, then passing the optimized function to NDSolve.  I
really think Numerical Recipes in C (or Fortran) is the way to go if you do
a lot of nonlinear multivariate optimizations .


"Dr J. Satherley" <js1 at> wrote in message
news:9nam8i$nqd$1 at
> Dear All
> I have a complicated non-linear least square minimization problem. It is
> fit numerical data to a function that has to be solved using NDSolve at
> iteration. To do this I have written a function to compute the sum of
> squares between each data point and the value of the function at this
> I then use FindMinimum to find the values of the parameters which minimise
> the sum of squares. Mathematica does this very slowly and I want to find a
> more efficient way to do the calculation. To that end I have worked on a
> simple example to assist me with finding improvements, the main one of
> is to supply the partial derivatives of the function with respect to each
> parameter. However, the example leaves me a little perplexed and I wonder
> anyone out there can enlighten me on the points I raise below.
> Here's the example:
> First the inital equation:
> y[x_] := 2.31  x^3 + 1.2  x^2 + 3.42  Sin[x] + 131.56
> Generate some data using this equation:
> xydata = Table[{u, y[u]}, {u, -10, 10, 0.1}]
> Write the function to calculate the sum of squares. I'm simply trying to
> find the parameters in the above equation:
> Q[a_?NumberQ,b_?NumberQ,c_?NumberQ,d_?NumberQ]:=Module[{output,fn,temp},
>       fn=#[[2]]-(a #[[1]]^3+b #[[1]]^2+c Sin[#[[1]]]+d )&;
>       temp=fn/@xydata;
>       output=temp.temp;
>       total=totala+totalb+totalc+totald;
>       Print[output," ",++n," ",a," ",b," ",c," ",d," ",totala," ",totalb,"
> ",totalc," ",totald," ",total];(*this prints out what is happening at each
> iteration*)
>       output
>       ];
> Define the partial derivatives of Q wrt each parameter:
> dQda[a_, b_, c_, d_] := totala = Plus @@ (2*(#1[[2]] - (a*#1[[1]]^3 +
> b*#1[[1]]^2 + c*Sin[#1[[1]]]+d))*#1[[1]]^3 & ) /@ xydata;
> dQdb[a_, b_, c_, d_] := totalb = Plus @@ (2*(#1[[2]] - (a*#1[[1]]^3 +
> b*#1[[1]]^2 + c*Sin[#1[[1]]]+d))*#1[[1]]^2 & ) /@ xydata;
> dQdc[a_, b_, c_, d_] := totalc = Plus @@ (2*(#1[[2]] - (a*#1[[1]]^3 +
> b*#1[[1]]^2 + c*Sin[#1[[1]]]+d))*Sin[#1[[1]]] & ) /@ xydata;
> dQdd[a_, b_, c_, d_] := totald = Plus @@ (2*(#1[[2]] - (a*#1[[1]]^3 +
> b*#1[[1]]^2 + c*Sin[#1[[1]]]+d)) & ) /@ xydata;
> Derivative[1, 0, 0, 0][Q] := dQda;
> Derivative[0, 1, 0, 0][Q] := dQdb;
> Derivative[0, 0, 1, 0][Q] := dQdc;
> Derivative[0, 0, 0, 1][Q] := dQdd;
> Run FindMinimum:
> MemoryInUse[]
> ]
> MemoryInUse[]
> These are the points I've noted when running these functions:
> 1. I was expecting the convergence to be rather rapid compared to giving 2
> starting values to FindMinimum. However, it is only marginally quicker -
> maybe 150 iterations instead of 220. Is this to be expected? Or have I not
> formulated the problem correctly?
> 2. I was expecting the sum of the 4 parital derivative functions to
> zero at the convergence point. However, it was not as close as I would
> thought - for example only around 0.007.
> 3. Reaching the correct solution is more sensitive to the choice of
> values when using FindMinimum together with the partial derivatives. Using
> FindMinimum with 2 starting values does take more iterations but reaches a
> solution with a smaller sum of squares (4 orders of magnitude less).)
> 4. I have noticed that Mathematica uses a vast amount of memory when I've
> performed my actually problem. It uses up all the available RAM (256MB on
> system) and them runs off the harddisk using the swap file. That is why
> included the MemoryInUse before and after running FindMinimum to monitor
> memory use. Even with my simple example memory is not returned for use.
> using Mathematica 4.0.1  on Windows98. Is there anything I can do to fix
> this problem? On my real problem after running the function a couple of
> times it slows dramatically once the harddisk is accessed.
> I'd be grateful for any comments or suggestions related to these
> observations particularly those that may reduce the number of iterations
> the problem about the memory use.
> Cheers
> John Satherley

  • Prev by Date: Re: JLink an animated gif
  • Next by Date: Re: FindMinimum and least square minimization
  • Previous by thread: Re: FindMinimum and least square minimization
  • Next by thread: Re: FindMinimum and least square minimization