MathGroup Archive 1995

[Date Index] [Thread Index] [Author Index]

Search the Archive

Re: Optimizing FindMinimum for speed using intermediate calculations.

  • To: mathgroup at christensen.cybernetics.net
  • Subject: [mg1238] Re: Optimizing FindMinimum for speed using intermediate calculations.
  • From: rubin at msu.edu (Paul A. Rubin)
  • Date: Fri, 26 May 1995 05:10:03 -0400
  • Organization: Michigan State University

In article <3pjl4l$7h0 at news0.cybernetics.net>,
   Bill Harbaugh <harbaugh at students.wisc.edu> wrote:
->Hello
->
->I am trying to maximize a likelihood function, using FindMinimum.  This 
->involves finding the parameters (x,y in my simplified example) that max 
->(min, in my example) the sum of the values of a function applied to some 
->data (ddist, in my example).  
->
->My actual problem is very long and my question is about how to optimize 
->my mma code for speed.
->
->This function involves some intermediate results (x1,y1 in my example) 
->which vary across the x,y parameters but are constant across the d's.  
->(In my actual problem, these intermediate results are found using NSolve, 
->so I am specifying the functions using := rather than =). I am trying to 
->only calculate these once for each iteration on x,y, and then apply the 
->results to each of the d's in the list ddist.  The code below shows how I 
->do this without trying to optimize.
->
->
->ddist={1,2,3,4,5,6,7,8,9,10}
->x1:=x+1
->y1:=y+2
->l[d_]:=(x1*d)^2+(y1*d)^2
->
->f=Apply[Plus,
->    Map[l,ddist]
->  ]
->
->FindMinimum[f,{x,{-1,1}},{y,{-1,1}}]
->
->
->
->
->This works, but it seems the following should be faster.
->
->Clear[l];
->Clear[f];
->l[d_]:=(x1*d)^2+(y1*d)^2
->
->f={x1=x+1,y1=y+2,Apply[Plus,
->    Map[l,ddist]
->  ]}
->
->FindMinimum[f[[3]],{x,{-1,1}},{y,{-1,1}}]
->
->This also works, but timing results are inconclusive. I would appreciate 
->suggestions on any other approaches that might speed this up.  I am also 
->asking mma support for help, and will forward their reply.
->
->Bill Harbaugh
->harbaugh at students.wisc.edu
->
I don't see why the second version should be faster.  In both cases, you 
compute an expression (f in the first case, f[[3]] in the second) that is 
quadratic in x and y, and feed that to FindMinimum.  The only difference in 
the two calls to FindMinimum is that the second forces FindMinimum to 
extract the third entry in a list (f), which would slow it down a tad.  As 
to the steps leading up to FindMinimum, I don't see any meaningful 
difference.

I suspect, though, that your example above does not illustrate your actual 
problem.  In both versions of the example, you end up with an objective 
function directly computable from x and y.  It sounds as though, in your 
actual problem, there is an intermediate step (getting x1, y1 from x and y) 
that cannot be written in closed form.  That being the case, the question 
is:  where are your CPU cycles going?  Is the time consuming part getting 
from {x, y} to {x1, y1}, or getting from {x1, y1} to f (and its 
derivatives)?  Or is it the sheer number of iterations you're doing?

Off the cuff, the only suggestion I can give you for speeding up your 
problem (and it's a shot in the dark) would be to start with a subset of 
the data, solve using just the subset (which should go faster), then add 
back some of the deleted observations and solve again, using the previous 
solution as the starting values for FindMinimum.  That way, the early 
iterations (getting you into the general vicinity of the optimum) are done 
at lower computational cost.  You could also start with a fairly sloppy 
precision goal and/or iteration limit, and get pickier as you add points 
(and, hopefully, get closer to the optimum).

Unfortunately, FindMinimum does not give you a list of intermediate results 
for x and y.  Steepest descent has been known to zigzag its way toward an 
optimum.  If you knew that was happening, you could switch to a fancier 
optimization method (something based on conjugate gradients or quasi-Newton 
methods) and perhaps get faster convergence.  Then again, you'd have to 
program it yourself.

Paul


**************************************************************************
* Paul A. Rubin                                  Phone: (517) 432-3509   *
* Department of Management                       Fax:   (517) 432-1111   *
* Eli Broad Graduate School of Management        Net:   RUBIN at MSU.EDU    *
* Michigan State University                                              *
* East Lansing, MI  48824-1122  (USA)                                    *
**************************************************************************
Mathematicians are like Frenchmen:  whenever you say something to them,
they translate it into their own language, and at once it is something
entirely different.                                    J. W. v. GOETHE


  • Prev by Date: [Q] Differential operators <--> polynomials
  • Next by Date: Question about FindRoot function?
  • Previous by thread: Optimizing FindMinimum for speed using intermediate calculations.
  • Next by thread: Re: Optimizing FindMinimum for speed using intermediate calculations.