MathGroup Archive 2000

[Date Index] [Thread Index] [Author Index]

Search the Archive

Re: FindMinimum

  • To: mathgroup at smc.vnet.net
  • Subject: [mg22702] Re: FindMinimum
  • From: Robert Knapp <rknapp at wolfram.com>
  • Date: Wed, 22 Mar 2000 00:28:08 -0500 (EST)
  • Organization: Wolfram Research, Inc.
  • References: <8aqrrd$a9p@smc.vnet.net>
  • Sender: owner-wri-mathgroup at wolfram.com

Johannes Ludsteck wrote:
> 
> Dear Mathgroup members
> I have some problems with the sparse documentation of the
> FindMinimum Procedures.
> 
> 1) To the best of my knowledge the Newton and QuasiNewton
> methods use the gradient and the hessian of the minimand or an
> approximation to them. Mathematica, however allows me only to
> supply an analytical gradient, but not the hessian. Why? (I think
> that it would speed up the minimization considerably if an
> analytical hessian could be used.

This may be allowed in some future version.  The reason not to allow it
is that often supplied Hessian functions are incorrect (just read a few
numerical analysis books to see comments on this), and unless your
function is a complicated program, Mathematica can find it by
differentiation.

Typically, up to precision alloweed by machine numbers, QuasiNewton will
be faster than the full Newton method, and QuasiNewton produces its onw
Hessian approximation.  There are two reasons it is faster.  First, it
does not have to evaluate the Hessian, and second, because of the update
procedure, it is able to do simple computations to keep a factored form
of the Hessian approximation to solve the necessary system, thereby
saving O(n^3) work on each step.

If you need to do high precision computations, a good strategy is to use
QuasiNewton with machine numbers to get a good starting guess, and then
to use Newton with arbitary precision numbers.  With the good starting
guess, the faster convergence rate of Newtons method will pay off.

> 
> 2) I suspect that Mathematica uses a secant method if it cannot
> compute analytical gradients. In my application it would be efficient
> to compute a numerical gradient (an hessian) and to use the
> gradient or quasi-newton method, since my minimand is smooth
> and globally convex, but not differentiable analytically. Can I force
> Mathematica to use this strategy?

This will be implemented in a future version of Mathematica, but the
only way to force it to do so now is to specify a gradient function
which does this with the Gradient option.

> 
> Thank you,
>         Johannes Ludsteck
> 
> Johannes Ludsteck
> Centre for European Economic Research (ZEW)
> Department of Labour Economics,
> Human Resources and Social Policy
> Phone (+49)(0)621/1235-157
> Fax (+49)(0)621/1235-225
> 
> P.O.Box 103443
> D-68034 Mannheim
> GERMANY
> 
> Email: ludsteck at zew.de


  • Prev by Date: Re: data structures
  • Next by Date: Re: graphics rendering glitch?
  • Previous by thread: FindMinimum
  • Next by thread: Re: FindMinimum