MathGroup Archive 2000

[Date Index] [Thread Index] [Author Index]

Search the Archive

Re: FindMinimum and Gradient

  • To: mathgroup at smc.vnet.net
  • Subject: [mg24168] Re: [mg24099] FindMinimum and Gradient
  • From: Carl Woll <carlw at u.washington.edu>
  • Date: Wed, 28 Jun 2000 22:50:49 -0400 (EDT)
  • References: <200006270451.AAA04907@smc.vnet.net>
  • Sender: owner-wri-mathgroup at wolfram.com

Grischa,

Since you have a two parameter subspace you are exploring, one thought is to just
3DPlot S[a,b,n] over some region, and then you may be able to eyeball where the
minimum is. This may also reveal why FindMinimum is having a tough time finding the
minimum, for example, maybe the minimum is a valley with a very small incline. At
any rate, looking at the plot may lead you to a good starting point for FindMinimum.

As far as your second question about the Newton and Quasi-Newton methods, I don't
know why exactly why it doesn't work, but I have a suggested workaround. Whenever I
try to use FindMinimum (or NonlinearFit, etc) with a function which Mathematica is
unable to symbolically differentiate, rather than using a Gradient option, I prefer
to teach Mathematica how to differentiate the function. In your example, it's very
simple for a human to differentiate S. Using your notation, we have

daS[a_,b_,n_]:=-Sum[h[k]*D[p[k,a,b,n],a]/p[k,a,b,n],{k,-n,n}]
dbS[a_,b_,n_]:=-Sum[h[k]*D[p[k,a,b,n],b]/p[k,a,b,n],{k,-n,n}]

which I suspect is what you did. However, rather than feeding these derivatives to
Gradient, just teach Mathematica how to use this information to compute the
derivatives of S:

Derivative[1,0,0][S]:=daS
Derivative[0,1,0][S]:=dbS

Now, when FindMinimum tries to compute the derivatives of your function, it knows
what to do, even when you use the Newton and Quasi-Newton methods.

Carl Woll

Grischa Stegemann wrote:

> Dear group
>
> I'm using Mathematica 3.0.2.0 with Solaris.
>
> In trying to minimize relative entropy, i.e. a positive function like
>   S[a_, b_, n_] := -Sum[h[k]*Log[p[k, a, b, n]], {k, -n, n}]
>         /; a > 0 && b > 0
>   S[a_, b_, n_] := -1 /; a <= 0 || b <= 0
> with FindMinimum I just encountered two general problems.
>
> I'm only interested in a,b>0 with a fixed n. Since Mathematica is not able to
> compute the gradient symbolically I specified it with the Gradient option. I
> defined the gradient as 0 if not a,b>0. Thus if FindMinimum gives -1 for the
> minimum i know, that the algorithm went out of the range of interest. But this
> never happened.
>
> Roughly I tried this with several starting points {as,bs}:
> FindMinimum[S[a,b,n0],{a,as},{b,bs},
>   Gradient->{daS[a,b,n0],dbS[a,b,n0]},
>   MaxIterations->1000];
>
> First of all, it works fine in many cases (depending on n0 and h). But in the
> other cases I always get
> FindMinimum::"fmlim": "The minimum could not be bracketed in 1000 iterations.
> I tried to play around with Accuracy- and PrecisionGoal but even setting them to
> 1 doesn't make things better.
> So, what exactly does this message mean? I cannot find any explanation of this
> message in the documentation. Where got FindMinimum lost?
>
> The second problem is that due to the first one I wanted to play with the Method
> option.
> But if I try Method->Newton or Method->QuasiNewton Mathematica complains
> FindMinimum::"fmgs":
>     Could not symbolically find the gradient of S[a, b, n]. Try
>     giving two starting values for each variable.
> This is inexplicable since I'm using Gradient->{daS[a,b,n0],dbS[a,b,n0]}, isn't
> it?
>
> Any suggestions about this? Thanks a lot in advance.
> --
>                          Grischa Stegemann
>
> ----------------------------------------------------------------------
> Grischa Stegemann                       Technische Universitaet Berlin
> email: Stegemann at physik.tu-berlin.de
>
> *** We are here on Earth to do good for others.
> *** What the others are here for, I do not know.  (W.H. Auden)



  • Prev by Date: Re: Re: options to functions
  • Next by Date: Removing In and Out Labels in all Open Notebooks [revisited]
  • Previous by thread: FindMinimum and Gradient
  • Next by thread: Re: FindMinimum and Gradient