Re: Constraint evaluation in NMinimize
- To: mathgroup at smc.vnet.net
- Subject: [mg122715] Re: Constraint evaluation in NMinimize
- From: Ray Koopman <koopman at sfu.ca>
- Date: Mon, 7 Nov 2011 05:53:17 -0500 (EST)
- Delivered-to: l-mathgroup@mail-archive0.wolfram.com
----- Daniel Lichtblau <danl at wolfram.com> wrote:
> On 11/04/2011 06:02 AM, Ray Koopman wrote:
>> I want to minimize f[x] with respect to x, subject to g[x].
>> Both f and g depend in part on h[x], so to avoid calculating h[x]
>> twice I use an auxiliary variable:
>>
>> NMinimize[{t = h[x]; f[x,t], g[x,t]}, {x}].
>>
>> That appears to work, but it depends on the constraint never being
>> evaluated without first evaluating the minimand, and I can't find
>> anything in the documentation that says that that will always be
>> the case. Can anyone help?
>>
>> (I asked a similar question several years ago, at which time DrBob
>> pointed me to sec 2.6.4 in the book, but it's not obvious to me if
>> the standard evaluation procedure necessarily applies here.)
>>
>
> Could memoize h[x] for numeric x. That way you don't need to second
> guess the evaluation internals of NMinimize.
>
> h[x_?NumericQ] := h[x] = ...
>
> I have some familiarity with those and offhand I've no idea if your
> method above is guaranteed to work.
>
> For purposes of speed it might also be useful to memoize f and g.
> Some methods in NMinimize, such as DifferentialEvolution, may
> recompute at the same points quite often.
>
> An alternative might be:
>
> objandconstraint[x_?NumericQ] := Module[{t}, t=h[x]; {f[x,t],g[x,t]}]
>
> (This could also be memoized.)
> Then do:
>
> NMinimize[objandconstraint[x], {x}]
>
> Caveat: I have not tried this (short on time at the moment) and make
> no guarantee as to whether it will work. It might be DOA.
>
> Daniel Lichtblau
> Wolfram Research
I couldn't get a combined objective-and-constraint function to work.
I had to use two separate functions, which made it impossible to
avoid computing the same thing twice. Also, the constraint is of the
form g[x] == y, and the only way to get the constraint function to
work was to have it return only g[x] and let NMinimize compare that
to y. That approach ended up taking about 22 seconds, which memoizing
reduced to 18 seconds, for a problem that originally took about 6.5
seconds with the objective and constraint computations done inline.
I guess I was trying to fix something that wasn't broke.
An interesting result -- but not surprising, because I've run into it
before in many different situations -- was a slowdown with repeated
calls. Here is a typical sequence of times for the first 20 calls:
{6.53, 6.77, 7.13, 7.17, 7.17, 7.24, 7.23, 7.26, 7.34, 7.29,
7.35, 7.31, 7.25, 7.26, 7.32, 7.30, 7.29, 7.29, 7.33, 7.29}
It seems to asymptote, or at least increase much more slowly,
after 10 calls or so.