Re: Findminimum too slow for iterative reweighted least
- To: mathgroup at smc.vnet.net
- Subject: [mg123532] Re: Findminimum too slow for iterative reweighted least
- From: Oliver Ruebenkoenig <ruebenko at wolfram.com>
- Date: Sat, 10 Dec 2011 07:29:38 -0500 (EST)
- Delivered-to: l-mathgroup@mail-archive0.wolfram.com
- References: <201112091055.FAA03827@smc.vnet.net>
On Fri, 9 Dec 2011, Alberto Maydeu-Olivares wrote:
> Hi, I'm trying to use FindMinimum to minimize a weighted least squares
> function where the weight matrix needs to be updated at each
> iteration. The weight matrix involves the inverse of a symbolic
> matrix. This should be fast if FindMinimum first evaluates the
> symbolic matrix, then takes the inverse. Given how long it takes, it's
> obviously not doing that. I wonder if anyone can suggest a way to
> force FindMinimum to do that. I obviously tried Evaluate, but does not
> work. At attach a toy example so that you can try. Finding the minimum
> of this toy example takes a full second. It takes minutes to do a
> realistic size job, It should not take more than a couple of seconds.
>
> Thanks a lot for your help.
>
>
> model = {l1^2 + ps, l2*l1, l2^2 + ps, l3*l2, l3*l1, l3^2 + ps};
> theta = {l1, l2, l3, ps};
> j = Outer[D, model, theta];
> data = {2., .42, 3., .56, .48, 1.};
> startval = Transpose[{theta, {.5, .5, .5, .5}}];
> e = data - model;
> mat = {{2 (l1^2 + ps)^2, 2 l1 l2 (l1^2 + ps), 2 l1^2 l2^2,
> 2 l2 l3 (l1^2 + ps), 2 l1 l2^2 l3,
> 2 l2^2 l3^2}, {2 l1 l2 (l1^2 + ps),
> ps (l2^2 + ps) + l1^2 (2 l2^2 + ps), 2 l1 l2 (l2^2 + ps),
> l1 l3 (l1^2 + l2^2 + ps), l2 l3 (l1^2 + l2^2 + ps),
> 2 l1 l2 l3^2}, {2 l1^2 l2^2, 2 l1 l2 (l2^2 + ps), 2 (l2^2 + ps)^2,
> 2 l1^2 l2 l3, 2 l1 l3 (l2^2 + ps),
> 2 l1^2 l3^2}, {2 l2 l3 (l1^2 + ps), l1 l3 (l1^2 + l2^2 + ps),
> 2 l1^2 l2 l3, l2^2 l3^2 + (l1^2 + ps) (l3^2 + ps),
> l1 l2 (2 l3^2 + ps), 2 l2 l3 (l3^2 + ps)}, {2 l1 l2^2 l3,
> l2 l3 (l1^2 + l2^2 + ps), 2 l1 l3 (l2^2 + ps),
> l1 l2 (2 l3^2 + ps), l1^2 l3^2 + (l2^2 + ps) (l3^2 + ps),
> 2 l1 l3 (l3^2 + ps)}, {2 l2^2 l3^2, 2 l1 l2 l3^2, 2 l1^2 l3^2,
> 2 l2 l3 (l3^2 + ps), 2 l1 l3 (l3^2 + ps), 2 (l3^2 + ps)^2}};
>
> (*brute force approach to benchmark*)
> w = Inverse[mat];
> irls = FindMinimum[e. w . e, Evaluate[Sequence @@ startval], Gradient
> -> -2 e. w .j]; // Timing
>
> (*this should work, in fact, it takes almost twice as long*)
> w := Inverse[mat];
> irls = FindMinimum[Evaluate[e. w . e], Evaluate[Sequence @@
> startval], Gradient -> Evaluate[-2 e. w .j]]; // Timing
>
>
Alberto,
try
ewe = e.w.e;
ewj = -2 e.w.j;
irls = FindMinimum[ewe, Evaluate[Sequence @@ startval],
Gradient -> ewj]; // AbsoluteTiming
which is somewhat faster. Using Method -> "Newton" gets rid of the
messages but takes longer. Perhaps a Hessian would help. Or use, say,
Method -> "ConjugateGradient" which is againn faster but has messages. You
could use ParallelTry with different methods.
Oliver
- References:
- Findminimum too slow for iterative reweighted least squares
- From: Alberto Maydeu-Olivares <amaydeu@gmail.com>
- Findminimum too slow for iterative reweighted least squares