Re: Re: Re: Re: FindMinimum and the minimum-radius circle
- To: mathgroup at smc.vnet.net
- Subject: [mg50235] Re: [mg50213] Re: [mg50193] Re: Re: FindMinimum and the minimum-radius circle
- From: Daniel Lichtblau <danl at wolfram.com>
- Date: Sun, 22 Aug 2004 00:19:45 -0400 (EDT)
- References: <cfuq6g$653$1@smc.vnet.net> <200408180834.EAA08732@smc.vnet.net> <cg204q$msq$1@smc.vnet.net> <200408200857.EAA12448@smc.vnet.net> <200408210704.DAA24766@smc.vnet.net>
- Sender: owner-wri-mathgroup at wolfram.com
Janos D. Pinter wrote: > Hello mathgroup, > > as stated in previous messages, the minimum-radius circle problem is > convex, but constrained: hence, an unconstrained optimization approach may > or may not work... But it can certainly be made to work on an unconstrained formulation. This takes some nondefault option settings to work around problems that arise because such formulation is not everywhere smooth. > Using the convex hull as a constraint set will work, of > course. NMInimize and other constrained optimizers should then also work, > even in local search mode if they have that option. > > Instead of further philosophical discussions on the subject, why don't we > solve a standardized test model(s), using the same numerical example. (I > assume here that SeedRandom is platform and version independent... as it > should be.) > > Here is a simple model-instance, for your perusal and experiments. > > SeedRandom[1]; > points = Partition[Table[Random[], {1000}], 2]; > constraints = (#[[1]] - x0)^2 + (#[[2]] - y0)^2 <= r^2 & /@ points; > soln = NMinimize[{r, constraints}, {x0, y0, {r, 0, 10}}] > {0.6628150350002375, {r -> 0.6628150350002375, x0 -> 0.48684175822621106, > y0 -> 0.46146943870460916}} > > timedsoln = NMinimize[{r, constraints}, {x0, y0, {r, 0, 10}}] // AbsoluteTiming > {30.694136`8.938600406559628*Second, {0.6628150350002375, > {r -> 0.6628150350002375, x0 -> 0.48684175822621106, y0 -> > 0.46146943870460916}}} > > That is (on a P4 1.6 WinXP machine) it takes about 30 seconds to generate > this solution. > > Next, I solve the same model using the MathOptimizer Professional package, > in local search mode: > > Needs["MathOptimizerPro`callLGO`"] > callLGO[r, constraints, {{x0, 0, 1}, {y0, 0, 1}, {r, 0, 1}}, Method -> LS] > // AbsoluteTiming > {6.158856`8.24104504348061*Second, > {0.662815036, {x0 -> 0.4868417589, y0 -> 0.4614694394, r -> 0.662815036}, > 4.770200900949817*^-11}} > > The two solutions are fairly close (the solution methods are different). > The MathOptimizer Professional (LGO) solver time is ~ 6 seconds. The result > also shows that the max. constraint error of this solution is ~ > 4.770200900949817*^-11. (This error could be reduced, if necessary by > changing the model and some LGO options.) > > Obviously, one could use also more sophisticated models and other point > sets etc., as long as we all use the same one(s), for objective > comparisons. (The proof of the pudding principle.) > > Regards, > Janos > [...] What about the unconstrained formulation? In[47]:= Timing[NMinimize[Sqrt[Max[Map[(x-#[[1]])^2+(y-#[[2]])^2&,points]]],{x,y}]] Out[47]= {3.54 Second,{0.662815,{x\[Rule]0.486831,y\[Rule]0.461458}}} I note that this can be made faster still by preprocessing to extract the convex hull of the data points. Also one can attain a faster result, with but slight loss of accuracy, using FindMinimum[...,Gradient->{"FiniteDifference","DifferenceOrder" -> 2}] You can do this with arbitrary reasonable starting points but clearly the mean or median values would be good choices. In[52]:= Timing[FindMinimum[ Sqrt[Max[ Map[(x-#[[1]])^2+(y-#[[2]])^2&,points]]],{x,#[[1]]},{y,#[[2]]}, Gradient->{"FiniteDifference","DifferenceOrder"->2}]&[ Mean[points]]] Out[52]= {1.34 Second,{0.663207,{x->0.487394,y->0.461453}}} Experimentation reveals that by taking DifferenceOrder->4 I can recover the NMinimize/MOP result for this particular example, albeit at some loss of speed. Again, this will perform much better if you work with the convex hull. Here is how fast this becomes. Needs["DiscreteMath`ComputationalGeometry`"] In[69]:= Timing[With[{hull=points[[ConvexHull[points]]]}, FindMinimum[ Sqrt[Max[ Map[(x-#[[1]])^2+(y-#[[2]])^2&,hull]]],{x,#[[1]]},{y,#[[2]]}, Gradient->{"FiniteDifference","DifferenceOrder"->2}]&[ Mean[hull]]]] Out[69]= {0.04 Second,{0.66286,{x->0.486905,y->0.461467}}} My thanks to Bobby Treat, Rob Knapp, and Tom Burton for bringing to my attention some of the finer points regarding speed and avoidance of premature convergence by FindMinimum. Daniel Lichtblau Wolfram Research
- Follow-Ups:
- Re: Re: Re: Re: Re: FindMinimum and the minimum-radius circle
- From: "Fred Simons" <f.h.simons@tue.nl>
- Re: Re: Re: Re: Re: FindMinimum and the minimum-radius circle
- References:
- Re: FindMinimum and the minimum-radius circle
- From: Thomas Burton <tburton@brahea.com>
- Re: Re: FindMinimum and the minimum-radius circle
- From: Thomas Burton <tburton@brahea.com>
- Re: Re: Re: FindMinimum and the minimum-radius circle
- From: "Janos D. Pinter" <jdpinter@hfx.eastlink.ca>
- Re: FindMinimum and the minimum-radius circle