Re: Using LevenbergMarquardt Method with a complicated function
- To: mathgroup at smc.vnet.net
- Subject: [mg53084] Re: Using LevenbergMarquardt Method with a complicated function
- From: algaba at alumni.uv.es (algaba)
- Date: Thu, 23 Dec 2004 07:58:14 -0500 (EST)
- Sender: owner-wri-mathgroup at wolfram.com
Seems it doesn't work. I have tried some examples to understand the proceeding and it worked, but when I wanted to implement it on the program it fails. I have tried to add the instruction R = Join[ResAlpha, ResDelta, ResAlphaComp, ResDeltaComp]; to specify the residual explicitly and use it in the minimization, but it gives me the error message FindMinimum::fmgz: Encountered a gradient which is effectively zero. The \ result returned may not be a minimum; it may be a maximum or a saddle point. It is true that mathematica now gives me some results that, in principle, appear to be good, but in the next step I see this is not the case: R is a null list and posterior work with data gives me indeterminates expressions. I think now the problem is that the residuals are not constants but can vary (in fact, the number of residuals can also be different each time we run the algorith), but I don't know how does it affect to mathematica. I have also tried to remove some summands or introduce the residuals in other several ways, such as creating a list before the execution, inside the function, just when implementing L-M method, etc. So, think I'm in the correct way, because I get some (but surely false) results. So, what's the next step? Thanks. --- algaba wrote: > Hi. > I have defined a very long function like this: > > ChiSquare[Per0_?NumericQ, Ppa0_?NumericQ, Ecc0_?NumericQ] := > > (some steps and definitions here) > > ChiSQ = Sum[ResAlpha[[i]]^2 + ResDelta[[i]]^2, {i, 1, Length[ > TExp]}] + Sum[ResAlphaComp[[i]]^2 + > ResDeltaComp[[i]]^2, {i, 1, Length[TComp]}]); > > which tries to find the Chi-Square of an array of data. Now, I want to > minimize it and I use FindMinimum, which works well. The problem > arises when I want to use the Levenberg-Marquardt method, which seems > to be better for this kind of functions (As you can see, it is a sum > of squares) But when I run Mathematica 5 it gives me the next error > message: > > FindMinimum::notlm: The objective function for the method > LevenbergMarquardt \ > must be in a least-squares form: Sum[f[i][x]^2,{i,1,n}] or Sum[w[i] \ > f[i][x]^2,{i,1,n}] with positive w[i]. > > I think the function accomplishes all the requirements. Why I get this > error? Is it maybe because of the long definition of the function? Is > it because Mathematica doesn't see this function as a sum of squares > but as a sequence of steps? > What can I do to solve this problem? I do want to use this method to > minimize the Chi-Square. Thanks. > Because the definition is set up to only evaluate with numerical values of the arguments, Mathematica cannot do the computations necessary to decompose the sum of squares. To use the Levenberg-Marquardt method, the function needs to be decomposed into a residual function r[X] such that f[X] = r[X].r[X]/2. The LevenbergMarquardt method has a method option that allows you to specify the residual explicitly for cases like this where you may not want the sum of squares evaluated symbolically. Here is a simple example that should give you an idea of how it works: In[1]:= SS[x_?NumberQ, y_?NumberQ] := (x - 1)^2 + 100 (y - x^2)^2 In[2]:= FindMinimum[SS[x,y],{{x,1},{y,-1}}, Method->"LevenbergMarquardt"] >From In[2]:= FindMinimum::notlm: The objective function for the method LevenbergMarquardt must be in a least-squares form: Sum[f[i][x]^2,{i,1,n}] or Sum[w[i] f[i][x]^2,{i,1,n}] with positive w[i]. More... Out[2]= FindMinimum[SS[x, y], {{x, 1}, {y, -1}}, Method -> LevenbergMarquardt] In[3]:= R[x_?NumberQ, y_?NumberQ] := Sqrt[2] {x-1, 10 (y -x^2)} In[4]:= FindMinimum[SS[x,y], {{x,1},{y,-1}}, Method->{"LevenbergMarquardt", "Residual"->R[x,y]}] Out[4]= {0., {x -> 1., y -> 1.}} Note you could leave out the Sqrt[2] and minimize 2 SS[x,y] which would be slightly more efficient. There is more explanation and examples in the Advanced Documentation for unconstrained optimization. Look in the Mathematica help browser under Advanced Documentation->Optimization->Unconstrained Optimization->Methods for Local Minimization->Gauss Newton Methods Rob Knapp Wolfram Research