Re: Gradient in FindMininum
- To: mathgroup at smc.vnet.net
- Subject: [mg24049] Re: [mg24006] Gradient in FindMininum
- From: "Mark Harder" <harderm at ucs.orst.edu>
- Date: Wed, 21 Jun 2000 02:20:09 -0400 (EDT)
- Sender: owner-wri-mathgroup at wolfram.com
Johannes, Getting FindMinimum (& NonlinearRegression) to work when gradients, and for that matter, the target function, cannot be found symbolicly is currently a real problem of mine, and I'm seeking professional counselling for it (from Support at Wolfram, ;-) ). Your problem, I think , is tractable. Taking your example, In[317]:= f[x_] := x^2 + NIntegrate[Exp[-t^2], {t, -Infinity, x}]; I tried to find a symbolic derivative & In[318]:= D[f[x], x] NIntegrate::"nintp": "Encountered the non-number \!\(x\) at \!\({t}\) = \ \!\({t}\)." Out[318]= Indeterminate Yes, it is indeterminate, but setting up FM[] with Method ->Gradient, and a variable list specifying an interval around 1. works: In[319]:= FindMinimum[f[x], {x, {.99, 1.01}}, Method -> Gradient] Out[319]= {0.666069, {x -> -0.419365}}. But, is this what you want for your full-blown problem? i.e. What did you mean by >(minimization algorithms which don't use the gradient >are impracticable.) ??? Now, there is a numerical derivatives routine in the NumericMath`NLimits` package, called ND[]. It requires evaluation of a symbolic function (which won't work for me: I need to use a Pseudoinverse, which takes forever to evaluate symbolically, or SVD, which *only* works on numeric arguments), & it complains about your function, then goes ahead & returns what look like reasonable results, to me anyway: In[322]:= {ND[f[x], x, 1], ND[f[x], x, -.419365], ND[f[x], x, -1] } NIntegrate::"nintp": "Encountered the non-number \!\(x\) at \!\({t}\) = \ \!\({t}\)." NIntegrate::"nintp": "Encountered the non-number \!\(x\) at \!\({t}\) = \ \!\({t}\)." NIntegrate::"nintp": "Encountered the non-number \!\(x\) at \!\({t}\) = \ \!\({t}\)." General::"stop": "Further output of \!\(NIntegrate :: \"nintp\"\) will be \ suppressed during this calculation." Out[322]={2.3678794389572717`, -4.789926799402722`*^-7, -1.6321205594031925` So, if the "gradient-free" method of optimization I proposed above won't work for you, perhaps you could construct a numeric gradient function with ND[]. If that doesn't work, I'd really like to know how to solve your complicated problem, too. -mark harder harderm at ucs.orst.edu -----Original Message----- From: Johannes Ludsteck <ludsteck at zew.de> To: mathgroup at smc.vnet.net Subject: [mg24049] [mg24006] Gradient in FindMininum >Dear MathGroup Members, >Unfortunately I got no answer when I sent the question below last >week to the mathgroup mailing list. Since I think that the problem >is not a very special one but a general problem of the way how >Mathematica treats numerical integrals in the computation of >gradients, I retry to get an answer. >I want to minimize a complicated function which contains >numerical integrals. Since the function is too complicated for a >direct demonstration, I give a simple example which makes the >structure of the problem clear: > >The (example) function to be minimized is: >f[x_] := NIntegrate[g[t], {t, -Infinity, x}] > >(g is a known function; however symbolical integration is >impossible). > >When I request numerical minimization of this function by typing > >FindMinimum[f[x],{x,1}] > >Mathematica gives me the following error message: > >FindMinimum::fmgl: Gradient {Indeterminate} is not a length 1 >list of real numbers at {x} = {1.}. > >Appearently, Mathematica is not able to find the gradient >symbolically. A simple solution would be to define f using Integrate >(without prefix N) and to wrap it with N[ ]: > >f[x_]:= N[ Integrate[g[t], {t,-Infinity, x}] ] > >However, since the function contains some hundred terms, >evaluation of the function takes several minutes. (Mathematica then >tries to find the integral symbolically before applying the numerical >integration procedure.) This makes optimization impracticable. >(the function I want to optimize has about 40 variables!). > >Are there any suggestions how to avoid computation of the gradient >manually? (minimization algorithms which don't use the gradient >are impracticable.) >I.e. how can I tell Mathematica to use the first definition > >f[x_] := NIntegrate[g[t], {t, -Infinity, x}] > >for evaluation of the function and the second > >f[x_]:= N[ Integrate[g[t], {t,-Infinity, x}] ] > >for the computation of the gradient. > >Thank you > >P.S If you want to reproduce the error message, you can use a >simple definition: > >f[x_]:= x^2 + NIntegrate[ Exp[-t^2], {t, -Infinity, x} ]. > > > > >Johannes Ludsteck >Centre for European Economic Research (ZEW) >Department of Labour Economics, >Human Resources and Social Policy >Phone (+49)(0)621/1235-157 >Fax (+49)(0)621/1235-225 > >P.O.Box 103443 >D-68034 Mannheim >GERMANY > >Email: ludsteck at zew.de >