numerical differentiation

• To: mathgroup at smc.vnet.net
• Subject: [mg61127] numerical differentiation
• From: Joerg Schaber <schaber at molgen.mpg.de>
• Date: Tue, 11 Oct 2005 03:20:20 -0400 (EDT)
• Sender: owner-wri-mathgroup at wolfram.com

Hi,

I defined a cost function for an data fitting problem, that involves
solving an ordinary differential equation, i.e.

costfunc[{x1,x2}]:=Module[{},
...
model=NDSolve[ ... ]

(* sum of squared residues *)
sim = First[m3[tp] /. model];
Return[(sim - data).W.(sim - data)];
];

NMinimize[constfunc[{x1,x2}],{x1,x2}...];

This gives me estimated optimal parameters x1 and x2. So far, so good.

Now I want to calculate asymptotic confidence intervals for the
estimated parameters x1 and x2. One option is to calculate the Hessian
of costfunc as an approximation to the Covariance Matrix. However,
numerical differentiation like

N[D[costfunc[{x1,x2}],{{x1,x01},{x2,x02}}] does not seem work.

Does anybody have a hint how I can get numerical derivatives of costfunc
in this case? Or how can I recover the Hessian or Jacobian, when I use
FindMinimum? Is there a routine for Finite Differences?

best,

joerg

• Prev by Date: Linear algebra performance between 5.0 and 5.2
• Next by Date: Re: simplifying rational expressions
• Previous by thread: Linear algebra performance between 5.0 and 5.2
• Next by thread: Compile SparseArray help