MathGroup Archive 2010

[Date Index] [Thread Index] [Author Index]

Search the Archive

Automatic Differentiation of mathematica code

  • To: mathgroup at smc.vnet.net
  • Subject: [mg110594] Automatic Differentiation of mathematica code
  • From: pratip <pratip.chakraborty at gmail.com>
  • Date: Sun, 27 Jun 2010 04:55:32 -0400 (EDT)

Dear Experts,

Is there any implementation of automatic differentiation in
Mathematica? If I have a function that is completely numerical then I
can use the ND function but that gives a finite difference derivative.
Also for a function that is quite
expensive to evaluate ND takes huge time to find the derivative. So my
question is if it is possible to use AD to differentiate the function
code. There is a very small toy example in the demonstration project.
http://demonstrations.wolfram.com/AutomaticDifferentiation/

But I have a function that involves 500 lines of Mathematica code and
it takes around 2.3 sec to evaluate in a eight
core machine. The function has more than 80 variables and all of them
are numerical. Now any idea how to compute the gradient for such a
function. Is not AD a correct choice?

Now if I want to minimize this function with linear/nonlinear
constraints on the 80 variables Mathematica has great trouble to
proceed. Any idea on this will be really helpful. Also if any one can
tell me about a good Genetic algorithm implementation in Mathematica.
My function is quite non-smooth and I need some derivative free
algorithm in case there is no way to efficiently compute the
gradients.

Best regards,

Pratip


  • Prev by Date: Re: precedence for ReplaceAll?
  • Next by Date: Re: precedence for ReplaceAll?
  • Previous by thread: Re: Absolute value
  • Next by thread: numerical integration