D[...] change in 5.1

*To*: mathgroup at smc.vnet.net*Subject*: [mg58686] D[...] change in 5.1*From*: Alexei Akolzin <akolzine at uiuc.edu>*Date*: Thu, 14 Jul 2005 02:49:13 -0400 (EDT)*Organization*: University of Illinois at Urbana-Champaign*Sender*: owner-wri-mathgroup at wolfram.com

Hi, In previous version two lines below seemed to work as intended: In: n /: D[n[i_], x[j_], NonConstants -> {n,r}] := (1/r)( d[i,j] - n[i] n[j] ); D[ 1 + n[k], x[l] ] Out: (-n[k] n[l] + d[k,l]) \ r But now in ver 5.1 I get something like: Out: D[n, x[l], NonConstants -> {J, r, n}] (1) The funny part is that D[ n[k], x[l] ] is recognized and substituted by the expression associated with definition of n. I wonder whether there is a possibility to get Mathematica 5.1 recognize n[k] as an indexed symbol n. Thanks. Alexei Akolzin.