Pattern evaluation depending on order of definitions

*To*: mathgroup at smc.vnet.net*Subject*: [mg74610] Pattern evaluation depending on order of definitions*From*: "Hannes Kessler" <HannesKessler at hushmail.com>*Date*: Wed, 28 Mar 2007 01:42:24 -0500 (EST)

Hello Mathematica experts, please consider the following 2 examples: _g[1] := -1; _g[n_Integer] := 1; g["something"][1] --> -1 _h[n_Integer] := 1; _h[1] := -1; h["something"][1] --> 1 The first example is what I want: Objects with head g applied to 1 should return -1 and applied to other integers should return +1. The only difference in the second example is the order of the definitions. It appears that Mathematica does not check for further definitions matching h["something"][1] more accurate. This is different in the following two examples: gg[1] := -1; gg[n_Integer] := 1; gg[1] --> -1 hh[n_Integer] := 1; hh[1] := -1; hh[1] --> -1 Here, the order of the definitions has no influence. Mathematica checks all definitions and chooses the best matching one. What is the reason for this different behaviour? Thanks in advance, Hannes Kessler