MathGroup Archive 2001

[Date Index] [Thread Index] [Author Index]

Search the Archive

Re: Re: idiom for recurrence relations

  • To: mathgroup at smc.vnet.net
  • Subject: [mg26844] Re: [mg26815] Re: [mg26714] idiom for recurrence relations
  • From: Daniel Lichtblau <danl at wolfram.com>
  • Date: Thu, 25 Jan 2001 01:13:22 -0500 (EST)
  • References: <200101180557.AAA16636@smc.vnet.net> <948rtc$kli@smc.vnet.net> <200101240918.EAA03640@smc.vnet.net>
  • Sender: owner-wri-mathgroup at wolfram.com

Mitch Harris wrote:
> 
> In article <948rtc$kli at smc.vnet.net>,
>   Daniel Lichtblau <danl at wolfram.com> wrote:
> > maharrix at my-deja.com wrote:
> > >
> > > I have a problem which is mostly esthetic, but could have efficiency
> > > repercussions. First, the general question: What is the best way to
> > > implement recurrence relations (e.g. Fibonacci[n])?
> 
> [...some possibilities deleted...]
> 
> > > So, how do the above rate and what are the reasonable alternatives?
> >
> > One good way might be via matrix powers. Below I show several
> variations
> 
> [...some variations deleted...]
> 
> > fib5[n_] := MatrixPower[{{1,1},{1,0}},n-1][[1,1]]
> >
> > fib6[n_] := Fibonacci[n]
> 
> [...timings deleted...]
> 
> > Fifth method uses a divide-and-conquer approach in MatrixPower. So it
> > will be O(log(n)*M(n)) where M(n) is cost of multiplying a pair of
> n-bit
> > integers. The built-in Fibonacci code does something similar, I
> believe.
> 
> Great answer! (I wish I had done that to begin with). Your matrix method
> is one kind of alternative that seems fruitful (it generalizes at least
> for allowing coefficients that are not constants, but doesn't work with
> non-linear recurrences, which is not a terrible restriction).
> 
> But I have a couple of nits to pick about running times. You say that
> versions 2,3, and 4 are O(n^2). Yes, they are, but specifically only for
> the Fibonacci recurrence (and others whose growth is exponential). I
> would
> prefer to separate the number of steps performed (O(n) for these three)
> and then take into account the size of the answer (e.g. 1) for
> polynomial
> coefficients, you can get super-exponential growth, and for appropriate
> constants (e.g. binomial coefficients) you can get plain old polynomial
> solutions to the recurrence which gives an O(log n) multiplier.

Yes. What I said applied to that particular recurrence, not to
recurrences in general. Also note that what I said for the last methods
was wrong (I corrected that in a follow-up post). Those are just
O(M(n)), no log(n) factor.


> Also, you state that for method 5, that 'M(n)' is the time for
> multiplying
> 2 n-bit integers. Since method 5 is really multiplying n matrices, I
> would
> think this algorithm really O(n M(k)) where M(k) is the cost of
> multiplying two k by k matrices that encode a kth-order recurrence
> (which
> is currently roughly O(k^2.37) by the Coppersmith/Winograd method.
> As you say by divide and conquer for handling exponentiation, you can make
> this O(log(n) M(k)), for constant matrices (as you get with recurrences
> with constant coefficients). I have not taken into account the sizes of
> the integers being multiplied within the matrix multiplication.

But you have to account somewhere for the size of the scalar factors;
this is unavoidable. I realize your omission is by intent, and I am just
pointing out that for computations such as the Fibonacci example this
can be a dominant factor. That said, yes, it makes good sense to
separate effects of matrix products, number of steps, and scalar
multiplication.

I did not include a factor to account for matrix multiplication. The
reason is I was using an order 2 recurrence (2x2 matrix) and simply let
the estimate swallow the multiplicative constant that accounts for
number of products generated in taking such a matrix product. This is
realistic for smallish orders (up to maybe a few dozen?) after which
asymptotically fast matrix multiplication methods may start to make a
big difference.

On the other hand, note that these recurrences, when expressed as matrix
products, are working with companion matrices (1's on subdiagonal,
zeroes everywhere else except last column, or transpose of this
depending on how you set it up). So one might do these products in a way
that involves, say, O(k^2) scalar multiplications. But this might not be
possible in a divide-and-conquer matrix power setting (offhand I do not
know whether or how these ideas might be combined). This could be
something to think about if you want to work with high degree
recurrences.


> Thanks for the info. Your analysis is exactly what I should have done to
> begin with. By the way is there any sort of collection somewhere of
> idioms
> of the kind I am thinking of? e.g. those embedded in the mathematica
> book
> like for partial fractions: Fold[Plus,0,list]. Forever, I was using
> things
> like Table[Sum[list[[i]],{i,1,k}],{k,1,Length[list]}], knowing that
> there
> was a much easier funcitonal way to do it, but not knowing which
> functional operator to start with.

There may be such a collection (and to be honest I could sometimes use a
look at it myself). What I find is that alot of the "better" ways just
gradually sink in with experience. For a slightly more concrete
suggestion, whenever you find yourself using an explicit iterator (as in
the example above), you might cast about for a way that avoids that.
Also, given that your older approach uses explicit nested iterators
whereas an "in your head" approach would have a single loop, you could
simply try to think of ways to attain that more efficient method. Not
sure if this would lead you right to FoldList, but it might, or at least
take you to something else within a constant multiple in efficiency
and/or elegance.


> Mitch Harris
> 
> Sent via Deja.com
> http://www.deja.com/


Daniel Lichtblau
Wolfram Research


  • Prev by Date: Re: Who can help me?
  • Next by Date: The Integrator (Wolfram)
  • Previous by thread: Re: idiom for recurrence relations
  • Next by thread: Re: idiom for recurrence relations