scaling of ndsolve to large systems
- To: mathgroup at smc.vnet.net
- Subject: [mg60124] scaling of ndsolve to large systems
- From: Christopher Klausmeier <klausme1 at msu.edu>
- Date: Sat, 3 Sep 2005 02:06:17 -0400 (EDT)
- Sender: owner-wri-mathgroup at wolfram.com
I've been using Mathematica's NDSolve command to solve fairly large
systems of ODEs (hundreds). Unfortunately it is slower than I'd
like, and perhaps unreasonably so. In particular, I noticed that
there is a certain size of the problem where performance gets
distinctly worse. I've watched to make sure this is not due to
running out of RAM and switching to virtual memory.
To test this more carefully, I timed how long it would take to
simultaneously solve x identical uncoupled pairs of equations/initial
conditions. I thought that the time would scale linearly with the
number of equations, but it doesn't. Regressing log runtime against
x I found that runtime ~ x^1.08 (1<=x<=128), ~ x^2.64 (130<=x<= 192),
~ x^1.33 (194<=x<=256).
Any ideas what goes wrong when x hits 128? Even better, is there a
way to maintain that nice linear scaling across all size problems?
This is using Mathematica 5.1 on my Powerbook running MacOS 10.4.2,
but a friend's PC gives similar results.
thanks -- Chris
Kellogg Biological Station
Michigan State University
Hickory Corners MI 49060
Email: klausme1 at msu.edu
Phone: (269) 671-4987
Prev by Date:
Re: piecewise vs which
Next by Date:
Re: inconsistency with Inequality testing and Floor
Previous by thread:
Re: ComplexExpand confusion
Next by thread:
Would a code generator from dynamic systems be feasible and useful?