MathGroup Archive 2010

[Date Index] [Thread Index] [Author Index]

Search the Archive

Re: Export

  • To: mathgroup at
  • Subject: [mg110247] Re: Export
  • From: "Mitch Stonehocker" <mitch at>
  • Date: Thu, 10 Jun 2010 08:08:15 -0400 (EDT)
  • References: <>

I make the following suggestion not knowing exactly what you're after and
how you need/want to process.  

When you read data Mathematica consumes memory to retain the data.  When you
write data Mathematica doesn't release any memory therefore when you have a
situation where you are looping through a process like - read data,
manipulate, add to the original data, write new data - Mathematica will be
consuming ever more amounts of memory each time data is read.

You may be able control exactly how memory grows and avoid reading and
writing files using a technique exemplified here:

data[0] = a;
data[1] = b;



data[0, 0] = c; data[0, 1] = d;



I hope that helps.


-----Original Message-----
From: Clint [mailto:clint.zeringue at] 
Sent: Wednesday, June 09, 2010 7:22 AM
To: mathgroup at
Subject: [mg110247] Re: Export

Hi Bill,

Thank you for youre response. I just read this post and in the meantime I
found this work around:

read a file and skip so many records down.. (here i skip so many

strm = OpenRead[ToFileName[{NotebookDirectory[]}, "1laserGrid.txt"]];
If[iter != 1, Skip[strm, Record, (iter - 1)*Nz]];
EL = ReadList[strm, Expression];

append to a file
strm = OpenAppend[ToFileName[{NotebookDirectory[]}, "1laserGrid.txt"],
    FormatType -> OutputForm];
  "\n" <> StringDrop[
     StringReplace[ToString[ELNewNew, InputForm], "," -> "\n"], 
     1], -1]];

Both your solution and this one seem to give me the same problem which I
will describe below:

The reason for using export and import is that I max out the 32 GB RAM on my
PC. So to keep RAM down I use file I/O.

While the "write" stream takes no time at all and doesn't depend on the
current file size the read command gets bogged down as the file gets larger
and larger... I naively thought that using "skip" in the read would prevent
the processor from reading in the entire file which starts to take a very
long time as the file sizes approch 100,000 KB's...

This is making simulations virtually imposibble to run since they are taking
10 hours primarily because I keep looping through this read command
everytime I propogate my PDE ...and each one is taking a minute or so...

I'm at a loss on what to do here.....

I have no choice but to use file I/O do to RAM limitations but i don't see a
way around my problem there either :(

One way I thought of was to maintain two data files.. "one where i store all
my data" and the second one "where I just store the last 2 iterations of
data since that is all I need to propogate my pde forward in time anyway"..
However, I thought maybe I would not have to do that if I could use
non-sequential data pulling from a file in Mathematica but I guess that
isn't possible?

  • References:
    • Re: Export
      • From: Clint <>
  • Prev by Date: Re: Lists: Row Vectors vs. Column Vectors. (feels like such a silly
  • Next by Date: Re: Export
  • Previous by thread: Re: Export
  • Next by thread: Re: Export