Re: Export
- To: mathgroup at smc.vnet.net
- Subject: [mg110247] Re: Export
- From: "Mitch Stonehocker" <mitch at aitoconsulting.com>
- Date: Thu, 10 Jun 2010 08:08:15 -0400 (EDT)
- References: <201006091121.HAA12074@smc.vnet.net>
I make the following suggestion not knowing exactly what you're after and how you need/want to process. When you read data Mathematica consumes memory to retain the data. When you write data Mathematica doesn't release any memory therefore when you have a situation where you are looping through a process like - read data, manipulate, add to the original data, write new data - Mathematica will be consuming ever more amounts of memory each time data is read. You may be able control exactly how memory grows and avoid reading and writing files using a technique exemplified here: In[1]= data[0] = a; data[1] = b; ?data Out[2]= Global`data data[0]=a data[1]=b In[3]= data[0, 0] = c; data[0, 1] = d; ?data Out[4]= Global`data data[0]=a data[1]=b data[0,0]=c data[0,1]=d I hope that helps. Cheers, Mitch -----Original Message----- From: Clint [mailto:clint.zeringue at kirtland.af.mil] Sent: Wednesday, June 09, 2010 7:22 AM To: mathgroup at smc.vnet.net Subject: [mg110247] Re: Export Hi Bill, Thank you for youre response. I just read this post and in the meantime I found this work around: read a file and skip so many records down.. (here i skip so many iterations...) strm = OpenRead[ToFileName[{NotebookDirectory[]}, "1laserGrid.txt"]]; If[iter != 1, Skip[strm, Record, (iter - 1)*Nz]]; EL = ReadList[strm, Expression]; Close[strm]; append to a file strm = OpenAppend[ToFileName[{NotebookDirectory[]}, "1laserGrid.txt"], FormatType -> OutputForm]; WriteString[strm, "\n" <> StringDrop[ StringDrop[ StringReplace[ToString[ELNewNew, InputForm], "," -> "\n"], 1], -1]]; Close[strm]; Both your solution and this one seem to give me the same problem which I will describe below: The reason for using export and import is that I max out the 32 GB RAM on my PC. So to keep RAM down I use file I/O. While the "write" stream takes no time at all and doesn't depend on the current file size the read command gets bogged down as the file gets larger and larger... I naively thought that using "skip" in the read would prevent the processor from reading in the entire file which starts to take a very long time as the file sizes approch 100,000 KB's... This is making simulations virtually imposibble to run since they are taking 10 hours primarily because I keep looping through this read command everytime I propogate my PDE ...and each one is taking a minute or so... I'm at a loss on what to do here..... I have no choice but to use file I/O do to RAM limitations but i don't see a way around my problem there either :( One way I thought of was to maintain two data files.. "one where i store all my data" and the second one "where I just store the last 2 iterations of data since that is all I need to propogate my pde forward in time anyway".. However, I thought maybe I would not have to do that if I could use non-sequential data pulling from a file in Mathematica but I guess that isn't possible?
- References:
- Re: Export
- From: Clint <clint.zeringue@kirtland.af.mil>
- Re: Export