MathGroup Archive 2012

[Date Index] [Thread Index] [Author Index]

Search the Archive

Re: Memory Blowup Issues

  • To: mathgroup at smc.vnet.net
  • Subject: [mg126718] Re: Memory Blowup Issues
  • From: Bill Rowe <readnews at sbcglobal.net>
  • Date: Sat, 2 Jun 2012 05:43:11 -0400 (EDT)
  • Delivered-to: l-mathgroup@mail-archive0.wolfram.com

On 6/1/12 at 5:18 AM, kjm at KevinMcCann.com (Kevin J. McCann) wrote:

>I am dealing with some good sized data sets from satellite data.
>These data are stored in hdf format. The problem is that I process a
>sequence of these data sets by Importing and then processing;
>however, it seems that the memory usage grows with each Import, even
>if I Remove the data.

Try doing $HistoryLength=0; at the start of a session when
working with large data sets. This will keep Mathematica from
keeping copies on the In/Out stack which can quickly consume memory

Another approach might be to load the package CleanSlate before
importing large data sets. CleanSlate can restore things to the
state they were in immediately after the package was loaded
including clearing the In/Out stack.




  • Prev by Date: Re: vector ordering problem
  • Next by Date: Using CountryData to Get US Population by State
  • Previous by thread: Re: Memory Blowup Issues
  • Next by thread: Re: Memory Blowup Issues