MathGroup Archive 2012

[Date Index] [Thread Index] [Author Index]

Search the Archive

Re: Memory Blowup Issues

  • To: mathgroup at smc.vnet.net
  • Subject: [mg126742] Re: Memory Blowup Issues
  • From: Ralph Dratman <ralph.dratman at gmail.com>
  • Date: Sun, 3 Jun 2012 05:01:48 -0400 (EDT)
  • Delivered-to: l-mathgroup@mail-archive0.wolfram.com
  • References: <201206010918.FAA11834@smc.vnet.net>

Kevin,

Thank you for passing along those tips. I usually put $HistoryLength=1
and also turn off "Enable Notebook History Tracking" in the
Preferences dialog -- though I'm not sure the latter makes any
difference.

I just want to re-emphasize how egregious I think the situation with
crashing really is. It is unacceptable. I hope everyone is making that
absolutely clear to WRI. How can we continue to recommend a program
which often crashes? It defies common sense.

Ralph


On Sat, Jun 2, 2012 at 9:45 AM, Kevin J. McCann <kjm at kevinmccann.com> wrote:
> I already have 12Gb; so, that is not a solution. Others have suggested
> $HistoryLength = 0, which works. On the other hand, mvary = . did not change
> the amount of memory in use at all.
>
> I guess acceptability of the memory/crash problem depends on your point of
> view. If indeed the ability to access previous outputs is important to the
> way you work (it is not to me), then I can see why it needs to be there even
> though it could lead to a crash in extreme cases like mine. However, the
> ability to limit this with $HistoryLength gives flexibility to the user.
>
> I do not make use of the whole In[] Out[] thing, and, in fact, turn those
> labels off in all my notebooks.
>
> Kevin
>
> On 6/2/2012 5:48 AM, Ralph Dratman wrote:
>>
>> Solution: install more memory in your computer?
>>
>> That answer is sort of a wisecrack, but in real life, my Mac Mini now
>> has 8 GB of RAM for the benefit of Mathematica -- yet the program
>> still somehow manages to fill up memory and crash. That is why I keep
>> glancing at the desktop memory monitor. If memory is almost full, I
>> can usually stop Mathematica before things get really serious.
>>
>> That said, my experience has been that disposing of variables (using
>> myvar=.) does help quite a bit. But I am not reading files very often.
>> In your case, maybe something about reading a file is making it harder
>> to free the memory used.
>>
>> By the way, in case you might be thinking that this whole crash
>> problem is pretty unacceptable behavior for an app that sells for a
>> lot of money in 2012.... yeah, you're right, it is.
>>
>> Ralph
>>
>>
>>
>> On Fri, Jun 1, 2012 at 5:18 AM, Kevin J. McCann<kjm at kevinmccann.com>
>>  wrote:
>>>
>>> I am dealing with some good sized data sets from satellite data. These
>>> data are stored in hdf format. The problem is that I process a sequence
>>> of these data sets by Importing and then processing; however, it seems
>>> that the memory usage grows with each Import, even if I Remove the data.
>>>
>>> Here is a somewhat simplified example:
>>>
>>> MemoryInUse[] (* some data already read in *)
>>> 2781549920
>>>
>>> Data=Import[fileName,"Data"]; (* import a new data set *)
>>> MemoryInUse[]
>>> 3302168936
>>>
>>> Data=Import[fileName,"Data"];(* import  it again *)
>>> MemoryInUse[]
>>> 3822781920
>>>
>>> Remove[Data] (* this has no effect on the memory usage *)
>>> MemoryInUse[]
>>> 3822787840
>>>
>>> As you can see, I will shortly run out of memory. Any ideas?
>>>
>>> Thanks,
>>>
>>> Kevin
>>>
>>
>



  • Prev by Date: Re: Flattening a hypermatrix into an ordinary matrix
  • Next by Date: Re: Sum of Products - Complete The Square
  • Previous by thread: Re: Memory Blowup Issues
  • Next by thread: Re: Memory Blowup Issues