[Date Index] [Thread Index] [Author Index]
Re: 1GB Kernel Memory Limit/Out of memory errors
I was out of email contact for the weekend, and while going through the weekend's MathGroup load, I noticed the following two similar posts, from Friday night and Sunday night. The Friday one doesn't seem to have had any public responders, so I thought I'd respond to both in one shot. First, let me say categorically that Mathematica does not have a 1 gigabyte limit, and it has no limitation based solely on the amount of physical RAM. Like virtually every other Windows application out there, Mathematica does not manage its own paging file or its own memory limitations. It merely allocates memory using Microsoft's implementation of the C 'malloc' function, which is subject to the standard limitations imposed by the operating system. On a Win32 system, these limitations impose a per-process limitation of 2G of addressable space to the process, or 3Gb if booted using the /3GB switch and if the application is compiled appropriately to use the 3Gb, which Mathematica is. This addressable space is backed by virtual memory, which is the collection of physical RAM and page files managed directly by Windows itself. Practically, you're never going to see anything very close to that limitation being used by MemoryInUse. Some of the memory which is addressable is not going to be directly available for use by Mathematica for several reasons. But, in practice, I've been able to get a Mathematica kernel to use around 1.85Gb (if memory serves me correctly...it's been a while since I've done this) on a 2Gb system. However, there are several legitimate reasons why a Mathematica kernel could issue an out of memory message when MemoryInUse is displaying a much lower number. Without seeing your programs, I could only speculate, but here are some possibilities. * Mathematica failed in the middle of making a very large memory allocation because the memory just wasn't available. The math just doesn't work. * Mathematica failed to make a smaller (but probably still in the tens of megabytes) allocation because of memory fragmentation. Since there are only 2Gb (or 3Gb) of addressable space, the pattern of past memory allocations and frees may leave the currently allocated memory so fragmented that no really large blocks of contiguous memory are left. A single, 50Mb request for memory will fail if there are no contiguous 50Mb chunks of addressable space, regardless of how much memory is in use. * Much less commonly, it's possible to exhaust other, much more limited, memory-based resources. This includes stack space, which could affect highly recursive calculations, and limitations on the number of handles, which might come into play if accessing an extremely large number of files without subsequently closing their streams. A note about the memory fragmentation issue. If this were the problem, then quite possibly you could run your evaluations on a 64-bit version without any extra RAM or a larger paging file. The fact that a much larger addressable space is available means that getting a contiguous block of memory isn't a problem. But only the allocated blocks need to be backed by virtual memory, no matter what their addresses are. Sincerely, John Fultz jfultz at wolfram.com User Interface Group Wolfram Research, Inc. ------------------------------- On Fri, 5 Jun 2009 03:02:49 -0400 (EDT), Pillsy wrote: > Hi, all, > > I'm currently working on a problem that requires a whole lot of memory > to complete. Sometimes, depending on the specific parameters being > used, I will receive the dreaded, "No more memory available," message. > Doing my best to track memory usage with MemoryInUse, MaxMemoryUsed >  and the Windows Task Manager, I find that the kernel quits when it > hits a limit of around one gigabyte. > > I'm using Version 7.0 on 32-bit Windows XP without the /3GB switch > active; neither switching to a 64 bit OS nor activating the /3GB > switch are possibilities. However, I was under the distinct impression > that even without the /3GB switch active I should have a per-process > memory limit of 2GB, not 1GB. Is there some other reason the kernel > might quit, due to either Mathematica settings or OS settings? > > As for modifying my program I've done obvious things like setting > $HistoryLength 0, Clearing things that are no longer needed, and > none of them have done the trick in all cases I've needed to deal > with. I've also done an array of not-so-obvious things that haven't > worked either. > > Any suggestions would be greatly appreciated. > > Thanks > Pillsy ---------------------------- On Mon, 8 Jun 2009 03:05:48 -0400 (EDT), Weldon MacDonald wrote: > I'm operating on some largish problems in which I'm making a graph, > using MakeGraph from the Combinatorica package. When I exceed a > certain size, the thing runs for a bit and then gives an out of memory > error and then the kernel shuts down. I'm running Vista on a dual core > with 3Gb of RAM. > I've tracked down the problem, I think. Mathematica runs the operation > in RAM, when it exceeds the on board RAM, it shuts down. My question > is why doesn't it build a pagefile like most windows applications? > Then I could run anything that didn't actually use up the free disk > space.