14MB image very quickly exhausts 512MB RAM.
- To: mathgroup at smc.vnet.net
- Subject: [mg43453] 14MB image very quickly exhausts 512MB RAM.
- From: "Virgilio, Vincent" <Vincent.Virgilio at itt.com>
- Date: Wed, 17 Sep 2003 07:59:19 -0400 (EDT)
- Sender: owner-wri-mathgroup at wolfram.com
Hello, I cannot get Mathematica 5.0 (Windows NT, 512MB RAM) to efficiently manipulate a 14MB data set, which is a 2D image in a single file. The file contains a sequence of unsigned 16-bit integers stored in row-major order. I started with Utilities`BinaryFiles` - impossibly slow. Then, the more promising Mathlink program binary.exe, which provides package FastBinaryFiles, a replacement for Utilities`BinaryFiles`. That reads the data into memory in reasonable time (~ 1 min). But memory usage of the Mathematica kernel skyrockets; it increases about 70MB. After that, about all I can do is partition the data once. Any further manipulation exhausts memory quickly, MathKernel memory usage reaches ~350MB, and the machine grinds to a virtual (memory) halt, pun intended. Similar things happen when I ReadList bytes. Data is read much faster than even FastBinaryFiles (~ 4 sec, great!), but memory usage is just as excessive. I can't even translate bytes to unsigned ints - virtual memory thrashes (again) on the conversion function. Packed arrays don't seem to help. As a long shot, I though perhaps the 'arbitrary precision' data structures was hogging memory, so I set $MinPrecision=$MaxPrecision=16. No improvement. So, my Mathematica chops are not up to a seemingly common task. But perhaps I am misapplying Mathematica. I have seen a reference in somewhere in Mathematica documentation which refers to 400kb files as 'large'. If that is Wolfram Inc's frame of reference, then I have to change horses, to another system or Python/Mayavi/Vtk, since 14MB is only a small test case. I had hoped to process much larger chunks of much larger images (512MB). Regards, Vince Virgilio