Services & Resources / Wolfram Forums
-----
 /
MathGroup Archive
2003
*January
*February
*March
*April
*May
*June
*July
*August
*September
*October
*November
*December
*Archive Index
*Ask about this page
*Print this page
*Give us feedback
*Sign up for the Wolfram Insider

MathGroup Archive 2003

[Date Index] [Thread Index] [Author Index]

Search the Archive

Re: 14MB image very quickly exhausts 512MB RAM.

  • To: mathgroup at smc.vnet.net
  • Subject: [mg43483] Re: 14MB image very quickly exhausts 512MB RAM.
  • From: Jens-Peer Kuska <kuska at informatik.uni-leipzig.de>
  • Date: Thu, 18 Sep 2003 05:38:45 -0400 (EDT)
  • Organization: Universitaet Leipzig
  • References: <bk9js5$1e$1@smc.vnet.net>
  • Reply-to: kuska at informatik.uni-leipzig.de
  • Sender: owner-wri-mathgroup at wolfram.com

Hi,

set

$HistoryLength=0;

and use UnSet[] and Share[] when ever possible.
Set the attributes of all functions you write to
Hold* so that Mathematica does not create a copy of
your data set.

You should keep in mind that Mathematica store
always at least 32 bit integers and your 16 bit integer
image need at least twice  of the memory.

I have 256 - 400 MByte volume data sets and Mathematica
work fine on it, with 1.5 GByte RAM.

Regards
  Jens


"Virgilio, Vincent" wrote:
> 
> Hello,
> 
> I cannot get Mathematica 5.0 (Windows NT, 512MB RAM) to efficiently manipulate a 14MB data set, which is a 2D image in a single file. The file contains a sequence of unsigned 16-bit integers stored in row-major order.
> 
> I started with Utilities`BinaryFiles` - impossibly slow.
> 
> Then, the more promising Mathlink program binary.exe, which provides package FastBinaryFiles, a replacement for Utilities`BinaryFiles`. That reads the data into memory in reasonable time (~ 1 min). But memory usage of the Mathematica kernel skyrockets; it increases about 70MB. After that, about all I can do is partition the data once. Any further manipulation exhausts memory quickly, MathKernel memory usage reaches ~350MB, and the machine grinds to a virtual (memory) halt, pun intended.
> 
> Similar things happen when I ReadList bytes. Data is read much faster than even FastBinaryFiles (~ 4 sec, great!), but memory usage is just as excessive. I can't even translate bytes to unsigned ints - virtual memory thrashes (again) on the conversion function.
> 
> Packed arrays don't seem to help. As a long shot, I though perhaps the 'arbitrary precision' data structures was hogging memory, so I set $MinPrecision=$MaxPrecision=16. No improvement.
> 
> So, my Mathematica chops are not up to a seemingly common task.
> 
> But perhaps I am misapplying Mathematica. I have seen a reference in somewhere in Mathematica documentation which refers to 400kb files as 'large'. If that is Wolfram Inc's frame of reference, then I have to change horses, to another system or Python/Mayavi/Vtk, since 14MB is only a small test case. I had hoped to process much larger chunks of much larger images (512MB).
> 
> Regards,
> 
> Vince Virgilio


  • Prev by Date: Re: Nonzero Min[]
  • Next by Date: how do i access the inside of the interpoating functions
  • Previous by thread: Re: 14MB image very quickly exhausts 512MB RAM.
  • Next by thread: Nonzero Min[]