SVD performance issues under Linux64?
- To: mathgroup at smc.vnet.net
- Subject: [mg75227] SVD performance issues under Linux64?
- From: renormalize at hotmail.com
- Date: Sat, 21 Apr 2007 23:07:41 -0400 (EDT)
I have access to two dual-core computers running Mathematica 5.2: DESKTOP: 3.2GHz Pentium D940 4GB RAM Windows XP32 SP2 Mathematica 5.2.0 BenchmarkReport[] Result = 1.74 SERVER: 3GHz Xeon 5160 8GB RAM SUSE Linux Enterprise Server 10 (x86_64) Mathematica 5.2.0 BenchmarkReport[] Result = 3.17 I intend to run a long sequence of SVDs on big matrices, with many runs requiring over 4GB of RAM and several hours to complete. With its high benchmark score and 8GB RAM, the server looks ideal for this task. But I'm troubled by the performance comparisons for the following SVD test code, run from fresh kernels: m = Table[Random[],{3000},{3000}]; Timing[SingularValueDecomposition[m];] DESKTOP: 69 sec SERVER: 96 sec It doesn't appear that the poorer performance on the server is a simple 1-processor vs. 2 issue: I can verify using the system performance monitors that both machines engage 100% of both cores during the last 10 - 20 sec of the SVD test computations. The system administrator for the server suggests that the problem may stem from linear-algebra libraries that are poorly optimized for Linux64. I'd really like to take advantage of all the hardware horsepower available on the server. The only two fixes suggested by the sys admin are: 1) Link Mathematica on the server to math libraries better optimized for Linux64. (But is this even a possibility?). 2) Install Windows XP64 on the server, and then install and run Mathematica in that environment. (A big hassle!) Can anyone offer other suggestions or insights? Regards, Ron