MathGroup Archive 2012

[Date Index] [Thread Index] [Author Index]

Search the Archive

Re: Importing large file into table and calculating takes a long time. How to improve efficiency?

  • To: mathgroup at smc.vnet.net
  • Subject: [mg126338] Re: Importing large file into table and calculating takes a long time. How to improve efficiency?
  • From: j p <ipschka at yahoo.com>
  • Date: Wed, 2 May 2012 05:45:06 -0400 (EDT)
  • Delivered-to: l-mathgroup@mail-archive0.wolfram.com
  • References: <201205010924.FAA05249@smc.vnet.net> <CAEtRDSdh3peZUoq9agRw_T74Z262s-ZXJF3uMnpujC5K7tybHg@mail.gmail.com>
  • Reply-to: j p <ipschka at yahoo.com>

Thanks Bob for your suggestions. I certainly like the Map and Norm functions and I knew there had to be a better way to convert from hex to decimal. 
You were right about your assumption about not defining dataX, dataY, and dataZ, I had left those out by mistake from the code I included in my original question.


But when I use the line 

  

Map[FromDigits[#, 16] &, dataHex, {2}] 


I get the error message:


FromDigits::nlst: The expression 0.` is not a list of digits or a string of valid digits.

The input data may look something like this:

{{0., 24, "009d"}, {0., 28, 9}, {0., 28, 99}, {"00dc", 27, 98}, {0., 29, 95},

where there is a decimal point when the value is zero. So I added the following to the import function to replace all occurrences of "0.":


  • Prev by Date: Re: Importing large file into table and calculating takes a long time. How to improve efficiency?
  • Next by Date: Re: moving average function
  • Previous by thread: Importing large file into table and calculating takes a long time. How to improve efficiency?
  • Next by thread: Re: Importing large file into table and calculating takes a long time. How to improve efficiency?