Re: Need speed partitioning WeatherData
- To: mathgroup at smc.vnet.net
- Subject: [mg112525] Re: Need speed partitioning WeatherData
- From: "Hans Michel" <hmichel at cox.net>
- Date: Sat, 18 Sep 2010 07:27:10 -0400 (EDT)
Try
Apply or Map AboluteTime to your list
In[1]:= AbsoluteTime[{1990,10,1,0,0,0}]
Out[1]= 2863728000
And used many of the grouping, select and partition functions available in
Mathematica. You can even use subtraction.
Hans
-----Original Message-----
From: P. Fonseca [mailto:public at fonseca.info]
Sent: Friday, September 17, 2010 5:41 AM
To: mathgroup at smc.vnet.net
Subject: [mg112525] [mg112488] Need speed partitioning WeatherData
Hi,
Meanwhile I figured out that the problem comes from the time
DateDifference takes.
One process of DateDifference costs 0.0045 seconds on my computer (ex.
DateDifference[{1990, 10, 1, 0, 0, 0}, {1999, 10, 1, 0, 0, 0}]).
In all the different algorithms I implemented, each one of the 500 000
samples, is "DateDifferenced" against around 3 of its neighbors. This
means:
3 * 500 000 * 0.0045 = 6750 seconds !!!!!
I can now imagine a way of partitioning the data without applying so
many times the DateDifference function: one first run to register the
differences between every consecutive pair, and then I just work with
these difference values.
Nevertheless, this still means 500 000*0.0045=2250 seconds!
Should I create my own DateDifference function (less options -> more
fast)? Does someone has a simpler solution for a 100x speed up? (I
already dropped the illusion of the couple of seconds...)
Thank you in advance,
Pedro