Re: Re: Problem with BinCounts
- To: mathgroup at smc.vnet.net
- Subject: [mg90950] Re: [mg90927] Re: Problem with BinCounts
- From: Darren Glosemeyer <darreng at wolfram.com>
- Date: Thu, 31 Jul 2008 02:57:16 -0400 (EDT)
- References: <g6mas3$i0d$1@smc.vnet.net> <200807300751.DAA17667@smc.vnet.net>
Valeri Astanoff wrote:
> On 29 juil, 07:46, Mark Teagarden <Mark.Teagar... at UTSA.EDU> wrote:
>
>> Hi,
>>
>> I'm trying to do an autocorrelation analysis on an artificially generated
>> neuronal spike train, but BinCounts is behaving strangely.
>>
>> I first generate a list of spike times for an oscillator with a mean
>> frequency of 1 Hz, plus some gaussian jitter:
>>
>> st = NestList[(1 + RandomReal[NormalDistribution[0, 0.1]] + # &), 0,100=
>>
> 0];
>
>> Then I do the autocorrelation like this:
>>
>> BinCounts[st, {# - 2.00, # + 2.00, 0.01}] & /@ st;
>>
>> ac = Last[Accumulate[%]]/Length[%];
>>
>> The output of BinCounts ought to be a list of length st, with each list
>> element having a fixed length of 400. And yet I get error messages lik=
>>
> e so:
>
>> Thread::tdlen: Objects of unequal length in
>> \
>> {13,11,15,9,12,13,14,13,9,6,<<390>>}+{0,1,0,0,0,0,0,0,0,0,<<389>>}
>> \
>> cannot be combined. >>
>>
>> And sure enough, if I run Dimensions/@ the output of the BinCounts line, =
>>
> I
>
>> get one or two elements with a length of 399. This happens completely =
>>
> at
>
>> random, and I don't understand why. It seems to me that BinCounts ough=
>>
> t to
>
>> give me a fixed output length, regardless of what the results of the
>> operation actually were. These shorter elements do not occur near the
>> beginning or end, either, so it's not an edge effect. Furthermore, the
>> values of st corresponding to the shorter elements of ac don't have any
>> unusual properties like being whole numbers or anything. I will add th=
>>
> at
>
>> sometimes the code works correctly, and it's not obvious why.
>>
>> Has anyone else encountered such behavior before? What did you do abou=
>>
> t it?
>
>
> Good day,
>
> All I can say is that if you rationalize 'st'
> and all parameters then it doesn't occur :
>
> st=NestList[(1+RandomReal[NormalDistribution[0,0.1]]+#&),
> 0,1000]//Rationalize[#,10^-6]&;
>
> BinCounts[st, {# - 2, # + 2, 1/100}] & /@ st;
>
>
> V.Astanoff
>
>
It's clear that this is a consequence of inexact arithmetic, but not yet
clear if it is avoidable in general. Basically, a small amount of
numerical error is enough to cause a bin to not be included (effectively
something like a step going slightly beyond the end point in the inexact
representation, though not with exact numbers) for some starting values.
A way around this is to shift the data being binned rather than the bins
themselves, as we see here in 100 trials
In[1]:= Table[st =
NestList[(1 + RandomReal[NormalDistribution[0, 0.1]] + # &), 0,
1000];
bc2 = BinCounts[st - #, {-2.00, 2.00, 0.01}] & /@ st;
Map[Length, bc2] // Union, {100}] // Union
Out[1]= {{400}}
and better still to do this and use an exact binning spec to avoid any
numerical error in the bins.
In[2]:= Table[st =
NestList[(1 + RandomReal[NormalDistribution[0, 0.1]] + # &), 0,
1000];
bc2 = BinCounts[st - #, {-2, 2, 1/100}] & /@ st;
Map[Length, bc2] // Union, {100}] // Union
Out[2]= {{400}}
Darren Glosemeyer
Wolfram Research
- References:
- Re: Problem with BinCounts
- From: Valeri Astanoff <astanoff@gmail.com>
- Re: Problem with BinCounts