Approximate entropy applied to the Pi digits

*To*: mathgroup at smc.vnet.net*Subject*: [mg52359] Approximate entropy applied to the Pi digits*From*: Roger Bagula <tftn at earthlink.net>*Date*: Wed, 24 Nov 2004 02:32:32 -0500 (EST)*Reply-to*: tftn at earthlink.net*Sender*: owner-wri-mathgroup at wolfram.com

In Ivars Peterson's MathTrek - Randomness, Risk, and Financial Markets: http://www.maa.org/mathland/mathtrek_10_11_04.html I found the reference to Steven M. Pincus' Approximate entropy. I applied it to Hofstader's sequence, the last digits of the primes and Pi's digits and the Approximate entropy came out larger in that order. My program is really slow, but it does seem to give the ApEn function as defined in the paper. It is a lot like a Lyapunov Largest exponent in the way I've calculated it, but it more a probability measure on the variables than a direct result of the variables. It is also much harder and takes longer than a Lyapunov since it has two distinct sums in it. It is more closely related to correlation dimension that Kaplan-York dimension in it's method of calculation. Clear [f,n,d,c,Phi,ApEn,a,i,j,k,r,m,g,digits] (*Steven M. Pincus,Approximate entropy as a measure of system complexity, PNAS,vol 88,pp2297-2301,March 1991,Mathematics*) digits=100 $MaxExtraPrecision =digits f[n_]:=Floor[Mod[10^n*Pi,10]] (* approximate Entropy for Pi digits sequence*) d[i_,j_,m_,n_]:=Max[Table[Abs[f[i+k-1]-f[j+k-1]],{k,1,m-1}]] c[i_,r_,m_,n_]:=N[Sum[If[d[i,j,m,n]<r,1,0],{j,1,n-1}]]/(n-m+1) Phi[r_,m_,n_]:=Sum[N[Log[c[i,r,m,n]]],{i,1,n-m+1}]/(n-m+1) ApEn[m_,r_,n_]:=Phi[r,m,n]-Phi[r,m+1,n] a=Table[ApEn[2,0.18,k],{k,3,digits}] ga=ListPlot[a,PlotJoined->True] y=Fit[a,{1,x},x] gb=Plot[y,{x,1,digits}] Show[{ga,gb}] Respectfully, Roger L. Bagula tftn at earthlink.net, 11759Waterhill Road, Lakeside,Ca 92040-2905,tel: 619-5610814 : alternative email: rlbtftn at netscape.net URL : http://home.earthlink.net/~tftn