Re: convolution vs. NMinimize
- To: mathgroup at smc.vnet.net
- Subject: [mg52445] Re: convolution vs. NMinimize
- From: Bill Rowe <readnewsciv at earthlink.net>
- Date: Sat, 27 Nov 2004 01:40:56 -0500 (EST)
- Sender: owner-wri-mathgroup at wolfram.com
On 11/26/04 at 1:04 AM, lupos at cheerful.com (robert) wrote:
>especialy (but not only) with samplelegths which are powers of 2
>convolution can be done efficiently by FFT (Mathematica Fourier[])
>the "convolution theorem" states that to perform convolution(f,g)
>you may take the the fourier transforms F=Fourier[f] and
>G=Fourier[g] and get the convolution by calculating
>InverseFourier[F*G]
>for a sample length n
>Fourier[] calculates in n*Log[n] time (because of FFT)
>ListConvolve[] calculates in n*n time
>the differnece is enormous for large n.
I am pretty sure ListConvolve makes use of the fast fourier transform. In fact, a quick check
data=Table[Random[],{65536}];
Timing[ListConvolve[{1,-1},data];]
{0.04 Second,Null}
data=Table[Random[],{2^20}];
Timing[ListConvolve[{1,-1},data];]
{0.8 Second,Null}
2^20 Log[2^20]/(2^16 Log[2^16])//N
20.
indicates the time for ListConvolve is proportional to n Log[n]
--
To reply via email subtract one hundred and four