Re: Parametric curve fitting
- To: mathgroup at smc.vnet.net
- Subject: [mg18601] Re: [mg18568] Parametric curve fitting
- From: "Mark E. Harder" <harderm at ucs.orst.edu>
- Date: Tue, 13 Jul 1999 01:01:30 -0400
- Sender: owner-wri-mathgroup at wolfram.com
Virgil;
Just a suggestion; I don't have time now to demonstrate this idea now,
but it is similar to data analyses I do all the time, and I'll describe it
in words.
Instead of treating your sampled time series as functions with noise
and searching them pairwise for correlations, treat them as a set of m
samples from Real N-space (assuming that the entries are Real numbers).
Place the N-vectors as columns in a matrix, call it A. Then the correlation
between the i-th and j-th columns in A is proportional to the i,j-th element
of the correlation matrix (Transpose[A].A) . To discover significant
correlations, we need to filter noise out of this matrix, which I believe
can be done using Mathematica's SingularValueDecomposition function. The
SVD of matrix A returned as the product of 3 matrices, i.e.
A=U.S.Transpose[V]. U and V are orthonormal matrices. S is a diagonal
matrix with its largest elements listed first, and the columns of U and the
rows of V are listed in order of decreasing significance in approximating A.
See a good book on applied linear algebra for the algebraic and statistical
details, eg. Noble & Daniel. Now, the correlation matrix, Transpose[A].A=
V.[S^2] .Transpose[V]. (The columns of V are eigenvectors and the elements
of S^2 the corresponding eigenvalues of the correlation matrix. ) To
eliminate noise, examine elements of S, looking for a sharp drop-off where
adding more vectors to the expansion only reconstructs noise, then drop all
elements of S^2 and corresponding columns of V beyond that point, and
reconstruct the correlation matrix only with the significant elements
remaining. The correlations in the filtered correlation matrix should now
be mostly noise-free, and therefore visible.
NB. Mathematica returns U and V as their transposes!
If you have any questions, e-mail me and I will try to answer.
-mark
-----Original Message-----
From: Virgil Stokes <virgil.stokes at neuro.ki.se>
To: mathgroup at smc.vnet.net
Subject: [mg18601] [mg18568] Parametric curve fitting
>Suppose that I have a set of experimental time series data, say
>
> x1(t) , t = 1,2,...N
> x2(t), t = 1,2,...N
> x3(t), t = 1,2,...N
> .
> .
> .
> xm(t), t = 1,2,...N
>
>where, each represents a sequence of sampled values (containing
>measurement error). I would like to see what pairs (if any)
>are linearly related. That is, if I plotted x1(t) vs x2(t) and this
>appeared that it could be approximated by a straight line then
>one might assume that these could be linearly related. However, this
>is not a simple regression problem since both of the measured
>time series which we are trying to fit contain measurement errors
>and thus one of the basic assumptions of ordinary regression is violated.
>
>Are there any Mathematica functions available that could be used
>for this type of time series fitting problem?
>
>-- Virgil
>
>