[Date Index]
[Thread Index]
[Author Index]
Re: How to numerically estimate an asymptotic equivalent?
*To*: mathgroup at smc.vnet.net
*Subject*: [mg46368] Re: How to numerically estimate an asymptotic equivalent?
*From*: Franck Michel <Franck.Michel at wanadoo.fr>
*Date*: Mon, 16 Feb 2004 23:41:53 -0500 (EST)
*References*: <c0qjin$kai$1@smc.vnet.net>
*Sender*: owner-wri-mathgroup at wolfram.com
Thank you for your answer.
"Bill Rowe" <readnewsciv at earthlink.net> wrote:
> The basic problem is you are applying statistical methods and concepts to
a non-statistical problem.
>
> For the usual linear regression problem, you are trying to find a
relationship between a measured response and some set of predictors. The
measured response is assumed to be the sum of the true response and an error
term. The error term is assumed to come from a normal distribution with mean
= zero. The confidence intervals Regress computes are based on this
assumption about errors.
>
> But you don't have errors in the same sense in your problem. You know for
several values of n the precise value of the sequence since you have
computed it. There is no error term. Consequently, you cannot expect Regress
to produce meaningful confidence limits. In fact, the normal meaning of
confidence limits doesn't apply to your problem.
Yes, you're right. In fact, I thought about that, but I was wondering if the
informations provided by Regress could be used to give some insight about
the quality of estimations, even if the exact values of SE are not
pertinent.
> Rather than confidence limits, I think you should be more interested in
the maximum difference between the computed sequence values and the
regression model. If that difference is suitably small, then the estimated
parameters (and model) are sufficient. Note, since you are looking for the
model that is asymptotically correct, it follows the difference between the
computed sequence values and the model should decrease as n gets bigger. So,
I would plot the residuals to see this is the case.
>
> Also, this points out another difference between the usual regression
problem and the problem you are trying to solve. In the normal regression
problem you expect the residuals to be independent of the predictor
variables. This should not be the case for your problem.
Yes, I've plotted the residuals; you can see them at
http://www.medicis.polytechnique.fr/~fmichel/figure.gif
The shape of this figure is independant of the range of n, and independant
of the function estimated (I've always had the same shape in all my trials,
in particular with the divergent expansions coming from classical special
functions like erf or gamma).
There is no really decrease in the values of the residuals, but the function
estimated is rapidly increasing. If we consider the plot of the relative
value of residuals
http://www.medicis.polytechnique.fr/~fmichel/figure1.gif
there is a small decrease.
I don't know how to explain the shape of the figure (like the back of a
camel). It looks like the graph of a polynomial of degree 4.
With only 3 parameters in the estimation (s, ln(A) and beta), the plot of
the residuals is like a cubic curve; with only s and ln(A), the plot is a
parabola.
> >Are there other possible methods to numerically estimate an asymptotic
> >equivalent like this one?
>
> There are a variety of techniques. I would probably use NMinimize to
minimize ths sum of the absolute differences rather than Regress.
Thank you for this suggestion of NMinimize. I'll try it and keep you
informed.
All other suggestions are welcome. I'm not a specialist of Mathematica, in
fact, I've started to use this CAS because I noticed on the web that the
function Regress may be useful for my problem.
--
Franck
Prev by Date:
**Re: queation exporting quaternions from mathematica 5**
Next by Date:
**Re: sharing packages between multiple installations**
Previous by thread:
**Re: How to numerically estimate an asymptotic equivalent?**
Next by thread:
**Re: How to numerically estimate an asymptotic equivalent?**
| |