MathGroup Archive 2003

[Date Index] [Thread Index] [Author Index]

Search the Archive

Re: WeibullDistribution

  • To: mathgroup at smc.vnet.net
  • Subject: [mg42671] Re: WeibullDistribution
  • From: Bill Rowe <listuser at earthlink.net>
  • Date: Sat, 19 Jul 2003 03:20:07 -0400 (EDT)
  • Sender: owner-wri-mathgroup at wolfram.com

On 7/18/03 at 5:25 AM, robert.nowak at ims.co.at (Robert Nowak) wrote:


> > The primary disadvantage of using NonlinearFit is the difficulty in
> > finding the true least squares fit, i.e., the set of paramerters
> > that makes the summed square error globally minimal. It is often
> > the case there are several local minina and it is easy for the
> > non-linear algogrithm to get trapped in a local minima. And the
> > real difficulty is there is no simple way of determining when this
> > happens.

> could it be that problems which are transformable in to equivalent
> linear problems (such as in the weilbull case) don't encounter the
> problem of more than one local minima ?

I don't know whether this is true in general or not. I doubt that it is. And even if it is, it would probably only be true in the ideal situation, i.e., when you know the true probability of getting the data set you have. In practical situations, there will be measurement noise and other issues that result in some small non-zero offset from the true values. I am sure this will be sufficient to result in multiple minima except for the most simple models. At least, my experience with attempting to do a Nonlinear regression analysis to find the Weibull parameters certainly indicates there are multiple minima when using real data.

> > From a practical standpoint, no the linear fit to the transformed
> > problem is good as is with out adjustments. Generally, the
> > uncertainty in the fitted parameters is larger than the bias
> > particularly when attempting to find parameters for a given
> > distribution.

> oh yes, but now the quest is for the "BEST" method

I can see some point in finding faster, more computationally efficient methods or methods that result in reduced uncertainty in the final estimates. But I don't see any purpose doing additional work to get a more theorectically correct answer when it differs by a quicker method by less than the uncertainty in the esitmate.


  • Prev by Date: External Refs in Compile'd: Speed Impact
  • Next by Date: RE: Re: Re: New version, new bugs
  • Previous by thread: Re: WeibullDistribution
  • Next by thread: ListConvolve?