MathGroup Archive 2009

[Date Index] [Thread Index] [Author Index]

Search the Archive

Re: linear regression with errors in both variables

  • To: mathgroup at smc.vnet.net
  • Subject: [mg96384] Re: linear regression with errors in both variables
  • From: Bill Rowe <readnews at sbcglobal.net>
  • Date: Thu, 12 Feb 2009 06:41:26 -0500 (EST)

Joerg wrote:

>I want to test the hypothesis that my data follows a known simple
>linear relationship, y = a + bx. However, I have (known)
>measurements errors in both the y and the x values.

>I suppose just a simple linear regression does not do here.

>Any suggestions how do test this correctly?

You've not made it clear what your end purpose is. The standard
model assumed for simple linear regression is

y = a x + b + error

where the error term is assumed to be from a normal
distribution. Possibly, doing a simple regression will give you
a good enough result even if it will no longer be the maximum
likelihood estimate when there is error in x.

You say the error is known. If you mean by this you have
definite values, then the obvious thing to do is subtract the
error from the measurements before doing a regression.

If the error is random without specific values and you truly
need maximum likelihood estimates for the regression parameters,
then you will need to create your own functions. Mathematical
details of dealing with errors in variables can be found at <http://en.wiki=
pedia.org/wiki/Total_least_squares>



  • Prev by Date: Re: Map[] and multiple args function
  • Next by Date: Re: testing if a point is inside a polygon
  • Previous by thread: Re: linear regression with errors in both variables
  • Next by thread: Re: linear regression with errors in both variables