MathGroup Archive 2009

[Date Index] [Thread Index] [Author Index]

Search the Archive

Re: linear regression with errors in both variables

  • To: mathgroup at smc.vnet.net
  • Subject: [mg96285] Re: linear regression with errors in both variables
  • From: dh <dh at metrohm.com>
  • Date: Wed, 11 Feb 2009 05:17:06 -0500 (EST)
  • References: <gmrmga$a1k$1@smc.vnet.net>


Hi Joerg,

a least square procedure minimizes (yregi-yi)^2, where yi is a measured 

value and yregi the value of the regression line. In your case you want 

to minimize the squares sum  of the distance perpendicular to the line.

Let's denote the line by a+b x and assume that the data is in d, the we 

get the squares sum by:

res[a_, b_] =

  1/(1 + b^2) Plus @@ (((b #[[1]] + a - #[[2]])^2) & /@ d )

we then minimize this expression over a and b:

sol = Minimize[res[a, b], {a, b}]



hope this helps, Daniel



Joerg wrote:

> Hi,

> 

> I want to test the hypothesis that my data

> follows a known simple linear relationship,

> y = a + bx. However, I have (known) measurements

> errors in both the y and the x values.

> 

> I suppose just a simple linear regression

> does not do here.

> 

> Any suggestions how do test this correctly?

> 

> Thanks,

> 

> joerg

> 




  • Prev by Date: Re: Usage of #1
  • Next by Date: Re: Mathematicas simplifications
  • Previous by thread: Re: linear regression with errors in both variables
  • Next by thread: Re: linear regression with errors in both variables