MathGroup Archive 1999

[Date Index] [Thread Index] [Author Index]

Search the Archive

Re: Simultaneous Forward and Reverse Polynomial Fits ?

  • To: mathgroup at smc.vnet.net
  • Subject: [mg20847] [mg20847] Re: [mg20817] Simultaneous Forward and Reverse Polynomial Fits ?
  • From: Daniel Lichtblau <danl at wolfram.com>
  • Date: Wed, 17 Nov 1999 03:41:06 -0500 (EST)
  • References: <199911142314.SAA02176@smc.vnet.net>
  • Sender: owner-wri-mathgroup at wolfram.com

Steve wrote:
> 
> Is there a way to fit a set of x,y pairs to two polynomials such that
> equations of the form
> 
> y=f(x)   (1)
> 
> AND
> 
>  x=f(y)   (2)
> 
> are returned ?
> 
>  I can do this in 2 separate steps by using the Fit function to get
> y=f(x) and then repeating this to get x=f(y) and assuming a certain
> polynomial structure. But am looking for a better way that could take
> into account the performance of both regressions simultaneously.
> 
> In other words, when I perform fit #1, Mathematica has no knowledge
> that when a y1 value is computed from equation 1 using x1 as input
> that when this same y1 is provided to equation 2 that x1 should be
> returned. In general I doubt that it can be expected that exact values
> be wrapped, but perhaps the difference between the x1's could be
> minimized. The only such minimization that takes place (with my
> method) is to the degree of the individual data fits.
> 
> There seems to be 3 separate regression performances operating here:
> 
> 1) how well does equation 1 match the data ?
> 
> 2) how well does equation 2 match the data ?
> 
> and
> 
> 3) how well does the output of equation 2 match the input of equation
> 1 when the output of equation 1 is used as input to equation 2 ?
> 
> I'm using Mathematica 3.0 and polynomials greater than 8th power.
> 
> Thanks for any help.
> 
> Steve
> For any e-mail responses, please remove *NOSPAM* from my address.

Perhaps what you want to do really needs to be quantified according to
something you want to minimize. Here is a possibility.

Say you want to approximate 

y = f[x] = a_m*x^m + ... + a_1*x + a_0

and similarly

x = g[y] = b_n*x^n + ... + b_1*x + b_0

Step 1: Do as you have done to get linear least-squares fits for f[x]
and g[y] independently. This will be an initial guess for
a_0,...,a_m,b_0,...,b_n.

Step 2: Use these initial guesses for a nonlinear minimization. The
objective function will be a sum of squares formed by

(y1 - (a_m*(b_n*y1^n+...+b_1*y+b_0)^m + ... +
	a_1*(b_n*y1^n+...+b_1*y+b_0) + a_0) )^2 + ... + 
(x1 - (b_n*(a_m*x1^m+...+a_1*x1+a_0)^n + ...) )^2 + ...

In other words, find {a_0,...,a_m,b_0,...,b_n} that minimize

Sum[(yj-f[g[yj]])^2, {j,t}] + Sum[(xj-g[f[xj]])^2, {j,t}]

where t is the number of data points.

As a couple of remarks, it is best that t be alot larger than Max[m,n]
(I believe t>Max[m,n]^2 is recommended). Also it may give better
numerical results if you fit rational functions rather than polynomials.


Daniel Lichtblau
Wolfram Research


  • Prev by Date: Re: Help with geometry problem required.
  • Next by Date: Re: Graphing 4D functions
  • Previous by thread: Re: Options for Limit.
  • Next by thread: Re: Distinct Compositions