MathGroup Archive 2012

[Date Index] [Thread Index] [Author Index]

Search the Archive

Re: Surface Smoothing

  • To: mathgroup at
  • Subject: [mg127638] Re: Surface Smoothing
  • From: "Nicholas Kormanik" <nkormanik at>
  • Date: Thu, 9 Aug 2012 03:53:03 -0400 (EDT)
  • Delivered-to:
  • Delivered-to:
  • Delivered-to:
  • Delivered-to:
  • References: <jvo0fu$cfq$> <>
  • Reply-to: <nkormanik at>

Good point, Kevin.

The spiky behavior does make things rather scary.

By incorporating additional factors (beyond the two in the contour map) I
hope to minimize the bad spikes, and maximize the good ones.  For now,
though, I hope to find a relatively decent "neighborhood," should such
actually exist - i.e., the "sweet spot."

An analogy might be:  A dangerous minefield.  If I absolutely have to walk
through it, I'd like to try to ascertain the path with the lowest
probability of being blown up.

Nicholas Kormanik

-----Original Message-----
From: Kevin J. McCann [mailto:kjm at] 
Sent: Tuesday, August 07, 2012 9:16 AM
To: nkormanik at
Subject: [mg127638] Re: Surface Smoothing


Any smoothing implicitly assumes that you "know" what the data should look
like. So, I assume that you know that the spiky behavior is not "correct".
Given that, how about a LSQ fit to some satisfactorily smooth function, e.g.
a 2d polynomial or a truncated Fourier series?



  • Prev by Date: Re: Text Alignment in Graphics[]
  • Next by Date: Re: Simplify Binomial
  • Previous by thread: Re: Surface Smoothing
  • Next by thread: Why the form of constraint affects the result of NMinimize?