[Date Index]
[Thread Index]
[Author Index]
Difficult (?) 2variable saddlepoint problem with sum of Gaussians
 To: mathgroup at smc.vnet.net
 Subject: [mg5757] Difficult (?) 2variable saddlepoint problem with sum of Gaussians
 From: Michael Hucka <hucka at eecs.umich.edu>
 Date: Tue, 14 Jan 1997 10:42:32 0500
 Organization: University of Michigan EECS, Ann Arbor, Mich., USA
 Sender: ownerwrimathgroup at wolfram.com
I have the following problem with which I hope someone can lend me a hand.
In three dimensions, I need to space two overlapping, elliptical Gaussian
"mountains" as far apart as possible but not so far apart that a saddle point
forms between them. To visualize this, in two dimensions, picture the sum of
two Gaussians
s(x) = g(x) + g(x)
When the function s is graphed with increasing values of x > 0, the two
Gaussians are spaced farther and farther apart, and you get a range of shapes:
_
/ \ __ _ _
/ \ / \ / \/ \
/ \ / \ / \
/ \ / \ / \
' ` ' ` ' `
Fig. 1 Fig. 2 Fig. 3
x = 0 x small x larger
In two dimensions, it is possible to find the largest value that x can take
on without the dip appearing in the top of the sum in Figure 3. In my three
dimensional case, I need to express the Gaussians as polar coordinate
functions of distance rho at a angle theta. In three dimensions, then, the
mountains are arranged as shown in the following horizontal cross section
through their bases (and the separation is exaggerated):
theta = 0
_ ^ _ rho Assume polar coordinates of theta and
\  / rho, and a sum of Gaussians of the form
@ @  @ @
@ \ @  @ / @ s(theta,rho) = g(theta,rho) + g(theta,rho)
@ \ @  @ / @
@ \ @  @ / @ When theta = 0, the two blobs will lie on
_ @ @  @ @ top of each other and add into one peak
/ \ <>/ (a local maximum). As theta increases,
one \  / theta the blobs will spread apart and the sum
Gaussian \  / will decrease, and eventually a saddle
sliced \/ point will appear.
horizontally +

What I need to do is to find the largest value of the angle theta before the
saddle point appears between the two Gaussian "mountains".
If I understand it correctly from my basic calc (and I'm quite rusty at
this), one way of testing for an extreme value in some function f(x,y) of two
variables is also to use a second derivative test, which goes like this:
1) Find where the first partial derivative in each direction = 0 or
one or both fails to exist.
2) If there is a point f(a,b) at which they're both zero, examine the 2nd
partial derivatives in each direction as well as the combined sum
fxx fyy  fxy^2 (where fxx is the second partial derivative in the x
direction, fyy is the 2nd deriv. in the y direction, and fxy is the
partial in each direction successively). The conditions are:
a) if fxx < 0 and fxx fyy  fxy^2 > 0, it's a relative max
b) if fxx > 0 and fxx fyy  fxy^2 > 0, it's a relative min
c) if fxx fyy  fxy^2 < 0 it's a saddle point
d) the test is inconclusive if fxx fyy  fxy^2 = 0.
First, it is not clear to me how to express the transition point when the
saddle point begins to appear. I thought that maybe it would be correct to
look for where fxx fyy  fxy^2 = 0, but actually that is explicitly a
condition where the test is supposed to be inconclusive.
Second, I've tried to solve the equations with the help of Mathematica 2.2,
but the combination of cosine & sine terms and exponentials makes solving for
the various zeros very difficult. The particular Gaussian I need to use is:
2 2 2 2 2 2
n 2 Pi rho (sx Cos[theta] + sy Sin[theta])
g(theta,rho)= (2 Pi rho Cos[theta]) E
The exponent n is an integer >= 1. The sigmas, sx and sy, are positive
reals. (This equation was converted from one in cartesian coordinates to
polar coordinates, because I think it is easier to express the angular
relationship between the Gaussians. That's why the sigmas are expressed in
the x and y directions.) Although n is shown as a parameter, I have only a
finite number of value of n for which solutions are needed (n = 17), so I
can in fact examine the cases individually and thus eliminate n from the eq.
I am hoping the solution(s) will have the form of an expression of theta in
terms of sx and sy.
As mentioned above, for my purposes, it's enough to look at the sum of two
such Gaussians,
s(theta,rho) = g(theta,rho) + g(theta,rho)
Taking partial derivatives with respect to rho and theta of the sum leads to
quite an ugly mess. I've tried various manipulations and simplifications,
and using the solving facilities in Mathematica to try to obtain a solution
to this. It appears that the first partial derivatives in the rho and theta
direction will go to zero when
_
\/n
rho = 
____________________________
/ 2 2 2 2
2 Pi \/ sx Cos[theta] + sy Sin[theta]
Next, I tried looking at the second partial derivative of s(theta,rho) with
respect to rho. It is possible to substitute back the expression above for
rho into this second derivative, and then to try to solve for the value of
theta. To make it easier, I've set n = 1 for the first case. Unfortunately,
the result is still either too complex to solve, or I'm doing something badly
wrong, because I get indeterminate results. Through various manipulations
I've been able to massage the expressions so that they involve only cosines
and exponentials, and that allows the Solve[] function to work at least
partially, but then I get answers such as theta = Pi/2, which must be wrong.
This makes me think I am approaching the problem badly.
Can anyone offer suggestions for alternative (and hopefully easier) solution
paths to this problem? I probably don't understand how saddlepoint problems
should be solved.

Mike Hucka hucka at umich.edu <URL: http://ai.eecs.umich.edu/people/hucka>
PhD wannabe, computational models of visual processing (AI Lab) University
UNIX systems administrator & programmer/analyst (EECS DCO) of Michigan
Prev by Date:
Coords of intersecting lines...
Next by Date:
mathematica 2.2.x >1
Previous by thread:
Coords of intersecting lines...
Next by thread:
mathematica 2.2.x >1
 