MathGroup Archive 1996

[Date Index] [Thread Index] [Author Index]

Search the Archive

Re: Constrained Min, non-linear

  • To: mathgroup at smc.vnet.net
  • Subject: [mg4590] Re: Constrained Min, non-linear
  • From: "Stas" <URYASEV at dnenet.nov.dne.bnl.gov>
  • Date: Wed, 21 Aug 1996 03:25:13 -0400
  • Sender: owner-wri-mathgroup at wolfram.com

>From:             elsner at avalon.msfc.nasa.gov (Ron Elsner)
>Subject:          [mg4510] Re: Constrained Min, non-linear

>Hi,

>I'm not yet quite familiar with all Mathematica features (who is?) and I'm 
>looking for a solution of a rather simple problem. I can easily solve it by 
>hand but I wanna know how to do this with Mathematica. Here it is:

>Where is f(x,y)=x^2+y^2 minimal, if x and y are
>constrained to abs(x+y)=1 ?

>Is it possible to solve this without programming?

>Please answer via email, thanks in advance, Stefan.

Stefan asks for an email response.  I would also like to see the 
answer to this posted here.

Ron Elsner
NASA/MSFC

++++++++++++++++++++++++++++++++++++++++++++++++++++++++

As it formulated, this is an optimization problem with a nonsmooth constraint.  There are
a lot of publications in the area of nonsmooth optimization.  For example, algorithms
 to solve nonsmooth problems are described in

Uryasev S.P. " New  Variable Metric Algorithms for Nondifferentiable Optimization
Problems." Journal of Optimization Theory and Applications  Vol. 71, No. 2, 359-388, 1991 .

I have FORTRAN codes implementing the algorithms described in this paper. Recently, 
I translated one FORTRAN code to the MATHEMATICA language. I am planing to submit
this code in MathSource. Implemented version of the algorithm solves unconstrained 
optimization problems. Therefore, I used a nonsmooth penalty function to include the constraint
in the objective function. The problem has two local extremums. I started the algorithm from
different initial approximations; algorithm converged to point (0.5,0.5) or (-0.5,-0.5)
depending upon the initial approximation.

This code needs two functions calculating the objective function and a generalized gradient.
Further, I provide the codes of these functions with self explained identificators. I present
two runs of the code leading to different local extremums.

                           Stas Uryasev,
                           Brookhaven National Laboratory, (516) 3447849


Needs["NonsmoothOptimization`VariableMetricAlgorithm`"];

fun[{x1_,x2_}]:=
Module[{fn,PenaltyConstant,Constraint,ObjectiveFunction},
PenaltyConstant=10;
Constraint=Sign[x1+x2]*(x1+x2) -1;
ObjectiveFunction=x1^2+x2^2;
fn= ObjectiveFunction + 
     PenaltyConstant*Sign[Constraint]*Constraint;
Return[fn]
];

grad[{x1_,x2_}]:=
Module[{gr,PenaltyConstant,GrConstraint,GrObjectiveFunction,
        Constraint},
PenaltyConstant=10;
Constraint=Sign[x1+x2]*(x1+x2) -1;
GrConstraint={Sign[x1+x2],Sign[x1+x2]};
GrObjectiveFunction={2*x1,2*x2};
gr= GrObjectiveFunction + 
     PenaltyConstant * Sign[Constraint] * GrConstraint;
Return[gr]
];


x0={10,2};
VariableMetricAlgorithm[x0,NumberIterations -> 200,
ArgumentPrecision->10^-20]
VariableMetricAlgorithm::lstop3: Algorithm performed 
                                specified NumberIterations
{0.5, {x1 -> 0.5, x2 -> 0.5}}



x0={-10,-1};
VariableMetricAlgorithm[x0,NumberIterations -> 200,
ArgumentPrecision->10^-20]
VariableMetricAlgorithm::lstop3: Algorithm performed 
                                specified NumberIterations
{0.5, {x1 -> -0.5, x2 -> -0.5}}

==== [MESSAGE SEPARATOR] ====


  • Prev by Date: Output to a file
  • Next by Date: Re: suppressing initialization
  • Previous by thread: Re: Constrained Min, non-linear
  • Next by thread: Re: Horner scheme function ?