[Date Index]
[Thread Index]
[Author Index]
Re: How to Solve a Quadratic Programming problem?
*To*: mathgroup at smc.vnet.net
*Subject*: [mg4216] Re: How to Solve a Quadratic Programming problem?
*From*: rubin at msu.edu (Paul A. Rubin)
*Date*: Tue, 18 Jun 1996 03:25:59 -0400
*Organization*: Michigan State University
*Sender*: owner-wri-mathgroup at wolfram.com
In article <4pisve$jk5 at dragonfly.wolfram.com>,
daniele rizzi <drizzi at chiostro.univr.it> wrote:
->I'd like to solve a QP problem, in the following form:
->
->min translate(x)*H*x
->
->subject to translate(E)*x = E_f
-> \sum_{i=1}^n x_i = 1
-> 0 <= x_i <= 1 \forall i \in (1, n)
->
->where : x is a n-term solution vector;
-> E is a n-term constrain vector;
-> E_f is a parameter of the problem;
-> H is a n x n-term Symmetric Positive (semi)Definite Matrix.
->
->Mathematica is quite good at solving linear constrained (LP) problems
-> (via Simplex-based routines) and general unconstrained instances,
->but what about quadratic objective function? Is there some "ready-made"
->routine or have I to write down my own code to tackle with that?
->
There is no built-in routine for this in the current version of
Mathematica. I don't know what is on tap for version 3.0.
You might contact Jean-Christophe Culioli (culioli at cas.ensmp.fr); I believe
he has working code to solve nonlinear programs with constraints.
If you need to do it yourself, a quick-and-dirty approach is to relax the
equation constraints into the objective as penalty terms. The problem is
that as the errors get close to zero, higher powers of them get very small,
so convergence is not great (i.e. the answer you get may not be very
accurate in satisfying the constraints). An example:
In[1]:=
H := {{5., 3., -1.}, {3., 4., 0.}, {-1., 0., 10.}};
e = {1., 4., 2.};
ef = 3.1;
u = {1., 1., 1.};
xvec = Array[ x, 3 ] (* vector of variables *)
Out[1]=
{x[1], x[2], x[3]}
In[2]:=
xlist = {#, 0.5, 0., 1.}& /@ xvec (* argument list for FindMinimum *)
Out[2]=
{{x[1], 0.5, 0., 1.}, {x[2], 0.5, 0., 1.}, {x[3], 0.5, 0., 1.}}
In[3]:=
f = (* objective function with quadratic penalties *)
Simplify[
xvec . (H . xvec) + mu1 (e . xvec - ef)^2 + mu2 (u . xvec - 1)^2
];
In[4]:=
s = (* note the huge values for the penalty coefficients mu1, mu2 *)
FindMinimum[
Evaluate[ f /. {mu1 -> 10^12, mu2 -> 10^12} ],
Evaluate[ Sequence@@xlist ]
]
Out[4]=
{2.66847, {x[1] -> 0.114286, x[2] -> 0.607143, x[3] -> 0.278571}}
In[5]:=
xx = xvec /. s[[2]];
u . xx (* check conformance to constraints *)
e . xx
Out[5]=
1.
3.1
This can be a bit dicey in general, particularly if your optimum lies on
the boundary of the feasible region (FindMinimum generates an error message
if it thinks it is about to step outside the region).
You can also use ConstrainedMin in a cutting plane scheme:
In[6]:=
const = { e . xvec == ef, u . xvec == 1 }; (* equation constraints *)
cuts = {}; (* cutting planes *)
vlist = Append[ xvec, z ]; (* variables - z = objective value *)
In[7]:=
s = ConstrainedMin[ z, Union[ const, cuts ], vlist ]
Out[7]=
{0, {x[1] -> 0.3, x[2] -> 0.7, x[3] -> 0, z -> 0}}
In[8]:=
x0 = xvec /. s[[2]]; (* compute the correct value of the quadratic *)
z0 = x0 . (H . x0)
Out[8]=
3.67
In[9]:=
cuts = Append[ cuts, z >= Simplify[ z0 + 2 (H . x0) . (xvec - x0) ] ]
Out[9]=
{z >= -3.67 + 7.2 x[1] + 7.4 x[2] - 0.6 x[3]}
In[10]:=
s = ConstrainedMin[ z, Union[ const, cuts ], vlist ] (* try again *)
Out[10]=
{0.13, {x[1] -> 0, x[2] -> 0.55, x[3] -> 0.45, z -> 0.13}}
In[11]:=
x0 = xvec /. s[[2]];
z0 = x0 . (H . x0)
Out[11]=
3.235
In[12]:=
cuts = Append[ cuts, z >= Simplify[ z0 + 2 (H . x0) . (xvec - x0) ] ]
Out[12]=
{z >= -3.67 + 7.2 x[1] + 7.4 x[2] - 0.6 x[3],
z >= -3.235 + 2.4 x[1] + 4.4 x[2] + 9. x[3]}
And so forth. You can delete some of the older cuts (first in, first out
should work) to keep the problem size compact, and limit the number of
iterations. The x values will converge to the constrained optimum
(assuming H is positive semidefinite), and the z value will converge to the
optimal value of the quadratic objective. x will always be feasible (but
suboptimal); z will be superoptimal (optimistic).
Good luck.
Paul
**************************************************************************
* Paul A. Rubin Phone: (517) 432-3509 *
* Department of Management Fax: (517) 432-1111 *
* Eli Broad Graduate School of Management Net: RUBIN at MSU.EDU *
* Michigan State University *
* East Lansing, MI 48824-1122 (USA) *
**************************************************************************
Mathematicians are like Frenchmen: whenever you say something to them,
they translate it into their own language, and at once it is something
entirely different. J. W. v. GOETHE
==== [MESSAGE SEPARATOR] ====
Prev by Date:
**Re: How can this MapAt application be done more efficiently**
Next by Date:
**Nonlinear Fit ... Errors and covariance.**
Previous by thread:
**How to Solve a Quadratic Programming problem?**
Next by thread:
**Critique of these Stieltjes Integration Functions?**
| |