[Date Index]
[Thread Index]
[Author Index]
Re: Re: LeastSquares using LinearProgramming?
*To*: mathgroup at smc.vnet.net
*Subject*: [mg25] Re: [mg89032] Re: [mg88986] LeastSquares using LinearProgramming?
*From*: "Gregory Duncan" <gmduncan at gmail.com>
*Date*: Sun, 25 May 2008 02:04:10 -0400 (EDT)
*References*: <200805230705.DAA25655@smc.vnet.net>
*Reply-to*: gmduncan at econ.berkeley.edu
Beyond that, the L1 norm is preferred to L2 for statistical problems because
of resistance to outliers. If one wants to do a restricted least squares,
one can do a quadratic program, but one can also reparameterize x =Exp[y]
and perform a non-constrained optimization. One can also use a penalty
function and do an unconstrained regression. Alternatively one can do k^2
regressions, starting with no restrictions, then setting each coefficient
equal to zero and regressing on the rest, then setting each pair of
coefficients to zero and regressing on the rest, etc until n regressions on
each variable, keeping for each regression the sum of squared
residuals(SSR), then pick the regression with the smallest SSR.
Greg Duncan
Dept Economics
U California, Berkeley
On Sat, May 24, 2008 at 12:53 AM, Daniel Lichtblau <danl at wolfram.com> wrote:
> Gareth Russell wrote:
> > Hi,
> >
> > Is it possible to specify a least-squares minimization through the
> > LinearProgramming function? In other words, exactly the same as
> > LeastSquares, with the extra constraint that all x>=0?
> >
> > Presumably it comes down to specifying the input c correctly in the
> > LinearProgramming function. But I can't see how to do that such that
> > what is being minimized is the standard least-squares function
> > ||m.x-b||^2
> >
> > Thanks,
> >
> > Gareth
>
> That objective function would be quadratic, so no, LinearProgramming
> will not like that.
>
> You could instead try
>
> FindMinimum[{objective,constraints}, vars,
> Method->"QuadraticProgramming"]
>
> This method is, alas, not documented, and I'd imagine it could disappear
> (which would be a shame, because it works really well, bordering on
> magic, for some problems). A documented alternative that might work well
> is "InteriorPoint";, as it also can handle e.g. nonnegativity constraints.
>
> If you can settle for an L_1 norm (so it's no longer least squares), you
> can minimize a new variable abs, with new constraints -abs<=m.x-b<=abs.
>
> Daniel Lichtblau
> Wolfram Research
>
>
Prev by Date:
**Re: Range of Use of Mathematica**
Next by Date:
**Re: Re: How to plot a graph, using a distance matrix**
Previous by thread:
**Re: LeastSquares using LinearProgramming?**
Next by thread:
**Re: Re: LeastSquares using LinearProgramming?**
| |