Re: Re: Re: LeastSquares using
- To: mathgroup at smc.vnet.net
- Subject: [mg89109] Re: [mg89085] Re: [mg89032] Re: [mg88986] LeastSquares using
- From: danl at wolfram.com
- Date: Mon, 26 May 2008 01:29:00 -0400 (EDT)
- References: <200805230705.DAA25655@smc.vnet.net>
>> Gareth Russell wrote: >> > Hi, >> > >> > Is it possible to specify a least-squares minimization through the >> > LinearProgramming function? In other words, exactly the same as >> > LeastSquares, with the extra constraint that all x>=0? >> > >> > Presumably it comes down to specifying the input c correctly in the >> > LinearProgramming function. But I can't see how to do that such that >> > what is being minimized is the standard least-squares function >> > ||m.x-b||^2 >> > >> > Thanks, >> > >> > Gareth I wrote: >> [...] >> If you can settle for an L_1 norm (so it's no longer least squares), you >> can minimize a new variable abs, with new constraints -abs<=m.x-b<=abs. >> >> Daniel Lichtblau >> Wolfram Research This elicited the response: > Beyond that, the L1 norm is preferred to L2 for statistical problems > because of resistance to outliers. > [...] > Greg Duncan > Dept Economics > U California, Berkeley Which motivates me to give a correct approach to finding the L^1 norm. (What I showed, suitably interpreted, would give the L^infinity norm. That is often useful, but not for this problem.) The expression m.x-b is a vector, of length n, say. Call it c, and thus the jth component is c[[j]]. Create a new vector d also of length n. It will denote new variables. Then we minimize Total[d], that is, the sum of components of d, subject to 2*n new constraints. For 1<=j<=n, there is the pair {c[[j]]<=d[[j]], -c[[j]]<=d[[j]]} I think (well, hope, at any rate) that I wrote it correctly this time. Daniel Lichtblau Wolfram Research
- References:
- LeastSquares using LinearProgramming?
- From: Gareth Russell <russell@njit.edu>
- LeastSquares using LinearProgramming?