       Re: Solving General Linear Equations

• To: mathgroup at smc.vnet.net
• Subject: [mg71962] Re: Solving General Linear Equations
• From: dh <dh at metrohm.ch>
• Date: Wed, 6 Dec 2006 06:03:40 -0500 (EST)
• Organization: hispeed.ch
• References: <ekmepd\$8ll\$1@smc.vnet.net>

```
Hello David,

Consider vector spaces A,A',B,B' of dimensions a,a',b,b', with

A'=subspace of A and B'=subspace of B and b'=a'<b<a. M maps A into B but

because the rank of M < b, A is mapped onto B'.

Consider the NullSpace A0 of M, that is all vectors of A that are mapped

to zero of B. We may now split A into A0 and A', the complement of A0

relative to A (let zero be an element of both A' and A0, otherwise A' is

no vector space). Consider further B', the image of A. We see that y

must be contained in B' otherwise there will be no solution. The mapping

from A' to B' is invertible, denote the vector from A' mapped into y as

x'. The solution will therefore be x=x'+A0 (a coset vector space).

To get x' we need the inverse of the mapping A'->B'. This could be done

"by hand", but in Mathematica we have "PseudoInverse" (Moore Penrose Inverse

that comes in handy. Although PseudoInverse is usually used to get rid

of spurious dimensions in measurements it more or less does what we want

(provided we have no really small eigenvalues): x'=PseudoInverse[M].y

A basis of A0 can be obtained by NullSpace[M] and an arbitrary vector

from A0 we get by: x''.NullSpace[M], where x'' is a vector of dimension

a-a', the dimension of A0, with arbitrary components. Therefore:

x=PseudoInverse[M].y + x''.NullSpace[M]

Now we only need to split y into dependent and independent components.

Note that y is a vetor of B'. B0, the complement of B' relative to B,

will be mapped to zero of A by PseudoInverse[M]. Therefore, B0 is

perpendicular to y. We therefore have: NullSpace[PseudoInverse[N[M]]] .

y == 0. This are b-b' equation that let us  write b-b' components of y

by the remaining b' components.

m={{1, 1, 0, 1}, {1, 1, 1, 2}, {2, 2, 1, 3}};

y={2, 1, 3};

PseudoInverse[M].y gives {1, 1, -1, 0}

for an arbitrary y={y1,y2,y3} and x''={x1,x2},we have:

x=PseudoInverse[M].{y1,y2,y3} + {x1,x2}.NullSpace[M]

To reduce the dimension of PseudoInverse[M] from (4x3) to (4x2) we solve:

Solve[NullSpace[PseudoInverse[M]].{y1, y2, y3} == 0, y3], this gives y3

-> y1 + y2 and we can write:

PIM'=PseudoInverse[M][[All, {1, 2}]] + PseudoInverse[M][[All, 3]]. With

this we can write x as:

x= PIM'.{y1,y2} + {x1,x2}.NullSpace[M]

As a test we calculate M.x:

M.(PIM'.{y1,y2} + {x1,x2}.NullSpace[M])/.y3->y1+y2 //Simplify what

evaluates to: {y1, y2, y1 + y2} as requested.

Daniel

David Park wrote:

> Dear MathGroup,

>

> I would like to solve a general set of linear equations and obtain not only the overall solution but also the submatrices that go into determining the solution. I know, more or less, how to do it but I don't know how to do it elegantly with Mathematica. I would appreciate any suggestions on the best way to do this.

>

> In the following notation small letters are vector and capital letters are matrices and I have put the dimensions of the vectors and matrices in brackets. So the general form of the equations are:

>

> y[m] == M[m,n].x[n]

>

> where n > m and the rank of M is r < m.

>

> Then I would like to have the solution in the form:

>

> x[r] == A[r,r].y[r] + B[r,n-r].x[n-r]

> y[m-r] == C[m-r,r].y[r]

>

> I need to know how the x's and y's are selected from the original set, and I want the matrices A, B and C.

>

> I could append an IdentityMatrix[m] to M and then RowReduce but then I need an algorithm to partition properly, or do it visually.

>

> A test case might be:

>

> testmat = {{1, 1, 0, 1}, {1, 1, 1, 2},

>    {2, 2, 1, 3}}

>

> yvector = {2, 1, 3}   or maybe {y1, y2, y3}

>

> David Park