MathGroup Archive 2003

[Date Index] [Thread Index] [Author Index]

Search the Archive

Problem with linear independence


Hi all!

I got a little problem and here it goes:

In[1]:= a m + b n == c n

This is what I'd like to solve for a, b and c. Of course this has
multiple solutions and furthermore no actual values for a, b or c.
However if I assume n and m to be {1, 0} and {0, 1} respectively I get
exactly the result I want: {a -> 0, b -> c}. So far so good.

The Problem now is:

In[2]:= a m + b n == c n + d o

I want o to be linear independant from m and n. BUT not because I move
one dimension up but because m,n and o are taken from the whole 2D
space and thus no pair of them can be expected to be linear dependant.

How do I tell this Mathematica???
How do I Assuming["m, n, o are linear independant", Solve[...]]???

Please help me :) ... Hagen!


  • Prev by Date: Find within selected cells?
  • Next by Date: Re: Something wrong with this integral (see text)
  • Previous by thread: Find within selected cells?
  • Next by thread: Linear Regression and Hat Matrix