Re: orthonormal eigenvectors
- To: mathgroup at smc.vnet.net
- Subject: [mg67606] Re: [mg67576] orthonormal eigenvectors
- From: Andrzej Kozlowski <akoz at mimuw.edu.pl>
- Date: Sun, 2 Jul 2006 06:27:12 -0400 (EDT)
- References: <200607010912.FAA20405@smc.vnet.net>
- Sender: owner-wri-mathgroup at wolfram.com
On 1 Jul 2006, at 18:12, tkg wrote: > Hi, I am trying to calculate eigenvalues and the corresponding > ORTHONORMAL eigenvectors of a square, real, and hermitian matrix > (10 X 10 matrix). I use either Eigenvalues & Eigenvectors or > Eigensystem. It gives me real, distinct eigenvalues and the > corresponding eigenvectors. These eigenvectors are normalized but > not orthogonal to each other. Then I use GramSchmidt method to make > them orthogonal. It makes them orthogonal, but these new orthogonal > eigenvectors are not consistent with the eigenvalues! > Do you have any idea how to sort out this problem? > I mean, is there any way to get eigenvalues with the corresponding > ORTHONORMAL eigenvectors? > > Thank you in advance. > WHen asking a question like this it is much better to provide an example of the phenomenon you believe yourself to have observed. It need not be your real problem, but it could be soemthing less complex where esstneially the same problem occurs. As it is we have to guess. It seems very likely to me that the problems you beleive to have observed are simply consequences of workign with non exact numbers. I will illustrate thsi with an example. Lets' create a matrix of random integers: M = Array[Random[Integer, {1, 6}] &, {4, 4}]; Adding M to its own transpose we will get a symmetric matrix. A = M + Transpose[M] {{6, 9, 9, 9}, {9, 4, 5, 4}, {9, 5, 2, 10}, {9, 4, 10, 10}} Let's compute the Eigenvectors of this matrix> These are exact numbers and if you look at them by evaluating vecs[[1]] etc, they will be very complicated. However note than: FullSimplify[vecs[[3]].vecs[[2]]] 0 in ohter words they are really orthogonal. However, if you replace the matrix we got by a matrix of approximate numbers, the vectors will no longer by orthogonal: vecs = N[vecs] {{0.9529856912137404, 0.6574618431021122, 0.8028042069793742, 1.}, {2.8453996876462306, -1.0843877316975596, -3.7352589900772113, 1.}, {-1.1208504810138278, 1.2692897136665984, -0.9545977431597417, 1.}, {-0.4878210955975244, -1.0643582605668658, 0.20510787843211642, 1.}} Now: vecs[[3]] . vecs[[2]] 1.6653345369377348*^-15 is no longer zero, but a very smal number. Applying Chop to it we get 0: Now let's make our vectors (approximately) orthonormal: P = #/Norm[#] & /@ vecs; We would expect P.A.Transpose[P] to be a diagonal matrix, but it wont be eactly so: P . A . Transpose[P] {{29.234760663125755, -1.4286615103984426*^-13, 9.611389246924539*^-14, 2.5628169702774596*^-14}, {-1.4145618102525984*^-13, -6.0815436387462665, 1.456487713933675*^-16, 3.373374676294516*^-16}, {9.351371358175255*^-14, -4.342420364370678*^-16, -4.556472906055469, -9.468243517321083*^-16}, {2.4413370125226212*^-14, 3.4754265094201577*^-16, -1.4716526946101933*^-15, 3.403255881675978}} But again Chop[%] {{29.234760663125755, 0, 0, 0}, {0, -6.0815436387462665, 0, 0}, {0, 0, -4.556472906055469, 0}, {0, 0, 0, 3.403255881675978}} And we see that we really do have the eigenvalues on the diagonal: Eigenvalues[A]//N {29.2348,-6.08154,-4.55647,3.40326} So everything works as it should, except that one has to take care with using approximate values. In particular, if your original matrix was numerical and ill conditioned you may need to use more subtle methods to get correct answers. But before going into that I would have to know what exactly your problem is, because what your description so far is too vague. Andrej Kozlowski Tokyo, Japan
- References:
- orthonormal eigenvectors
- From: tkg <tkghosh@mp.okayama-u.ac.jp>
- orthonormal eigenvectors