MathGroup Archive 2004

[Date Index] [Thread Index] [Author Index]

Search the Archive

Re: Re: Eigenvalues and eigenvectors of a matrix with nonpolynomial elements.

  • To: mathgroup at smc.vnet.net
  • Subject: [mg51389] Re: [mg51315] Re: [mg51284] Eigenvalues and eigenvectors of a matrix with nonpolynomial elements.
  • From: Daniel Lichtblau <danl at wolfram.com>
  • Date: Fri, 15 Oct 2004 02:48:26 -0400 (EDT)
  • References: <200410141035.GAA14868@smc.vnet.net>
  • Sender: owner-wri-mathgroup at wolfram.com

Goyder Dr HGD wrote:
> Goyder Dr HGD wrote:
> 
>>I need to find the eigenvalues and eigenvectors of matrices where the elements depend on a variable, k, in a nonpolynomial manner. Thus, according to my (limited) knowledge of eigensystems, the eigenvalues are the values of k that make the determinant zero and there should be an eigenvector associated with each eigenvalue. A simple warm-up example is given below; my actual cases will be more complicated. I wish to know if the method I have put together below, using engineering rather than maths, is suitable and what accuracy I can expect. 
> 
> 
> Daniel Lichtblau wrote:
> 
> 
>>No, eigenvalues for a matrix mat are values lambda for which
> 
> mat - lambda*IndentityMatrix[...]
> is singular (that is, has zero determinant. In your example these would 
> be functions of k. In simplified form I get
> 
> Out[8]//InputForm=
> {-1 + I, -1 - I, (-Cosh[k] - Sin[k] -
>     Sqrt[Cosh[k]^2 - 2*Cosh[k]*Sin[k] + Sin[k]^2 + 4*Cos[k]*Sinh[k]])/2,
>   (-Cosh[k] - Sin[k] + Sqrt[Cosh[k]^2 - 2*Cosh[k]*Sin[k] + Sin[k]^2 +
>       4*Cos[k]*Sinh[k]])/2}
> 
> Daniel Lichtblau
> Wolfram Research
> 
> Thank you for your fast response. I apologies for not making myself clear. The warm up example comes from a differential equation eigenvalue problem rather than a simple matrix problem. The differential equation is given below and results in the matrix that I gave previously. Thus starting from the beginning we have a differential equation and boundary conditions which will only be satisfied for certain values of, k, which I take to be the eigenvalues. The problem is then to find these eigenvalues and the associated eigensolutions. (My actual problem involves sets of differential equations of this form.) The warm-up differential equation is
> 
> In[103]:=
> DSolve[Derivative[4][y][x] - k^4*y[x] == 0, y[x], x]
> 
> Out[103]=
> {{y[x] -> C[2]/E^(k*x) + E^(k*x)*C[4] + C[1]*Cos[k*x] + C[3]*Sin[k*x]}}
> 
> I have translated this solution to the equivalent form
> 
> In[104]:=
> e1 = A1*Cos[k*x] + A2*Sin[k*x] + A3*Cosh[k*x] + A4*Sinh[k*x]; 
> 
> The boundary conditions are
> 
> In[107]:=
> bc = {0 == (e1 /. x -> 0), 0 == (1/k^2)*(D[e1, {x, 2}] /. x -> 0), 0 == (e1 /. x -> 1), 
>     0 == (1/k)*(D[e1, x] /. x -> 1)}; 
> 
> Which give rise to a set of linear equations whose coefficient matrix may be found from
> 
> In[108]:=
> e2 = Normal[CoefficientArrays[bc, {A1, A2, A3, A4}]]
> 
> Out[108]=
> {{0, 0, 0, 0}, {{-1, 0, -1, 0}, {1, 0, -1, 0}, {-Cos[k], -Sin[k], -Cosh[k], -Sinh[k]}, 
>    {Sin[k], -Cos[k], -Sinh[k], -Cosh[k]}}}
> 
> The values of k that satisfy this equation I am calling eigenvalues. Each value of {A1, A2, A3, A4} which is associated with each eigenvalue I am calling an eigenvector. We thus come to the matrix which I defined previously from which my eigenvalues and vectors must be extracted.
> 
> In[109]:=
> e3 = e2[[2]]
> 
> Out[109]=
> {{-1, 0, -1, 0}, {1, 0, -1, 0}, {-Cos[k], -Sin[k], -Cosh[k], -Sinh[k]}, 
>   {Sin[k], -Cos[k], -Sinh[k], -Cosh[k]}}
> 

Okay. I had considered that you might be doing some sort of 
Sturm-Liouville problem, but the setup looked too much like a standard 
linear algebra eigensystem.

So we begin anew with the matrix and its determinant.

mat = {{-1, 0, -1, 0}, {1, 0, -1, 0},
   {-Cos[k], -Sin[k], -Cosh[k], -Sinh[k]},
   {Sin[k], -Cos[k], -Sinh[k], -Cosh[k]}};
det = Det[mat];

For the task at hand you can use NullSpace numerically after first 
extracting the root in k.

In[130]:= root = FindRoot[det==0, {k,6,8}]
Out[130]= {k -> 7.06858}

In[132]:= Chop[NullSpace[mat /. root]]
Out[132]= {{0, -0.999999, 0, 0.00120412}}

But there is a symbolic approach as well. We set up the symbolic 
expression mat.vec==0 with vec a vector of indeterminates. We 
algebraicize by converting trigs and hyperbolics to ordinary variables 
and adding subsidiary defining relations e.g. Sin[k]->sk, Cos[k]->ck, 
and we have sk^2+ck^2==1. Variations on this idea involve different 
algebraic substitutions. For example we could convert to exponentials, 
or use the standard rational parametrization of the trig and hyperbolic 
functions.

vars = Array[x,Length[mat]];
subs = {Cos[k]->ck,Sin[k]->sk,Cosh[k]->chk,Sinh[k]->shk};
polys1 = mat . vars /. subs;
polys2 = {ck^2+sk^2-1, chk^2-shk^2-1};
detpoly = Det[mat]/.subs;

We could "normalize" the eigenvector with a relation that the sum of 
squares of its components must be unity. As it will lead to a simpler 
result, we instead insist that a particular component be unity. This 
must be done carefully as it is is not really a "generic" approach. So 
we use the earlier numeric computation to choose one of the components, 
the second one, say, that did not vanish.

In[144]:= InputForm[polys = Join[polys1,polys2,{x[2]-1,detpoly}]]
Out[144]//InputForm=
{-x[1] - x[3], x[1] - x[3], -(ck*x[1]) - sk*x[2] - chk*x[3] - shk*x[4],
  sk*x[1] - ck*x[2] - shk*x[3] - chk*x[4], -1 + ck^2 + sk^2,
  -1 + chk^2 - shk^2, -1 + x[2], 2*ck*shk - 2*chk*sk}

In[146]:= InputForm[soln = vars /. Solve[polys==0, vars]]
Out[146]//InputForm= {{0, 1, 0, -(chk*ck) + shk*sk}}


Daniel Lichtblau
Wolfram Research


  • Prev by Date: Re: Sorting a list of pairs on the second elements
  • Next by Date: Re: Re: Re: Re: normal distribution random number generation
  • Previous by thread: Re: Eigenvalues and eigenvectors of a matrix with nonpolynomial elements.
  • Next by thread: Re: Eigenvalues and eigenvectors of a matrix with nonpolynomial elements.