MathGroup Archive 2008

[Date Index] [Thread Index] [Author Index]

Search the Archive

Re: LU Decomposition w/o Pivoting

  • To: mathgroup at smc.vnet.net
  • Subject: [mg91784] Re: [mg91711] LU Decomposition w/o Pivoting
  • From: danl at wolfram.com
  • Date: Sun, 7 Sep 2008 22:54:19 -0400 (EDT)
  • References: <200809060607.CAA19726@smc.vnet.net>

> Hello,
>
> I am trying to calculate the signs of the eigenvalues of a bunch of
> moderately large matrices (thousand by thousand), which takes a long
> time.  It has come to my attention that the diagonal terms in a LU
> decomposition have the same signs as the eigenvalues.  So I'm hoping
> that the LU decomposition is faster than calculating the actual
> eigenvalues.  I tried testing this out using mathematica's built in
> function LUDecomposition[] and after a large amount of monkeying
> around, I realized the signs of the diagonal terms only match the
> eigenvalues if there is no pivoting employed.  Unfortunately, I cannot
> figure out how to turn off pivoting in LUDecomposition[].  While I am
> sure I could write my own LU decomposition function, I'm also sure it
> would not be nearly as elegant and computationally efficient as the
> built in function.  So does anyone know how to turn off
> LUDecomposition[] or have optimized code to perform a LU decomposition?
>
> -Ben Reedlunn

I am not familiar with that result concerning signs of LU decomposition
diagonal, so I just take on faitth that it is correct. Everything else
below is based on that assumption.

I assume your matrices are symmetric, or at least that they have all real
eigenvalues. Forcing the "no pivot" is not always possible, since
different matrix types use different pvot strategies. For real values the
selection is based on the largest in magnitude, so if the matrix tends to
have values from a "nice" distribution then you can do as follows.

(1) Multiply each row by some value
(2) Take the LU decomposition.
(3) Extract your diagonal.

The hope is that by choosing the row multipliers well, you force the pivot
to be that row. It is an exercise to show that this is sufficiently
closely related to a pivot-free decomposition of the original matrix that
the signs on the diagonal are the same. Experiments I tried, at low
dimension, used Reverse[10^Range[Length[mat]]*mat.

There are a few problems with this approach. One is that it is appropriate
for matrices of inexact numbers, but not exact (for that latter I think
the pivot strategy might be to choose the smallest). Another is that it
will, for sufficiently large dimension, force you to bignums which usually
eradicates any speed advantage. A third is that it really is only a
heuristic, and will not always prevent nontrivial pivoting even in the
case where matrix entries are taken at random from a well behaved
distribution.

Plan b is to add a large multiple m of the identity matrix, take the
diagonal of either the LU or Cholesky decomposition (do the latter if the
matrix is symmetric, and square the resulting diagonal), and subtract m
from the resulting values. This is efficient, close to foolproof in the
sense of not being heuristic in any way; the choice of multiplier can be
figured out a priori based on input sizes (though I'm not showing details
of how todo that). I have not tried to prove in all detail that this
method is correct, but I believe it is.

Daniel Lichtblau
Wolfram Research




  • Prev by Date: Re: Use of delayed assignment, :=, with a list
  • Next by Date: Re: How to remove duplicate solutions (Solve)?
  • Previous by thread: LU Decomposition w/o Pivoting
  • Next by thread: Re: LU Decomposition w/o Pivoting