[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: Eigenvalues computation
From: |
Dirk Laurie |
Subject: |
Re: Eigenvalues computation |
Date: |
Tue, 28 Jul 1998 19:52:43 +0200 (SAT) |
In response to some discussion by:
A. Scottedward Hodel, Daniel Heiserer, Daniel Tourde (proposer)
on eigenvalues and eigenvectors as continuous functions of
a matrix.
The question is a comfortable one only when the matrix is
normal (i.e. A'*A = A*A'). In that case eigenvalues are
very well-behaved functions of the matrix.
If a normal matrix A is close to another normal matrix B which
has eigenvector matrix Q (in this case known to be unitary,
i.e. inv(Q)=Q') then Q'*A*Q will be close to a diagonal matrix,
and in fact its diagonal will be extra close to the eigenvalues
of A. More precisely, if A and B differ by O(epsilon) then the
off-diagonal part of diag(Q'*A*Q) is of magnitude O(epsilon) and
diag(Q'*A*Q) differs from the eigenvalues of A by O(epsilon^2).
When A is not normal, the eigenvalues are still continuous
functions of A, but these functions may be extremely ill-behaved.
Even in the case where A is normal, it can be numerically delicate
to trace an eigenvector as a function of A when A has nearly
equal eigenvalues. If you use the method
[S,X]=eig(A); [y,p]=sort(diag(X)); S=S(:,p); X=X(p,p);
you get the eigenvalues sorted but the corresponding eigenvectors
of A and B may nevertheless differ drastically.
These issues are explained clearly in the classical books by
J H Wilkinson: "The Algebraic Eigenvalue Problem" and "Rounding
Errors in Algebraic Processes", as well as later tomes like
"Accuracy and Stability of Numerical Algorithms" by Nicholas J Higham.
-- Dirk Laurie