next up previous contents index
Next: Conditioning Up: Non-Hermitian Eigenproblems  J. Demmel Previous: Equivalences (Similarities)   Contents   Index


We discuss four eigendecompositions, or four matrices that are similar to $A$ and for which it is simpler to solve eigenproblems. The first one, diagonal form, exists only when there are $n$ independent eigenvectors. The second one, Jordan form, generalizes diagonal form to all matrices. It can be very ill-conditioned (indeed, it can change discontinuously), so for numerical purposes we typically use the third one, Schur form, which is cheaper and more stable to compute. We briefly mention a fourth one, which we call Jordan-Schur form,[*] that is as stable as the Schur form but computes some of the detailed information about invariant subspaces provided by the Jordan form.

Define $\Lambda = {\rm diag}(\lambda_1 ,\ldots, \lambda_n )$. If there are $n$ independent right eigenvectors $x_1,\ldots,x_n$, we define $X = [x_1,\ldots,x_n]$. $X$ is called an eigenvector matrix of $A$. The $n$ equalities $A x_i = \lambda_i x_i$ for $i=1,\ldots,n$ may also be written $AX = X \Lambda$ or $A = X \Lambda X^{-1}$. The factorization

\begin{displaymath}A = X \Lambda X^{-1}\end{displaymath}

is called the diagonal form of $A$. Note that $X^{-1}A = \Lambda X^{-1}$. Letting $y_i^*$ denote the $i$th row of $X^{-1}$, we see that $y_i^*A = \lambda y_i^*$. In other words, the rows of $X^{-1}$ are the left eigenvectors of $A$. Thus, if $A$ has $n$ independent eigenvectors, then it is similar to the diagonal matrix $\Lambda$. In this case we call $A$ diagonalizable.

If we take a subset of $k$ columns of $X$ (say $\hat{X}= X(:,[2,3,5])$ = columns 2, 3, and 5), these columns span an invariant subspace of $A$. If we take the corresponding submatrix $\hat{\Lambda}= {\rm diag}(\lambda_2 , \lambda_3 , \lambda_5 )$ of $\Lambda$, and the corresponding three rows of $X^{-1}$ (say $\hat{Y}^* = X^{-1}([2,3,5],:)$), we can write the corresponding partial diagonal form as $A \hat{X}= \hat{X}\hat{\Lambda}$ or $\hat{Y}^* A \hat{X}= \hat{\Lambda}$. If the columns in $\hat{X}$ are replaced by $k$ different vectors spanning the same invariant subspace, we get a different partial eigendecomposition $A \check{X}= \check{X}\check{A}$, where $\check{A}$ is a $k$ by $k$ matrix whose eigenvalues are those of $\hat{\Lambda}$, though $\check{A}$ may not be diagonal. Similar procedures for producing partial eigendecompositions for the other eigendecompositions discussed below.

If all the $\lambda_i$ are distinct, there are $n$ independent eigenvectors, and the diagonal form exists. This is the simplest and most common case. For example, if one picks a matrix ``at random,''[*]the probability is 1 that the eigenvalues are distinct.

A diagonal form of the matrix $A(0)$ in (2.3) does not exist, since it has just one independent eigenvector. Instead, we can compute its Jordan form, which is a decomposition

\begin{displaymath}A = XJX^{-1},\end{displaymath}

where $J$ is a block-diagonal matrix, i.e., a matrix of the form $J = {\rm diag} (J_1 ,\ldots,J_e)$, where each $J_i$ is a square matrix called a Jordan block. Each $J_i$ is upper triangular with its single eigenvalue $\lambda_i$ on the diagonal, and ones on the first superdiagonal. $J_i$ has a single independent right and left eigenvector. $A(0)$ is a 2 by 2 Jordan block with its single eigenvalue at 0, and $A(0) + \lambda_i I$ is a 2 by 2 Jordan block with its single eigenvalue at $\lambda_i$. It can be shown that the Jordan form explicitly describes all possible invariant subspaces of $A$ as being spanned by certain subsets of the columns of $X$ [187].

Unfortunately, the Jordan form is generally not suitable for numerical computation. Here are two reasons. First, it can change discontinuously as $A$ changes. For example, the matrix $J$ for $A(\epsilon)$ with $\epsilon \neq 0$ is ${\rm diag}(+ \epsilon , - \epsilon)$, but for $A(0)$, $J=A(0)$ itself. Second, the eigenvector matrix $X$ can be very ill-conditioned, i.e., hard to invert accurately. In the case of $A(\epsilon)$, the condition number of $X$ grows like $1/\epsilon$.

So instead we use eigendecompositions of the form

\begin{displaymath}A = QTQ^*,\end{displaymath}

where $Q$ is a unitary (or orthogonal) similarity transformation, and $T$ is triangular (or quasi-triangular). This is called the Schur form of $A$. When $A$ is complex, or real with all real eigenvalues, then $T$ is triangular with the eigenvalues of $A$ on its diagonal. When $A$ is real with complex eigenvalues and $Q$ is real and orthogonal, then $T$ cannot be real and triangular. Instead, we allow 2 by 2 blocks with complex eigenvalues to appear on the diagonal of $T$, which we call quasi-triangular form. For simplicity of presentation, we will consider the complex case, so that $T$ is triangular. Let $\lambda_1 = T_{11},\ldots,
\lambda_n = T_{nn}$ be the eigenvalues in the order they appear on the diagonal of $T$; there is a (different) Schur form for every order. The columns of $Q$ are called Schur vectors. The Schur form does not give explicit eigenvectors and bases for all invariant subspaces the way diagonal form and Jordan form do, but it can be computed quite stably. The leading $k$ columns of $Q$ span the invariant subspace for eigenvalues $\lambda_1$ through $\lambda_k$, but all other eigenspaces require a computation. For example, eigenvectors can be computed simply by solving a triangular system of equations taken from $T$, and multiplying by $Q$ [114].

Finally, we consider the Jordan-Schur form. It is quite complicated, so we only summarize its properties here. Like the Schur form, it only uses unitary (orthogonal) transformation, and so can be computed stably. Like the Jordan form, it explicitly shows what the sizes of the Jordan blocks are and gives explicit bases for many more invariant subspaces than Schur form.

Many textbooks give explicit solutions for problems such as computing the matrix of the exponential $e^A$ in terms of the Jordan form of $A$ [114]. These methods are to be avoided numerically, because of the difficulty of computing the Jordan form. Nearly all these problems have alternative solutions in terms of the Schur form, or in some cases the Jordan-Schur form.

next up previous contents index
Next: Conditioning Up: Non-Hermitian Eigenproblems  J. Demmel Previous: Equivalences (Similarities)   Contents   Index
Susan Blackford 2000-11-20