next up previous contents index
Next: Algorithm Up: Symmetric Indefinite Lanczos Method Previous: Symmetric Indefinite Lanczos Method   Contents   Index

Some Properties of Symmetric Indefinite Matrix Pairs

The symmetry of $A$ and $B$ is purely an algebraic property and is not sufficient to ensure any of the special mathematical properties enjoyed by a definite matrix pencil, such as those discussed in §2.3. In fact, it can be shown that any real square matrix $F$ may be written as $F = AB^{-1}$ or $F = B^{-1} A$, where $A$ and $B$ are suitable symmetric matrices; for example, see [353].

The eigenvalues of a definite matrix pencil are all real, but an indefinite pencil may have complex eigenvalues. For example, when

A = \left[\begin{array}{cc}0&10\\ 10&20\end{array}\right],
B = \left[\begin{array}{cc}-1&0\\ 0&10\end{array}\right],

the eigenvalues of $\{A,B\}$ are the complex conjugate pair $1\pm 3i$. A further distinction from the definite case is that an indefinite matrix pencil may not have a complete set of eigenvectors. As an example, consider the pencil

\left[\begin{array}{cc}0&1\\ 1&2\end{array}\right]
\left[\begin{array}{cc}0&1\\ 1&1\end{array}\right],

which has one eigenvalue $\lambda = 1$ (multiplicity 2) and only one eigenvector, $x = [{1 \atop 0}]$.

In §2.3, we know that when $B$ is positive definite we can find a matrix of eigenvectors $X$ and a diagonal matrix of eigenvalues $\Lambda$ such that $AX = BX \Lambda$ with $X^TBX=I$. The $B$ inner product $(x,y)_B$ forms a true inner product and $(x,x)_B^{1/2}$ is a norm.

When $B$ is indefinite and nonsingular and if $B^{-1}A$ is not defective (i.e., no eigenvectors are missing), we can find a full set of eigenvectors, $X$. The equation $AX = BX \Lambda$ still holds, and the eigenvectors can be chosen so that $X^TBX=J$, where $J$ is a diagonal matrix with $1$ and $-1$ on the diagonal (note that it is transpose, not conjugate transpose, even though some vectors can be complex). The $B$ inner product $(x,y)_B$ is an indefinite inner product or pseudo-inner product, and $(x,x)_B$ can be used for normalizing purposes. Unlike the positive definite case, there is a set of vectors having pseudolength zero (as measured by $(x,x)_B$). In fact, it is possible for an eigenvector $x$ to satisfy $x^TAx=x^TBx=0$. This implies that the Rayleigh quotient

\begin{displaymath}\rho = \frac{x^TAx}{x^TBx}\end{displaymath}

is undefined. This phenomenon is illustrated by the diagonal matrices

\begin{displaymath}A = {\rm diag}(-1,1,1), \quad B = {\rm diag}(1,1,-1).\end{displaymath}

The vector $x = \left[ \begin{array}{ccc}
1 & 0 & 1
\end{array} \right]^T$ is an eigenvector of $\{A,B\}$ corresponding to the multiple eigenvalue $\lambda = -1$, but $x^TAx=x^TBx=0$.

The analogy between the definite and the indefinite cases can be taken further. When $B$ is positive definite, $\tlu$ is an approximate eigenvector, and $\tla$ is an approximate eigenvalue, we have the standard residual bound:

\vert\lambda-\tla\vert \le \frac{\Vert(A-\tla B)\tlu\Vert _{B^{-1}}}{\Vert\tlu\Vert _B},

where $\Vert\cdot \Vert _{B^{-1}}$ is the norm with respect to $B^{-1}$, i.e., $\Vert x \Vert _{B^{-1}}=(x^{\ast}B^{-1}x)^{1/2}$ (similarly for the other norm). Also see §5.7. When $B$ is indefinite and nonsingular, $X^TBX=J$, and $B^{-1}A$ is not defective, one can prove that

\vert\lambda-\tla\vert \le \frac{\Vert(A-\tla B)\tlu\Vert _{\bar{X}X^{T}}}
{\Vert\tlu\Vert _{(XX^{\ast})^{-1}}};

note that $B^{-1}=XX^{\ast}$ in the definite case. A similar bound is given by

\vert\lambda-\tla\vert \le \frac{\Vert(A-\tla B)\tlu\Vert _2...
...t _2}\Vert B^{-1}\Vert _2
\Vert X\Vert _2\Vert X^{-1}\Vert _2.

Even if these bounds are not computable, they imply that a small residual is good. When $B$ is singular there are similar bounds; see [161].

If both $A$ and $B$ are singular, or close to singular, worse problems may occur. Assume that there is a nonzero vector $z$ such that $Az=Bz=0$; then any complex number $\lambda$ is an eigenvalue. A more general case is illustrated in the example below:

1 & 0 & 0\\
0 & -1 & 0\\
0 ...
1 & 0 & 1\\
0 & -1 & 1\\
1 & 1 & 0

The characteristic polynomial $\det (A-\lambda B)$ is identically equal to zero, so any complex number is an eigenvalue; we have a singular matrix pencil. The null spaces of $A$ and $B$ are one-dimensional and are spanned by $z_a=[0~0~1]^T$ and $z_b=[-1~1~1]^T$, respectively. Note that intersection between the null spaces is just the zero vector. The singularity of the problem implies that $z_a^TBz_a=0$ (and $z_b^TAz_b=0$). To solve a singular problem using numerical methods the singularity must be removed. It is a hopeless task to attack the original problem, as can be seen by perturbing $A$ and $B$. Set $a_{3,3}=\epsilon$ and add $\delta$ to $b_{1,1}$, then $\det (A-\lambda
B)=\delta\lambda^3+\epsilon(\lambda-1)(\lambda(1+\delta)-1)$. So if $\epsilon$ or $\delta$ is nonzero this is no longer a singular pencil. Taking $\delta=0$ then for any $\epsilon> 0$ we have the eigenvalues $1,1$, and $\infty$. If $\delta>0$ and $\epsilon = 0$ the eigenvalues are $0,0$, and $0$. When $\delta=\epsilon\rightarrow 0$ the eigenvalues are roughly $0.56984$ and $0.21508\pm1.30714$. Note that the perturbations can be made arbitrarily small. A general discussion about singular pencils can be found in §8.7.

In practice, if we have a singular pencil, $A - \lambda B$ is singular for any $\lambda$, and a good LU factorization routine should give a warning (when used in (a) and (b) in the following §8.6.2). Another sign of a singular pencil is that the eigenvalue routine produces some random eigenvalues at each run on the same problem.

next up previous contents index
Next: Algorithm Up: Symmetric Indefinite Lanczos Method Previous: Symmetric Indefinite Lanczos Method   Contents   Index
Susan Blackford 2000-11-20