next up previous contents index
Next: Shift Selection Up: Implicitly Restarted Lanczos Method Previous: Implicitly Restarted Lanczos Method   Contents   Index

Implicit Restart

An unfortunate aspect of the Lanczos process is that there is no way to determine in advance how many steps will be needed to determine the eigenvalues of interest within a specified accuracy. It is determined by the distribution of the eigenvalues $\lambda(A)$ and the choice of starting vector $v_1$. In many cases, convergence will not occur until $k$ gets very large. Maintaining the numerical orthogonality of $ V_k $ quickly becomes intractable for large $k$ at a cost of $O(nk^2)$ floating point operations.

There is another way to maintain orthogonality: one may limit the size of the basis set and use restarting schemes. Restart means replacing the starting vector $v_1$ with an ``improved" starting vector $v_1^+$ and computing a new Lanczos factorization with the new vector. The structure of the residual vector $r_k$ serves as a guide: In theory, $r_k$ will vanish if $v_1$ is a linear combination of $k$ eigenvectors of $A$. Our goal is to make this happen iteratively. In the symmetric case, these eigenvectors are orthogonal and hence form an orthonormal basis for an invariant subspace.

The need for restart was recognized early on by Karush [258] soon after the appearance of the original algorithm of Lanczos  [285]. Subsequently, there were developments by Paige [347], Cullum and Donath [89], and Golub and Underwood [197]. All of these schemes are explicit in the sense that a new starting vector is produced by some process and an entirely new Lanczos factorization is constructed.

In this section we will describe implicit restart. It is a technique to combine the implicitly shifted QR scheme with a $k$-step Lanczos factorization and obtain a truncated form of the implicitly shifted QR iteration. The numerical difficulties and storage problems normally associated with the Lanczos process are avoided. The algorithm is capable of computing a few ($k$) eigenvalues with user-specified features such as the algebraically largest or smallest eigenvalues or those of largest magnitude, using storage for only a moderate multiple of $k$ vectors. The computed eigenvectors form a basis for the desired $k$-dimensional eigenspace and are numerically orthogonal to working precision.

Implicit restart provides a means to extract interesting information from large Krylov subspaces while avoiding the storage and numerical difficulties associated with the standard approach. It does this by continually compressing the interesting information into a fixed-size $k$-dimensional subspace. This is accomplished through the implicitly shifted QR mechanism. A Lanczos factorization of length $m = k+p$, A V_m = V_m T_m + r_m e_m^ , is compressed to a factorization of length $k$ that retains the eigeninformation of interest. This is accomplished by applying $p$ implicitly shifted QR steps. The first stage of this shift process results in A V_m^+ = V_m^+ T_m^+ + r_m e_m^ Q , where $ V_m^+ = V_m Q$, $T_m^+ = Q^{\ast} T_m Q$, and $Q = Q_1Q_2 \cdots Q_p.$ Each $Q_j$ is the orthogonal matrix associated with the shift $\mu_j$ used during the shifted QR algorithm. Because of the Hessenberg structure of the matrices $Q_j$, it turns out that the first $k-1$ entries of the vector $e_m^{\ast} Q$ are zero. This implies that the leading $k$ columns in equation (4.21) remain in a Lanczos relation. Equating the first $k$ columns on both sides of (4.21) provides an updated $k$-step Lanczos factorization

A V_k^+ = V_k^+ T_k^+ + r_k^+ e_k^{\ast} ,

with an updated residual of the form $ r_k^+ = V_m^+ e_{k+1} \beta_k + r_m Q(m,k)$. Using this as a starting point, it is possible to apply $p$ additional steps of the Lanczos process to return to the original $m$-step form.

A template for the implicitly restarted Lanczos method (IRLM) is given in Algorithm 4.7.

for HEP
...e_m^{\ast} $\ \\
{\rm (13)}\> \>{\bf end repeat}

We will now describe some implementation details, referring to the respective phases in Algorithm 4.7.

Ideally, for eigenvalue calculations, one should attempt to construct a starting vector $v$ that is dominant in eigenvector directions of interest. If there is a sequence of closely related eigenvalue problems, the solution of the previous problem may provide a starting vector for the next. In the absence of any other considerations, a random vector is a reasonable choice.

Algorithm 4.8 below may be used to compute this initial $(m = k+p)$-step Lanczos factorization.

The shifts $\mu_j$ are selected with respect to the user's ``wanted set" specification and the current and perhaps past information about the spectrum of $T_m$; see §4.5.2 below.

Apply $p$ steps of the shifted QR iteration to $T_m$ using the $\mu_1, \mu_2, \ldots, \mu_p$ as shifts. This should be done using the implicitly shifted QR variant (i.e., the ``bulge chase'' mechanism). If exact shifts are used, then $\beta_{k} = T_m(k+1,k)$ should be zero on completion of the $p$ steps of QR, and the leading submatrix $T_m(1:k,1:k)$ should have $\theta_1, \theta_2, \ldots, \theta_k$ as its eigenvalues.

When exact shifts are used, the properties mentioned in steps (6)-(9) are usually approximately true in finite precision. However, they might not be achieved due to rounding errors. Therefore, it is important to include both terms in the update of the residual vector $r_k \leftarrow v_{k+1}\beta_{k} + r_m \sigma_k$. Note that, with exact shifts, the updated $ V_k $ and $T_k$ will provide a new $k$-step Lanczos factorization with Ritz values and vectors that are the best approximations to the user specification that has been produced so far.

This step requires a slight modification of the Lanczos process discussed previously. It begins with step $k$ of the usual scheme and assumes that all $k$ Lanczos vectors have been retained in a matrix $ V_k $. The details are given in Algorithm 4.8 below.

next up previous contents index
Next: Shift Selection Up: Implicitly Restarted Lanczos Method Previous: Implicitly Restarted Lanczos Method   Contents   Index
Susan Blackford 2000-11-20