where has orthonormal columns, , and is upper Hessenberg with nonnegative subdiagonal elements. We shall call this a

A template for computing a -step Arnoldi factorization is given in Algorithm 7.6.

We will now describe some implementation details, referring to the respective phases in Algorithm 7.6.

**(1)**- Choose the initial starting vector and normalize. Ideally, for eigenvalue
calculations, one should attempt to construct a that is dominant
in eigenvector directions of interest. In the absence of any other
considerations, a random vector is a reasonable choice.
**(2)**- Initialize the Arnoldi procedure. In the subsequent step, will
become the new Arnoldi basis vector. This step should include
a reorthogonalization just as all of the others do.
**(7)**- Normalize to get the new basis vector , partially
update to get the
matrix ,
and compute the new information
.
The norm calculation
need not be recomputed
here. It has already been computed at step (12) and can be saved for reuse
here.
**(10)**- This is the classical Gram-Schmidt step to orthogonalize with
respect to the columns of .
This formulation allows
level 2 matrix-vector operations. A more detailed discussion
is given below.
**(12)**- The sequence of statements in this if-clause assure that the
new direction is numerically orthogonal to the previously
computed directions, i.e., to the columns of . The parameter
must be specified (
).
The test asks the question, ``Is nearly in
the space that already has been constructed?"
The ratio
,
where is the angle that the vector makes with
(i.e., the angle between and its projection ).
A larger value of is a more stringent orthogonality requirement.
With the algorithm reverts to classical Gram-Schmidt (CGS).
An additional detail is needed to completely specify this process.
If, after updating and , the
relation
still holds,
then numerically we have
. In this case,
the Arnoldi procedure should either be terminated or a randomly generated
vector should be orthogonalized with respect to the columns of
to replace , and should be set to zero
to continue the factorization. The value of used in
the ARPACK implementation is
(see §7.6.9).
Additional discussion of this orthogonalization device is given below.
**(17)**- is now a upper Hessenberg matrix.

The mechanism used to orthogonalize the new information against the existing basis is the CGS. It is notoriously unstable and will fail miserably in this setting without modification. One remedy is to use the modified Gram-Schmidt process as used in Algorithm 7.3. Unfortunately, this will also fail to produce orthogonal vectors in the restarting situation we are about to discuss and it cannot be expressed with Level 2 BLAS in this setting. At step (5), we have included an iterative refinement technique proposed by Daniel, Gragg, Kaufman, and Stewart (DGKS) in [96]. This scheme provides an excellent way to construct a vector that is numerically orthogonal to . The simple test specified at step (5) is used to avoid this DGKS correction if it is not needed.

This mechanism maintains orthogonality to full working precision at very reasonable cost. The special situation imposed by the restarting scheme we are about to discuss makes this modification essential for obtaining accurate eigenvalues and numerically orthogonal Schur vectors (eigenvectors in the Hermitian case). Schemes based on MGS are often sufficient for solving linear systems because they do construct a basis set that is linearly independent even though it might not be numerically orthogonal. The quality of MGS orthogonality is dependent upon the condition number (linear independence) of the original set of vectors. The implicit restarting mechanism we are about to present will be less effective and may even fail if numerical orthogonality is not maintained.

It has been well documented that failure to maintain orthogonality leads to several numerical difficulties within the Lanczos or Arnoldi procedure. In the Hermitian case, Paige [347] showed that the loss of orthogonality occurs precisely when an eigenvalue of is close to an eigenvalue of In fact, the Arnoldi vectors lose orthogonality in the direction of the associated approximate eigenvector. Moreover, failure to maintain orthogonality results in spurious copies of the approximate eigenvalue produced by the Arnoldi method. In the Hermitian case, several remedies short of full reorthogonalization have been proposed. These are discussed in §4.4 (p. ).