We include a numerical example for testing purposes, so that potential users of the Jacobi-Davidson algorithms can verify and compare their results.
The symmetric matrix is of dimension
. The diagonal entries
are
, the codiagonal entries are
, and furthermore,
.
All other entries are zero. This example has been
taken from [88] and is discussed, in the context of the
Jacobi-Davidson algorithm, in [411, p. 410].
We use Algorithm 4.17 for the computation of the
largest eigenvalues. The input parameters have been chosen as follows.
The starting vector
. The tolerance is
. The subspace dimension parameters are
,
, and the target value
.
![]() |
![]() |
We show graphically the norm of the residual vector as a function of
the iteration number in Figure 4.5. Every time the norm
is less than , we have determined an eigenvalue within
this precision, and the iteration is continued with deflation for the
next eigenvalue. The four pictures represent, lexicographically, the
following different situations:
In Figure 4.6, we give the convergence history for interior
eigenvalues, as obtained with Algorithm 4.17 (top parts) and with
Algorithm 4.19 (bottom parts), with the following input
specifications:
,
,
,
,
, and
.
Again, every time the curve gets below
, this
indicates convergence of an approximated eigenvalue to within that
tolerance. For all figures, we used 5 steps of GMRES to solve the
correction equation in (32). For the left figures, we did not use
preconditioning. For the right figures, we preconditioned GMRES with
, as in Algorithm 4.18.