What Methods Are Covered?



next up previous contents index
Next: Iterative Methods Up: Introduction Previous: Why Use Templates?

What Methods Are Covered?

 

Many iterative methods have been developed and it is impossible to cover them all. We chose the methods below either because they illustrate the historical development of iterative methods, or because they represent the current state-of-the-art for solving large sparse linear systems. The methods we discuss are:

  1. Jacobi 
  2. Gauss-Seidel 
  3. Successive Over-Relaxation (SOR )
  4. Symmetric Successive Over-Relaxation (SSOR )
  5. Conjugate Gradient (CG )
  6. Minimal Residual (MINRES ) and Symmetric LQ (SYMMLQ )
  7. Conjugate Gradients on the Normal Equations (CGNE  and CGNR )
  8. Generalized Minimal Residual (GMRES )
  9. Biconjugate Gradient (BiCG )
  10. Quasi-Minimal Residual (QMR )
  11. Conjugate Gradient Squared (CGS )
  12. Biconjugate Gradient Stabilized (Bi-CGSTAB )
  13. Chebyshev  Iteration
For each method we present a general description, including a discussion of the history of the method and numerous references to the literature. We also give the mathematical conditions for selecting a given method.

We do not intend to write a ``cookbook'', and have deliberately avoided the words ``numerical recipes'', because these phrases imply that our algorithms can be used blindly without knowledge of the system of equations. The state of the art in iterative methods does not permit this: some knowledge about the linear system is needed to guarantee convergence of these algorithms, and generally the more that is known the more the algorithm can be tuned. Thus, we have chosen to present an algorithmic outline, with guidelines for choosing a method and implementing it on particular kinds of high-performance machines. We also discuss the use of preconditioners and relevant data storage issues.



Jack Dongarra
Mon Nov 20 08:52:54 EST 1995