Conjugate Gradient Method (CG)



next up previous contents index
Next: Theory Up: Nonstationary Iterative Methods Previous: Nonstationary Iterative Methods

Conjugate Gradient Method (CG)

   

The Conjugate Gradient method is an effective method for symmetric positive definite systems. It is the oldest and best known of the nonstationary methods discussed here. The method proceeds by generating vector sequences of iterates (i.e., successive approximations to the solution), residuals corresponding to the iterates, and search directions  used in updating the iterates and residuals. Although the length of these sequences can become large, only a small number of vectors needs to be kept in memory. In every iteration of the method, two inner products are performed in order to compute update scalars that are defined to make the sequences satisfy certain orthogonality conditions. On a symmetric positive definite linear system these conditions imply that the distance to the true solution is minimized in some norm.

The iterates are updated in each iteration by a multiple of the search direction vector  :

Correspondingly the residuals   are updated as

 

The choice minimizes over all possible choices for in equation (gif).

The search directions are updated using the residuals

 

where the choice ensures that and - or equivalently, and - are orthogonal  . In fact, one can show that this choice of makes and orthogonal to all previous and respectively.

The pseudocode for the Preconditioned Conjugate Gradient Method is given in Figure gif. It uses a preconditioner ; for one obtains the unpreconditioned version of the Conjugate Gradient Algorithm. In that case the algorithm may be further simplified by skipping the ``solve'' line, and replacing by (and by ).

  
Figure: The Preconditioned Conjugate Gradient Method





Jack Dongarra
Mon Nov 20 08:52:54 EST 1995