We've now laid the groundwork for a meaningful definition of the gradient in the setting of a constraint manifold. At this point, one could run off and try to do a steepest descent search to maximize one's objective functions. Trouble will arise, however, when one discovers that there is no sensible way to combine a point and a displacement to produce a new point because, for finite , violates the constraint equations and thus does not give a point on the manifold.

For any manifold, one updates by solving a set of differential
*equations of motion*
of the form

The term is crafted to ensure that remains a tangent vector for all times , thus keeping the path on the manifold. It is called the

To see how these equations could be satisfied,
we take the infinitesimal constraint equation for ,

and differentiate it with respect to , to get

This can be satisfied generally by , where is some arbitrary tangent vector.

In the next subsection, we will have reason to consider
for .
For technical reasons, one usually requests that

(this is called the

(this is called the

The connection for the Stiefel manifold using two different inner products
(the Euclidean and the canonical) can be found in the work by Edelman,
Arias, and Smith (see [151,155,416]). The function `connection`

computes
in the template software.

Usually the solution of the equations of motions on a manifold are
very difficult to carry out. For the Stiefel manifold,
analytic solutions exist and can be found in the aforementioned
literature, though we have found very little performance
degradation between moving along paths via the equations of motion and
simply performing some orthogonalizing factorization on ,
as long as the displacements are small. The `move`

function
supports multiple methods of geodesic motion, depending on the degree
of approximation desired.