next up previous contents index
Next: Error Bound for Computed Up: Some Combination of and Previous: Residual Vector.   Contents   Index


Transfer Residual Error to Backward Error.

It can be proved that there are Hermitian matrices $E$ and $F$, e.g.,
\begin{displaymath}
(E,F) = \left[-r\wtd x^*-\wtd x r^*
+\left(\wtd\beta\wtd x^...
...\wtd x^*\right]\cdot
\left(\wtd\beta I, -\wtd\alpha I\right),
\end{displaymath} (104)

such that $(\wtd\alpha,\wtd\beta)$ and $\wtd x$ are an exact eigenvalue and its corresponding eigenvector of $\{A+E, B+F\}$. We are interested in such matrices $(E,F)$ with smallest possible norms. It turns out that the best possible $(E,F)=(E_2,F_2)$ for the spectral norm $\Vert\cdot\Vert _2$ and the best possible $(E,F)=(E_{F},F_{F})$ for Frobenius norm $\Vert\cdot\Vert _{F}$ satisfy
\begin{displaymath}
\Vert(E_2, F_2)\Vert _2={\Vert r\Vert _2}, \quad
\Vert(E_{F}...
...
- (\wtd\beta\wtd x^* A\wtd x-\wtd\alpha\wtd x^* B\wtd x)^2}.
\end{displaymath} (105)

See [256,431,473].[*]In fact, $(E_{F},F_{F})$ is given explicitly by (5.40). Therefore if $\Vert r\Vert _2$ is small, the computed $(\wtd\alpha,\wtd\beta)$ and $\wtd x$ are exact ones of nearby matrices. Error analysis of this kind is called backward error analysis and matrices $(E,F)$ are backward errors.

We say an algorithm that delivers an approximate eigenpair $((\wtd\alpha,\wtd\beta),\wtd x)$ is $\tau $-backward stable for the pair with respect to the norm $\Vert\cdot\Vert$ if it is an exact eigenpair for $\{A+E, B+F\}$ with $\Vert(E,F)\Vert\le\tau$. With these in mind, statements can be made about the backward stability of the algorithm which computes the eigenpair $((\wtd\alpha,\wtd\beta),\wtd x)$. In convention, an algorithm is called backward stable if $\tau = O(\epsilon_M \Vert(A,B)\Vert)$.


next up previous contents index
Next: Error Bound for Computed Up: Some Combination of and Previous: Residual Vector.   Contents   Index
Susan Blackford 2000-11-20