analytical eigenvalue problem


[ Follow Ups ] [ Post Followup ] [ Netlib Discussion Forum ] [ FAQ ]

Posted by Patrick van der Smagt on September 03, 1997 at 06:12:05:

I keep hoping that someone knows some paper, book,
or other reference which may help me further here.

I have a matrix

Q = sum x x + x y + y x + y y
ij p i j i j i j i j

where each x and y also has an index p (not shown).
Furthermore, for all k, E{x_k} = E{y_k} = 0.
Therefore, Q consists of two correlation matrices
which are each other's transposes, added to two
covariance matrices. By summing three symmetric
pos.def. matrices, Q itself is s.p.d.

With the courant-fischer minimax theorem I can find
a lower bound for the eigenvalues of Q, namely:
they are (assuming that |x_k| = 1) of order 1
(what, however, if |x_k| <= 1, or unbounded??).
But..... how do I find an upper bound???

Basically I want to know if Q is better conditioned
through the addition of the xx, xy, and yx terms.

Second, related problem: if I take a matrix

Q' = sum y y
ij p i j

where E{y_k} != 0 (summed over p), this matrix appears
to be very badly conditioned (in my case). Does the
condition improve when I center the y's? Does the
condition improve by adding the terms xy, yx, and xx?


Follow Ups: