: ¤³¤Îʸ½ñ¤Ë¤Ä¤¤¤Æ...
Remarks: Let
and
be
matrices.
- We said that if
and
are similar, then they have the same
eigenvalues. In particular,
if
is diagonalizable, then
is similar to a diagonal matrix
.
Thus,
and
have the same eigenvalues
which means the diagonal elements of
are the eigenvalues of
. Now
if
is diagonalizable, then there exists
a nonsingular matrix
such that
. In this case, we say
diagonalizes
or
is diagonalizable
via
. Now after explaining the relationship between
and
(they
have the same eigenvalues),
the questions that arise are:
- What is the relationship between
and
?
- Are
and
unique?
The answer to this first is that the columns of
are eigenvectors of
. The answer to the second is
is not unique and
is not necessarily unique. In fact, if you multiply
by any nonzero number,
then the resulting matrix will diagonalize
.
Also, if you change the order of the columns of
, then the resulting
matrix will diagonalize
. If you do so,
also may change (will have the same elements as the original, but the
location of these elements may change),
which means
is not necessarily unique.
Remark: To diagonalize an
matrix
that has
linearly independent eigenvectors,
find
linearly independent eigenvectors
,
,
,
,
of
. Then form the
matrix
. Now
, where
is
a diagonal matrix whose
diagonal elements are the eigenvalues of
corresponding to the
eigenvalues of
,
,
,
.
- Now recall that the eigenvalues of Hermitian matrices and the
diagonal elements are real and the
eigenvalues of skew-Hermitian
matrices and the diagonal elements have zero real parts. Also, recall that
if
is unitary, then
(note
can be complex),
is normal,
,
,
and if
is an eigenvalue of
, then
. Moreover,
any two distinct columns of
are orthogonal (i.e.
, when
, and each one of them is a unit vector). Here are more facts
about these matrices:
- (Schur's Theorem) If
is an
matrix, then there
exists a unitary matrix
such that
, where
is upper-triangular. Moreover,
and
have the
same eigenvalues.
- From the above, note that we can write
(proof:
exercise). This is called the Schur decomposition of
or
Schur normal form.
- From Schur's theorem: If
is an
hermitian or a
skew-Hermitian matrix, then there
exists a unitary matrix
such that
, where
is diagonal.
Thus,
is diagonalizable, and we say in this case
is unitarily
diagonalizable.
Proof of the Hermitian Case: By Schur's Theorem, there exists a
unitary matrix
and an upper-triangular matrix
such that
. Now take the Hermitian transpose of both sides to get
. Thus,
. But,
also. Therefore,
, which
implies
is diagonal. The skew-Herimitian case is
similar.
in Schur's decomposition is diagonal iff
is normal.
Moreover, when
is normal, the rows of
are eigenvetors of
. Thus,
is unitarily diagonalizable iff
is
normal. Moerover,
is normal iff
has a complete orthonormal set of eigenvectors.
- From the previous part, if you take
to be real symmetric, then
there
exists an orthogonal matrix
such that
, where
is diagonal. Thus,
is diagonalizable, and we
say in this case
is orthogonally
diagonalizable (proof: exercise). The eigenvalues of a real
symmetric matrix are real and eigenvectors are real.
Also, eigenvectors corresponding to different eigenvalues are orthogonal.
- If
is Herimitian, then eigenvectors that correspond to different
eigenvalues are orthogonal.
Proof: Let
and
be two eigenpairs of
where
. We have to prove that
. Now
consider
Therefore,
, which implies
, .
Since
, then
.
is orthogonaly diagonalizable iff
has
orthonormal
eigenvectors iff
is real symmetric.
- (Cayley-Hamilton Theorem) Every matrix satisfies its characteristic
equation.
Transforming Complex Hermitian Eigenvalue Problems to Real Ones
Let
be a complex Hermitian matrix, where
and
are
real
matrices and let
be an eigenpair of
, where
and
are in
(recall that
is real because
is Hermitian).
Now,
if and
only if
Note that since
is Hermitian, then
is symmetric and
is
skew-symmetric. Hence,
the matrix
is real symmetric. Thus, we managed to reduce a complex Hermitian
eigenvalue problem of order
to a real symmetric eigenvalue
problem of order
.
The companion Matrix
Let
be the
matrix such that
,
, and
,
. By expanding the determinant of
across the
last row, you'll find out that the characterestic polynomial of
is
The matrix
is called the companion matrix of the polynomial
. The companion matrix is sometimes
defined to be the traspose of the matrix above and it satisfies the
following:
,
,
and
.
Definition: The spectral radius of an
matrix
,
denoted
,
is the maximum eigenvalue of
in magnitude;
i.e. if the eigenvalues of
are
,
,
, then
max
.
Definition: Let
be an
matrix and let
be the
matrix such that
.
- The one-norm of
, denoted
, is the maximum column sum of
.
- The
-norm, denoted
, of
is the maximum
rwo sum of
.
- The two-norm (or spectral norm) of
, denoted
, is the
non-negative square root of
.
Note that
is square and symmetric.
- The Frobenius norm of
, denoted
, is
.
Remark: There are properties and equivalent definitions for matrix
norms, but we don't have time
to go over that.
Theorem (Gerschgorin): Let
be an
matrix and define
the disks
If
is an eigenvalue of
, then
is located in
.
: ¤³¤Îʸ½ñ¤Ë¤Ä¤¤¤Æ...
Iyad Abu-Jeib
Ê¿À®17ǯ1·î1Æü