Definition: Let
be an
matrix. The number
is called an eigenvalue of
if there
exists an
nonzero vector
such that
.
Every nonzero vector satisfying
is called an eigenvector of
corresponding to
the eigenvalue
, and
is called and an eigenpair of
. The set of all
eigenvalues of
is called the spectrum
of
, denoted
.
Remarks:
- Eigenvalues are also called proper values (eigen is a German word
means proper) or characteristic
values or latent values. Eigenvectors are also called proper vectors or
characteristic vectors or latent vectors.
-
iff
. Thus,
is an
eigenvalue of
iff
has a nontrivial solution (i.e. the solution space
is not just the zero vector) iff
is singular iff
.
- Eigenvectors are also called right eigenvectors. Left
eigenvectors are defined as follows:
is said
to be a left eigenvector of
iff
(i.e.
is a
right eigenvector of
associated with
the eigenvelue
). Note that
here is a column vector. Note also
that
and
have the same
eigenvalues but not necessarily the same eigenvectors.
- If the eigenvalues of
are distinct, and
is a right
eigenvector of
corresponding to the eigenvalue
and
is a left eigenvector of
corresponding to the eigenvalue
,
where
, then
;
if
, then
.
Definition: The polynomial
is
called the characteristic polynomial
of
and the equation
is called the characteristic equation of
. The roots (zeros)
of the characteristic polynomial are the eigenvalues of
.
Remarks:
- Some people call
the
characetreristic polynomial and
the characteristic equation. Note that the
characteristic polynomial is of degree
and
. Thus,
and
have the same roots.
- If you expand
, then
- The coefficient of
is
and the coefficient of
is
.
-
, where
,
,
,
are the
eigenvalues of
(including repeated eigenvalues).
-
.
Thus,
is an eigenvalue of
iff
is singular iff
iff
.
- If you expand
, then you get similar things (but
remember
. In
particular, you'll have here also
and
. Thus,
the determinant of
is equal to the product of its eigenvalues and the
trace of
is equal to the sum
of its eigenvalues.
Definition: Let
be an eigenvalue of matrix
and let
be the set consisting of the zero
vector and all vectors of
associated with
. Then
is a
subspace of
and it's called
the eigenspace of
associated with
. The dimension of
is called the geometric
multiplicity of
. The algebraic multiplicity of
is the multiplicity of
as
a root of
. An eigenvalue is called simple
of its algebraic multiplicity is 1, and
multiple if its algebraic multiplicity is greater than 1. A matrix
is called stable if
the real part of each eigenvalue of
is negative (i.e. all the
eigenvalues of
lie in the open left half plane).
Theorem: The geometric multiplicity of an eigenvalue is less than or
equal to its algebraic
multiplicity.
bf Examples:
- Let
.
Then
is
an eigenvalue of
of algebraic multiplicity 2 and a geometric
multiplicity 1. Thus, the geometric
multiplicity of
as an eigenvalue of
is less than its
algebraic multiplicity.
- Let
. Then
is
an eigenvalue of
of algebraic multiplicity 2 and a geometric
multiplicity 2. Thus, the geometric
multiplicity of
as an eigenvalue of
is equal to its
algebraic multiplicity.
Definition: An
matrix is called diagonalizable
(or cane be diagonalized)
iff there exists a nonsingular matrix
and a diagonal matrix
such that
. If such a matrix
exists, we say
diagonalizes
. I.e.
is diagonalizable iff it's similar to a
diagonal matrix.
Definition: If an
matrix has less than
linearly
independent eigenvectors,
is called defective.
Definition: Let
be a linear operator on a vector space
. An
eigenvector of
is a nonzero
vector
in
such that
for some scalar
.
In this case, we
say
is an eigenvalue of
.
Remark: Let
be the matrix of a linear operator
(i.e.
). Then the eigenvalues/eigenvectors
of
and
are the same.
Note: The problem in which we have to determine
eigenvalues/eigenvectors is called an eigenvalue problem.
How to find the eigenvalues and associated eigenvectors of an
matrix
?
- Eigenvalues: find the roots of
; i.e.
solve the equation
. These are
the eigenvalues of
.
- Eigenvectors: For each eigenvalue
, find a basis for
the solution space of
. The vectors in the basis are linearly independent
eigenvectors of
associated with
.
Reminder of definitions we introduced in the past:
- An
matrix is called nilpotent iff
for
some positive integer
.
- An
matrix is called idempotent iff
.
- An
matrix
is said to be similar to the
matrix
iff there
exists an invertible
matrix
such that
.
Note that similarity is an
equivalence relation.
Remark: Let
be eigenvectors of the
matrix
corresponding
to the eigfenvalues
, and let
. Then
, where
.
If
are linearly independent, then
is
nonsingular, and thus we get,
, which
means
diagonalizes
. Note also we can write
.
Thus, the columns of
are right eigenvectors
of
and the rows of
are left eigenvectors of
(and right
eigenvectors of
).
Theorems: Let
and
be
matrices.
- If
is an eigenpair of
and
is a nonzero number,
then
is an
eigenpair of
, and
is an eigenpair of
,
.
If
is a non-real eigenvalue and
is real, then
is an eigenpair of
.
and
have the same eigenvalues but not necessarily the same
eigenvectors.
and
have the same eigenvalues but not necessarily the same
eigenvectors.
- If
is nonsingular, then
is an eigenpair of
iff
is an eigenpair of
.
- The eigenvalues of a diagonal/lower-triangular/upper-triangular
matrix are equal to the elements
of the main diagonal.
is diagonalizable iff
has
linearly independent
eigenvectors. Moreover, if
is similar to a
diagonal matrix
, then the elements of
are the eigenvalues of
.
- If
is defective, then
is non-diagonalizable.
- If
is nondefective and
are linearly
independent eigenvectors of
corresponding to the eigenvalues
,
,
,
respectively, and
], then
.
- If
and
are similar, then they have the same eigenvalues. You
can easily find a relationship between
their eigenvectors.
- If
is idempotent and
is an eigenvalue of
, then
or
.
- If
is nilpotent and
is an eigenvalue of
, then
.
- The eigenvalues of a Hermitian matrix are real and the eigenvalues
of a skew-Hermitian matrix have zero
real parts.
- If
is an eigenvalue of a unitary matrix, then
.
is normal iff
has
orthogonal eigenvectors.
- For a
matrix, the characteristic polynomial is
.
- If
is diagonal, then so are
and
,
, and if
is nonsingular,
then
is also diagonalizable.
- If the eigenvalues of
are distinct, then
is diagonalizable.
is diagonalizable iff
has a complete set of
linearly
independent eigenvectors.
Exercises:
Prove (1), (2), (4), (5), (7), (9) (here also find the relationship
between the eigenvetors of
and those of
), (10), (11), (12), (13), (15), (16).
REFERENCES: