Operations on matrices such as addition, multiplication, scalar
multiplication, transpose, powers (exponents)
of matrices, and related facts.
Operations on vectors such as addition, scalar multiplication, and
dot (inner) product.
Special matrices (such as skew-symmetric, symmetric, orthogonal,
diagonal, lower/upper triangular, identity matrix,
zero matrix). Matrices such as idempotent, nilpotent,
involutory, and Householder, were introduced in the homework. Matrices
such as
unitary, Hermitian, skew-Hermitian, will be defined later.
Determinants of special matrices such as diagonal, orthogonal,
lower/upper-triangular,
skew-symmetric of odd order, identity, and inverses of special matrices
such as 2x2 and diagonal.
Minor, cofactor, adjoint, determinant, inverse, and related facts.
Finding the inverse by using the adjoint.
Solving linear systems whose matrix of coefficients is nonsingular by
using the inverse and disadvantages
of using this method.
Consistent and inconsistent systems of linear equations, homogeneous
systems of linear equations, equivalent systems of linear equations,
geometric interp. of a solution of a linear systems of equations,
systems with infinitely many solutions, solving a system of linear
equations by substitution and by elimination, solving
singular/non-singular diagonal systems,
solving singular/non-singular lower (resp. upper) triangular systems by
forward (resp. backward)
substitution, writing a systems of linear equations in the form
Ax=b, where A is the coefficient matrix, x is the
vector of unkowns, and b is the vector of constant terms.
Gramer's rule for solving linear systems.
Gaussian elimination, row echelon form, solving linear systems by
Gaussian elimination, finding the determinant by using Gaussian
elimination.
Gauss-Jordan elimination, reduced
row echelon form,
elementary matrices, row-equivalent matrices,
solving linear systems by Gauss-Jordan elimination, finding the inverse
by
using Gauss-Jordan elimination,
finding the determinant by using Gauss-Jordan elimination.
Span, nullspace, and linear idependence &
dependence.
Basis and dimension, extending a linearly independent set in a
vector space V to form a
basis for V, finding a basis in a set S for span S.
Linear independence of functions and the Wronskian.
Basis for and dimension (nullity) of nullspace of a matrix A
(solution space of
A x = 0).
Row space and row rank, column space and column rank, rank,
relationship between rank and nullity, relationships between these two
concepts, solutions of homogenenous and non-homogeneous systems,
invertible matrices, and row-equivalent.
p-norm of vectors (including one-norm, two-norm,
and infinity-norm);
orthogonal and orthonormal vectors; orthogonal and orthonormal sets;
unit vectors; relationship between orthogonal or orthonormal sets and
linearly independent sets, orthonormal bases for
Rn, orthogonal matrices (revisited),
Gram-Schmidt orthogonalization process.
Linear Transformations (Chapter 10): one-to-one, onto, target, range,
domain, kernel, nullity of a linear
transformation, range, rank of a linear transformation, composition,
matrix of a linear transformation, matrix of a linear
transformation, transition matrix and change of basis.
Review of complex numbers.
Matrix conjugate, Hermitian conjugate,
Hermitian, skew-Hermitian, normal, and unitary matrices.
Eigenvalues and eigenvectors of matrices, eigenpairs, eigen
value problems, eigenspace, geometric multiplicity, algebraic
multiplicity,
defective and
nondefective matrices, diagonalization, similar matrices and their eigen
structure, theorems related to the eigen structure.
Complex eigenvalues and eigenvetors.
The spectral radius and matrix norms.
Then: Block matrices.
If time permits: Applications of the topics we learned in other
fields; other topics.