Below is a list of terminology and results you should know for the exam. The exam is cumulative, but there will be an emphasis on material covered since the last midterm. For the terminology you should be prepared to give definitions. For the results, I've indicated which results you should be able to prove. For the others, you should know the statement and how to apply it. This list may change slightly over the next few days as I finish writing the final.
- Vector spaces.
Examples: F^n, Functions Fun(S,F) for a set S, polynomials, vector space of linear maps L(V,W) and matrices M_n(F), vector space of multilinear (possibly alternating) k-forms.
Subspaces. Complements. Direct sums. Linear independence, span. Linear dependence lemma. Linear independence theorem. dimension, finite vs. infinite dimensional.
Kernel, image. Rank--nullity. Examples. Invertibility, equivalent conditions for invertibility, formula for inverse of a 2x2 matrix. Defining a linear map on a basis. Linear maps and matrices. Matrix multiplication. How to go from formula for linear map to a matrix. Eigenvalues and eigenvectors (know how to find them). Diagonalizability. Invariant subspaces. vector space of linear maps, dual spaces.
Division algorithm (know the linear algebra proof). Root theorem (with proof). Fundamental theorem of algebra. Factoring polynomials over R and over C. Polynomial of an operator. Satisfied polynomials and eigenvalues. Eigenvector/invariant subspace existence for complex/real operators (with proof).
Real inner product spaces.
Inner product, inner product space. Pythagorean theorem/Cauchy--Schwarz inequality/Triangle inequality (with proofs). Angles and lengths. Orthonormal vectors. Gram--Schmidt algorithm (be able to apply it). Orthogonal complements. Representation proposition (with proof). Matrix of an inner product.
Operators on inner product spaces.
Adjoints. Computing the adjoint in examples. Spectral theorem. Properties of adjoints (relation between kernel/image of T and its adjoint, with proof). Isometries/characterization (with proof). Positive operators/characterization (with proof). Square roots.
Matrix decomposition theorems.
Eigen-decomposition (with proof, know how to compute it for a 2x2 matrix, know how to use it to compute square roots), polar decomposition (with proof in the invertible case), singular value decomposition, singular values (know how to compute singular values for a 2x2).
k-multilinear form, alternating form, sign of a permutation, determinant of a matrix. Determinant theorem (i.e. det is the unique function such that...). Know that det(AB)=det(A) det(B) (you don't need to know the proof). Know how to compute 2x2 and 3x3 determinants.
Stochastic matrices have eigenvalues (c.f. HW6#10 and the page-rank lecture). Laplace matrix, matrix-tree theorem. Metric spaces, distance geometry theorem.