BYU MATH 213 - Mark Hughes - FINAL STUDY
Angle between vectors : Cosx = u⋅v/||u||∗||v||
Projection of v on u : proj_uV = (v dot u / u do tu) * u
What if there is a set of more vectors than entries in a vector? : linearly dependent as n > m
What happens if a set contain the 0 vector? : it is linearly dependent
Determinent of A (2x2) : ad-bc
Inverse of a 2x2 matrix : (1/ad-bc) [d -b] [-c a]
When is a transformation linear? : - T(U+V) = T(U) + T(V)- T(cU) = cT(U)
SUBSPACE : - 0 is in S- U+V is in S (closed for vector addition)- cU is in S (closed for scalar multiplication)- span{v, v, v, v} is a subspace
How to find basis for row, col, null : - row: row reduce and take non zero rows- col: row reduce and take pivot cols of original matrix- null: find RREF and free variables. Write the vectors in terms of the free variables and take the numeric coefficients
How do you diagonalize a matrix A? : 1) find eigenvalues of A2) find basis for the eigenspace of all eigenvalues3) P = eigenspace vectors in order from lowest lambda to highest4) P^-1 is found by inverting P5) D = diagonal matrix of eigenvalues of A
3 conditions to prove a null space : - conditions are 0, U+V, and cU are in subspace- AX = 0, AU = 0, AV = 0, A0 = 0
Finding the standard matrix of a transformation : - A = [ T(1,0,0) T(0, 1, 0) T(0, 0, 1) ]- for projections, do T(1, 0) and T(0, 1)
Orthonormal Set of Vectors : Orthogonal set of vectors that are also unit vectors
Orthogonal Matrix (nxn) : - UU^T = U^TU = In- it is square- U^T = U^-1 and both are ortho- detU = +- 1- | lambda | = 1- U has ortho rows and cols- rank A = n- U has orthonormal rows
Gram-Schmidt Process :
Gram-Schmidt Example :
Spectral Theorem : An nxn matrix is orthogonally diagonalizable IFF it is symmetric
Steps to Orthogonally Diagonalize a matrix: :
Singular Value Decomposition Theorem :
Steps for SVD :


