Welcome to MIT 18.06: Linear Algebra! The Spring 2025 course information, materials, and links are recorded below. Course materials for previous semesters are archived in the other branches of this repository. You can dive in right away by reading this introduction to the course by Professor Strang.
Catalog Description: Basic subject on matrix theory and linear algebra, emphasizing topics useful in other disciplines, including systems of equations, vector spaces, determinants, eigenvalues, singular value decomposition, and positive definite matrices. Applications to least-squares approximations, stability of differential equations, networks, Fourier transforms, and Markov processes. Uses linear algebra software. Compared with 18.700, more emphasis on matrix algorithms and many applications.
Instructor: Prof. Nike Sun
Textbook: Introduction to Linear Algebra: 6th Edition.
- Vectors in
, and generalization to vectors in (N-dimensional space). - Vector operations: addition and scalar multiplication. Both operations together: linear combinations.
- The span of a set of vectors
is the set of all linear combinations of these vectors: we covered some examples in class. - Definition of matrix times vector:
where is an matrix, and is in .
Reading: Strang Chapter 1.
- Dot product, vector length, cosine formula.
- The gaussian elimination algorithm for solving a system of linear equations Ax=b: reduce the matrix to row echelon form (REF), and do back-substitution. Worked through an example in class.
- Definition of matrix times matrix:
where is , is , is .* We explained how to view the gaussian elimination operations as matrix multiplication operations: the steps of gaussian elimination correspond to changing to , , etc.
Reading: Strang 2.1-2.2.
- Reviewed the gaussian elimination example using matrix multiplications to encode the operations.
- Gauss-Jordan elimination has a few additional steps which brings the system to reduced row echelon form (RREF) — we did this in the same example, again using matrix multiplications.
- In the example, the final RREF system was
. Moreover we found , the identity matrix. In this example it allowed us to read off . - We reviewed basic rules of matrix multiplication: associative
, distributive , but not commutative: and are generally not equal! - Inversion: if
is an matrix, it is invertible if there exists a matrix , also , such that , the identity matrix. - If
is invertible, then Gauss-Jordan elimination converts to . Moreover it converts to . - Square matrices need not be invertible, and we covered some examples.
Reading: Strang 2.3.
- The columns of
are linearly dependent if a non-trivial linear combination of the columns is zero: can write this as for nonzero. If is a square matrix with linearly dependent columns, then cannot be invertible. We covered some examples. - We defined the column space
of a matrix. An matrix can be viewed as a function/map from to , sending input in to output in . The column space is exactly the image of this map. The equation is solvable if and only if lies in . - We defined general vector spaces and subspaces, and covered some examples.
Reading: Strang 3.1.
- Defined the null space
as the set of vectors such that . - Note that if
is , then is a subspace of , while is a subspace of . - Invertible row operations (such as in Gauss-Jordan elimination) do not affect the null space, so if
is the RREF of , then . - We covered several examples of calculating
. We also noted that in all our examples, dim C(A) + dim N(A) = n.
Reading: Strang 3.2.
- In this class we covered the general solution for a system of linear equations Ax=b.
- The basic principle: if
is not in , then there is no solution. If is in , then there must exist at least one "particular solution," call it . Then the set of all solutions to is the set of vectors , where is the particular solution, and is any vector from the null space . - General recipe for solving:
- given
, apply Gauss-Jordan elimination to transform to RREF system . - If the RREF system contains a row that says 0 = nonzero, then we have a contradiction, and in this case
is not in and there is no solution. - Otherwise, set the free variables to zero to find a particular solution
. - Separately, solve for the null space
. - Then the set of all solutions to
is the set of vectors , where is the particular solution, and is any vector from .
- given
Reading: Strang 3.3.
- Throughout this class, we let
be list of n vectors, each in the space . Let be the matrix with columns . - The vectors
are linearly dependent if a non-trivial linear combination of them equals zero: this corresponds to being strictly larger than . Otherwise, we say they are linearly independent: this corresponds to . - A basis for a vector space
is a list of vectors that span , and are linearly independent. We covered the standard basis for the space . - Let
. Then is the same as . If are linearly independent, then they form a basis for .
- The vectors
- More generally, perform Gauss-Jordan elimination, and let
be the RREF of . Then . - The pivot columns of
form a basis for , and the corresponding columns of form a basis for . - Note that rank(A) = # pivots in R = dim C(R) = dim C(A). Meanwhile # free variables in R = dim N(A).
- There are n columns total, and each column must be pivot or free, so n = # pivot + # free = dim C(A) + dim N(A): this is the rank-nullity theorem.
- The pivot columns of
- Lastly, we reviewed that if
is an matrix, then we view it as a map from to , sending in to in .
Reading: Strang 3.4.
Note: You should be able to do all of exam 1 using only the information covered in Lectures 1–6, together with the homework, recitations, and reading assigned before exam 1. However, since all the topics of the class are closely connected, material from Lecture 7 might also be helpful to know for the exam, and you are free to use that material on the exam as well. All exams in this class are closed book, closed notes.
- We started the lecture with the definition of the matrix transpose
. - Note in general
, and . - If
, then we say that is symmetric. Only square matrices can be symmetric.
- Note in general
- We covered the four fundamental subspaces of a matrix, and how to calculate them. Throughout, let A be an
matrix, and let be the RREF. Thus is an invertible matrix that encodes the Gauss-Jordan row operations. - Column space
. This is a subspace of . - Null space
. This is a subspace of . - Row space
. This is a subspace of . - Left null space
. This is a subspace of .
- Column space
Formal reasoning for the above claims:
- Column space:
and . Thus . This proves . - Null space:
and . Since invertible, . This proves . - Row space: recall
. and . Since is invertible, is also invertible. As ranges over all of , also ranges over all of . Therefore . - Left null space: (There was a typo on the blackboard, so please read this one carefully.)
and . Therefore . This proves .
In class, we calculated the four fundamental subspaces on a small example. We verified that the column space and left null space are orthogonal subspaces of
Reading: Strang 3.5.
- In this class we reviewed the four fundamental subspaces of an
matrix . - We went over an example of how to calculate the four subspaces of
, using the RREF . - Dimensions: both column space and row space have dimension r = rank(A). The null space has dimension
, while the left null space has dimension . - We covered the fact that in
, the row space and null space are orthogonal complements of one another. In , the column space and left null space are orthogonal complements of one another.
Reading: Strang 4.1.
- We covered what it means for two subspaces of
to be: - complementary
- orthogonal
- orthogonal complements.
- In particular:
- If
and are complementary subspaces of , then any can be uniquely written as with from , from . - If
and are in additional orthogonal complements, then is the orthogonal projection of onto , while is the orthogonal projection of onto . Denoted and .
- If
- We discussed the geometric interpretation of orthogonal projection:
-
is the unique vector in that lies closest to . - equivalently,
is the unique vector in such that is perpendicular to . - We used the latter characterization to calculate
where is the span of a single nonzero vector in .
-
Reading: Strang 4.2.
- We covered the general formulas for orthogonal projection.
- Projection onto a one-dimensional subspace
, where is a unit vector in : where . Note that is an symmetric matrix. Its column space is exactly the one-dimensional space , therefore has rank one. - Projection onto a general subspace
of , where : first express where matrix whose columns form a basis of . We showed in class that where . This is an symmetric matrix. Its column space is exactly , therefore has rank . -
Claim: If
is with rank , then is invertible. We stated this fact in class, and used it to define . We did not yet give a justification of this fact, and will do so in a future lecture. - Note that
where . This achieves the minimum distance , and we call this the least squares solution.
-
Claim: If
- Lastly we went over some examples of the projection matrix formula:
- In the one-dimensional case
where is a unit vector, we take and recover the formula . - If we have an orthonormal basis
for , then where is the orthogonal projection onto .
- In the one-dimensional case
Reading: Strang 4.3.
- As we learned previously, the equation
does not have a solution if b does not lie in column space . In this case, one can instead ask for the least squares (LS) solution: the choice of x that minimizes
- This means
should be precisely the projection of onto , so from what we previously learned, we see that , and consequently . - Application: given a data set
for , we covered how to find: - The straight line with no intercept that achieves the least squares fit:
where is the slope; - The straight line with intercept that achieves the least squares fit:
where is the intercept and is the slope; - The cubic function that achieves the least squares fit:
.
- The straight line with no intercept that achieves the least squares fit:
Reading: Strang 4.3.
Reading for upcoming lectures: we will continue through Strang Chapter 4.