Systems of Linear Equations: Matrices as Equation Solvers A system of linear equations looks messy on paper, but pack it into a matrix and it becomes a single object you can manipulate, invert, and solve. The geometry makes it obvious.
Eigenvalues and Eigenvectors: The Directions That Do Not Rotate When you apply a matrix transformation, most vectors spin off in new directions. Eigenvectors are the rare exceptions: they stay on their own line and simply stretch or shrink — the skeleton hiding inside every matrix.
Determinants: The Volume-Scaling Factor of Matrices The determinant is one number that tells you everything important about a matrix. How much does the transformation scale volume? Is the transformation reversible? Does it flip orientation? The determinant answers all of these. It's not a calculation you memorize. It's a property you understand. The
Matrix Multiplication: Why the Rows and Columns Dance The strange row-column procedure for multiplying matrices makes instant sense once you see it as transformation composition. Multiplying two matrices asks: what single transformation does A followed by B equal?
Matrices: Arrays That Transform Vectors Matrices look like grids of numbers. Think of them instead as machines that transform space — rotating, scaling, shearing vectors. That shift in perspective makes everything from matrix multiplication to eigenvalues click.
Vectors: Quantities with Direction and Magnitude A vector carries two pieces of information: how much and which way. That combination turns out to be the right language for forces, velocities, gradients, and word embeddings — almost anywhere that pure numbers fall short.
What Is Linear Algebra? The Mathematics of Many Dimensions Every time you multiply a matrix, you're applying a linear transformation — stretching, rotating, or collapsing a space. Linear algebra is the mathematical language that unifies these operations, and it's the foundation of nearly every quantitative field.