Linear algebra is one of the pillars of modern mathematics, both pure and applied, as well as of scientific computing. It is as important as calculus, maybe more important. You will encounter it and learn more about it in virtually every advanced math course you ever take. This course is just a beginning.
Linear algebra is about what you can do with linear combinations, which are simply sums with coefficients. You are already familiar with examples, such as polynomials: . You may have encountered vectors in space such as
.
In linear algebra as in every subject in life there are both objects and interactions between objects that require attention. The objects are vector spaces (which are collections of vectors) and the interactions are provided by linear transformations (which are functions that take vectors as inputs and produce vectors as outputs – you can think of matrices).
In linear algebra both concepts and computations are important. The concepts are not difficult, but there are a lot of them. There is a good deal of essential vocabulary. Here is a short list of some of the most important terms: linear combination, linearly independent/dependent, span, basis, dimension, columnspace, rank, nullspace, coordinates with respect to a basis, linear transformation, matrix of a linear transformation, determinant, eigenvector, eigenvalue, inner product, orthonormal basis, projection, orthogonal transformation. For computations, we get some assistance from computers, especially Mathematica.
Summary of the subject
- Definition. Linear algebra: the study of linear combinations.
- Definition: Linear combination: sum with coefficients.
- If you can make sums with coefficients (usually real or complex number coefficients) then you can do linear algebra. Example: vectors, matrices, polynomials, functions. If you are into dot projects, lengths, angles, then strictly speaking you are past linear algebra into geometry, but this geometry comes up in linear algebra courses.
- Definition: Vector space: A collection (set) of objects where you can make linear combinations.
- Definition: Linear map T (or transformation or operator – all synonymous): A function from an input vector space to an output vector space with the property that T of a linear combination of input vectors equals the same linear combination of the output vectors.
- Key definitions: (1) span and linear independence, (2) basis, dimension, and rank, (3) eigenvector and eigenvalue.
- You can turn linear transformations into matrices. The matrix is nothing but a short table of values of the function. Each column is an output value of the transformation (in code) and you have to be told somewhere earlier what the code is and what the input vectors are that go with the outputs. The code is given by bases for the input and output spaces.
Course components
- Vector spaces, bases, dimension
- Linear transformations, matrix of a linear transformation
- Good bases in presence of inner products: orthonormal bases
- Good bases in presence of linear transformation: bases of eigenvectors
- Symmetric matrices; singular value decomposition