Linear Algebra for Data Science

Overview

Linear algebra is definitely a must course for computer science majors. Our aim within this course is to overview the main techniques and methods of linear algebra that are widely used in data science.

We presume the student knows the basic notions of linear algebra (linear vector spaces, independence, linear transformations, orthogonality) but will briefly recall them whenever needed. Within six sessions, we will try to discuss eigenvalues and eigenvectors, singular value decomposition, least squares methods, matrix factorizations and several problems in numerical optimization. The stress will be made on illustrating applications of the topics under discussion and several projects will be suggested for numerical implementation of the methods learned.

Course topics

Introductory part: short overview of basics of linear algebra, including

• Existence and uniqueness of solutions to systems of linear equations;
• Bases of linear vector spaces;
• Coordinates in different bases;
• Transformation between R^ and R^m

Topic 1: Orthogonality and related notions

• Orthogonal vectors and subspaces; Pythagoras theorem
• Four subspaces theorem
• Projections and projectors; orthogonal vs oblique projection
• Least square solutions to linear systems and application in regression models
• Gram-Schmidt orthogonalization
• QR factorization via Gram-Schmidt
• Orthogonal and unitary transformations and their properties
• Householder reflections and Givens rotations; applications to QR factorization

Topic 2: Spectral decomposition

• Eigenvalues and eigenvectors
• Diagonalization of matrices
• Jordan canonical form
• Application to differential and difference equations
• Symmetric matrices and their properties
• Spectral decomposition of symmetric matrices
• Singular value decomposition

Topic 3: Matrix Factorisation

• Spectral decomposition (diagonalisation)
• Principal component analysis (PCA)
• Singular value decomposition and applications
• QR factorisation
• Cholesky decomposition

Topic 4: Iterative methods for solving linear systems

• Jacobi, Gauss-Seidel, and SOR (successive over-relaxation) methods
• conjugate gradient method
• Krylov subspaces method
• GMRES method

Topic 5: Numerical Optimisation

• First and second order tests
• Gradient descent and Newton’s methods
• Convex analysis
• Saddle point approach

Topic 6: Least Square Methods

• Linear least square
• Non-linear least square
• Constrained least square