Linear Algebra for Data Science

Overview

Linear algebra is definitely a must course for computer science majors. Our aim within this course is to overview the main techniques and methods of linear algebra that are widely used in data science.

We presume the student knows the basic notions of linear algebra (linear vector spaces, independence, linear transformations, orthogonality) but will briefly recall them whenever needed. Within six sessions, we will try to discuss eigenvalues and eigenvectors, singular value decomposition, least-squares methods, matrix factorizations and several problems in numerical optimization. The stress will be made on illustrating applications of the topics under discussion and several projects will be suggested for numerical implementation of the methods learned.

Course topics

Introductory part: a short overview of the basics of linear algebra, including

  • linear systems (ie, systems of linear equations)
  • matrix algebra (matrices and operations with them)
  • the rank of a matrix, solution set to the equation Ax=b
  • linear (in)dependence of vectors, bases of linear vector spaces
  • linear transformations and their representation via matrices

Topic 1: Orthogonality 

  • Four subspaces theorem
  • Projections and projectors; orthogonal vs oblique projection
  • Least square solutions to linear systems and application in regression models
  • Gram-Schmidt orthogonalization
  • QR factorization via Gram-Schmidt
  • Orthogonal and unitary transformations and their properties
  • Householder reflections and Givens rotations; applications to QR factorization

Topic 2: Eigenvalues and eigenvectors

  • Eigenvalues and eigenvectors and their characterization
  • Diagonalisation of a square matrix under the change of basis
  • Calculating A^n for large n and solution of the related difference equations
  • Jordan canonical form
  • Application to differential and difference equations

Topic 3: Symmetric matrices and quadratic forms

  • Symmetric matrices and their properties
  • Spectral decomposition of symmetric matrices
  • Quadratic forms
  • Principal component analysis
  • Orthogonal matrices

Topic 4: Matrix Factorisation

  • singular value decomposition (SVD) and many of its applications
  • LU and its symmetric modification, the Cholesky decomposition A=LL^⊤
  • recall the QR factorization etc.

Topic 5: Iterative methods

  • Iterative methods for solving linear systems
  • Jacobi, Gauss-Seidel, and SOR (successive over-relaxation) methods
  • conjugate gradient method
  • Krylov subspaces method
  • GMRES method
  • Iterative methods for eigenvalue calculations
  • power method
  • QR method

Topic 6: Numerical Optimisation

  • first and second-order conditions for the minimum of a function
  • specify them for convex functions
  • use Lagrangians for constrained optimization and explain Kuhn-Tucker saddle point method
  • gradient descent, conjugate gradient, and Newton methods

Topic 7: Least Square Methods

  • constrained least squares
  • nonlinear least squares

Prerequisites

 

Lecture sample