Discrete Computation Graphs 2020

Course description

The enormous success of deep learning is partially due to the simplicity of the backpropagation algorithm, which allows one to efficiently compute the gradient of any loss function defined as a composition of differentiable functions. However, in a variety of problems originating in supervised, unsupervised, and reinforcement learning, computation graph includes a collection of discrete components such as discrete random variable, graphs, or even programs. By the end of this course, you will have learned contemporary methods that allow efficient training of such models. Prerequisites Hard requirements: You will need some basic knowledge of linear algebra, calculus, probability theory, statistics (mainly different concepts around estimators), Python. Optional, but desirable: neural networks, graphical models, variational inference, reinforcement learning, natural-language processing. Course tools Bayesian deep learning, latent variable models, latent structure.

About the lecturer

Serhii Havrylov Serhii is a Ph.D. student at the ILCC, the University of Edinburgh. Previously, he had been working on his thesis at the ILLC, University of Amsterdam. Prior to starting Ph.D., Serhii had been working as a research engineer a little bit more than three years. He has Bachelor’s and Master’s degrees in applied mathematics from the National Technical University of Ukraine “KPI”.