Course Description
The connections between physics and machine learning have a long history, particularly in the field of neural networks and statistical learning. Application of concepts and tools of theoretical physics provided important insights into the process of learning and even sparked some novel architectures for the ML algorithms. Conversely, machine learning methods have recently been applied to distinguish phases of matter and phase transitions as well as to construct new numerical methods of simulation of physical systems. Recently, with the advent of deep learning, as well as a growing use of ML in physics itself, new connections are being discovered. Some of the most intriguing involve analogies between how information flow in deep architectures progressively distills the relevant data from the vast initial inputs and similar concepts in theoretical physics embodied in the powerful Renormalization Group formalism.
In the lecture we will try to review both the classical results on stochastic training of neural networks as well as the very recent developments.
Prerequisites
Basic notions of probability theory and machine learning will be sufficient.
Lecturer
Dr. Maciej Koch-Janusz
ETH Zurich
Maciej is a theoretical physicist with background in computer science and mathematics. He studied in Poland, the Netherlands and Israel and currently work at the ETH Zurich in Switzerland. Maciej’s recent research focuses on relations between statistical physics/information theory and machine learning.
Fields of interests: Topological states, Statistical Physics/Information Theory/Machine Learning.
Contacts: maciejk@ethz.ch