Introduction to Natural Language Generation

|
|
|
Introduction to Natural Language Generation

Course Description

This course provides an overview of the Natural Language Generation (NLG) field. We will cover state-of-the-art (SoTA) methods based on neural networks, as well as more traditional methods based on context-free and dependency grammars. We will highlight important differences between generative and discriminative models, such as differences in loss functions and ways of evaluating results. Finally, we will touch upon multilingual NLG with currently available methods and challenges.

Course tools

  • Python, PyTorch, Numpy

Prerequisites

  • Python (some knowledge of PyTorch will be useful)
  • Neural networks (basic concepts, gradient descent, discriminative losses)
  •  Basic probability theory and linear algebra
Level of complexity of course Intermediate

Lecturer

Dr. Dmytro Kalpakchi

Ph.D. Student at KTH Royal Institute of Technology

Currently, Dmytro is a Ph.D. student at KTH Royal Institute of Technology in Stockholm, Sweden. His main interest revolves around Natural Language Processing and Generation. Previously he has received a master’s degree in Machine Learning at KTH and a bachelor’s degree in Computer Science at the National Technical University of Ukraine “Igor Sikorsky Kyiv Polytechnic Institute”. Fields of interests: NLP, NLG, Dialogue systems Contacts: dkalpackchi@gmail.com

Про факультет

Важлива інформація

Контактна інформація