Course Description
This course provides an overview of the Natural Language Generation (NLG) field. We will cover state-of-the-art (SoTA) methods based on neural networks, as well as more traditional methods based on context-free and dependency grammars. We will highlight important differences between generative and discriminative models, such as differences in loss functions and ways of evaluating results. Finally, we will touch upon multilingual NLG with currently available methods and challenges.Course tools
- Python, PyTorch, Numpy
Prerequisites
- Python (some knowledge of PyTorch will be useful)
- Neural networks (basic concepts, gradient descent, discriminative losses)
- Basic probability theory and linear algebra
Lecturer
Dr. Dmytro Kalpakchi
Ph.D. Student at KTH Royal Institute of Technology |