Institute of Computer Science
  1. Courses
  2. 2021/22 spring
  3. Transformers (MTAT.06.055)
ET
Log in

Transformers 2021/22 spring

  • Main
  • Lectures
  • Labs
  • Homeworks
  • Projects
  • Seminar

Lectures

  • Given by Mark, mark@tartunlp.ai
  • Tuesdays at 10:15-12:00
  • Open to anyone interested
  • Recordings and slides available after the lecture (see table below)
  • General link to lectures: at Panopto
1.Feb 8Motivation and overview (slides, video)
2.Feb 16Language models (slides, video)
Sequential vs masked LM, probabilistic / neural, fixed / contextual embeddings
3.Feb 23Attention (slides, video)
Encoder-decoder sequence-to-sequence, attention mechanism
4.Mar 1Self-attention (slides, video)
Transformer as introduced for NMT/sequence-to-sequence, incl. self-attention,
multiple attention heads and layers, discussion of why it works
5.Mar 8Transformers for NLP (slides)
6.Mar 15Transformers for images and sound (slides, video)
7.Mar 22Transformers for bio and health data, time series (slides, video)
8.Mar 29Output generation and evaluation (slides, video)
9.Apr 12Neural machine translation (slides, video)
10.Apr 26Statistical machine translation (slides)
  • Institute of Computer Science
  • Faculty of Science and Technology
  • University of Tartu
In case of technical problems or questions write to:

Contact the course organizers with the organizational and course content questions.
The proprietary copyrights of educational materials belong to the University of Tartu. The use of educational materials is permitted for the purposes and under the conditions provided for in the copyright law for the free use of a work. When using educational materials, the user is obligated to give credit to the author of the educational materials.
The use of educational materials for other purposes is allowed only with the prior written consent of the University of Tartu.
Terms of use for the Courses environment