Knows the pros and cons of applying solutions predeceasing transformer such as RNNs, LSTMs, CNNs
Understand the essentials theory of transformer-based deep neural networks, including its most important building blocks.
Learn what are the modern approaches to solving computer vision problems with use of transformer-based architectures component.
Michał Chromiak, PhD
Director, R&D | UBS
Module 1: Introduction to Attention and Transformers
- Concept of attention and its applications before it has become an integral part of the transformer architecture.
- Key building blocks of transformer architecture discussing their intuition and applications.
Module 2: Transformers for object detection
- The intuitions behind the recent research and
- How the transformer architecture, originating from NLP, has proven to be suitable for the computer vision domain, along
- State of the art research with examples
- 2D image can be used with Transformers for object detection.
Module 3: Slot Attention
- Extracting object-centric representations with Slot Attention
- Enable generalization to unseen compositions with Slot Attention.
- Relation to unsupervised object discovery from images.
- Explaining slot attention based on recent research
This course is for current or aspiring Data Scientists, Machine Learning Engineers, and NLP Practitioners
Knowledge of following tools and concepts:
Attention, Transformers and Convonutional Neural Networks