Course Abstract

Training duration : 90 minutes

Quite recently, techniques like Transfer Learning have been in use where a deep learning model trained on a large dataset is used to perform similar tasks on another dataset, and these models are called pre-trained models. The requirement for transfer learning techniques in NLP went at an all-time high. In this session, we will start by introducing the recent breakthroughs in NLP that resulted from the combination of Transfer Learning and Transformer architectures. Then, we'll learn to use the open-source tools released by HuggingFace like the Transformers and Tokenizers libraries and the distilled models.


Learning Objectives

  • Understanding Transfer Learning in NLP

  • How the Transformers and Tokenizers libraries are organized and

  • How to use Transformers and Tokenizer for downstream tasks like text classification, NER and text generation


Instructor Bio:

Thomas leads the Science Team at Huggingface Inc., a Brooklyn-based startup working on Natural Language Generation and Natural Language Understanding. After graduating from Ecole Polytechnique (Paris, France), he worked on laser-plasma interactions at the BELLA Center of the Lawrence Berkeley National Laboratory (Berkeley, CA). Got accepted for a PhD at MIT (Cambridge, MA) but ended up doing his PhD in Statistical/Quantum physics at Sorbonne University and ESPCI (Paris, France), working on superconducting materials for the French DARPA (DGA) and Thales. Thomas is interested in Natural Language Processing, Deep Learning, and Computational Linguistics. Much of his research is about Natural Language Generation (mostly) and Natural Language Understanding (as a tool for better generation).

Thomas Wolf, PhD

Chief Science Officer | Hugging Face 🤗

Background knowledge

  • This course is for current and aspiring Data Scientists, NLP and ML Engineers, and AI Product Managers

  • Knowledge of following tools and concepts is useful:

  • Familiarity with Python and Jupyter notebooks

  • Basic understanding of Natural Language Processing techniques

Real-world applications

  • Transformers are extensively use in natural language generation in Conversational AI.

  • More recently, Facebook AI Research released Transformer deep learning model to learn embeddings for protein sequences with 669.2M parameters

  • ABN AMRO bank is already using Transformers and NLP-powered chatbots to provide better customer service