Overview

Transfer learning enables leveraging knowledge acquired from related data to improve performance on a target task. The advancement of deep learning and a large amount of labelled data such as ImageNet has made high-performing pre-trained computer vision models possible. Transfer learning, in particular, fine-tuning a pre-trained model on a target task, has been a far more common practice than training from scratch in computer vision.  
In NLP,  starting from 2018, thanks to the various large language models (ULMFiT, OpenAI GPT, BERT family, etc) pre-trained on a large corpus, transfer learning has become a new paradigm and new state of the art results on many NLP tasks have been achieved.
In this session, we'll learn the different types of transfer learning, the architecture of these pre-trained language models, and how different transfer learning techniques can be used to solve various NLP tasks. In addition, we’ll also show a variety of problems that can be solved using these language models and transfer learning.

AI+ SUBSCRIPTION PLANS

New on-demand courses are added weekly

Session Overview

  • 1

    ODSC West 2020: Transfer Learning in NLP

    • Overview and Author Bio

    • Transfer Learning in NLP

Instructor Bio:

Joan Xiao, PhD

Principal Data Scientist | Linc Global

Joan Xiao, PhD

Joan Xiao is a Principal Data Scientist at Linc Global, a commerce-specialized customer care automation company. In her role, she applies novel natural language processing and machine learning techniques to improve customer experience. Previously she led machine learning and data science teams at various companies ranging from startup to Fortune 100. Joan received her Ph.D in Mathematics and MS in Computer Science from University of Pennsylvania.