This workshop teaches you the use of transformer neural networks and their incarnations (BERT, RoBERTa, GPT-2) for solving real-world natural language use cases. NLP has advanced tremendously over the last few years and BERT is at the forefront of this success having achieved state-of-the-art results on 11 different NLP tasks. For businesses, BERT has unlocked new NLP use cases that have been previously unattainable.
This workshop will teach you what transformers and systems like BERT and GPT-2 are and how to use and modify them for your needs. Organizations have a wealth of unstructured text sources in every line of business, such as employee feedback in human resources, purchase orders and legal documents in contracting and procurement, communication records throughout the org, and many more. Making sense of this information and organizing it into knowledge and actionable insights to improve business outcomes is a key function every data scientist should be aware of.
Most NLP problems require domain adaptation and customization to tailor your models to accommodate nuances in language usage and business requirements. We will show you how to fine-tune a BERT model to adapt to different language environments (e.g., social media vs. contracts), how to leverage transfer learning with BERT to leverage knowledge across domains, and how to modify the BERT architecture by using different “heads” to accomplish different NLP tasks. Using real-world examples, we will present common pitfalls for using and deploying BERT and transformers, so that you won’t have to make the same mistakes early adopters did.
This tutorial will cover five major topics:
Overview of transformer and BERT use cases. We will go over real-world uses cases that can be solved with models like BERT and GPT-2 using several anecdotes from actual companies.
How transformers work. We will go over the input and output of BERT and other transformer-based networks and how they work internally. We will show you how to set up your NLP problems to fit into this paradigm.
How to train and adapt BERT. We will go over the concept and benefits of fine-tuning BERT for your domain and how to do it. We’ll cover adapting BERT to your problems by using different “heads” to do the different tasks, including using transformers for sequence labeling tasks (e.g., NER), classification tasks (e.g., sentiment), and multi-sentence problems (e.g., Q&A and inference).
What to watch out for when working with BERT. We will present pitfalls and modeling considerations (e.g., fixed length sequences, training speed, and others) so that you are not slowed down by the same mistakes we encountered at first.
Business level considerations for deploying BERT. We will present approaches to tuning, optimizing, and monitoring BERT for operationalization to ensure including low resource environments, and performance monitoring.
Workshop Overview and Author Bio
Before you get started: Prerequisites and Resources
Transform your NLP Skills: Using BERT (and Transformers) in Real Life
Niels Kasch, PhD
Data Scientist and Founding Partner | Miner & Kasch