Course Abstract

Training duration : 90 minutes

Learn the basics of building a PyTorch model using a structured, incremental and from first principles approach. Find out why PyTorch is the fastest growing Deep Learning framework and how to make use of its capabilities: autograd, dynamic computation graph, model classes, data loaders and more. The main goal of this training is to show you how PyTorch works: we will start with a simple and familiar example in Numpy and "torch" it! At the end of it, you should be able to understand PyTorch's key components and how to assemble them together into a working model.

DIFFICULTY LEVEL: BEGINNER

Learning Objectives

  • Understand the basic building blocks of PyTorch: tensors, autograd, models, optimizers, losses, datasets, and data loaders

  • Identify the basic steps of gradient descent, and how to use PyTorch to make each one of them more automatic

  • Build, train, and evaluate a model using mini-batch gradient descent

Instructor

Instructor Bio:

Daniel Voigt Godoy has 20+ years experience in developing solutions, programs and models using analytical skills across different industries: software development, government, fintech, retail and mobility. 7+ years experience with data processing, data analysis, machine learning and statistical tools: Python (numpy, scipy, pandas, scikit-learn), Spark, R Studio, MatLab and Statistica. Experience in stochastic simulation and agent-based modeling. Experienced programmer in SQL, Python, Java, R, PowerBuilder, PHP. Strong programming skills and eagerness to learn different languages, frameworks and tools. Solid background in statistics, economics, capital markets, debt management and financial instruments.

Daniel Voigt Godoy

Manager, Financial Advisory Analytics, Dean | Deloitte | Data Science Retreat

Course Abstract

Module 1: PyTorch: tensors, tensors, tensors (15 min)    

• Introducing a simple and familiar example: linear regression   

 • Generating synthetic data    

• Tensors: what they are and how to create them

 • CUDA: GPU vs CPU tensors    

• Parameters: tensors meet gradients 

Module 2: Gradient Descent in Five Easy Steps (15 min)    

• Step 0: initializing parameters    

• Step 1: making predictions in the forward pass

• Step 2: computing the loss, or “how bad is my model?”    

• Step 3: computing gradients, or “how to minimize the loss?”    

• Step 4: updating parameters    

• Bonus: learning rate, the most important hyper-parameter    

• Step 5: Rinse and repeat 

Module 3: Autograd, your companion for all your gradient needs! (15 min)    

• Computing gradients automatically with the backward method    

• Dynamic Computation Graph: what is that?   

 • Optimizers: updating parameters, the PyTorch way    

• Loss functions in PyTorch 

Module 4: Building a Model in PyTorch (15 min)

• Your first custom model in PyTorch    

• Peeking inside a model with state dictionaries

• The importance of setting a model to training mode    

• Nested models, layers, and sequential models

• Organizing our code: the training step 

Module 5: Datasets and data loaders (20 min)   

• Your first custom dataset in PyTorch   

• Data loaders and mini-batches    

• Evaluation phase: setting up the stage   

• Organizing our code: the training loop   

 • Putting it all together: data preparation, model configuration, and model training    

• Taking a break: saving and loading models

Background knowledge

  • This course is for current or aspiring Data Scientists, Machine Learning Engineers, and Deep Learning Practioners

  • Knowledge of following tools and concepts:

  • Python, Jupyter notebooks, Numpy and, preferably, object oriented programming.

  • Basic machine learning concepts may be helpful, but it is not required.

Real-world applications

  • Several companies are already “powered by PyTorch”, to name a few: Facebook, Tesla, OpenAI, Uber, and more.

  • PyTorch can be used for developing deep learning models in a wide range of applications and areas, ranging from natural language processing to self-driving cars.