Foundations of PyTorch

Course Cover
compare button icon

Course Features

icon

Duration

171 minutes

icon

Delivery Method

Online

icon

Available on

Downloadable Courses

icon

Accessibility

Mobile, Desktop, Laptop

icon

Language

English

icon

Subtitles

English

icon

Level

Beginner

icon

Teaching Type

Self Paced

icon

Video Content

171 minutes

Course Description

PyTorch has been a popular choice to build deep learning models due to its flexibility and ease-of-use. It also supports optimized hardware like GPUs. PyTorch allows you to build complex deep-learning models while still using Python's native support for visualization and debugging. Foundations of PyTorch will teach you how to use PyTorch support dynamic computation graphs and compare it with other frameworks like TensorFlow. You will first learn about the inner workings of neurons and neural networks. Next, you'll see how layers, activation functions and affine transforms are combined within a deep-learning model. Next, you'll learn how such a model can be trained. This is how model parameters are calculated. This will allow you to see how gradient descent optimization can be cleverly applied to optimize the process. You will understand the different types of differentiation that could be used in this process, and how PyTorch uses Autograd to implement reverse-mode auto-differentiation. You will be able to work with PyTorch constructs like Variables, Tensors and Gradients. You will also learn how to create dynamic computation graphs with PyTorch. This course will be compared with TensorFlow's methods, which used static computation graphs but now supports dynamic computation graphs. After this course is completed, you will be able to build deep learning models in PyTorch using the power of dynamic computation graphs.

Course Overview

projects-img

International Faculty

projects-img

Post Course Interactions

projects-img

Instructor-Moderated Discussions

projects-img

Case Studies, Captstone Projects

Skills You Will Gain

What You Will Learn

Learning models owing to its flexibility, ease-of-use and built-in support for optimized hardware such as GPUs

Using PyTorch, you can build complex deep learning models, while still using Python-native support for debugging and visualization

In this course, Foundations of PyTorch, you will gain the ability to leverage PyTorch support for dynamic computation graphs, and contrast that with other popular frameworks such as TensorFlow

First, you will learn the internals of neurons and neural networks, and see how activation functions, affine transformations, and layers come together inside a deep learning model

Next, you will discover how such a model is trained, that is, how the best values of model parameters are estimated

You will then see how gradient descent optimization is smartly implemented to optimize this process

You will understand the different types of differentiation that could be used in this process, and how PyTorch uses Autograd to implement reverse-mode auto-differentiation

You will work with different PyTorch constructs such as Tensors, Variables, and Gradients

Finally, you will explore how to build dynamic computation graphs in PyTorch

You will round out the course by contrasting this with the approaches used in TensorFlow, another leading deep learning framework which previously offered only static computation graphs, but has recently added support for dynamic computation graphs

When you’re finished with this course, you will have the skills and knowledge to move on to building deep learning models in PyTorch and harness the power of dynamic computation graphs

Course Instructors

Author Image

Janani Ravi

Instructor

Janani has a Masters degree from Stanford and worked for 7+ years at Google. She was one of the original engineers on Google Docs and holds 4 patents for its real-time collaborative editing framework...
Course Cover