Generating New Recipes using GPT-2

Course Cover
compare button icon

Course Features

icon

Duration

120 minutes

icon

Delivery Method

Online

icon

Available on

Limited Access

icon

Accessibility

Desktop, Laptop

icon

Language

English

icon

Subtitles

English

icon

Level

Intermediate

icon

Teaching Type

Self Paced

icon

Video Content

120 minutes

Course Description

This 2 hour-long project will teach you how to preprocess a text data set that contains recipes and then split it into a training or validation set. You will learn how to use HuggingFace to fine-tune deep, generative models, and how to train such models on Google Colab. Finally, you will learn how GPT-2 can be used to create unique and realistic recipes using the above dataset. This project will teach you how to fine tune a large-scale model and the immense amount of resources required for it to learn. In this course, you will also learn about knowledge distillation. Note: This course is best for those who live in the North America region. We are currently working to offer the same experience in other areas.

Course Overview

projects-img

Virtual Labs

projects-img

International Faculty

projects-img

Post Course Interactions

projects-img

Hands-On Training,Instructor-Moderated Discussions

projects-img

Case Studies, Captstone Projects

Skills You Will Gain

What You Will Learn

Create datasets for large-scale language generation

Fine-tune large-scale language model on small and niche task of generating recipes

Clean and preprocess text data for modeling

Course Cover

This Course Is Not Available In Your Country Or Region

Explore Related Courses