Deploying Machine Learning Models as Microservices Using Docker

Course Cover

5

(3)

compare button icon

Course Features

icon

Duration

24 minutes

icon

Delivery Method

Online

icon

Available on

Limited Access

icon

Accessibility

Desktop, Laptop

icon

Language

English

icon

Subtitles

English

icon

Level

Intermediate

icon

Teaching Type

Self Paced

icon

Video Content

24 minutes

Course Description

Modern cloud applications often rely on REST-based microservices architectures using Docker containers. Docker allows your applications to communicate with each other and to create and scale different components. These techniques are used by data scientists to efficiently scale machine learning models to production. This video teaches you how to deploy machine learning models behind a REST API--to serve low latency requests from applications--without using a Spark cluster. You'll also learn how to export SparkML models; how Docker can be used to build, deploy and ship microservices code; and how a model scoring system should support both single-on-demand predictions as well as bulk predictions. Basic knowledge of Scala, Python, Hadoop, Spark, Pandas, SBT or Maven, cloud platforms like Amazon Web Services, Bash, Docker and REST is required.

Mikhail Semeniuk works as a data engineer at Shift Technologies. Mikhail was a senior-level statistician at UnitedHealth Group for six years. It is the largest U.S. health insurance provider. He has a BS degree in Economics and Financial Math from University of Minnesota.

Jason Slepicka, a senior data engineer at DataScience, is Jason. Jason is currently pursuing a PhD in Computer Science at University of Southern California Information Sciences Institute.

Course Overview

projects-img

International Faculty

projects-img

Post Course Interactions

projects-img

Instructor-Moderated Discussions

Skills You Will Gain

What You Will Learn

Understand how to deploy machine learning models behind a REST API

Learn to utilize Docker containers for REST-based microservices architectures

Explore methods for exporting models trained in SparkML using a library like Combust MLeap

See how Docker builds, deploys, and ships application code for microservices

Discover how to deploy a model using exported PMML with a REST API in a Docker container

Learn to use the AWS elastic container service to deploy a model hosting server in Docker

Pick up techniques that enable a model hosting server to read a model

Course Instructors

Jason Slepicka

Instructor

Jason Slepicka is the instructor for this course

Mikhail Semeniuk

Instructor

Mikhail Semeniuk is the instructor for this course

Course Reviews

Average Rating Based on 3 reviews

5.0

100%

Course Cover