$69

Need Custom Training for Your Team?

Get Quote

Call Us

Toll Free (844) 397-3739

Inquire About This Course

Instructor

Thumb b3361c8d 6950 4184 a653 570add1d6eb9

Peter Chen

Peter Chen is an analytics and data science professional that has an eclectic and deep background. He has previously worked in various senior positions at companies such as Algebraix Data, Petco, Mitchell International, Sempra Energy ,etc. He has been profiled in various media articles (“A Confession of a Data Miner”) about his analytics and data mining experience. He is widely published in industry trade magazines about analytics & data science. Peter received his BS in Management Science from the Massachusetts Institute of Technology/Sloan School of Management and his Masters in Software Engineering/Data Science from Harvard.

Machine Learning Foundations: Supervised Learning

Instructor: Peter Chen

Practical Approach to Supervised Machine Learning

  • Learn how and when to apply prediction machine learning algorithms through a series of lectures and in-depth quizzes.
  • Instructor has worked in various senior roles for Algebraix Data, Petco, and Sempra Energy.  He recieved his bachelors from MIT & his Masters in Software Engineering/Data Science from Harvard.

Course Description

My name is Peter Chen and I am the instructor for this course. I want to introduce you to the wonderful world of Machine Learning through practical examples and code. The course will cover Supervised Learning algorithms and models in machine learning. More importantly, it will get you up and running quickly with a practical and at times funny applications of Supervised Learning algorithms. The course has code & sample data for you to run and learn from. It also encourages you to explore your own datasets using Supervised Learning algorithms. Prerequisites: Beginner knowledge of Python and R. It's used mostly for expository reasons. You do NOT need to be a Python or R expert to understand this course. Basic math and comfortable with basic probability and statistics.

What am I going to get from this course?
* Understand the two major types of Supervised Machine Learning
* Know when to apply a prediction machine learning algorithm
* Know when to apply a classification machine learning algorithm
* Gained an intuition behind the math of the underlying algorithms and be able to explain it
* Learned how to use Python scikit-learn library and R libraries to build supervised machine learning models and algorithms
* Apply Python & R code to their data sets to solve prediction and classification problems
* Evaluate the effectiveness of their machine learning models
* Develop a taste for tinkering with the model to improve results
 

Prerequisites and Target Audience

What will students need to know or do before starting this course?
Basic Python and R. Do not need to be an expert programmer. We use these languages mainly for expository reasons. Basic probability math.  
Who should take this course? Who should not?
Students who are interested in a practical introduction to supervised machine learning. Less on theory, but more on practical application of machine learning that can get you up and running. Must like to play with data and code. Enthusiasm is more important than expertise.

Curriculum

Module 1: Introduction
Lecture 1 Introductions to Supervised Machine Learning
05:04

Module 2: Regressions
Lecture 2 Linear Regression
13:51

We will cover the basics of linear regression. Also work through example code in Python.

Lecture 3 Linear Regression using Statsmodels Module
08:29

Another approach using the statsmodels module. It gives more statistical tests and parameters than scikit learn.

Quiz 1 Linear Regression

Lecture 4 Multiple Linear Regression
04:53

This lecture extends the basic concept of linear regression to include more variables using the same car price prediction data set. Exercise encourages student to turn on/off different variables to see the effects that has on the model performance.

Quiz 2 Multiple Linear Regression Project: Turning Features On/Off

In our lecture example, we used ALL of the independent variables/features and we got an R^2 of 0.83. Please try to turn off some of the features and reran the multiple linear regression model to see what kinds of R^2 you can get. Does more features/predictors always equal to better R^2 or does it decrease the R^2 sometimes?

Lecture 5 Multiple Linear Regression Addendum: Using Statsmodel Module
12:01

Another approach using statsmodel module. It gives more detailed statistical answers than scikit learn.

Lecture 6 Polynomial Regression
05:42

You'll learn a type of regression not frequently discussed: polynomial regression. Learning how to fit nonlinear relationships.

Quiz 3 Polynomial Regression Project

Find a data set that has quadratic behavior and run a multiple linear regression and compare that R^2 with running a quadratic regression. P.S. Bonus points to those who understood the math joke at the end of Lecture 4.

Lecture 7 Polynomial Regression Addendum: Using Numpy
05:14

Another approach to polynomial regression using Numpy

Module 3: Neural Networks
Lecture 8 Neural Networks
24:32

Neural networks are machine learning algorithms that model after the way brain learns. You'll learn what they are and use neural network to predict continuous values problems such as the car price prediction before and many others.

Quiz 4 Project: Playing with Different Neural Network Architectures

In this assignment, please try to change the number of hidden layers, the type of activation functions , and the number of input nodes to see if it changes the effectiveness of the model predictions. This is more open ended exploration of neural network and the various components.

Module 4: Regression Trees
Lecture 9 Tree Regressions
14:08

Understand what tree regressions are and how flexible they can be for predicting often times non-linear problems that typical linear regression can fail.

Quiz 5 Tree Regression

Play with max_depth of the tree regression. Does increasing the max_depth on the tree improves the accuracy? At one point does diminishing returns occur? That is as you increase the max_depth more and more it stops getting better? Play with that parameter to see how it behaves. Also pick another nonlinear problem you like to predict using tree regression.

Module 5: Classification
Lecture 10 Introduction to Classification
02:44

The second type of Supervise Learning is classification. This introduces the concept of classification and the applications of classification.

Quiz 6 Classification
Lecture 11 Logistic Regression Part1
09:34

Logistic Regression, despite its name, is used for classification problems and not prediction problems. We examine the concept of using logistic regression for binary classification of whether an email is spam or not. Armed with this knowledge and code, students can easily modify the code to classify other binary problems.

Quiz 7
Lecture 12 Logistic Regression: Part 2
20:47

How to measure performance of binary classification

Quiz 8
Lecture 13 Logistic Regression: Part 3
26:14

Practical example of spam detection using logistic regression

Lecture 14 Naive Bayesian: Introductions & Probability Review
21:03
Lecture 15 Naive Bayesian
21:10

Develop an intuitive understanding of Bayes Rules through a worked out example also and see how that can be used to classify things.

Quiz 9 Naive Bayesian
Lecture 16 Classification Trees
13:08

Understand how tree models can be used for classification in addition to regression. Learn GraphViz application to visualize beautiful classification trees and use this powerful modern machine learning algorithm for the classic Iris data set.

Quiz 10 Classification Tree Quiz

Reviews

1 Review

Empty user
Terry P

December, 2016

350