Application for the next batch of the Diploma Program is now open.

Applications open now for next batch of the Diploma Program

Machine Learning Techniques

To introduce the main methods and models used in machine learning problems of regression, classification and clustering. To study the properties of these models and methods and learn about their suitability for different problems.

by Ashish Tendulkar

Course ID: BSCS2007

Course Credits: 4

Course Type: Data Science

Recommended Pre-requisites: None

Recommended Co-requisites: BSCS2004 -  Machine Learning Foundations

What you’ll learn VIEW COURSE VIDEOS

Demonstrating In depth understanding of machine learning algorithms - model, objective or loss function, optimization algorithm and evaluation criteria.
Tweaking machine learning algorithms based on the outcome of experiments - what steps to take in case of underfitting and overfitting.
Being able to choose among multiple algorithms for a given task.
Developing an understanding of unsupervised learning techniques.

Course structure & Assessments

12 weeks of coursework, weekly online assignments, 2 in-person invigilated quizzes, 1 in-person invigilated end term exam. For details of standard course structure and assessments, visit Academics page.

WEEK 1 Introduction to machine learning; Supervised vs unsupervised, batch vs online, instance-based vs model-based; Problems - regression, classification, clustering; Challenges
WEEK 2 Models of regression; Linear regression - least squares; Polynomial regression - learning curves; Regularized linear models - Ridge, LASSO
WEEK 3 Models of regression; Linear regression - least squares; Polynomial regression - learning curves; Regularized linear models - Ridge, LASSO
WEEK 4 Models of classification; Discriminant functions and decision boundaries - two classes, multiple classes, least squares, perceptron; Probabilistic generative and discriminative models - ML, Naive Bayes, exponential family, logistic regression
WEEK 5 Models of classification; Discriminant functions and decision boundaries - two classes, multiple classes, least squares, perceptron; Probabilistic generative and discriminative models - ML, Naive Bayes, exponential family, logistic regression
WEEK 6 Models of classification; Discriminant functions and decision boundaries - two classes, multiple classes, least squares, perceptron; Probabilistic generative and discriminative models - ML, Naive Bayes, exponential family, logistic regression
WEEK 7 Models of classification; Nearest Neighbours - regression and classification problems
WEEK 8 Support Vector Machines; Linear SVM - soft margin classification; Nonlinear SVM - kernels
WEEK 9 Decision Trees, Ensemble Methods and Random Forests; Training decision trees, making predictions; Bagging, Boosting
WEEK 10 Decision Trees, Ensemble Methods and Random Forests; Training decision trees, making predictions; Bagging, Boosting
WEEK 11 Clustering; k-Means - algorithm, demo and how to select k HAC
WEEK 12 Neural networks; Multi-layer perceptron, activation functions; Training - SGD and back propagation; Hyperparameters - number of layers, neurons, activation functions; Note: We will give additional material for NN and Clustering.
+ Show all weeks

Prescribed Books

The following are the suggested books for the course:

Pattern Classification by David G. Stork, Peter E. Hart, and Richard O. Duda

Pattern Recognition and Machine Learning by Christopher M. Bishop

The Elements of Statistical Learning: Data Mining, Inference, and Prediction by Trevor Hastie, Robert Tibshirani, and Jerome Friedman

About the Instructors

Ashish Tendulkar
Research Software Engineer, Google AI, Google

Dr. Ashish Tendulkar is a researcher with Google Research Bangalore. He holds Masters and PhD from IIT Bombay. Before his current position, he was an Assistant Professor at IIT Madras and Head of data sciences at Persistent Systems Pune. Ashish is passionate about teaching ML and writing AI related contents in Indian languages.

  less

Other courses by the same instructor: BSCS2008 - Machine Learning Practice