This page will provide updated information on what is covered in each lecture along with pdf of the hand written material from each class.


Video recordings and all course notes are all being made available on bodhi tree.
The course notes are also available from course page. The calendar page for the previous offering of this course is available here.
Date Summary

-

Practice problem set: Please find here a growing list of practice problems. You can also refer to this wiki for definitions of several concepts relevant to this course, connections between those concepts through theorems and proofs of those theorems.

It is very important that you attempt the homework problems that were posted at the end of (almost) each lecture.

You can consider solving the following problems from PRML (Pattern Recognition and Machine Learning): 1.1 to 1.16, 1.20 to 1.32, 1.35 to 1.41, 2.1 to 2.8, 2.12, 2.13, 2.14 to 2.40, 3.2 to 3.10, 3.12 to 3.20, 4.2, 4.3, 4.4, 4.7, 4.9, 4.10, 4.12, 4.13, 4.14, 4.15 (harder), 4.16-4.21, 4.24, 5.1 - 5.9, 5.25 (hard), 5.27, 5.28, and following problems from Hastie et. al. : (A) 2.1, 3.2 , 3.5 , 3.6 , 3.7 , 3.12 , 3.19 , 3.21 , 3.23 , 3.27 , 3.28 , 3.29 , 3.30 , 4.2 , 4.3 , 4.4 , 4.5 , 4.6 , 4.7 , 4.8 , 5.16 , 6.2 , 6.4 , 6.6 , 6.9 , 6.11 , 8.1 , 8.2 , 8.5, 8.7, 10.5a, 10.8, 10.12, 11.2, 11.3, 11.4, 11.5, 11.7, 12.1, 12.2, 12.9, 12.10, 12.11 (harder), 13.1, 14.1, 14.2, 14.23 (harder)

DateTopics
18-07-2016
  • Introduction to Machine Learning
  • Unannotated Slides
  • Annotated Slides
  • Chapter 1 of Hastie et. al.
  • Homework: Intuitively analyse the machine learning problem of handwritten digit recognition (slides 10 and 11) taking cues from the analysis on slides 8 and 9.
21-07-2016
(Optional Revision Lecture) 23-07-2016, Contains solutions to most problems of Tutorial 0
25-07-2016
  • Basis functions/attributes, Least squared Regression and geometrical interpretation of its solution
  • Unannotated Slides
  • Annotated Slides
  • Sections 2.2, 3.1 and 3.2 of Hastie et. al.
  • Homework: Understand the concept of column space and the geometrical interpretation of the solution to the Least squared regression problem. What is polynomial regression on k indepenendent variables v1, v2.... vk and how would you achieve it using linear regression?
28-07-2016
  • Probabilistic Interpretation of Linear Regression
  • Unannotated Slides
  • Annotated Slides
  • Homework: Complete attempting problems from Tutorial 1
  • References: Section 3.1 (specifically until Section 3.1.3) of Pattern Recognition and Machine Learning by Christopher M. Bishop and Sections 2.2, 3.1 and 3.2 of Hastie et. al.
01-08-2016
04-08-2016
  • Conjugate Prior for (Multivariate) Gausssian, Posterior for (Multivariate) Gaussian, Bayesian Linear Regression
  • Unannotated Slides
  • Annotated Slides
  • Homework: Complete attempting problems from Tutorial 2. Also complete derivation of the MAP estimate for Bayesian Linear Regression (last slide of Annotated Slides)
  • References: Sections 2.1, 2.1.1, 3.1.4 of Pattern Recognition and Machine Learning by Christopher M. Bishop, Sections 3.4.3 of Hastie et. al.
08-08-2016
08-08-2016
18-08-2016
22-08-2016
25-08-2016
29-08-2016
01-09-2016
15-09-2016
20-09-2016
22-09-2016
  • Convergence of Perceptron Update Algorithm, Logistic Regresion (Sigmoid Perceptron), From Maximum likelihood to Minimum Cross Entropy
  • Unannotated Slides
  • Annotated Slides
  • Homework: Complete attempting problems from Tutorial 6. Derive the cross entropy minimization formulation for Logistic Regression Parameter estimation from the maximum likelihood formulation
  • Reference: Sections 4.5.1, 4.4 and 11.4 of Hastie et. al.
26-09-2016
29-09-2016
03-10-2016
06-10-2016
10-10-2016
13-10-2016
17-10-2016
20-10-2016
24-10-2016
27-10-2016
03-11-2016