CS 337 and CS 335:
Artificial Intelligence and Machine Learning
(Picture source: https://www.ibm.com/ibm/history/ibm100/images/icp/A138918I23240Y22/us__en_us__ibm100__700_series__checkers__620x350.jpg)
This page serves as the primary resource for CS 337 (Artificial Intelligence and Machine Learning) and CS 335 (Artificial Intelligence and Machine Learning Lab).
Office: Room 220, New CSE Building
Vishwajeet Singh (E-mail: firstname.lastname@example.org).
Huzefa Chasmai (E-mail: email@example.com).
Rishabh Shah (E-mail: firstname.lastname@example.org).
Abhinav Goyal (E-mail: email@example.com).
Manas Bhargava (E-mail: firstname.lastname@example.org).
Ashwini Pale (E-mail: email@example.com).
Rakesh Khobragade (E-mail: firstname.lastname@example.org).
Dhananjay Kumar Sharma (E-mail: email@example.com).
Lectures will be held in LA 201 in Slot 6: 11.05 a.m. – 12.30
p.m. Wednesdays and Fridays. Lab sessions will be held in Software Lab
2, New CSE Building, during Slot L4: 2.00 p.m. – 4.55
p.m. Fridays. A portion of the lab sessions will be used for lectures
and assessments; the schedule will be announced in class
beforehand. If at all a lecture or test is scheduled during the lab session,
it will be held 2.00 p.m. – 3.25 p.m. in LH 301.
The instructor's office hours will immediately follow class
lectures and labs. Meetings can also be arranged by appointment.
Artificial Intelligence (AI) surrounds us today: in phones that
respond to voice commands, programs that beat humans at Chess and Go,
robots that assist surgeries, vehicles that drive in urban traffic,
and systems that recommend products to customers on e-commerce
platforms. This course aims to familiarise students with the breadth
of modern AI, to impart an understanding of the dramatic surge of AI
in the last decade, and to foster an appreciation for the distinctive
role that AI can play in shaping the future of our society. The
resurgence of AI has been facilitated in large part by the field of
machine learning (ML), whose essential elements will be introduced as
a part of this course.
The course will provide a historical perspective of the field of AI
and discuss of its foundations in search, knowledge representation and
reasoning, and machine learning. A small selection of specialised
topics will also be taken up; these could include, for example, speech
and natural language processing, robotics, crowdsourcing, computer
vision, and multiagent systems. The theory and lab components will
proceed in step to equip students with the knowledge and skills to
design and apply solutions based on AI and ML.
Students interested in gaining more depth are encouraged to follow
this basic course with advanced ones on topics such as machine
learning, information retrieval and data mining, sequential decision
making, robotics, speech and natural language processing, computer
vision, and game theory.
CS 337 and CS 335 are core courses in the CSE undergraduate
programme. They can only be taken by CSE B.Tech. students in their
third (or higher) year. Other students are welcome to sit through
lectures in CS 337, but may not formally register (whether for credit
or for audit) for either course.
CS 337 will have four class tests (each 20 marks) and an
end-semester examination (40 marks). The best three scores from the
class tests will contribute 60 marks towards the final grade; the
end-semester examination will contribute 40 marks towards the final
Grades for CS 335 will be decided based on 8–10 lab
assignments, each worth 10–15 marks.
Students are expected to adhere to the highest standards of
integrity and academic honesty. Acts such as copying in the
examinations and sharing code/viewing on-line solutions for the lab
assignments will be dealt with strictly, in accordance with the
actions for academic malpractice.
Texts and References
Artificial Intelligence: A
Modern Approach, Stuart J. Russell and Peter Norvig, 3rd edition,
Elements of Statistical Learning, Trevor Hastie, Robert
Tibshirani, and Jerome Friedman, 2nd edition, Springer,
This page will serve as the primary source of information regarding
CS 337 and CS 335, their schedules, and related announcements. The
Moodle pages for these courses will be used for sharing additional
resources for the lectures and assignments, and also for recording
E-mail is the best means of communicating with the instructor;
students must send e-mail with "[CS337]" in the header, with a copy
marked to the TAs.
January 4: Welcome; Introduction to the course; AI: past,
present, and future.
Reading: Chapter 1, Russell and Norvig
January 23: Learning.
Reading: Class Note 1.
Summary: Linear separability; Perceptron; Perceptron Learning Algorithm; Proof of convergence.
January 25 (Morning): Learning.
Reading: Section 18.7, Russell and Norvig (2010).
Summary: Artificial neurons; Neural networks as universal function approximators and as parameterised functions.
January 25 (Afternoon): Learning.
Summary: Gradient descent; Calculation of error gradient.
January 30: Learning.
Summary: Backpropagation algorithm; Stochastic gradient descent; Overfitting; Cross-validation.
February 1 (Morning): Learning.
Summary: Filtering operations on images; Convolutional Neural Networks; Convolutional layer; Max-pooling layer.
References: Filtering examples from AI shack, Notes from CS 231N at Stanford University, Notes from DeepGrid.
February 1 (Afternoon): Learning.
Reading: Section 18.3, Russell and Norvig (2010).
Summary: Decision trees.
February 6: Learning.
Reading: Wikipedia page on ROC Curve, Section 8.7, Hastie and Tibshirani (2009), Section 18.10, Russell and Norvig (2010).
Summary: Evaluation of learning algorithms (FPR, Precision, Recall, F1 score, ROC curve, AUC); Ensemble methods; Bagging.
February 8 (Morning): Learning.
Summary: Confusion matrix; Converting a weighted data set to an unweighted one; AdaBoost.
February 8 (Afternoon): Class Test 1.
February 13: Learning.
Reading: Section 18.6, Russell and Norvig (2010).
Summary: Linear regression; Regularisation; Ridge regression; Lasso.
February 15 (Morning): Learning.
Reading: Andrew Zisserman's note on logistic regression (ignore sections 2, 3, and 4); Section 18.6.4, Russell and Norvig (2010).
Summary: Convex functions; Logistic regression.
Reference: Wikipedia page on convex functions.
February 15 (Afternoon): Learning.
Reading: Class Note 2.
Summary: k-means clustering problem; k-means clustering algorithm; Proof of convergence.
February 20: Learning.
Reading: Section 18.8, Russell and Norvig (2010).
Summary: Instance-based methods.
February 25: Class Test 2.
March 1 (Morning): Planning and Learning.
Reading: Slides (Section 3 not in syllabus).
Summary: Markov Decision Problems; Policies; Value functions; Bellman's Equations; Bellman's Optimality Equations.
March 1 (Afternoon): Planning and Learning.
Summary: Value Iteration; Planning and learning; Q-learning; Deep RL.
March 6: Learning.
Reading: Sections 12, 12.1, 12.2, Hastie and Tibshirani (2009); Section 18.9, Russell and Norvig (2010).
Summary: Maximum-margin separator; Support Vector Machines; Kernel trick.
March 8: Search.
Reading: Chapter 3, Russell and Norvig (2010).
Summary: Illustrative search problems; Search problem
instances; Search tree; BFS, DFS, UCS.
March 13: Search.
Reading: Pieter Abbeel's illustration of A* search.
Summary: Completeness, optimality, time and memory requirements of BFS, DFS, UCS; Iterative deepening; Heuristics; A* search; Admissibility and consistency of heuristics.
March 15 (Morning): Search.
Reading: Sections 5.1–5.5, Russell and Norvig (2010).
Summary: Game trees; Minimax search; Alpha-beta pruning.
March 15 (Afternoon): Search.
Abbeel's illustration of alpha-beta pruning.
Summary: Illustration of alpha-beta pruning; Chance nodes and expectiminimax search; Evaluation functions; Lookup tables.
March 20 : Discussion on absenteeism (not in syllabus).
March 22 : Search.
Reading: Section 4.1, Russell and Norvig (2010).
Summary: Traveling Salesperson Problem; Local search; Hill climbing; Genetic algorithms; Optimising soccer agents with evolutionary algorithms.
Reference: Urieli et al., 2011
March 27 : Class Test 3.
March 29 (Morning): Probabilistic reasoning.
Reading: Chapter 13, Russell and Norvig (2010).
Summary: Limitations of Boolean logic in representing knowledge;
Requirement of consistency of beliefs with the axioms of
probability; Random variables; Joint distributions;
Marginalisation; Conditional probabilities.
March 29 (Afternoon): Probabilistic reasoning.
Summary: Bayes' rule; Independence; Conditional independence;
Introduction to Bayes Nets.
April 3: Probabilistic reasoning.
Reading: Sections 14, 14.1, 14.2, 14.4, 14.4.1, Russell and Norvig (2010); Class Note 3.
Summary: Semantics of Bayes Nets; Computing joint probabilities; Bayesian inference.
April 5 (Morning): Probabilistic reasoning.
Reading: Sections 14.5, Russell and Norvig (2010).
Summary: Dynamic Bayes Nets; Random sampling; Rejection sampling; Sampling in Bayes Nets.
April 5 (Afternoon): Probabilistic reasoning.
Summary: Likelihood weighting; Gibbs sampling; Markov blanket.
April 10: Probabilistic reasoning.
Reading: Pieter Abbeel's lecture on conditional independence and D-separation; Section 15.5.3, Russell and Norvig (2010).
Summary: D-separation; Particle filtering.
April 12: Probabilistic reasoning; Overview of topics in AI/ML.
Summary: Derivation of consistency of likelihood weighting,
particle filtering; List of topics in AI/ML not covered in this course.
April 15: Class Test 4.
April 28: End-semester Examination.
Lab Assignments and Schedule
Lab Assignments will be published on Tuesdays, and will be due
for submission by 11.55 p.m. the following Monday. It is expected
that students will attend the Friday lab sessions after reading the
published statement and making at least partial progress towards the
solution. The instructor and TAs will provide guidance as required
during the lab session.
Submissions must be uploaded to Moodle in the format specified. If
a submission is not made by the associated deadline, a "carry
over" will be counted against the assignment. Assignments that are
carried over will only be evaluated after the student attends a
session with a TA or the instructor to explain their submission and
demonstrate its working. A special lab session will be announced to
evaluate carry-over assignments.
A student may carry over up to two lab assignments without any
penalty. A third carry-over will incur a penalty of 2 marks; a fourth
carry-over will incur a penalty of 4 marks; subsequent carry-overs
will incur a penalty of 6 marks.
Below is the schedule for lab assignments.
January 25: Welcome; Introduction to the course.
Lab Assignment 1 (12 marks). Published January 29, 2019. Due 11.55 p.m., February 4, 2019.
February 1: Lab Assignment 1.
Lab Assignment 2 (15 marks). Published February 5, 2019. Due 11.55 p.m., February 11, 2019.
February 8: Lab Assignment 2.
Lab Assignment 3 (10 marks). Published February 12, 2019. Due 11.55 p.m., February 18, 2019.
February 15: Lab Assignment 3.
Lab Assignment 4 (15 marks). Published February 19, 2019. Due 11.55 p.m., March 4, 2019.
March 1: Lab Assignment 4.
March 8: Carry-over session.
Lab Assignment 5 (10 marks). Published March 12, 2019. Due 11.55 p.m., March 18, 2019.
March 15: Lab Assignment 5.
Lab Assignment 6 (12 marks). Published March 19, 2019. Due 11.55 p.m., April 1, 2019.
March 22: Lab Assignment 6.
Lab Assignment 7 (15 marks). Published March 26, 2019. Due 11.55 p.m., April 1, 2019.
March 29: Lab Assignment 7.
Lab Assignment 8 (11 marks). Published April 2, 2019. Due 11.55 p.m., April 8, 2019.
April 5: Lab Assignment 8.
April 12: Carry-over session.