This page will provide information on what is covered in each lecture along with pdf of the hand written material from each class. This page will be updated as the class progresses.

The course notes for this course offered in spring 2015 are available here. The course notes for this course offered in autumn 2013 are available here.

The course notes are available from course page. You can check out this link for a growing list of resource material (slides/videos/code/lecture notes, etc) for optimisation in general, and convex optimisation in particular. Please let me know if you come across some useful resource material that I could add to this list.
You can check out this link for a growing list of papers that we plan to cover in the assignments/projects or sometimes through discussions in the class. Please start looking at these papers for your seminar and/or projects.

Date Summary

-

Practice problem set: Please find here a growing list of practice problems, including some previous year's question papers and their solutions. In particular, do look at this set. You can also refer to this wiki for definitions of several concepts relevant to this course, connections between those concepts through theorems and proofs of those theorems.

It is very important that you attempt the homework problems that were posted at the end of (almost) each lecture.

You can consider solving the following problems from Boyd (solutions to few of them has been discussed in class or moodle): (A) 2.1 to 2.37 and 2.39 (B) 3.1 to 3.35 (question involving quasi-convexity are optional) and 3.47 to 3.56. (C) 4.1 to 4.6 and 4.8 to 4.20 and 4.8 to 4.29 and 4.38 to 4.49 and 4.56, 4.61, 4.63. (D) 5.1, 5.2, 5.4 5.5, 5.7, 5.9, 5.10 to 5.13 5.14 to 5.20, 5.21 to 5.23, 5.26 to 5.31 and 5.39, 5.41 to 5.43 (E) 9.1 to 9.11 and 9.23 to 9.32. 9.12 to 9.22 are optional (F) 11.1 to 11.12, 11.15 to 11.18


04.01.2018

  • Annotated classnotes and Non-annotated version
  • Reference: Section 4.1.1 (pages 213 to 214) of these notes.
  • You can refer to Linear Algebra Done Right for brushing up/reading up on linear algebra.
  • Suggested reading before next class: Calculus brushup for brushing up of principles for optimization of univariate functions
  • Suggested reading before next class: Linear Algebra Done Right for brushing up of principles for optimization of univariate functions
  • Explain the optimization problem on slide 16 of the notes.
  • Present a geometric intepretation of the solution to the least squares optimization problem on slide 12 of the notes.

08.01.2018

  • Annotated classnotes and Non-annotated version
  • Reference: Refer to Section 4.1.3 (pages 216 to 231) of these notes.
  • Suggested reading: Page 26 onwards until 50 (part remaining from previous class) before next class: Calculus brushup for brushing up of principles for optimization of univariate functions
  • Prove the necessary and sufficient first order conditions for (strictly) increasing functions of single variable and also characterize their local and global maxima and minima Deadline: January 11 2018.
  • What will be the necessary and sufficient second order conditions for (strictly) increasing functions of single variable and also characterize their local and global maxima and minima Deadline: January 11 2018.

11.01.2018

  • Annotated classnotes and Non-annotated version
  • Reference: Refer to Section 4.1.3 (pages 216 to 231) of these notes.
  • Mandatory Problem Solving: Solve exercises from pages 31 (slide 38) onwards until 53 before next class: from these notes for brushing up application of what was taught in previous class: Deadline: January 15 2018.
  • A more compact proof of sufficiency of slopeless interpretation of convexity that assumes some comfort level with left and right derivatives is here.

15.01.2018

  • Annotated classnotes part 1 and Annotated classnotes part 2
  • Advanced Reference on maxima and minima: Stories About Maxima and Minima (Mathematical World) by Vladimir M. Tikhomirov
  • Advanced Reference on analysis for optimization: Variational Analysis by Rockafellar, R. Tyrrell, Wets, Roger J.-B.
  • Advanced Reference on convex analysis, spaces etc: Fundamentals of Convex Analysis, by Hiriart-Urruty, Jean-Baptiste, Lemarechal, Claude
  • Mandatory Problem Solving: Identify a topological space that is NOT a metric space: referring to slides numbered 9,10 and 11 of these notes: Deadline: January 17 2018.
  • Mandatory Problem Solving: Identify a metric space that is NOT a normed space: referring to slides 10-13 of these notes: Deadline: January 17 2018.

18.01.2018

  • Annotated classnotes
  • Advanced Reference on topology, normed spaces, open sets, continuity etc: Geometry I - Basic ideas and concepts of differential geometry by R.V. Gamkrelidze, E. Primrose, D.V. Alekseevskij, V.V. Lychagin, A.M. Vinogradov-
  • Mandatory Problem Solving: Provide an intuitive explanation why for norms other than the 2-norm, there is no notion of an inner product: referring to slide number 23 of these notes: Deadline: January 22 2018.

22.01.2018

  • Annotated classnotes
  • Reference: Refer to Sections 3.7 until 3.9.4 (pages 175 to 187) of these notes.
  • More basic Reference: Refer to Sections 3.1 until 3.6 (pages 145 to 175) of these notes.
  • Advanced Reference on topology, normed spaces, open sets, continuity etc: Geometry I - Basic ideas and concepts of differential geometry by R.V. Gamkrelidze, E. Primrose, D.V. Alekseevskij, V.V. Lychagin, A.M. Vinogradov-
  • Chapter 2 (Section 2.1) of Convex Optimization by Boyd and Vandenberghe
  • Mandatory Problem Solving: Prove that the inner product defined on pages 1 and 2 of these class notes can be expressed as a (Eucledian) dot product of two vectors in a possibly rotated space (see page 47 of the same notes): Deadline: January 25 2018.
  • Mandatory Problem Solving: Present a hierarchical diagram depicting the different spaces described so far and their relationships in terms of specialization or generalization: Deadline: January 25 2018.

25.01.2018


29.01.2018


29.01.2018


01.02.2018


05.02.2018


08.02.2018


12.02.2018


15.02.2018


19.02.2018


22.02.2018


05.03.2018


12.03.2018


15.03.2018

  • Annotated classnotes
  • Reference: Joint Wolfe conditions for line search described around equations (4.90) and (4.91) of the basic class notes
  • Reference: Boyd and Vandenberghe Sections 9.3.1 and Nocedal and Wright Section A.2 (subsection "Rates of Convergence"). Extra advanced reading: Nemirovski Section 5.4.2
  • Mandatory Problem Solving: Show that if f is itself quadratic, exact ray search gives an optimal solution to the minimization of f on page number 6, slide 104 of these notes : Deadline: March 19 2018.
  • Mandatory Problem Solving: Explain why the step t corresponding to the next iterate in orange might satisfy the Armijo condition but not the second Strong Wolfe condition on page number 12, slide 107 of these notes : Deadline: March 19 2018.
  • Mandatory Problem Solving: Explain for the example quadratic program why the step sizes might satisfy the Armijo condition but not the second Strong Wolfe condition on page number 12, slide 107 of these notes : Deadline: March 19 2018.

19.03.2018

  • Annotated classnotes
  • References: Boyd and Vandenberghe Sections 9.3 and 9.4 and Nemirovski Section C.4 (appendix)
  • Reference: Joint Wolfe conditions for line search described around equations (4.90) and (4.91) of the basic class notes
  • Several graphical illustrations can be found in the matlab code available here.
  • An illustration of Random walk based descent based on the matlab code available here, which is adopted from matlab code accompannying the book Applied Optimization with MATLAB Programming
  • Understand the relation between Lipschitz continuity, continuity, uniform continuity, differentiability, continuous differentiability and so on.
  • Mandatory Problem Solving: Show that if f is itself quadratic, exact ray search gives an optimal solution to the minimization of f on page number 6, slide 104 of these notes : Deadline: March 22 2018.
  • Mandatory Problem Solving: Study the plots of functions on page numbers 23-28 of these notes for Lipschitz continuity of the function OR its gradient, etc: Deadline: March 22 2018.
  • Mandatory Problem Solving: What is the connection between Lipschitz continuity of a function and Lipschitz contunity of its gradient. Does one imply the other on page number 28, slide 125 of these notes : Deadline: March 22 2018.

22.03.2018

  • Annotated classnotes
  • References: Boyd and Vandenberghe Sections 9.3.1, 9.4, 9.5 and Nocedal and Wright Section A.2 (subsection "Rates of Convergence"). Extra advanced reading: Nemirovski Section 5.4.2
  • References: Best convergence rate bound for gradient (steepest) descent with constant step-size from these notes
  • Several graphical illustrations can be found in the matlab code available here.
  • Think of more examples of sequences that are R-convergent but not Q-convergent
  • Mandatory Problem Solving: Understand the second order conditions for convexity and the proofs for the same on page numbers 30-35, slide 152-156 of these notes: Deadline: March 26 2018.

26.03.2018


29.03.2018


2.04.2018

  • Annotated classnotes
  • References: Section 16.1 (and partly 16.2) of Nocedal and Wright (on Equality-Constrained Quadratic Programs)
  • References: Geometric motivation for lagrange multipliers in Section 4.4.1 (page 282 onwards) of my lecture notes
  • References: Boyd and Vandenberghe beginning of Section 5.1.
  • References: For Projected and Generalized Gradient Descent, see L. Vandenberghe, Lecture Notes for EE 236C. Some advanced reading is here
  • Reference: Page 557, Problem 10.2 of Boyd and Vandenberghe
  • Reference: Sections 3.2, 3.2.3, 3.2.4 (Advanced) and 3.2.5 (Advanced) of Introductory Lectures on Convex Programming by Nesterov.
  • References: For solutions to SVM using different algorithms, including projected gradient descent, refer here.
  • References: Best convergence rate bound for gradient (steepest) descent with constant step-size from these notes
  • Mandatory Problem Solving: Understand the proof for (order and rate of) convergence of accelerated generalized gradient descent while also appreciating conditions for convergence on page numbers 17-23, slides 194-197 of these notes: Deadline: April 5, 2018.
  • Mandatory Problem Solving: Is the projected step of projected gradient descent always a descent direction on page number 30, slide 202 of these notes?: Deadline: April 5, 2018.
  • Optional Problem Solving: Understand the geometry of the projected descent on page numbers 43-47, slides 213-217 of these notes?: Deadline: April 5, 2018.

5.04.2018


Quiz 2, Matrix Completion and Generalized Gradient Descent


9.04.2018

  • Annotated classnotes
  • References: Geometric interpretation of duality in optimisation problems: Boyd and Vandenberghe Section 5.3, Page 10 to 13 of class notes, Section 4.4.3 (page 292 onwards) of my lecture notes.
  • References: Boyd and Vandenberghe Sections 5.1, 5.2. Nemirovski: Lecture Notes on Modern Convex Optimization Sections 1.4.5 (esp theorem 1.4.2). Advanced: To appreciate theorem of alternatives that is used to prove strong duality in linear and conic duality see Boyd and Vandenberghe Section 5.8 (especially page 263) and Nemirovski Sections 1.4.4 (extra reading for geometric interpretation of conic dual). You can read more about the Lasso and its dual here.
  • Mandatory Problem Solving: Complete your understanding of the geometric intepretation of the Lagrange dual and the duality gap on page numbers 34-36, that is, slides 254-255 of these notes: Deadline: April 12, 2018.
  • Mandatory Problem Solving: Derive the Lagrange dual for the Quiz 1, Problem 1reproduced on page number 37, that is, slide 256 of these notes: Deadline: April 12, 2018.
  • Optional Problem Solving: How will the dual of the equation (***) on page number 14, slide 242 of these notes be different from the dual i equation (75) of the primal in (*) on page 10, slide 241 of the same set of slides: Deadline: April 12, 2018.
  • Optional Problem Solving: If f(x) and gi(x) are convex then complete the proof that the set I will be convex on page number 24, slide 249 of these notes: Deadline: April 12, 2018.

12.04.2018

  • Annotated classnotes
  • References: Geometric interpretation of duality in optimisation problems: Boyd and Vandenberghe Section 5.3, Page 10 to 13 of class notes, Section 4.4.3 (page 292 onwards) of my lecture notes.
  • References: Boyd and Vandenberghe Sections 5.1, 5.2. Nemirovski: Lecture Notes on Modern Convex Optimization Sections 1.4.5 (esp theorem 1.4.2). Advanced: To appreciate theorem of alternatives that is used to prove strong duality in linear and conic duality see Boyd and Vandenberghe Section 5.8 (especially page 263) and Nemirovski Sections 1.4.4 (extra reading for geometric interpretation of conic dual). You can read more about the Lasso and its dual here.
  • References: For Dual Ascent, Dual Decompositon, Augmented Lagrangian and ADMM, you can read this these notes.
  • References: Section 2 of >these notes by Bertsekas for weak duality and min-max inequality and Section 3 of the same notes for conditions for strong duality and analysis of saddle points.
  • References: For constraint qualificiations on specific cases of Linear and Conic Linear programs, read until lecture 10 on 13.2.2015 of previous year's notes. Else see Nemirovski's notes here. Also see some slides here that I have put together
  • References: For SMO and SVM Optimization, you can refer to this link and this one . For solutions to SVM using different algorithms, including projected gradient descent, refer here.
  • Mandatory Problem Solving: Revisit the Support Vector Regression problem and see how the KKT condition and convexity were used to infer 0 duality gap and therefore go ahead and derive the dual optimization problemas mentioned on page number 29, that is, slide 268 of these notes: You can read about several optimization algorithms applied to SVMs here. Deadline: April 16, 2018.
  • Optional Problem Solving: In the analysis for sufficiency of KKT conditions, what if the inequality constraints gi's are quasi-convex instead of convex as mentioned on page number 23, that is, slide 266 of these notes

16.04.2018


19.04.2018