ORF523

Convex and Conic Optimization    Spring 2021, Princeton University (graduate course)


(This is the Spring 2021 version of this course. For previous versions, click here.

Useful links
References
  • A. Ben-Tal and A. Nemirovski, Lecture Notes on Modern Convex Optimization [link]
  • S. Boyd and L. Vandenberghe, Convex Optimization [link]
  • M. Laurent and F. Vallentin, Semidefinite Optimization [link]
  • R. Vanderbei, Linear Programming and Extentions [link]
Lectures

The lecture notes below summarize most of what I cover on the whiteboard during class. Please complement them with your own notes.
Some lectures take one class session to cover, some others take two.

  • Lecture 1: A taste of P and NP: scheduling on Doodle + maximum cliques and the Shannon capacity of a graph.
    [pdf]
  • Lecture 2: Mathematical background.
    [pdf]

  • Lecture 3: Local and global minima, optimality conditions, AMGM inequality, least squares.
    [pdf]
     
  • Lecture 4: Convex sets and functions, epigraphs, quasiconvex functions, convex hullls, Caratheodory's theorem, convex optimization problems.
    [pdf], 
    [cvx_examples.m]

  • Lecture 5: Separating hyperplane theorems, the Farkas lemma, and strong duality of linear programming.
    [pdf]

  • Lecture 6: Bipartite matching, minimum vetex cover, Konig's theorem, totally unimodular matrices and integral polyhedra.
    [pdf]

  • Lecture 7: Characterizations of convex functions, strict and strong convexity, optimality conditions for convex problems.
    [pdf]
      
  • Lecture 8: Convexity-preserving rules, convex envelopes, support vector machines.
    [pdf]
      
  • Lecture 9: LP, QP, QCQP, SOCP, SDP.
    [pdf]

  • Lecture 10: Some applications of SDP in dynamical systems and eigenvalue optimization.
    [pdf
  • Lecture 11: Some applications of SDP in combinatorial optimization: stable sets, the Lovasz theta function, and Shannon capacity of graphs.
    [pdf]

  • Lecture 12: Nonconvex quadratic optimization and its SDP relaxation, the S-Lemma.
    [pdf

  • Lecture 13: Computational complexity in numerical optimization.
    [pdf

  • Lecture 14: Complexity of local optimization, the Motzkin-Straus theorem, matrix copositivity.
    [pdf  

  • Lecture 15: Sum of squares programming and relaxations for polynomial optimization.
    [pdf 
    [YALMIP_Demos

  • Lecture 16: Robust optimization.
    [pdf  

  • Guest lecture: Introduction to optimal control. (William Pierson Field Lecture by Sumeet Singh, Google Brain.)
    [pdf  

  • Lecture 17: Convex relaxations for NP-hard problems with worst-case approximation guarantees.
    [pdf]

  • Lecture 18: Approximation algorithms (ctnd.), limits of computation, concluding remarks.
    [pdf


Problem sets and exams

Solutions are posted on Blackboard. 

  • Homework 1: Image compression and SVD, matrix norms, existence of optimal solutions, descent directions, dual and induced norms, properties of positive semidefinite matrices.
    [pdf] [conway.jpg
     
  • Homework 2: Convex analysis true/false questions, optimal control, theory-applications split in a course.
    [pdf]

     
  • Homework 3: Support vector machines (Hillary or Bernie?), norms defined by convex sets, totally unimodular matrices, radiation treatment planning.
    [pdf]
    [Hillary_vs_Bernie]

  • Practice midterms.
    See Blackboard.

  • Midterm:
    [pdf]

  • Homework 4: A nuclear program for peaceful reasons, distance geometry, stability of a pair of matrices, SDPs with rational data and irrational feasible solutions.
    [pdf]
     
  • Homework 5: The Lovasz sandwich theorem, SDP and LP relaxations for the stable set problem, Shannon capacity.
    [pdf][Graph.mat

  • Homework 6:  Equivalence of search and decision, complexity of rank-constrained SDPs, monotone and convex regression with SOS optimization, Zoom's stock and distributionally-robust optimization. 
    [pdf][regression_data.mat

  • Practice final exams.
    See Blackboard.

  • Final exam:
    [pdf], [dependency_matrix.mat]