CS 7880: Special Topics in Theoretical Computer Science
From Convex Analysis to Learning, Prediction, and Elicitation
Fall 2025
Instructor: Lunjia Hu
Location: Shillman Hall 325
Time: Tuesdays 11:45 AM – 01:25 PM and Thursdays 02:50 PM – 04:30 PM
Workload
2-3 homework assignments with about 3 questions each. Reading and presenting classic and recent research papers. Leading and participating in research discussions. No exams.
Topics
Convex analysis basics: hyperplane separation theorem, minimax theorem, Lagrange duality
Online learning and optimization: experts problem, online convex optimization, multiplicative weights, mirror descent
Decision theory: calibration, multicalibration, omniprediction
Complexity theory: regularity lemma, pseudorandomness and pseudoentropy, hardcore lemma and dense model theorem
Economics / game theory: proper scoring rules, property elicitation
Overview
This special topics course introduces the basics of convex analysis and explores its wide-ranging applications across optimization algorithms, machine learning, computational economics, and complexity theory. We begin with the geometrically intuitive yet remarkably powerful hyperplane separation theorem, using it to establish the well-known and widely-used minimax theorems and Lagrange duality. We show how concepts from convex analysis (such as convex conjugation and the Fenchel-Young divergence) illuminate the algorithmic ideas behind the mirror descent framework, which encompasses the celebrated multiplicative weights algorithm as a special case and gives constructive proofs of the minimax and duality theorems. We discuss a broad spectrum of applications of mirror descent—boosting, no-regret online learning, online calibration, Blackwell’s approachability, regularity lemma, multicalibration, omniprediction, pseudo-entropy characterizations—spanning areas from learning, probabilistic forecasting, uncertainty quantification, differential privacy and algorithmic fairness to pseudorandomness, complexity theory and game theory. We also discuss the fundamental role of convex analysis in economics theory, particularly in the study of proper scoring rules and property elicitation. In the second half of the course, students will lead discussions on research topics in computer science, statistics, and economics that use convex analysis. These may include algorithms with predictions, multi-distribution learning, Blackwell’s informativeness theorem, isotonic regression, generalized linear models, single-index models, dense model theorem, metric entropy duality, and more.
Notes
- Notes 1 Hyperplane Separation Theorems
- Notes 2 Minimax Theorems
- Notes 3 Lagrange Duality and Slater’s Condition
- Notes 4 No-Regret Online Learning (Experts Problem and Online Linear Optimization)
- Notes 5 Saddle Points, KKT Conditions, and Constructive Minimax Theorem
- Notes 6 Follow the Regularized Leader, Multiplicative Weights
- Notes 7 Convex Conjugation and Fenchel-Young Divergence
- Notes 8 Regularity Lemma
- Notes 9 Blackwell Approachability and Online Calibration
- Notes 10 Proper Scoring Rules, Revelation Principle, Generalized Linear Models
- Notes 11 Pseudoentropy Characterizations
- Notes 12 Gradient Boosting, AdaBoost
Homeworks
Resources
Jacob Abernethy. Prediction and Learning: It’s Only a Game
Sergiu Hart. Calibrated Forecasts: The Minimax Proof
Jacob Abernethy, Peter L. Bartlett, Elad Hazan. Blackwell Approachability and No-Regret Learning are Equivalent
Moses Charikar, Alexandra Lassota, Prasanna Ramakrishnan, Adrian Vetta, Kangning Wang. Six Candidates Suffice to Win a Voter Majority
Maxwell Fishelson, Noah Golowich, Mehryar Mohri, Jon Schneider. High-Dimensional Calibration from Swap Regret
Schedule
- Lecture 1 (Sep 4, 02:50 PM – 04:30 PM): Guest lecture by Yifan Wu on the multiplicative weights algorithm.
- Lecture 2 (Sep 9, 11:45 AM – 01:25 PM): Minimax theorems and hyperplane separation theorems.
Extended reading: Hahn-Banach theorem (hyperplane separation in infinite dimensions, see Theorems E.4-E.6) - Lecture 3 (Sep 11, 02:50 PM – 04:30 PM): Proving minimax theorems. Yao’s minimax principle. Lagrange duality. Online calibration.
- Lecture 4 (Sep 16, 11:45 AM – 01:25 PM): Experts problem. Online linear optimization.
Fun reading: intransitive dice (motivated by this STOC 2025 paper that uses the minimax theorem) - Lecture 5 (Sep 18, 02:50 PM – 04:30 PM): Follow the Regularized Leader. Multiplicative weights.
Extended reading: Computing Optimal Regularizers for Online Linear Optimization
(Note: added Remark 2 to Notes 6 to connect our regret bound to another popular bound) - Lecture 6 (Sep 23, 11:45 AM – 01:25 PM): Convex conjugation and Fenchel-Young Divergence
- Lecture 7 (Sep 25, 02:50 PM – 04:30 PM): Saddle points
- Lecture 8 (Sep 30, 11:45 AM – 01:25 PM): Constructive Minimax Theorem, Regularity Lemma
Extended reading: Structure and randomness in combinatorics (this is Terence Tao’s tutorial at FOCS 2007)
Terence Tao’s blog post on Szemerédi’s graph regularity lemma - Lecture 9 (Oct 2, 02:50 PM – 04:30 PM): Advanced forms of regularity lemma
- Lecture 10 (Oct 7, 11:45 AM – 01:25 PM): Indistinguishability notions in multi-group fairness
- Lecture 11 (Oct 9, 02:50 PM – 04:30 PM): Online calibration, Blackwell’s approachability
- Lecture 12 (Oct 14, 11:45 AM – 01:25 PM): Proper scoring rules, revelation principle
- Lecture 13 (Oct 16, 02:50 PM – 04:30 PM): Omniprediction from indistinguishability
Reading: Blackwell’s Informativeness Theorem Using Diagrams - Lecture 14 (Oct 21, 11:45 AM – 01:25 PM): Pseudoentropy characterizations
- Lecture 15 (Oct 23, 02:50 PM – 04:30 PM): Gradient boosting, AdaBoost
- Lecture 16 (Oct 28, 11:45 AM – 01:25 PM): Student presentation: New Proofs of the Green–Tao–Ziegler Dense Model Theorem: An Exposition
Extended reading: Section 9 (Progressions in Sparse Pseudorandom Sets) of Yufei Zhao’s book - Lecture 17 (Oct 30, 02:50 PM – 04:30 PM): Student presentation: Complexity-Theoretic Implications of Multicalibration
Blog post by Luca Trevisan: The Impagliazzo Hard-Core-Set Theorem
Extended reading: How Global Calibration Strengthens Multiaccuracy - Lecture 18 (Nov 4, 11:45 AM – 01:25 PM): Student presentation: Six Candidates Suffice to Win a Voter Majority
- Lecture 19 (Nov 6, 02:50 PM – 04:30 PM): Student presentation: Property Elicitation
- NO CLASS on Nov 11 (Veterans Day)
- Lecture 20 (Nov 13, 02:50 PM – 04:30 PM): Student presentation: Blackwell’s Informativeness Theorem Using Diagrams
- Lecture 21 (Nov 18, 11:45 AM – 01:25 PM): Student presentation: The spectral proof of the Szemerédi regularity lemma, Structure and randomness in combinatorics
- Lecture 22 (Nov 20, 02:50 PM – 04:30 PM): Student presentation: Stronger Calibration Lower Bounds via Sidestepping