Optimization for Machine Learning (Algorithms & Theory)
This is a theory course, with proofs and stuff. No coding.
The focus of the seminar is on scalable 1st order methods for continuous optimization, such as stochastic gradient descent, and their connections to and applications in machine learning.
The talks by Master students will cover chapters of the books
- Optimization for Machine Learning. Sra, Nowozin, Wright
- Convex Optimization: Algorithms and Complexity. Bubeck