Explores gradient descent methods for smooth convex and non-convex problems, covering iterative strategies, convergence rates, and challenges in optimization.
Explores convex optimization, emphasizing the importance of minimizing functions within a convex set and the significance of continuous processes in studying convergence rates.
Explores KKT conditions in convex optimization, covering dual problems, logarithmic constraints, least squares, matrix functions, and suboptimality of covering ellipsoids.