Covers the Branch & Bound algorithm for efficient exploration of feasible solutions and discusses LP relaxation, portfolio optimization, Nonlinear Programming, and various optimization problems.
Discusses optimization techniques in machine learning, focusing on stochastic gradient descent and its applications in constrained and non-convex problems.
Discusses Stochastic Gradient Descent and its application in non-convex optimization, focusing on convergence rates and challenges in machine learning.