Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Optimization without Constraints: Gradient Method
Graph Chatbot
Related lectures (27)
Gradient Descent
Covers the concept of gradient descent in scalar cases, focusing on finding the minimum of a function by iteratively moving in the direction of the negative gradient.
Newton's Method: Optimization Techniques
Explores optimization techniques like gradient descent, line search, and Newton's method for efficient problem-solving.
Optimization Methods
Covers optimization methods without constraints, including gradient and line search in the quadratic case.
Proximal and Subgradient Descent: Optimization Techniques
Discusses proximal and subgradient descent methods for optimization in machine learning.
Coordinate Descent: Efficient Optimization Techniques
Covers coordinate descent, a method for optimizing functions by updating one coordinate at a time.
Quasi-Newton Methods
Introduces Quasi-Newton methods for optimization, explaining their advantages over traditional approaches like Gradient Descent and Newton's Method.
Optimality of Convergence Rates: Accelerated/Stochastic Gradient Descent
Covers the optimality of convergence rates in accelerated and stochastic gradient descent methods for non-convex optimization problems.
Previous
Page 2 of 2
Next