Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Course
MATH-512: Optimization on manifolds
Graph Chatbot
Lectures in this course (95)
Manopt: Optimization on Manifolds
Introduces Manopt, a toolbox for optimization on manifolds, covering gradient and Hessian checks, solver calls, and manual caching.
Comparing Tangent Vectors: Three Reasons Why
Explores the importance of comparing tangent vectors at different points using algorithms and finite differences.
Manopt: Optimization Toolbox for Manifolds
Introduces Manopt, a toolbox for optimization on manifolds, focusing on solving optimization problems on smooth manifolds using the Matlab version.
Comparing Tangent Vectors: Parallel Transport
Explores comparing tangent vectors and parallel transport on manifolds.
Comparing Tangent Vectors: Parallel Transport
Explores the definition, existence, and uniqueness of parallel transport of tangent vectors on manifolds.
Comparing Tangent Vectors: Parallel Transport
Explores parallel transport along loops and covariant derivatives induced by metrics.
Transporters: a proxy for parallel transport
Explores transporters as a practical alternative to parallel transport, discussing minimal requirements, examples with matrices, pragmatic choices, and optimization algorithms.
Distance and geodesics: Distance, geodesics and complete manifolds
Covers the concept of distance induced by the Riemannian metric on manifolds.
Distance, geodesics and complete manifolds: Complete manifolds
Explores distance, geodesics, and complete manifolds, emphasizing the existence of minimizing geodesics and the concept of metric completeness.
Taylor expansions: second order
Explores Taylor expansions and retractions on Riemannian manifolds, emphasizing second-order approximations and covariant derivatives.
Retractions: second order
Covers second-order retractions in optimization on manifolds, focusing on smooth curves and their relation to the gradient and Hessian of a function.
Gradient and Hessian of Pullback
Explores the connection between Riemannian and Euclidean gradients and Hessians at critical points.
Optimality conditions: second order
Explores necessary and sufficient optimality conditions for local minima on manifolds, focusing on second-order critical points.
Newton's method: Optimization on manifolds
Explores Newton's method for optimizing functions on manifolds using second-order information and discusses its drawbacks and fixes.
Computing the Newton Step: Matrix-Based Approaches
Explores matrix-based approaches for computing the Newton step on a Riemannian manifold.
Computing the Newton Step: GD as a Matrix-Free Way
Explores matrix-based and matrix-free approaches for computing the Newton step in optimization on manifolds.
Computing the Newton Step: From GD to CG
Covers the transition from Gradient Descent to Conjugate Gradients, highlighting the efficiency of CG over GD in optimization on manifolds.
Trust Region Methods: Why, with an Example
Introduces trust region methods and presents an example of Max-Cut Burer-Monteiro rank 2 optimization.
Trust Region Methods: Coming up with the Algorithm
Explores trust region methods, emphasizing the importance of considering subproblems at each iteration.
Trust region methods: Global convergence with minimal effort
Explores trust region methods for global convergence with minimal effort in optimization on manifolds.
Previous
Page 4 of 5
Next