Explores transporters as a practical alternative to parallel transport, discussing minimal requirements, examples with matrices, pragmatic choices, and optimization algorithms.
Explores the importance of differentiating vector fields and the correct methodology to achieve it, emphasizing the significance of going beyond the first order.
Explores geodesic convexity and its extension to optimization on manifolds, emphasizing the preservation of the key fact that local minima imply global minima.
Explores optimization methods like gradient descent and subgradients for training machine learning models, including advanced techniques like Adam optimization.
Discusses optimization techniques in machine learning, focusing on stochastic gradient descent and its applications in constrained and non-convex problems.