Conjugate Gradient MethodExplores the Conjugate Gradient method for solving linear systems and introduces Quasi-Newton methods and rank 2 updates.
Convergence of the methodCovers the convergence of the method and the importance of adapting time steps for accurate approximations.
Optimization MethodsCovers unconstrained and constrained optimization, optimal control, neural networks, and global optimization methods.
Conjugate Gradient OptimizationExplores Conjugate Gradient optimization, covering quadratic and nonlinear cases, Wolfe conditions, BFGS, CG algorithms, and matrix symmetry.
Optimization MethodsCovers optimization methods without constraints, including gradient and line search in the quadratic case.
Multistep methodsCovers multistep methods for solving differential equations, focusing on stability conditions and examples.
Quasi-newton optimizationCovers gradient line search methods and optimization techniques with an emphasis on Wolfe conditions and positive definiteness.
Runge-Kutta MethodsExplains the Runge-Kutta methods, particularly the explicit scheme of order 4 (ERK4), and how to optimize parameters for accuracy.