Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Concept
Series acceleration
Graph Chatbot
Related lectures (32)
Login to filter by course
Login to filter by course
Reset
Numerical Differentiation: Richardson Extrapolation
Covers Richardson extrapolation for numerical differentiation to reduce error.
Numerical Integration: Simpson's Rule
Introduces Simpson's rule for numerical integration and Richardson extrapolation for accuracy improvement.
Strong Convexity and Convergence Rates
Explores strong convexity's role in faster convergence rates for optimization algorithms.
Generalized Optimistic Methods for Convex-Concave Saddle Point Problems
Presents a generalized optimistic framework for solving convex-concave saddle point problems, extending to higher-order methods.
Stochastic Differential Equations: Mean-Field Inference
Explores inference for stochastic differential equations, focusing on numerical methods and convergence analysis.
Computation, Verification and Validation
Explains verification and validation in CFD simulations, focusing on Richardson extrapolation and accuracy checking.
Markov Chains: Convergence and Spectral Gap
Explores Markov chain convergence, spectral gap, and acceleration techniques for faster convergence.
Optimality of Convergence Rate: Acceleration in Gradient Descent
Explores the optimality of convergence rate in gradient descent and acceleration techniques for convex and non-convex problems.
Accelerated Algorithms for Monotone Inclusions
By Quoc Tran-Dinh explores accelerated algorithms for monotone inclusions, covering optimization models, challenges, and new algorithms.
Convergence Methods: Corrections and Precision
Discusses corrections to old videos on convergence methods and precision.
Error Estimation and Numerical Integration
Explores error estimation in numerical integration and its applications in forecasting, emphasizing the Romberg method and Richardson extrapolation.
Numerical Analysis: Nonlinear Equations
Explores the numerical analysis of nonlinear equations, focusing on convergence criteria and methods like bisection and fixed-point iteration.
Gradient Descent: Early Stopping and Stochastic Gradient Descent
Explains gradient descent with early stopping and stochastic gradient descent to optimize model training and prevent overfitting.
Taylor Series: Convergence and Applications
Explores Taylor series development, convergence criteria, and numerical applications.
Momentum methods and nonlinear CG
Explores gradient descent with memory, momentum methods, conjugate gradients, and nonlinear CG on manifolds.
Wave Equation: Numerical Methods
Explores numerical methods for solving the wave equation, discussing stability conditions and convergence rates.
Convergence of Adaptive Langevin using hypocoercivity
Covers the convergence of Adaptive Langevin dynamics using hypocoercive techniques and explores the Central Limit Theorem.
Newton's Method: Convergence and Quadratic Convergence
Explores Newton's method convergence and quadratic convergence properties in theorem 8.4.
Value Iteration Acceleration: PID and Operator Splitting
Explores accelerating the Value Iteration algorithm using control theory and matrix splitting techniques to achieve faster convergence.
Numerical Integration: Lagrange Interpolation, Simpson Rules
Explains Lagrange interpolation for numerical integration and introduces Simpson's rules.
Previous
Page 1 of 2
Next