Introduces Newton's method for solving non-linear equations iteratively, highlighting its fast convergence but also its potential failure to converge in some cases.
Covers the concept of gradient descent in scalar cases, focusing on finding the minimum of a function by iteratively moving in the direction of the negative gradient.
Covers the fixed point theorem and the convergence of Newton's method, emphasizing the importance of function choice and derivative behavior for successful iteration.