Covers optimization in machine learning, focusing on gradient descent for linear and logistic regression, stochastic gradient descent, and practical considerations.
Covers the history and fundamental concepts of neural networks, including the mathematical model of a neuron, gradient descent, and the multilayer perceptron.
Explores the history, models, training, convergence, and limitations of neural networks, including the backpropagation algorithm and universal approximation.