Covers the history and fundamental concepts of neural networks, including the mathematical model of a neuron, gradient descent, and the multilayer perceptron.
Explores learning from interconnected data with graphs, covering modern ML research goals, pioneering methods, interdisciplinary applications, and democratization of graph ML.
Delves into the trade-off between model flexibility and bias-variance in error decomposition, polynomial regression, KNN, and the curse of dimensionality.