Covers the fundamentals of multilayer neural networks and deep learning, including back-propagation and network architectures like LeNet, AlexNet, and VGG-16.
Covers CNNs, RNNs, SVMs, and supervised learning methods, emphasizing the importance of tuning regularization and making informed decisions in machine learning.
Covers Convolutional Neural Networks, including layers, training strategies, standard architectures, tasks like semantic segmentation, and deep learning tricks.
Delves into deep learning's dimensionality, data representation, and performance in classifying large-dimensional data, exploring the curse of dimensionality and the neural tangent kernel.