Covers PCA and LDA for dimensionality reduction, explaining variance maximization, eigenvector problems, and the benefits of Kernel PCA for nonlinear data.
Delves into the spectral bias of polynomial neural networks, analyzing the impact on learning different frequencies and discussing experimental results.
Explores non-linear SVM using kernels for data separation in higher-dimensional spaces, optimizing training with kernels to avoid explicit transformations.