Covers information measures like entropy, Kullback-Leibler divergence, and data processing inequality, along with probability kernels and mutual information.
Covers Kernel Density Estimation focusing on bandwidth selection, curse of dimensionality, bias-variance tradeoff, and parametric vs nonparametric models.
Explores non-parametric estimation using kernel density estimators to estimate distribution functions and parameters, emphasizing bandwidth selection for optimal accuracy.
Introduces kernel methods like SVM and regression, covering concepts such as margin, support vector machine, curse of dimensionality, and Gaussian process regression.