Explores Ridge and Lasso Regression for regularization in machine learning models, emphasizing hyperparameter tuning and visualization of parameter coefficients.
Explores overfitting, cross-validation, and regularization in machine learning, emphasizing model complexity and the importance of regularization strength.
Delves into the trade-off between model flexibility and bias-variance in error decomposition, polynomial regression, KNN, and the curse of dimensionality.
Covers a review of machine learning concepts, including supervised learning, classification vs regression, linear models, kernel functions, support vector machines, dimensionality reduction, deep generative models, and cross-validation.