Explores Ridge and Lasso Regression for regularization in machine learning models, emphasizing hyperparameter tuning and visualization of parameter coefficients.
Introduces Lasso regularization and its application to the MNIST dataset, emphasizing feature selection and practical exercises on gradient descent implementation.
Explores overfitting, regularization, and cross-validation in machine learning, emphasizing the importance of model complexity and different cross-validation methods.
Explores supervised learning in financial econometrics, covering linear regression, model fitting, potential problems, basis functions, subset selection, cross-validation, regularization, and random forests.
Explores Stochastic Gradient Descent with Averaging, comparing it with Gradient Descent, and discusses challenges in non-convex optimization and sparse recovery techniques.