Covers regression diagnostics for linear models, emphasizing the importance of checking assumptions and identifying outliers and influential observations.
Covers a review of machine learning concepts, including supervised learning, classification vs regression, linear models, kernel functions, support vector machines, dimensionality reduction, deep generative models, and cross-validation.
Covers linear models, including regression, derivatives, gradients, hyperplanes, and classification transition, with a focus on minimizing risk and evaluation metrics.
Explores Ridge and Lasso Regression for regularization in machine learning models, emphasizing hyperparameter tuning and visualization of parameter coefficients.
Explores optimization methods like gradient descent and subgradients for training machine learning models, including advanced techniques like Adam optimization.