Covers linear models, including regression, derivatives, gradients, hyperplanes, and classification transition, with a focus on minimizing risk and evaluation metrics.
Covers a review of machine learning concepts, including supervised learning, classification vs regression, linear models, kernel functions, support vector machines, dimensionality reduction, deep generative models, and cross-validation.
Covers the basics of linear regression in machine learning, exploring its applications in predicting outcomes like birth weight and analyzing relationships between variables.
Covers regression diagnostics for linear models, emphasizing the importance of checking assumptions and identifying outliers and influential observations.
Covers the basics of linear regression, including OLS, heteroskedasticity, autocorrelation, instrumental variables, Maximum Likelihood Estimation, time series analysis, and practical advice.