Explores linear regression from a statistical inference perspective, covering probabilistic models, ground truth, labels, and maximum likelihood estimators.
Covers Likelihood Ratio Tests, their optimality, and extensions in hypothesis testing, including Wilks' Theorem and the relationship with Confidence Intervals.
Explores constructing confidence regions, inverting hypothesis tests, and the pivotal method, emphasizing the importance of likelihood methods in statistical inference.
Explores the consistency and asymptotic properties of the Maximum Likelihood Estimator, including challenges in proving its consistency and constructing MLE-like estimators.
Covers the Likelihood Ratio Test in choice models, comparing unrestricted and restricted models through benchmarking and testing different model specifications.
Explores Gaussian Mixture Models for data classification, focusing on denoising signals and estimating original data using likelihood and posteriori approaches.