Explores the integration of machine learning into discrete choice models, emphasizing the importance of theory constraints and hybrid modeling approaches.
Introduces Bayesian estimation, covering classical versus Bayesian inference, conjugate priors, MCMC methods, and practical examples like temperature estimation and choice modeling.
Explores the consistency and asymptotic properties of the Maximum Likelihood Estimator, including challenges in proving its consistency and constructing MLE-like estimators.
Explores causal discovery using latent variable models, emphasizing the challenges and solutions in inferring causal relationships from non-Gaussian data.
Explores constructing confidence regions, inverting hypothesis tests, and the pivotal method, emphasizing the importance of likelihood methods in statistical inference.