Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Intro to Quantum Sensing: Parameter Estimation and Fisher Information
Graph Chatbot
Related lectures (32)
Basic Principles of Point Estimation
Explores the Method of Moments, Bias-Variance tradeoff, Consistency, Plug-In Principle, and Likelihood Principle in point estimation.
Estimation and Confidence Intervals
Explores bias, variance, and confidence intervals in parameter estimation using examples and distributions.
Parameter Estimation: Detection & Estimation
Covers the concepts of parameter estimation, including unbiased estimators and Fisher information.
Statistical Models and Parameter Estimation
Explores statistical models, parameter estimation, and sampling distributions in probability and statistics.
Sampling Distributions: Estimators and Variance
Covers estimation of parameters, MSE, Fisher information, and the Rao-Blackwell Theorem.
Estimators and Confidence Intervals
Explores bias, variance, unbiased estimators, and confidence intervals in statistical estimation.
Confidence Intervals: Gaussian Estimation
Explores confidence intervals, Gaussian estimation, Cramér-Rao inequality, and Maximum Likelihood Estimators.
Estimators and Bias
Explores estimators, bias, and efficiency in statistics, emphasizing the trade-off between bias and variability.
Statistical Theory: Cramér-Rao Bound & Hypothesis Testing
Explores the Cramér-Rao bound, hypothesis testing, and optimality in statistical theory.
Optimality in Decision Theory: Unbiased Estimation
Explores optimality in decision theory and unbiased estimation, emphasizing sufficiency, completeness, and lower bounds for risk.
Maximum Likelihood: Estimation and Inference
Introduces maximum likelihood estimation, discussing its properties and applications in statistical analysis.
Statistical Estimation
Explores statistical estimation, comparing estimators based on mean and variance, and delving into mean squared error and Cramér-Rao bound.
Bias, Variance, Consistency, EMV
Covers bias, variance, mean squared error, consistency, and maximum likelihood estimation in the Poisson model.
Probabilistic Models for Linear Regression
Covers the probabilistic model for linear regression and its applications in nuclear magnetic resonance and X-ray imaging.
Estimation: Linear Estimator
Explores linear estimation, optimal criteria, and the orthogonality principle for good choices in estimation.
Sampling Distributions: Estimation
Explores sampling distributions, estimation methods, and consistency in parameter estimation.
Fisher Information, Cramér-Rao Inequality, MLE
Explains Fisher information, Cramér-Rao inequality, and MLE properties, including invariance and asymptotics.
Estimation: Measures of Performance
Explores estimation measures of performance, including the Cramér-Rao bound and maximum likelihood estimation.
The Stein Phenomenon and Superefficiency
Explores the Stein Phenomenon, showcasing the benefits of bias in high-dimensional statistics and the superiority of the James-Stein Estimator over the Maximum Likelihood Estimator.
Sampling Distributions: Theory and Applications
Explores sampling distributions, estimators' properties, and statistical measures for data science applications.
Previous
Page 1 of 2
Next