Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Concept
Cross-covariance matrix
Graph Chatbot
Related lectures (31)
Login to filter by course
Login to filter by course
Reset
Principal Components: Properties & Applications
Explores principal components, covariance, correlation, choice, and applications in data analysis.
Gaussian Mixture Models: Likelihood and Covariance Matrix
Explores statistical independence, Gaussian Mixture Models, and fitting data with Gaussian functions.
Multivariate Statistics: Wishart and Hotelling T²
Explores the Wishart distribution, properties of Wishart matrices, and the Hotelling T² distribution, including the two-sample Hotelling T² statistic.
PCA: Derivation and Optimization
Covers the derivation of PCA projection, error minimization, and eigenvector optimization.
Multivariate Statistics: Normal Distribution
Covers the multivariate normal distribution, properties, and sampling methods.
Principal Component Analysis: Understanding Data Structure
Explores Principal Component Analysis, dimensionality reduction, data quality assessment, and error rate control.
Max-Stable Models: Smith and Schlather
Covers the Smith and Schlather max-stable models, exploring their validity and interpretation.
Fluctuation-dissipation relations for reversible diffusions
Covers linear response, steady states, Girsanov transforms, and covariance limits in reversible diffusions.
Extreme Value Models: Technical Derivation
Explores the technical derivation and properties of Multivariate Extreme Value models.
Estimating the Term Structure: Principal Component Analysis
Covers Principal Component Analysis for yield curve shape estimation and dimension reduction in interest rate models.
Principal Component Analysis: Covariance Matrix and Eigenvalues
Explores Principal Component Analysis focusing on maximizing variance and finding eigenvalues.
Linear Dimensionality Reduction: PCA and LDA
Explores PCA and LDA for linear dimensionality reduction in data, emphasizing clustering and class separation techniques.
Stable Laws: Lindeberg-Rafeller Theorem
Covers the Lindeberg-Rafeller theorem, discussing characteristic functions, moment problems, and the Central Limit Theorem.
Fitting data with one Gauss function
Explains Gaussian functions, modeling data, likelihood function, and maximum likelihood optimization.
Covariance Cleaning and Estimators
Explores covariance matrix cleaning, optimal estimators, and rotationally invariant methods for portfolio optimization.
Oja's Rule
Covers Oja's rule in Neurorobotics, focusing on learning eigenvectors and eigenvalues for capturing maximal variance.
Gaussian Discriminant Analysis
Covers Gaussian discriminant analysis, log-likelihood, supervised learning, and logistic regression.
Multivariate Normal Distribution
Covers the multivariate normal distribution, moment-generating function, and combinatorics.
Estimating R: Moments and Covariance
Covers the estimation of R, focusing on moments and covariance.
Gaussian Random Vectors
Explores Gaussian random vectors and their statistical properties, emphasizing the importance of specifying statistical properties in complex valued random vectors.
Previous
Page 1 of 2
Next