Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Markov Chains: Homogeneous Processes and Stationary Distributions
Graph Chatbot
Related lectures (29)
Limiting Distribution and Ergodic Theorem
Explores limiting distribution in Markov chains and the implications of ergodicity and aperiodicity on stationary distributions.
Probability & Stochastic Processes
Covers applied probability, stochastic processes, Markov chains, rejection sampling, and Bayesian inference methods.
Coupling of Markov Chains: Ergodic Theorem
Explores the coupling of Markov chains and the proof of the ergodic theorem, emphasizing distribution convergence and chain properties.
Markov Chains: Transition Probabilities
Explores Markov chains, transition matrices, distribution, and random walks.
Theory of MCMC
Covers the theory of Markov Chain Monte Carlo (MCMC) sampling and discusses convergence conditions, transition matrix choice, and target distribution evolution.
Markov Chains: Ergodicity and Stationary Distribution
Explores ergodicity and stationary distribution in Markov chains, emphasizing convergence properties and unique distributions.
Markov Chains: Ergodic Chains Examples
Covers stochastic models for communications, focusing on discrete-time Markov chains.
Markov Chains: Ergodic Chains Examples
Covers stochastic models for communications, focusing on discrete-time Markov chains.
Ergodic Theory: Markov Chains
Explores ergodic theory in Markov chains, discussing irreducibility and unique stationary distributions.
Stochastic Processes: Markov Chains
Covers stochastic processes, focusing on Markov chains and their applications in real-world scenarios.
Markov Chains: PageRank Algorithm
Explores the PageRank algorithm within Markov chains, emphasizing ergodicity and convergence for web page ranking.
Continuous-Time Markov Chains: Reversible Chains
Covers reversible continuous-time Markov chains and their properties.
Markov Chains: Ergodic Chains Examples
Covers stochastic models for communications, focusing on discrete-time Markov chains.
Markov Chains and Algorithm Applications
Covers Markov chains and their applications in algorithms, focusing on Markov Chain Monte Carlo sampling and the Metropolis-Hastings algorithm.
Invariant Distributions: Markov Chains
Explores invariant distributions for Markov Chains, emphasizing uniqueness and implications in communicating classes.
Elements of Statistics: Probability, Distributions, and Estimation
Covers probability theory, distributions, and estimation in statistics, emphasizing accuracy, precision, and resolution of measurements.
Levy Flights and Central Limit Theorem
Covers Levy flights, Central Limit Theorem, and Mesoscopic Master Equation with transition rates in an assurance system.
Markov Chains: Applications and Sampling Methods
Covers the basics of Markov chains and their algorithmic applications.
Lower Bound on Total Variation Distance
Explores the lower bound on total variation distance in Markov chains and its implications on mixing time.
Applied Probability & Stochastic Processes
Covers applied probability, Markov chains, and stochastic processes, including transition matrices, eigenvalues, and communication classes.
Previous
Page 1 of 2
Next