Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Discrete-Time Markov Chains: Definitions
Graph Chatbot
Related lectures (32)
Markov Chains: Ergodicity and Stationary Distribution
Explores ergodicity and stationary distribution in Markov chains, emphasizing convergence properties and unique distributions.
Markov Chains: Challenges and Surprises
Explores challenges and surprises in Markov chains, analyzing convergence, state grouping, and transition probabilities.
Markov Chains: Properties and Expectations
Explores Markov chains' properties, expectations, and recurrence in Poisson processes.
Probability Inequalities
Explores probability inequalities, convergence types, and moment generating functions for distribution approximation.
Discrete-Time Markov Chains: Absorbing Chains Examples
Explores examples of absorbing chains in discrete-time Markov chains, focusing on transition probabilities.
Probability and Statistics
Delves into probability, statistics, paradoxes, and random variables, showcasing their real-world applications and properties.
Markov Chains: Stationary Distributions
Explores Markov chains and stationary distributions, emphasizing the importance of identifying them for improving convergence.
Markov Chain Games
Explores Markov chain games, hitting probabilities, and expected hitting times in a target set.
Markov Chain Analysis
Delves into Markov chains by analyzing a scenario with two fleas moving in opposite directions, exploring transition matrices and probabilities over time.
Introduction to Convexity
Introduces the key concepts of convexity and its applications in different fields.
Hitting Times: Examples
Explores hitting times in Markov chains through various examples and scenarios.
Markov Chains: Recurrence and Transience
Explores first passage times, strong Markov property, and state recurrence/transience in Markov chains.
Previous
Page 2 of 2
Next