Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Discrete-Time Markov Chains: Definitions
Graph Chatbot
Related lectures (32)
Discrete-Time Markov Chains: Definitions
Covers the definitions and state probabilities of discrete-time Markov chains.
Markov Chains: Definitions and State Probabilities
Covers the definitions and state probabilities of discrete-time Markov chains.
Birth & Death Chains: Analysis & Probabilities
Explores birth and death chains, hitting probabilities, and expected game durations in Markov chains.
Stochastic Models for Communications: Discrete-Time Markov Chains - First Passage Time
Explores discrete-time Markov chains, emphasizing first passage time probabilities and minimal solutions.
Stochastic Models for Communications: Discrete-Time Markov Chains - Absorption Time
Discusses discrete-time Markov chains and absorption time in communication systems.
Markov Chains: Absorbing Classes
Explores Markov chains with absorbing classes through exercises on transition matrices and expected values.
Stochastic Models: Absorbing Markov Chains Examples
Covers examples of absorbing Markov chains in discrete time.
Hidden Markov Models: Primer
Introduces Hidden Markov Models, explaining the basic problems and algorithms like Forward-Backward, Viterbi, and Baum-Welch, with a focus on Expectation-Maximization.
Stochastic Models for Communications
Covers stochastic models for communications, focusing on random variables, Markov chains, Poisson processes, and probability calculations.
Probability & Stochastic Processes
Covers applied probability, stochastic processes, Markov chains, rejection sampling, and Bayesian inference methods.
Sunny Rainy Source: Markov Model
Explores a first-order Markov model using a sunny-rainy source example, demonstrating how past events influence future outcomes.
Continuous-Time Markov Chains: Definitions and State Probabilities
Covers the definitions and state probabilities of continuous-time Markov chains.
Markov Chains: Theory and Applications
Covers the theory and applications of Markov chains in modeling random phenomena and decision-making under uncertainty.
Lindblad equation
Covers the interpretation of the Lindblad equation and its unitary part in quantum gases.
Bonus Malus System: Transition Probabilities
Explores the Bonus Malus system for insurance premiums and Markov chain transition probabilities.
Continuous-Time Markov Chains: Definitions and State Probabilities
Covers definitions and state probabilities of continuous-time Markov chains for communications.
NISQ and IBM Q
Explores NISQ devices and IBM Q, covering noisy quantum circuits, qubit technologies, and quantum algorithm development.
Continuous-Time Markov Chains: Definitions and State Probabilities
Covers the definitions and state probabilities of continuous-time Markov chains.
Expected Number of Visits in State
Covers the criterion for recurrence in infinite chains based on the expected number of visits in a state.
Conditional Expectation: Grouping Lemma
Explores conditional expectation, the grouping lemma, and the law of large numbers.
Previous
Page 1 of 2
Next