Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Discrete-Time Markov Chains: Definitions
Graph Chatbot
Related lectures (32)
Discrete-Time Markov Chains: Definitions
Covers the definitions and state probabilities of discrete-time Markov chains.
Markov Chains: Definitions and State Probabilities
Covers the definitions and state probabilities of discrete-time Markov chains.
Birth & Death Chains: Analysis & Probabilities
Explores birth and death chains, hitting probabilities, and expected game durations in Markov chains.
Stochastic Models for Communications: Discrete-Time Markov Chains - First Passage Time
Explores discrete-time Markov chains, emphasizing first passage time probabilities and minimal solutions.
Stochastic Models for Communications: Discrete-Time Markov Chains - Absorption Time
Discusses discrete-time Markov chains and absorption time in communication systems.
Stochastic Models: Absorbing Markov Chains Examples
Covers examples of absorbing Markov chains in discrete time.
Hidden Markov Models: Primer
Introduces Hidden Markov Models, explaining the basic problems and algorithms like Forward-Backward, Viterbi, and Baum-Welch, with a focus on Expectation-Maximization.
Markov Chains: Absorbing Classes
Explores Markov chains with absorbing classes through exercises on transition matrices and expected values.
Stochastic Models for Communications
Covers stochastic models for communications, focusing on random variables, Markov chains, Poisson processes, and probability calculations.
Sunny Rainy Source: Markov Model
Explores a first-order Markov model using a sunny-rainy source example, demonstrating how past events influence future outcomes.
Probability & Stochastic Processes
Covers applied probability, stochastic processes, Markov chains, rejection sampling, and Bayesian inference methods.
Continuous-Time Markov Chains: Definitions and State Probabilities
Covers the definitions and state probabilities of continuous-time Markov chains.
Markov Chains: Theory and Applications
Covers the theory and applications of Markov chains in modeling random phenomena and decision-making under uncertainty.
Lindblad equation
Covers the interpretation of the Lindblad equation and its unitary part in quantum gases.
Continuous-Time Markov Chains: Definitions and State Probabilities
Covers definitions and state probabilities of continuous-time Markov chains for communications.
NISQ and IBM Q
Explores NISQ devices and IBM Q, covering noisy quantum circuits, qubit technologies, and quantum algorithm development.
Discrete-Time Markov Chains: Absorbing Chains Examples
Explores examples of absorbing chains in discrete-time Markov chains, focusing on transition probabilities.
Continuous-Time Markov Chains: Definitions and State Probabilities
Covers the definitions and state probabilities of continuous-time Markov chains.
Markov Chains: Ergodicity and Stationary Distribution
Explores ergodicity and stationary distribution in Markov chains, emphasizing convergence properties and unique distributions.
Expected Number of Visits in State
Covers the criterion for recurrence in infinite chains based on the expected number of visits in a state.
Previous
Page 1 of 2
Next