Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Continuous-Time Markov Chains: Definitions and State Probabilities
Graph Chatbot
Related lectures (31)
Continuous-Time Markov Chains: Definitions and State Probabilities
Covers the definitions and state probabilities of continuous-time Markov chains.
Discrete-Time Markov Chains: Definitions
Covers the definitions and state probabilities of discrete-time Markov chains.
Markov Chains: Absorbing Classes
Explores Markov chains with absorbing classes through exercises on transition matrices and expected values.
Continuous-Time Markov Chains: Definitions and State Probabilities
Covers the definitions and state probabilities of continuous-time Markov chains.
Markov Chains: Definitions and State Probabilities
Covers the definitions and state probabilities of discrete-time Markov chains.
Stochastic Models for Communications
Covers stochastic models for communications, focusing on random variables, Markov chains, Poisson processes, and probability calculations.
Discrete-Time Markov Chains: Definitions
Covers the definitions and state probabilities of discrete-time Markov chains.
Probability & Stochastic Processes
Covers applied probability, stochastic processes, Markov chains, rejection sampling, and Bayesian inference methods.
Hidden Markov Models: Primer
Introduces Hidden Markov Models, explaining the basic problems and algorithms like Forward-Backward, Viterbi, and Baum-Welch, with a focus on Expectation-Maximization.
Stochastic Models: Absorbing Markov Chains Examples
Covers examples of absorbing Markov chains in discrete time.
Markov Chains: Theory and Applications
Covers the theory and applications of Markov chains in modeling random phenomena and decision-making under uncertainty.
Stochastic Models for Communications: Discrete-Time Markov Chains - First Passage Time
Explores discrete-time Markov chains, emphasizing first passage time probabilities and minimal solutions.
Birth & Death Chains: Analysis & Probabilities
Explores birth and death chains, hitting probabilities, and expected game durations in Markov chains.
Stochastic Models for Communications: Discrete-Time Markov Chains - Absorption Time
Discusses discrete-time Markov chains and absorption time in communication systems.
Sunny Rainy Source: Markov Model
Explores a first-order Markov model using a sunny-rainy source example, demonstrating how past events influence future outcomes.
Bonus Malus System: Transition Probabilities
Explores the Bonus Malus system for insurance premiums and Markov chain transition probabilities.
Markov Chains: Properties and Expectations
Explores Markov chains' properties, expectations, and recurrence in Poisson processes.
Lindblad equation
Covers the interpretation of the Lindblad equation and its unitary part in quantum gases.
Markov Chain Games
Explores Markov chain games, hitting probabilities, and expected hitting times in a target set.
Expected Number of Visits in State
Covers the criterion for recurrence in infinite chains based on the expected number of visits in a state.
Previous
Page 1 of 2
Next