Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Lecture: Shannon
Graph Chatbot
Related lectures (27)
Data Compression and Shannon's Theorem: Entropy Calculation Example
Demonstrates the calculation of entropy for a specific example, resulting in an entropy value of 2.69.
Random Walks and Moran Model in Population Genetics
Explores random walks, Moran model, bacterial chemotaxis, entropy, information theory, and coevolving sites in proteins.
Data Compression and Shannon-Fano Algorithm
Explores the Shannon-Fano algorithm for data compression and its efficiency in creating unique binary codes for letters.
Stationary Sources: Properties and Entropy
Explores stationary sources, entropy, regularity, and coding efficiency, including a challenging problem with billiard balls.
Quantifying Randomness and Information in Biological Data
Explores entropy, randomness, and information quantification in biological data analysis, including neuroscience and protein structure prediction.
Information Theory: Quantifying Messages and Source Entropy
Covers quantifying information in messages, source entropy, common information, and communication channel capacity.
Information Measures
Covers information measures like entropy, Kullback-Leibler divergence, and data processing inequality, along with probability kernels and mutual information.
Previous
Page 2 of 2
Next