Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Information Theory and Coding
Graph Chatbot
Related lectures (30)
Information Theory and Coding
Covers source coding, Kraft's inequality, mutual information, Huffman procedure, and properties of tropical sequences.
Lecture: Shannon
Covers the basics of information theory, focusing on Shannon's setting and channel transmission.
Source Coding and Prefix-Free Codes
Covers source coding, injective codes, prefix-free codes, and Kraft's inequality.
Information Theory: Source Coding, Cryptography, Channel Coding
Covers source coding, cryptography, and channel coding in communication systems, exploring entropy, codes, error channels, and future related courses.
Compression: Prefix-Free Codes
Explains prefix-free codes for efficient data compression and the significance of uniquely decodable codes.
Information Theory: Channel Capacity and Convex Functions
Explores channel capacity and convex functions in information theory, emphasizing the importance of convexity.
Information Theory: Entropy and Information Processing
Explores entropy in information theory and its role in data processing and probability distributions.
Entropy and Data Compression: Huffman Coding Techniques
Discusses entropy, data compression, and Huffman coding techniques, emphasizing their applications in optimizing codeword lengths and understanding conditional entropy.
Information Measures: Entropy and Information Theory
Explains how entropy measures uncertainty in a system based on possible outcomes.
Information Theory Basics
Introduces information theory basics, including entropy, independence, and binary entropy function.
Entropy and Algorithms: Applications in Sorting and Weighing
Covers the application of entropy in algorithms, focusing on sorting and decision-making strategies.
Source Coding Theorem: Fundamentals and Models
Covers the Source Coding Theorem, source models, entropy, regular sources, and examples.
Source Coding Theorem
Explores the Source Coding Theorem, entropy, Huffman coding, and conditioning's impact on entropy reduction.
Mutual Information and Entropy
Explores mutual information and entropy calculation between random variables.
Information Theory: Entropy and Capacity
Covers concepts of entropy, Gaussian distributions, and channel capacity with constraints.
Data Compression and Shannon-Fano Algorithm
Explores the Shannon-Fano algorithm for data compression and its efficiency in creating unique binary codes for letters.
Conditional Entropy and Data Compression Techniques
Discusses conditional entropy and its role in data compression techniques.
Data Compression and Shannon's Theorem: Shannon's Theorem Demonstration
Covers the demonstration of Shannon's theorem, focusing on data compression.
Universal Source Coding
Covers the Lempel-Ziv universal coding algorithm and invertible finite state machines in information theory.
Conditional Entropy and Information Theory Concepts
Discusses conditional entropy and its role in information theory and data compression.
Previous
Page 1 of 2
Next