Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Source Coding: Compression
Graph Chatbot
Related lectures (26)
Compression: Prefix-Free Codes
Explains prefix-free codes for efficient data compression and the significance of uniquely decodable codes.
Compression: Kraft Inequality
Explains compression and Kraft inequality in codes and sequences.
Entropy and Algorithms: Applications in Sorting and Weighing
Covers the application of entropy in algorithms, focusing on sorting and decision-making strategies.
Conditional Entropy and Data Compression Techniques
Discusses conditional entropy and its role in data compression techniques.
Stochastic Processes: Sequences and Compression
Explores compression in stochastic processes through injective codes and prefix-free codes.
Entropy and Data Compression: Huffman Coding Techniques
Discusses entropy, data compression, and Huffman coding techniques, emphasizing their applications in optimizing codeword lengths and understanding conditional entropy.
Source Coding Theorem
Explores the Source Coding Theorem, entropy, Huffman coding, and conditioning's impact on entropy reduction.
Data Compression and Shannon's Theorem: Entropy Calculation Example
Demonstrates the calculation of entropy for a specific example, resulting in an entropy value of 2.69.
Data Compression and Shannon-Fano Algorithm
Explores the Shannon-Fano algorithm for data compression and its efficiency in creating unique binary codes for letters.
Data Compression and Shannon's Theorem Summary
Summarizes Shannon's theorem, emphasizing the importance of entropy in data compression.
Kraft-McMillan Theorem
Explores the Kraft-McMillan theorem, proving the existence of uniquely decodable prefix-free codes.
Source Coding Theorems: Entropy and Source Models
Covers source coding theorems, entropy, and various source models in information theory.
Shannon's Theorem
Introduces Shannon's Theorem on binary codes, entropy, and data compression limits.
Compression: Strong Connection and Prefix-Free Codes
Explores the relationship between code word length and probability distribution, focusing on designing prefix-free codes for efficient compression.
Compression: Prefix-free Codes
Explains how to design efficient prefix-free codes for compression.
Information Theory: Source Coding, Cryptography, Channel Coding
Covers source coding, cryptography, and channel coding in communication systems, exploring entropy, codes, error channels, and future related courses.
Conditional Entropy: Huffman Coding
Explores conditional entropy and Huffman coding for efficient data compression techniques.
Compression: Prediction
Covers the concepts of compression and prediction using prefix-free codes and distributions.
Data Compression and Shannon's Theorem: Lossy Compression
Explores data compression, including lossless methods and the necessity of lossy compression for real numbers and signals.
Compression
Covers the concept of compression and constructing prefix-free codes based on given information.
Previous
Page 1 of 2
Next