Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Information Theory: Source Coding & Channel Coding
Graph Chatbot
Related lectures (27)
Entropy and Data Compression: Huffman Coding Techniques
Discusses entropy, data compression, and Huffman coding techniques, emphasizing their applications in optimizing codeword lengths and understanding conditional entropy.
Conditional Entropy and Data Compression Techniques
Discusses conditional entropy and its role in data compression techniques.
Source Coding Theorems: Entropy and Source Models
Covers source coding theorems, entropy, and various source models in information theory.
Entropy and Algorithms: Applications in Sorting and Weighing
Covers the application of entropy in algorithms, focusing on sorting and decision-making strategies.
Compression: Prefix-Free Codes
Explains prefix-free codes for efficient data compression and the significance of uniquely decodable codes.
Conditional Entropy and Information Theory Concepts
Discusses conditional entropy and its role in information theory and data compression.
Compression: Kraft Inequality
Explains compression and Kraft inequality in codes and sequences.
Information Theory and Coding
Covers source coding, Kraft's inequality, mutual information, Huffman procedure, and properties of tropical sequences.
Stochastic Processes: Sequences and Compression
Explores compression in stochastic processes through injective codes and prefix-free codes.
Source Coding and Prefix-Free Codes
Covers source coding, injective codes, prefix-free codes, and Kraft's inequality.
Lecture: Shannon
Covers the basics of information theory, focusing on Shannon's setting and channel transmission.
Information Theory Basics
Introduces information theory basics, including entropy, independence, and binary entropy function.
Information Theory: Review and Mutual Information
Reviews information measures like entropy and introduces mutual information as a measure of information between random variables.
Random Variables and Information Theory Concepts
Introduces random variables and their significance in information theory, covering concepts like expected value and Shannon's entropy.
Information Theory: Channel Capacity and Convex Functions
Explores channel capacity and convex functions in information theory, emphasizing the importance of convexity.
Compression: Prefix-free Codes
Explains how to design efficient prefix-free codes for compression.
Source Coding Theorem
Explores the Source Coding Theorem, entropy, Huffman coding, and conditioning's impact on entropy reduction.
Probability and Statistics
Delves into probability, statistics, paradoxes, and random variables, showcasing their real-world applications and properties.
Entropy and Information Theory
Explores entropy, uncertainty, coding theory, and data compression applications.
Probability Distributions in Environmental Studies
Explores probability distributions for random variables in air pollution and climate change studies, covering descriptive and inferential statistics.
Previous
Page 1 of 2
Next