Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Advanced Information Theory: Random Binning
Graph Chatbot
Related lectures (27)
Compression: Prefix-Free Codes
Explains prefix-free codes for efficient data compression and the significance of uniquely decodable codes.
Lecture: Shannon
Covers the basics of information theory, focusing on Shannon's setting and channel transmission.
Compression: Kraft Inequality
Explains compression and Kraft inequality in codes and sequences.
Entropy and Data Compression: Huffman Coding Techniques
Discusses entropy, data compression, and Huffman coding techniques, emphasizing their applications in optimizing codeword lengths and understanding conditional entropy.
Entropy and Algorithms: Applications in Sorting and Weighing
Covers the application of entropy in algorithms, focusing on sorting and decision-making strategies.
Variational Formulation: Information Measures
Explores variational formulation for measuring information content and divergence between probability distributions.
Coding Theorem: Proof and Properties
Covers the proof and properties of the coding theorem, focusing on maximizing the properties of lx and the achievable rate.
Uniform Integrability and Convergence
Explores uniform integrability, convergence theorems, and the importance of bounded sequences in understanding the convergence of random variables.
Information Measures: Entropy and Information Theory
Explains how entropy measures uncertainty in a system based on possible outcomes.
Information Theory: Source Coding, Cryptography, Channel Coding
Covers source coding, cryptography, and channel coding in communication systems, exploring entropy, codes, error channels, and future related courses.
Channel Coding: Convolutional Codes
Explores channel coding with a focus on convolutional codes, emphasizing error detection, correction, and decoding processes.
Source Coding: Compression
Covers entropy, source coding, encoding maps, decodability, prefix-free codes, and Kraft-McMillan's inequality.
Source Coding Theorems: Entropy and Source Models
Covers source coding theorems, entropy, and various source models in information theory.
Geodesic Convexity: Theory and Applications
Explores geodesic convexity in metric spaces and its applications, discussing properties and the stability of inequalities.
Information Theory: Source Coding
Covers source coding, typical sequences, stationarity, and efficient encoding in information theory.
Conditional Entropy and Data Compression Techniques
Discusses conditional entropy and its role in data compression techniques.
Optimal Transport: Rockafellar Theorem
Explores the Rockafellar Theorem in optimal transport, focusing on c-cyclical monotonicity and convex functions.
Channel Coding and BICM (LLRs)
Explores channel coding, BICM, and LLRs in wireless communication systems, emphasizing the importance of error detection and correction.
Conditional Entropy and Information Theory Concepts
Discusses conditional entropy and its role in information theory and data compression.
Correlations of the Liouville function
Explores correlations of the Liouville function along deterministic and independent sequences, covering key concepts and theorems.
Previous
Page 1 of 2
Next