Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Concept
Error exponent
Applied sciences
Information engineering
Information theory
Channel capacity
Graph Chatbot
Related lectures (12)
Login to filter by course
Login to filter by course
Reset
Information Theory: Source Coding, Cryptography, Channel Coding
Covers source coding, cryptography, and channel coding in communication systems, exploring entropy, codes, error channels, and future related courses.
Multi-user Gaussian Channels with Noisy Feedback
Delves into challenges and opportunities of multi-user Gaussian channels with noisy feedback, presenting a new mathematical framework.
Information Theory and Coding: Source Coding
Covers source coding, encoder design, and error probability analysis in information theory and coding.
Information Theoretic Security: Wiretap Channel
Explores secret key generation in the Wiretap Channel model and achievable key rate-leakage pairs.
Conditional Entropy and Data Compression Techniques
Discusses conditional entropy and its role in data compression techniques.
Communication Channels: Encoding and Decoding
Explores encoding and decoding techniques in communication systems, focusing on fundamental limits and mutual information computations.
Error Correction Codes: Basics
Introduces erasure and error channels, Hamming distance, and error correction codes.
Entropy and Data Compression: Huffman Coding Techniques
Discusses entropy, data compression, and Huffman coding techniques, emphasizing their applications in optimizing codeword lengths and understanding conditional entropy.
Huffman Coding: Optimal Prefix-Free Codes
Explores Huffman coding, demonstrating its optimality in average codeword length and prefix-free property.
Source Coding Theorems: Entropy and Source Models
Covers source coding theorems, entropy, and various source models in information theory.
Stationary Sources: Properties and Entropy
Explores stationary sources, entropy, regularity, and coding efficiency, including a challenging problem with billiard balls.
Information Theory: Sampling, Quantization, and Communication Systems
Explores nonuniform sampling, quantization, noise challenges, and communication theories.
Previous
Page 1 of 1
Next