Information MeasuresCovers information measures like entropy, Kullback-Leibler divergence, and data processing inequality, along with probability kernels and mutual information.
Information Measures: Part 2Covers information measures like entropy, joint entropy, and mutual information in information theory and data processing.
Generalization ErrorDiscusses mutual information, data processing inequality, and properties related to leakage in discrete systems.
Information Measures: Part 1Covers information measures, tail bounds, subgaussions, subpossion, independence proof, and conditional expectation.
Information MeasuresCovers variational representation and information measures such as entropy and mutual information.
Lecture: ShannonCovers the basics of information theory, focusing on Shannon's setting and channel transmission.
Mutual Information: ContinuedExplores mutual information for quantifying statistical dependence between variables and inferring probability distributions from data.
Achievable Rate & CapacityExplores achievable rate, channel capacity, spectral efficiency, and fading channels in wireless communication systems.