Covers information measures like entropy, Kullback-Leibler divergence, and data processing inequality, along with probability kernels and mutual information.
Explores maximal correlation in information theory, mutual information properties, Renyi's measures, and mathematical foundations of information theory.