Data Compression and Shannon's Theorem: Entropy Calculation Example
Graph Chatbot
Description
This lecture focuses on calculating the entropy of a given example by determining the probabilities of letters and their frequencies of appearance, leading to a final entropy value of 2.69 for an average code length of 2.75.
Jean-Cédric Chappelier is currently senior researcher and lecturerin Computational Linguistics at the Ecole Polytechnique Fédérale deLausanne (Switzerland). He holds an Ms.Sci. and a Ph.D. in ComputerScience from the Ecole Nationale Supérieure des Télécommunicationsde Paris (France).
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
L'objectif de ce cours est d'introduire les étudiants à la pensée algorithmique, de les familiariser avec les fondamentaux de l'Informatique et de développer une première compétence en programmation (
Discusses entropy, data compression, and Huffman coding techniques, emphasizing their applications in optimizing codeword lengths and understanding conditional entropy.