University of Cambridge > Talks.cam > Churchill CompSci Talks > Understanding the Source Coding Theorem: A Talk on Shannon’s Entropy

Understanding the Source Coding Theorem: A Talk on Shannon’s Entropy

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Matthew Ireland.

Online only

How many bits do we need to encode a sequence of English characters? Can we do better by considering the relative frequency of each character? Is there a theoretical limit to how we encode data with negligible risk of information loss?

In this talk, we first define Shannon’s entropy, which quantifies the predictability of a sequence of random variables. We will also explore the Source Coding Theorem, which provides an operational definition of entropy by establishing a fundamental limit to the compressibility of information. Finally, we will prove this theorem with some surprisingly simple results in Probability Theory.

This talk is part of the Churchill CompSci Talks series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity