University of Cambridge > Talks.cam > Probability > Continuity of mutual information and the entropy-power inequality

Continuity of mutual information and the entropy-power inequality

Download to your calendar using vCal

If you have a question about this talk, please contact Berestycki .

Consider an additive channel of information transmission with a continuous noise. The talk will focus on sufficient conditions for continuity of the input-output mutual information for large and small signal-to-noise ratios. This result leads to Shannon’s entropy-power inequality which is a far-reaching generalisation of Minkowski’s inequality from the classical analysis.

This talk is part of the Probability series.

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

Š 2006-2025 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity