COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |

University of Cambridge > Talks.cam > Isaac Newton Institute Seminar Series > Analytic information theory and beyond

## Analytic information theory and beyondAdd to your list(s) Download to your calendar using vCal - Szpankowski, W (Purdue)
- Thursday 03 June 2010, 15:00-16:00
- Seminar Room 1, Newton Institute.
If you have a question about this talk, please contact Mustapha Amrani. Stochastic Processes in Communication Sciences Analytic information theory aims at studying problems of information theory using analytic techniques of computer science and combinatorics. Following Hadamard’s precept, we tackle these problems by complex analysis methods such as generating functions, Mellin transform, Fourier series, saddle point method, analytic poissonization and depoissonization, and singularity analysis. This approach lies at the crossroad of computer science and information theory. In this talk, we concentrate on one facet of information theory (i.e., source coding better known as data compression), namely the redundancy rate problem. The redundancy rate problem for a class of sources is the determination of how far the actual code length exceeds the optimal (ideal) code length. In a minimax scenario one finds the is the determination of how far the actual code length exceeds the optimal (ideal) code length. In a minimax scenario one finds the maximal redundancy over all sources within a certain class while in the average scenario one computes the average redundancy over all possible sources. The redundancy rate problem is typical of a situation where second-order asymptotics play a crucial role (since the leading term of the optimal code length is known to be the entropy of the source). This problem is an ideal candidate for ``analytic information theory’’. We survey our recent results on the redundancy rate problem obtained by analytic methods. In particular, we present our findings for the redundancy of Shannon codes, Huffman codes, minimax redundancy for memoryless sources, Markov sources, and renewal processes. In the second part of the talk, we discuss the limitations of classical Shannon information theory, and argue that there is need for post-Shannon information theory. This talk is part of the Isaac Newton Institute Seminar Series series. ## This talk is included in these lists:- All CMS events
- Featured lists
- INI info aggregator
- Isaac Newton Institute Seminar Series
- School of Physical Sciences
- Seminar Room 1, Newton Institute
Note that ex-directory lists are not shown. |
## Other listsPublic Health Policy Talks Conference on the Birch and Swinnerton-Dyer conjecture Caius Medsoc Talks: "A Timeline of Medicine"## Other talksDr. Ulrich Schwarz-Linek - Title to be Confirmed Development of a Broadly-Neutralising Vaccine against Blood-Stage P. falciparum Malaria Targets for drug discovery: from target validation to the clinic NK cells: in and out of the circulation Metabolic regulation of T cell trafficking Perylene-Based Poly(N-Heterocycles): Organic Semiconductors, Biological Fluorescence Probes and Building Blocks for Molecular Surface Networks |