Rényi Bounds on Information Combining

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Documents

  • Fulltext

    Accepted author manuscript, 348 KB, PDF document

  • Christoph Hirche

Bounds on information combining are entropic inequalities that determine how the information, or entropy, of a set of random variables can change when they are combined in certain prescribed ways. Such bounds play an important role in information theory, particularly in coding and Shannon theory. The arguably most elementary kind of information combining is the addition of two binary random variables, i.e. a CNOT gate, and the resulting quantities are fundamental when investigating belief propagation and polar coding.In this work we will generalize the concept to Rényi entropies. We give optimal bounds on the conditional Rényi entropy after combination, based on a certain convexity or concavity property and discuss when this property indeed holds. Since there is no generally agreed upon definition of the conditional Rényi entropy, we consider four different versions from the literature.Finally, we discuss the application of these bounds to the polarization of Rényi entropies under polar codes.

Original languageEnglish
Title of host publication2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings
PublisherIEEE
Publication date2020
Pages2297-2302
Article number9174256
ISBN (Electronic)9781728164328
DOIs
Publication statusPublished - 2020
Event2020 IEEE International Symposium on Information Theory, ISIT 2020 - Los Angeles, United States
Duration: 21 Jul 202026 Jul 2020

Conference

Conference2020 IEEE International Symposium on Information Theory, ISIT 2020
LandUnited States
ByLos Angeles
Periode21/07/202026/07/2020
SponsorIEEE Information Theory Society, The Institute of Electrical and Electronics Engineers

ID: 256725137