Rényi Bounds on Information Combining

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Standard

Rényi Bounds on Information Combining. / Hirche, Christoph.

2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings. IEEE, 2020. p. 2297-2302 9174256.

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Harvard

Hirche, C 2020, Rényi Bounds on Information Combining. in 2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings., 9174256, IEEE, pp. 2297-2302, 2020 IEEE International Symposium on Information Theory, ISIT 2020, Los Angeles, United States, 21/07/2020. https://doi.org/10.1109/ISIT44484.2020.9174256

APA

Hirche, C. (2020). Rényi Bounds on Information Combining. In 2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings (pp. 2297-2302). [9174256] IEEE. https://doi.org/10.1109/ISIT44484.2020.9174256

Vancouver

Hirche C. Rényi Bounds on Information Combining. In 2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings. IEEE. 2020. p. 2297-2302. 9174256 https://doi.org/10.1109/ISIT44484.2020.9174256

Author

Hirche, Christoph. / Rényi Bounds on Information Combining. 2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings. IEEE, 2020. pp. 2297-2302

Bibtex

@inproceedings{c873109723b84bbba93d87bf3d02bf94,
title = "R{\'e}nyi Bounds on Information Combining",
abstract = "Bounds on information combining are entropic inequalities that determine how the information, or entropy, of a set of random variables can change when they are combined in certain prescribed ways. Such bounds play an important role in information theory, particularly in coding and Shannon theory. The arguably most elementary kind of information combining is the addition of two binary random variables, i.e. a CNOT gate, and the resulting quantities are fundamental when investigating belief propagation and polar coding.In this work we will generalize the concept to R{\'e}nyi entropies. We give optimal bounds on the conditional R{\'e}nyi entropy after combination, based on a certain convexity or concavity property and discuss when this property indeed holds. Since there is no generally agreed upon definition of the conditional R{\'e}nyi entropy, we consider four different versions from the literature.Finally, we discuss the application of these bounds to the polarization of R{\'e}nyi entropies under polar codes.",
author = "Christoph Hirche",
year = "2020",
doi = "10.1109/ISIT44484.2020.9174256",
language = "English",
pages = "2297--2302",
booktitle = "2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings",
publisher = "IEEE",
note = "2020 IEEE International Symposium on Information Theory, ISIT 2020 ; Conference date: 21-07-2020 Through 26-07-2020",

}

RIS

TY - GEN

T1 - Rényi Bounds on Information Combining

AU - Hirche, Christoph

PY - 2020

Y1 - 2020

N2 - Bounds on information combining are entropic inequalities that determine how the information, or entropy, of a set of random variables can change when they are combined in certain prescribed ways. Such bounds play an important role in information theory, particularly in coding and Shannon theory. The arguably most elementary kind of information combining is the addition of two binary random variables, i.e. a CNOT gate, and the resulting quantities are fundamental when investigating belief propagation and polar coding.In this work we will generalize the concept to Rényi entropies. We give optimal bounds on the conditional Rényi entropy after combination, based on a certain convexity or concavity property and discuss when this property indeed holds. Since there is no generally agreed upon definition of the conditional Rényi entropy, we consider four different versions from the literature.Finally, we discuss the application of these bounds to the polarization of Rényi entropies under polar codes.

AB - Bounds on information combining are entropic inequalities that determine how the information, or entropy, of a set of random variables can change when they are combined in certain prescribed ways. Such bounds play an important role in information theory, particularly in coding and Shannon theory. The arguably most elementary kind of information combining is the addition of two binary random variables, i.e. a CNOT gate, and the resulting quantities are fundamental when investigating belief propagation and polar coding.In this work we will generalize the concept to Rényi entropies. We give optimal bounds on the conditional Rényi entropy after combination, based on a certain convexity or concavity property and discuss when this property indeed holds. Since there is no generally agreed upon definition of the conditional Rényi entropy, we consider four different versions from the literature.Finally, we discuss the application of these bounds to the polarization of Rényi entropies under polar codes.

U2 - 10.1109/ISIT44484.2020.9174256

DO - 10.1109/ISIT44484.2020.9174256

M3 - Article in proceedings

AN - SCOPUS:85090404332

SP - 2297

EP - 2302

BT - 2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings

PB - IEEE

T2 - 2020 IEEE International Symposium on Information Theory, ISIT 2020

Y2 - 21 July 2020 through 26 July 2020

ER -

ID: 256725137