An Entropy Power Inequality for Dependent Variables

Loading...
Thumbnail Image

Journal Title

Journal ISSN

Volume Title

Publisher

IEEE Transactions on Information Theory

Abstract

The entropy power inequality for independent random variables is a foundational result of information theory, with deep connections to probability and geometric functional analysis. Very few extensions of the entropy power inequality have been developed for settings with dependence. We address this gap in the literature by developing entropy power inequalities for dependent random variables. In particular, we highlight the role of log-supermodularity in delivering sufficient conditions for an entropy power inequality stated using conditional entropies.

Description

This article was originally published in IEEE Transactions on Information Theory. The version of record is available at: https://doi.org/10.1109/TIT.2024.3385728. © 2024 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. This article will be embargoed until 04/05/2026.

Citation

M. Madiman, J. Melbourne and C. Roberto, "An Entropy Power Inequality for Dependent Variables," in IEEE Transactions on Information Theory, doi: 10.1109/TIT.2024.3385728.

Endorsement

Review

Supplemented By

Referenced By