An Entropy Power Inequality for Dependent Variables

Author(s)Madiman, Mokshay
Author(s)Melbourne, James
Author(s)Roberto, Cyril
Date Accessioned2024-05-15T18:35:49Z
Date Available2024-05-15T18:35:49Z
Publication Date2024-04-05
DescriptionThis article was originally published in IEEE Transactions on Information Theory. The version of record is available at: https://doi.org/10.1109/TIT.2024.3385728. © 2024 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. This article will be embargoed until 04/05/2026.
AbstractThe entropy power inequality for independent random variables is a foundational result of information theory, with deep connections to probability and geometric functional analysis. Very few extensions of the entropy power inequality have been developed for settings with dependence. We address this gap in the literature by developing entropy power inequalities for dependent random variables. In particular, we highlight the role of log-supermodularity in delivering sufficient conditions for an entropy power inequality stated using conditional entropies.
SponsorResearch supported by the Labex MME-DII funded by ANR, reference ANR-11-LBX-0023-01, the Fondation Simone et Cino del Duca and the FP2M federation (CNRS FR 2036).
CitationM. Madiman, J. Melbourne and C. Roberto, "An Entropy Power Inequality for Dependent Variables," in IEEE Transactions on Information Theory, doi: 10.1109/TIT.2024.3385728.
ISSN1557-9654
URLhttps://udspace.udel.edu/handle/19716/34408
Languageen_US
PublisherIEEE Transactions on Information Theory
KeywordsEntropy Power Inequality
KeywordsDependent Variables
Keywordslog-supermodular
Keywordssubmodular functions
KeywordsFisher information
TitleAn Entropy Power Inequality for Dependent Variables
TypeArticle
Files
Original bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
An Entropy Power Inequality for Dependent Variables.pdf
Size:
1.04 MB
Format:
Adobe Portable Document Format
Description:
Main article
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
2.22 KB
Format:
Item-specific license agreed upon to submission
Description: