Madiman, MokshayMelbourne, JamesRoberto, Cyril2024-05-152024-05-152024-04-05M. Madiman, J. Melbourne and C. Roberto, "An Entropy Power Inequality for Dependent Variables," in IEEE Transactions on Information Theory, doi: 10.1109/TIT.2024.3385728.1557-9654https://udspace.udel.edu/handle/19716/34408This article was originally published in IEEE Transactions on Information Theory. The version of record is available at: https://doi.org/10.1109/TIT.2024.3385728. © 2024 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. This article will be embargoed until 04/05/2026.The entropy power inequality for independent random variables is a foundational result of information theory, with deep connections to probability and geometric functional analysis. Very few extensions of the entropy power inequality have been developed for settings with dependence. We address this gap in the literature by developing entropy power inequalities for dependent random variables. In particular, we highlight the role of log-supermodularity in delivering sufficient conditions for an entropy power inequality stated using conditional entropies.en-USEntropy Power InequalityDependent Variableslog-supermodularsubmodular functionsFisher informationAn Entropy Power Inequality for Dependent VariablesArticle