Abstract

We present a strengthened entropy power inequality (EPI) when one of the random summands is Gaussian. The sharpening is closely related to strong data processing for Gaussian channels and generalizes Costa’s EPI. This leads to new reverse convolution inequalities and, as a corollary, establishes stability of the Gaussian log Sobolev inequality in terms of entropy and/or Fisher information jumps under rescaled convolution. Applications to network information theory will be discussed, including a short proof of the converse for the two-encoder quadratic Gaussian source coding problem.