Abstract

The cut-set bound developed by Cover and El Gamal in 1979 has since remained the best known upper bound on the capacity of the Gaussian relay channel. We develop a new explicit upper bound on the capacity of the Gaussian primitive relay channel which is tighter than the cut-set bound. Combined with a simple tensorization argument, this result also implies that the current capacity approximations for Gaussian relay networks, which have linear gap to the cut-set bound in the number of nodes, are order-optimal and leads to a lower bound on the pre-constant.  Our approach significantly differs from the standard information-theoretic approach for proving upper bounds on the capacity of multi-user channels. We use measure concentration to analyze  the probabilistic geometric relations between the typical sets of the n-letter random variables associated with a reliable code, which translate to new entropy inequalities between the random variables involved in the problem.