Abstract

Dvoretzky's theorem tells us that if we put an arbitrary norm on n-dimensional Euclidean space, no matter what that normed space is like, if we pass to subspaces of dimension about log(n), the space looks pretty much Euclidean. A related probabistic/measure-theoretic phenomenon has long been observed: the (one dimensional) marginals of many natural high-dimensional probability distributions look about Gaussian. A natural question is whether this phenomenon persists for k-dimensional marginals for k growing with n, and if so, for how large a k? In this talk I will discuss a result showing that the phenomenon does indeed persist if k less than 2log(n)/log(log(n)), and that this bound is sharp.

Attachment

Video Recording