Abstract

Differentially Private algorithms often need to select the best amongst many candidate options. Classical works on this selection problem, such as the Exponential Mechanism, require that the candidates’ goodness, measured as a real-valued score function, does not change by much when one person’s data changes. In many applications such as hyperparameter optimization, this stability assumption is much too strong. I will talk about recent work, where we consider the selection problem under a much weaker stability assumption on the candidates, namely that the score functions are differentially private. Under this assumption, I will describe algorithms that are near-optimal along the three relevant dimensions: privacy, utility and computational efficiency.

Joint work with Jingcheng Liu.

Video Recording