Private Federated Learning and Statistics systems often use locally differentially private (LDP) algorithms to generate randomizations of user data. The outputs of these local randomizers can be aggregated to get stronger central privacy, using cryptographically secure approaches such as Secure Aggregation or PRIO. In high-dimensional settings, the cost of communication can become a bottleneck and no communication-efficient algorithms are known for several basic estimation tasks. This is in contrast to the standard LDP setting where recent work has shown that locally private randomizers can be compressed to just $\Otilde(\eps)$ bits without loss in utility. Unfortunately, the resulting compressed randomizers are not compatible with the aforementioned secure aggregation machinery, forcing a choice between low communication and efficient secure aggregation. In this work, we address the problem via {\em Secret-Shareable Compression}: a general technique that converts an LDP randomizer into a low-communication randomizer that supports efficient aggregation of compressed secret-shares. We instantiate our approach for the frequency estimation problem, and get a lossless Secret-shareable compression to $\Otilde(\exp(\eps))$ bits. We show similar results for the problem of mean estimation in $\ell_2$. Together our results lead to efficient algorithms for differentially private aggregation under minimal trust assumptions.