Talks
Fall 2013

Communication Efficient Distributed Optimization for Machine Learning

Tuesday, December 16th, 2014 11:10 am11:35 am

Add to Calendar

Communication remains the most significant bottleneck in the performance of distributed optimization algorithms for large-scale machine learning. We propose a communication-efficient framework, COCOA, that uses local computation in a primal-dual setting to dramatically reduce the amount of necessary communication. We provide a strong convergence rate analysis for this class of algorithms, as well as experiments on real-world distributed datasets with implementations in Spark. In our experiments, we find that as compared to state-of-the-art mini-batch versions of SGD and SDCA algorithms, COCOA converges to the same .001-accurate solution quality on average 25× as quickly.

This is joint work with Virginia Smith, Martin Takáč, Jonathan Terhorst, Sanjay Krishnan, Thomas Hofmann, Michael I. Jordan