Abstract

The speed and scalability of algorithms for problems on large data-sets depends on their (i) locality—the amount of data the algorithms moves around, and (ii) degree of parallelism. Therefore, a cost model for quantifying  complexity along these lines would be useful for a systematic study and comparison of algorithms.

This talk will present an overview of cost models for locality and parallelism including models that are program-centric, i.e., agnostic to the choice of parallel machine. We will discuss how costs in these models map to performance on parallel machines. To demonstrate their use and application, we will provide illustrative examples of analysis of algorithms (upper bounds) and lower bounds for problems.

We hope this talk will start a discussion on the suitability of different cost models for problems in the inference and optimization domains. Such a basis for comparison would help identify good algorithms, and highlight the necessity of new algorithms where optimal ones do not exist.

Video Recording