I will discuss how tools from statistical physics used to analyze partition functions, such as Peierls arguments and the cluster expansion, can be used to solve seemingly unrelated distributed computing problems about programmable matter. Programmable matter is a material or substance that has the ability to change its features in a programmable, distributed way; examples are diverse and include robot swarms and smart materials. We study an abstraction of programmable matter where particles independently move on a lattice according to simple, local algorithms. We want to design these algorithms so that the system has a desired collective behavior, such as compression of the particles into a shape with small perimeter or separation of differently colored particles. In our stochastic approach, we describe a desired collective behavior using an energy function; design a Markov chain that uses local moves and converges to the Gibbs distribution for this energy function; and then turn the Markov chain into an asynchronous distributed algorithm that each particle can execute independently. To prove our algorithms are correct, we must show this Gibbs distribution has the desired properties with high probability, which is equivalent to bounding the ratio of two polynomials (partition functions). Our previous work on the compression problem used Peierls arguments to do this, and more recent work on the separation problem necessitated the introduction of the cluster expansion. The key feature of the cluster expansion we use is that we can separate partition functions into volume and surface terms that we can analyze separately. Joint work with Joshua J. Daymude, Cem Gokmen, Dana Randall, and Andrea Richa.
All scheduled dates:
No Upcoming activities yet