I will give an overview of some of my work on dynamic asynchronous algorithms and systems for graph-parallel machine learning. I will begin by providing some background motivation on the need for structured probabilistic reasoning and some of the traditional tools and techniques used in this domain. I will then introduce our work on inference in probabilistic graphical models. In particular I will present our Splash BP and Splash Gibbs sampling algorithms and show how adaptive asynchronous parallelization can be used to accelerate convergence and preserve sequential statistical properties in the parallel and distributed settings. Motivated by our work on these algorithms I will provide an overview of GraphLab: a high-level abstraction and system designed to support the design and implementation of dynamic asynchronous graph-parallel algorithms.