Abstract
First-order model counting recently emerged as a computational tool for high-level probabilistic reasoning. It is concerned with counting satisfying assignments to sentences in first-order logic and upgrades the successful propositional model counting approaches to probabilistic reasoning. We give an overview of model counting as it is applied in statistical relational learning, probabilistic programming, probabilistic databases, and hybrid reasoning. A short tutorial illustrates the principles behind these solvers. Finally, we show that first-order counting is a fundamentally different problem from the propositional counting techniques that inspired it.