Abstract

The success of deep learning is inextricable linked to gradient-based optimization and to differentiable programming in general. In scientific applications, merging machine learning models and physics-based simulators is particularly compelling. ML surrogates can replace or boost expensive simulators, and also reduce the gap between (necessarily) incomplete models and experimental data. Physics-derived concepts and invariances add inductive bias to otherwise black-box models improving their expressivity, learning efficiency or generalization.

Here, we will describe research examples in the area of molecular simulations where ML surrogate functions, and in particular their gradients accessed through differentiable programming, are exploited. The algorithmic applications include active learning of machine learning potentials for ground and excited states, adversarial attacks on differentiable uncertainty, learning of data-driven collective variables for enhanced sampling simulations, coarse-graining and backmapping all-atom simulations, and interpolation of differentiable alchemical atom types for thermodynamic integration.

Video Recording