Skip to content

Machine Learning for Interatomic Potentials

Synopsis

The two day and a half Machine Learning for Molecular Simulation activities consist of introductory lectures, practicals, and advanced seminars. The first two days are structured similarly, starting with a lecture with a high level overview and some theoretical background, followed by an introduction to the practicals. Students will use Jupyter notebooks on the Deepnote.com cloud provider. Students will be introduced to popular methods for fitting interatomic potentials: the Gaussian Approximation Potential (GAP), the Atomic Cluster Expansion (ACE) and Message Passing Neural Networks for Fast and Accurate Force Fields (MACE). On the first day, the practical is about learning how to train simple force fields on fixed data sets like small molecules and solid materials. The second day we will introduced neural network potentials and explore the stability of trained force fields, beyond simple error quantification. On the last day we will touch on iterative training and active learning protocols.

Lecture 1 General Introduction ML: motivation/context history/outline

Lecture 2 Representation, Atomic Descriptors

Lecture 3 Kernel/Linear models: GAP/ACE

Workshop 1 (2.5 h) (GAP or ACE)

Lecture 4 Neural Networks: MACE

Lecture 5 Beyond RMSE, MD stability

Lecture 6 Training workflows

Workshop 2 (2.5 h) (MACE, issues with stability)

Lecture 7 Error quantification / active learning

Workshop 3 (2 h) (active learning)