Publication:
Equivariant Deep Learning Interatomic Potentials

No Thumbnail Available

Date

2023-06-01

Published Version

Published Version

Journal Title

Journal ISSN

Volume Title

Publisher

The Harvard community has made this article openly available. Please share how this access benefits you.

Research Projects

Organizational Units

Journal Issue

Citation

Batzner, Simon Lutz. 2023. Equivariant Deep Learning Interatomic Potentials. Doctoral dissertation, Harvard University Graduate School of Arts and Sciences.

Research Data

Abstract

Simulating the quantum-mechanical behavior of matter is at the core of the computational design of novel molecules and molecules. First-principles methods provide high fidelity, but scale poorly with the number of electrons, thereby greatly limiting the accessible length-scales and time-scales. This effect is particularly pronounced in molecular dynamics where due to short required integration time steps, millions or even billions of steps are required for the computation of structural and kinetic observables. Machine learning interatomic potentials aim to move past this dilemma, by learning to regress the energies, forces, and stresses of accurate first-principles calculations and providing an energy model that scales linearly with the number of atoms. Representations of the atomistic geometry must obey the symmetries of 3D space, translation, rotation, and mirrors, which together comprise the Euclidean group E(3). For decades, interatomic potentials have represented the atomistic geometry as invariants of the geometry, typically distances and angles, thereby automatically satisfying the required symmetries. In this thesis, I present efforts to generalize these invariant representations to equivariant ones, that directly operate on the relative interatomic positions and more faithfully represent the atomistic geometry. I demonstrate that this leads to large improvements in generalization, robustness, transferability, as well as order-of-magnitude improvements in sample efficiency of the learned potentials. I then demonstrate how to move from atom-centered message passing interatomic potentials, the dominant approach in deep learning interatomic potentials, to strictly local deep learning interatomic potentials. These retain the accuracy and transferability of message-passing interatomic potentials, but due to their local nature can be scaled to large-scale simulations. I demonstrate the scalability of the strictly local potential on a molecular dynamics simulation of more than 100 million atoms. The work demonstrates how leveraging the symmetry of 3D space can lead to fundamental advances in machine learning interatomic potentials.

Description

Other Available Sources

Keywords

Applied mathematics

Terms of Use

This article is made available under the terms and conditions applicable to Other Posted Material (LAA), as set forth at Terms of Service

Endorsement

Review

Supplemented By

Referenced By

Related Stories