Charles Explorer logo
🇬🇧

Committee neural network potentials control generalization errors and enable active learning

Publication at Faculty of Mathematics and Physics |
2020

Abstract

It is well known in the field of machine learning that committee models improve accuracy, provide generalization error estimates, and enable active learning strategies. In this work, we adapt these concepts to interatomic potentials based on artificial neural networks.

Instead of a single model, multiple models that share the same atomic environment descriptors yield an average that outperforms its individual members as well as a measure of the generalization error in the form of the committee disagreement. We not only use this disagreement to identify the most relevant configurations to build up the model's training set in an active learning procedure but also monitor and bias it during simulations to control the generalization error.

This facilitates the adaptive development of committee neural network potentials and their training sets while keeping the number of ab initio calculations to a minimum. To illustrate the benefits of this methodology, we apply it to the development of a committee model for water in the condensed phase.

Starting from a single reference ab initio simulation, we use active learning to expand into new state points and to describe the quantum nature of the nuclei. The final model, trained on 814 reference calculations, yields excellent results under a range of conditions, from liquid water at ambient and elevated temperatures and pressures to different phases of ice, and the air-water interface-all including nuclear quantum effects.

This approach to committee models will enable the systematic development of robust machine learning models for a broad range of systems.