A new article titled "Uncertainty quantification in molecular simulations with dropout neural network potentials" by Mingjian Wen and Ellad Tadmor has been published in npj Computational Materials. The describes a rigorous and efficient approach based on dropout regularization for estimating uncertainty in the predictions of artificial neural network potentials. This leads to a new class of "Dropout Uncertainty Neural Network" (DUNN) potentials. A model driver for DUNN potentials is available here.
From the abstract:
"Machine learning interatomic potentials (IPs) can provide accuracy close to that of first-principles methods, such as density functional theory (DFT), at a fraction of the computational cost. This greatly extends the scope of accurate molecular simulations, providing opportunities for quantitative design of materials and devices on scales hitherto unreachable by DFT methods. However, machine learning IPs have a basic limitation in that they lack a physical model for the phenomena being predicted and therefore have unknown accuracy when extrapolating outside their training set. In this paper, we propose a class of Dropout Uncertainty Neural Network (DUNN) potentials that provide rigorous uncertainty estimates that can be understood from both Bayesian and frequentist statistics perspectives. As an example, we develop a DUNN potential for carbon and show how it can be used to predict uncertainty for static and dynamical properties, including stress and phonon dispersion in graphene. We demonstrate two approaches to propagate uncertainty in the potential energy and atomic forces to predicted properties. In addition, we show that DUNN uncertainty estimates can be used to detect configurations outside the training set, and in some cases, can serve as a predictor for the accuracy of a calculation."
The full article is available here.