Dual-Level Training of Gaussian Processes with Physically Inspired Priors for Geometry Optimizations
- Chong TengChong TengDepartment of Chemistry, Boston College, Chestnut Hill, Massachusetts 02467, United StatesMore by Chong Teng
- ,
- Yang WangYang WangDepartment of Chemistry, Boston College, Chestnut Hill, Massachusetts 02467, United StatesMore by Yang Wang
- ,
- Daniel HuangDaniel HuangDepartment of Computer Science, San Francisco State University, San Francisco, California 94132, United StatesMore by Daniel Huang
- ,
- Katherine MartinKatherine MartinDepartment of Chemistry, Boston College, Chestnut Hill, Massachusetts 02467, United StatesMore by Katherine Martin
- ,
- Jean-Baptiste Tristan*Jean-Baptiste Tristan*Email: [email protected]Department of Computer Science, Boston College, Chestnut Hill, Massachusetts 02467, United StatesMore by Jean-Baptiste Tristan
- , and
- Junwei Lucas Bao*Junwei Lucas Bao*Email: [email protected]Department of Chemistry, Boston College, Chestnut Hill, Massachusetts 02467, United StatesMore by Junwei Lucas Bao
Abstract

Gaussian process (GP) regression has been recently developed as an effective method in molecular geometry optimization. The prior mean function is one of the crucial parts of the GP. We design and validate two types of physically inspired prior mean functions: force-field-based priors and posterior-type priors. In this work, we implement a dual-level training (DLT) optimizer for the posterior-type priors. The DLT optimizers can be considered as a class of optimization algorithms that belong to the delta-machine learning paradigm but with several major differences compared to the previously proposed algorithms in the same paradigm. In the first level of the DLT, we incorporate the classical mechanical descriptions of the equilibrium geometries into the prior function, which enhances the performance of the GP optimizer as compared to the one using a constant (or zero) prior. In the second level, we utilize the surrogate potential energy surfaces (PESs), which incorporate the physics learned in the first-level training, as the prior function to refine the model performance further. We find that the force-field-based priors and posterior-type priors reduce the overall optimization steps by a factor of 2–3 when compared to the limited-memory Broyden–Fletcher–Goldfarb–Shanno (L-BFGS) optimizer as well as the constant-prior GP optimizer proposed in previous works. We also demonstrate the potential of recovering the real PESs with GP with a force-field prior. This work shows the importance of including domain knowledge as an ingredient in the GP, which offers a potentially robust learning model for molecular geometry optimization and for exploring molecular PESs.
Cited By
This article is cited by 1 publications.
- Chong Teng, Daniel Huang, Junwei Lucas Bao. A spur to molecular geometry optimization: Gradient-enhanced universal kriging with on-the-fly adaptive ab initio prior mean functions in curvilinear coordinates. The Journal of Chemical Physics 2023, 158 (2) , 024112. https://doi.org/10.1063/5.0133675