Efficient Training of Neural Network Potentials for Chemical and Enzymatic Reactions by Continual Learning
Yao-Kun Lei
1, 2, 3, 4, 5, 6, 7, 8, 9, 10
,
K. Yagi
1, 2, 3, 4, 6, 7, 8, 9, 11
,
Yuji Sugita
1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 12, 13
1
Theoretical Molecular Science Laboratory
3
Computational Biophysics Research Team
6
Theoretical Molecular Science Laboratory, Wako, Japan
|
8
Computational Biophysics Research Team, Kobe, Japan
|
12
Laboratory for Biomolecular Function Simulation
13
Laboratory for Biomolecular Function Simulation, Kobe, Japan
|
Publication type: Journal Article
Publication date: 2025-03-02
scimago Q1
wos Q1
SJR: 1.482
CiteScore: 9.8
Impact factor: 5.5
ISSN: 15499618, 15499626
Abstract
Machine learning (ML) methods have emerged as an efficient surrogate for high-level electronic structure theory, offering precision and computational efficiency. However, the vast conformational and chemical space remains challenging when constructing a general force field. Training data sets typically cover only a limited region of this space, resulting in poor extrapolation performance. Traditional strategies must address this problem by training models from scratch using old and new data sets. In addition, model transferability is crucial for general force field construction. Existing ML force fields, designed for closed systems with no external environmental potential, exhibit limited transferability to complex condensed phase systems such as enzymatic reactions, resulting in inferior performance and high memory costs. Our ML/MM model, based on the Taylor expansion of the electrostatic operator, showed high transferability between reactions in several simple solvents. This work extends the strategy to enzymatic reactions to explore the transferability between more complex heterogeneous environments. In addition, we also apply continual learning strategies based on memory data sets to enable autonomous and on-the-fly training on a continuous stream of new data. By combining these two methods, we can efficiently construct a force field that can be applied to chemical reactions in various environmental media.
Found
Nothing found, try to update filter.
Found
Nothing found, try to update filter.
Are you a researcher?
Create a profile to get free access to personal recommendations for colleagues and new articles.
Metrics
4
Total citations:
4
Citations from 0:
0
Cite this
GOST |
RIS |
BibTex |
MLA
Cite this
GOST
Copy
Lei Y. et al. Efficient Training of Neural Network Potentials for Chemical and Enzymatic Reactions by Continual Learning // Journal of Chemical Theory and Computation. 2025. Vol. 21. No. 5. pp. 2695-2711.
GOST all authors (up to 50)
Copy
Lei Y., Yagi K., Sugita Y. Efficient Training of Neural Network Potentials for Chemical and Enzymatic Reactions by Continual Learning // Journal of Chemical Theory and Computation. 2025. Vol. 21. No. 5. pp. 2695-2711.
Cite this
RIS
Copy
TY - JOUR
DO - 10.1021/acs.jctc.4c01393
UR - https://pubs.acs.org/doi/10.1021/acs.jctc.4c01393
TI - Efficient Training of Neural Network Potentials for Chemical and Enzymatic Reactions by Continual Learning
T2 - Journal of Chemical Theory and Computation
AU - Lei, Yao-Kun
AU - Yagi, K.
AU - Sugita, Yuji
PY - 2025
DA - 2025/03/02
PB - American Chemical Society (ACS)
SP - 2695-2711
IS - 5
VL - 21
SN - 1549-9618
SN - 1549-9626
ER -
Cite this
BibTex (up to 50 authors)
Copy
@article{2025_Lei,
author = {Yao-Kun Lei and K. Yagi and Yuji Sugita},
title = {Efficient Training of Neural Network Potentials for Chemical and Enzymatic Reactions by Continual Learning},
journal = {Journal of Chemical Theory and Computation},
year = {2025},
volume = {21},
publisher = {American Chemical Society (ACS)},
month = {mar},
url = {https://pubs.acs.org/doi/10.1021/acs.jctc.4c01393},
number = {5},
pages = {2695--2711},
doi = {10.1021/acs.jctc.4c01393}
}
Cite this
MLA
Copy
Lei, Yao-Kun, et al. “Efficient Training of Neural Network Potentials for Chemical and Enzymatic Reactions by Continual Learning.” Journal of Chemical Theory and Computation, vol. 21, no. 5, Mar. 2025, pp. 2695-2711. https://pubs.acs.org/doi/10.1021/acs.jctc.4c01393.