University of Cambridge > Talks.cam > Machine learning in Physics, Chemistry and Materials discussion group (MLDG) > Kohn-Sham equations as regularizer: building prior knowledge into machine-learned physics

Kohn-Sham equations as regularizer: building prior knowledge into machine-learned physics

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Bingqing Cheng .

Including prior knowledge is important for effective machine learning models in physics, and is usually achieved by explicitly adding loss terms or constraints on model architectures. Prior knowledge embedded in the physics computation itself rarely draws attention. We consider the Kohn-Sham self-consistent calculation as a differentiable program and show that solving the Kohn-Sham equations when training neural networks for the exchange-correlation functional provides an implicit regularization that greatly improves generalization. Our results serve as proof of principle for rethinking computational physics in this emerging paradigm of scientific computing with enormous development in automatic differentiation libraries, hardware accelerators and deep learning.

This talk is part of the Machine learning in Physics, Chemistry and Materials discussion group (MLDG) series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity