Self-learning Monte Carlo method with equivariant Transformer
- đ¤ Speaker: Yuki Nagai (University of Tokyo) đ Website
- đ Date & Time: Wednesday 26 March 2025, 14:45 - 15:15
- đ Venue: DAMTP, MR11
Abstract
Machine learning and deep learning have revolutionized computational physics, particularly the simulation of complex systems. Equivariance is essential for simulating physical systems because it imposes a strong inductive bias on the probability distribution described by a machine learning model. However, imposing symmetry on the model can sometimes lead to poor acceptance rates in self-learning Monte Carlo (SLMC). Here, we introduce a symmetry equivariant attention mechanism for SLMC , which can be systematically improved. We evaluate our architecture on a spin-fermion model (i.e. double exchange model) on a two-dimensional lattice. Our results show that the proposed method overcomes the poor acceptance rates of linear models and exhibits a similar scaling law to large language models, with model quality monotonically increasing with the number of layers [1]. Our work paves the way for the development of more accurate and efficient Monte Carlo algorithms with machine learning for simulating complex physical systems.
[1] YN and A. Tomiya, J. Phys. Soc. Jpn. 93, 114007 (2024)
Series This talk is part of the DAMTP Data Intensive Science Seminar series.
Included in Lists
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)



Wednesday 26 March 2025, 14:45-15:15