DFT is so 2008?
- 👤 Speaker: Aaron Kaplan, Lawrence Berkeley National Laboratory (USA)
- 📅 Date & Time: Monday 09 March 2026, 14:00 - 14:30
- 📍 Venue: https://zoom.us/j/92447982065?pwd=RkhaYkM5VTZPZ3pYSHptUXlRSkppQT09
Abstract
Although DFT has long been the workhorse method of computational chemistry, materials science, and physics, all beasts of burden eventually need a break. Machine learning interatomic potentials (MLIPs) have recently begun to bridge the divide between chemistry-dependent classical interatomic potentials and DFT . Often covering nearly all the periodic table, the accuracy of an MLIP depends heavily on its training data. Extant MLIP -ready datasets, such as the Materials Project, tend to be noisy and favor basins on the potential energy surface. More recent efforts to purpose-build datasets for training MLI Ps have favored maximalist approaches, possibly leading to dataset duplication. In my talk, I’ll give a broad overview of the accuracy and computational considerations entering the construction of MLIP datasets, including relevant background on DFT . I’ll then discuss two recent dataset generation efforts I’ve contributed to, MatPES and MP-ALOE, which have sought to maximize dataset information density and accuracy. I’ll finish with an outlook to where these efforts lead.
Series This talk is part of the Lennard-Jones Centre series.
Included in Lists
- Hanchen DaDaDash
- https://zoom.us/j/92447982065?pwd=RkhaYkM5VTZPZ3pYSHptUXlRSkppQT09
- Lennard-Jones Centre
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)


Monday 09 March 2026, 14:00-14:30