Machine learning Hubbard parameters with equivariant neural networks

We have developed a machine learning model that dramatically speeds up the calculation of Hubbard parameters - critical for accurately modeling materials containing transition-metal and/or rare-earth chemical elements.
Machine learning Hubbard parameters with equivariant neural networks
Like

Share this post

Choose a social network to share with, or copy the URL to share elsewhere

This is a representation of how your post may appear on social media. The actual post will vary between social networks

Scientists at EPFL and the Paul Scherrer Institute have demonstrated that machine learning can significantly reduce the time and computational cost of density-functional theory with extended Hubbard functionals (DFT + U + V). This widely used method enables the accurate simulation of complex materials containing transition-metal or rare-earth elements by mitigating self-interaction errors inherent in semi-local functionals, particularly in systems with partially-filled d and f electronic states.

Achieving high accuracy with DFT + U + V depends on the precise determination of the on-site U and inter-site V Hubbard parameters. Traditionally, these parameters are either obtained through semi-empirical tuning—requiring prior knowledge—or through predictive but computationally expensive first-principles calculations. To address this challenge, the research team leveraged a recently developed class of neural networks called equivariant neural networks (ENNs), which offer a novel approach to learning and predicting Hubbard parameters with minimal computational overhead.

A New Machine Learning Approach

The team's machine learning model employs atomic occupation matrices as descriptors. These matrices directly capture the electronic structure, local chemical environment, and oxidation states of the system under investigation. The model is trained to predict Hubbard parameters computed self-consistently using iterative linear-response calculations within density-functional perturbation theory (DFPT) and structural relaxations.

The researchers trained two separate models—one for the U parameter and another for the V parameter—allowing them to work independently of one another. Using a dataset of 12 materials with varying crystal structures and compositions, the models achieved impressive results, with mean absolute relative errors of just 3% for U and 5% for V. Notably, these models also performed well in predicting downstream properties such as magnetic moments and voltages, demonstrating their robustness and practical applicability.

Key Advantages of the Machine Learning Model

The new approach replaces computationally demanding DFPT calculations while maintaining high accuracy for most practical applications. The model incorporates three crucial elements:

  1. Atomic occupation matrices within the DFT + U + V framework, which serve as descriptors of the material’s geometry, oxidation states, and local chemical environment.

  2. DFPT-based Hubbard parameters, which act as training targets.

  3. Interatomic distances, which provide additional structural information.

By utilizing equivariant neural networks, the model leverages the inherent O(3) group structure of the occupation matrices, ensuring excellent performance even with scarce training data. This guarantees both high accuracy and strong transferability, making the model suitable for diverse materials with ionic, covalent, and mixed ionic-covalent interactions.

Implications for Materials Science

By circumventing computationally expensive self-consistent DFT or DFPT calculations, this machine learning model accelerates the prediction of Hubbard parameters with negligible computational cost. Given its robust transferability, the model has significant implications for materials discovery and design, enabling high-throughput calculations across various technological applications, including energy storage and quantum materials.

This study represents a significant step forward in integrating machine learning and electronic structure theory, providing an efficient and accurate alternative to traditional methods. Equivariant neural networks have already demonstrated state-of-the-art accuracy and transferability in machine learning interaction potentials, and this work is the first to incorporate electronic-structure degrees of freedom as explicit features in solids.

Explore More

Interested in the details? Read our full paper in npj Computational Materials: https://www.nature.com/articles/s41524-024-01501-5

Additionally, check out the NCCR-MARVEL highlight, where the story behind this paper is featured: https://nccr-marvel.ch/highlights/machine-learning-hubbard

Let’s accelerate materials science together! 

Please sign in or register for FREE

If you are a registered user on Research Communities by Springer Nature, please sign in

Follow the Topic

Machine Learning
Mathematics and Computing > Computer Science > Artificial Intelligence > Machine Learning
Density Functional Theory
Physical Sciences > Materials Science > Computational Materials Science > Electronic Structure Calculations > Density Functional Theory
Strongly Correlated Systems
Physical Sciences > Physics and Astronomy > Condensed Matter Physics > Strongly Correlated Systems
Mathematical Models of Cognitive Processes and Neural Networks
Mathematics and Computing > Mathematics > Applications of Mathematics > Mathematical Models of Cognitive Processes and Neural Networks
Batteries
Physical Sciences > Materials Science > Materials for Energy and Catalysis > Batteries

Related Collections

With collections, you can get published faster and increase your visibility.

Machine Learning Interatomic Potentials in Computational Materials

Publishing Model: Open Access

Deadline: Jun 06, 2025

Computational Catalysis

Publishing Model: Open Access

Deadline: Dec 31, 2025