Cross Learning between Electronic Structure Theories for Unifying Molecular, Surface, and Inorganic Crystal Foundation Force Fields
Paper
•
2510.25380
•
Published
MACE-MH-1 is a foundation machine-learning interatomic potential (MLIP) that bridges molecular, surface, and materials chemistry through cross-domain learning:
MACE-MH-1 has the following features:
For more details, please refer to the paper, GitHub repository, and MACE foundations.
| Benchmark | Metric | MACE-MH-1 | ORB-v3 | UMA-S-1.1 |
|---|---|---|---|---|
| Phonon BZ | MAE (K) | 5 | 15 | 9 |
| Phonon ωavg | MAE (K) | 3 | 5 | 4 |
| Phonon ωmin | MAE (K) | 11 | 29 | 21 |
| Phonon ωmax | MAE (K) | 12 | 12 | 11 |
| Entropy (300K) | MAE (J/mol·K) | 8 | 13 | 7 |
| Helmholtz Free Energy (300K) | MAE (kJ/mol) | 2 | 3 | 2 |
| Heat Capacity | MAE (J/mol·K) | 3 | 4 | 3 |
| Bulk Modulus | MAE (GPa) | 12.49 | 7.18 | 14.33 |
| Shear Modulus | MAE (GPa) | 7.95 | 8.03 | 8.18 |
| Thermal Conductivity | RMSE (W/mK) | 0.24 | 0.21 | 0.20 |
| Benchmark | Metric | MACE-MH-1-OMAT-D3 | ORB-v3 | UMA-S-1.1-OMAT-D3 |
|---|---|---|---|---|
| X23 Formation Energy | MAE (kJ/mol) | 15.82 | 28.76 | 27.99 |
| Ice Polymorphs (DMC) | MAE (meV) | 11.23 | 138.44 | 310.82 |
| Benchmark | Metric | MACE-MH-1-OMAT-D3 | ORB-v3-D3 | UMA-S-1.1-OMAT-D3 |
|---|---|---|---|---|
| S24 Adsorption | MAE (eV) | 0.095 | 0.174 | 0.329 |
| OC20 Adsorption | MAE (eV) | 0.138 | 0.159 | 0.172 |
| OC20 Correlation | Pearson's r | 0.98 | 0.974 | 0.97 |
| Benchmark | Metric | MACE-MH-1-OMAT-D3 | ORB-v3-D3 | UMA-S-1.1-OMAT-D3 |
|---|---|---|---|---|
| Wiggle150 | MAE (kcal/mol) | 4.80 | 7.65 | 6.60 |
| GMTKN55 Overall | WTMAD (kcal/mol) | 11.23 | 22.30 | 30.83 |
| PLF547 (proteins) | MAE (kcal/mol) | 0.626 | 1.829 | 2.935 |
| S30L (host-guest) | MAE (kcal/mol) | 10.13 | 13.64 | 15.14 |
| Test | Metric | MACE-MH-1 | ORB-v3 | UMA-S-1.1 |
|---|---|---|---|---|
| Slab Extensivity | Δ (meV) | 0.0 | -709.7 | -453.8 |
| H-Atom Additivity | max |ΔF| (meV/Å) | 0.0 | 61.65 | 969.2 |
| Diatomic Force Flips | Mean count | 2.09 | 2.91 | 10.73 |
| Diatomic Minima | Mean count | 1.42 | 1.62 | 4.82 |
pip install mace-torch
from mace.calculators import mace_mp
from ase import Atoms
# Load the MACE-MH-1 model (using the OMAT/PBE head)
calc = mace_mp(model=path, default_dtype="float64", device="cuda", head="omat_pbe")
# Create an example structure
atoms = Atoms('H2O', positions=[[0, 0, 0], [0, 0, 1], [0, 1, 0]])
atoms.calc = calc
# Calculate energy and forces
energy = atoms.get_potential_energy()
forces = atoms.get_forces()
print(f"Energy: {energy} eV")
print(f"Forces:\n{forces}")
MACE-MH-1 contains multiple task-specific heads trained on different levels of theory:
| Head Name | Level of Theory | Best For | Access |
|---|---|---|---|
| omat_pbe (default) | PBE/PBE+U | General materials, balanced performance across tasks | Specify in model |
| omol | ωB97M-VV10 | 1% of OMOL data: Molecular systems, organic chemistry, Organometallic | Specify in model |
| spice_wB97M | ωB97M-D3(BJ) | Molecular systems and organic chemistry | Specify in model |
| rgd1_b3lyp | B3LYP | Reaction chemistry | Specify in model |
| oc20_usemppbe | PBE | Surface catalysis, adsorbates | Specify in model |
| matpes_r2scan | r²SCAN meta-GGA | High-accuracy materials | Specify in model |
By default, the OMAT head (PBE) is used, which provides the best cross-domain performance.
If you use MACE-MH-1 in your research, please cite:
@article{batatia2025crosslearning,
title={Cross Learning between Electronic Structure Theories for Unifying Molecular, Surface, and Inorganic Crystal Foundation Force Fields},
author={Batatia, Ilyes and Lin, Chen and Hart, Joseph and Kasoar, Elliott and Elena, Alin M. and Norwood, Sam Walton and Wolf, Thomas and Cs{\'a}nyi, G{\'a}bor},
journal={arXiv preprint arXiv:2510.25380},
year={2025}
}
@article{batatia2022mace,
title={MACE: Higher order equivariant message passing neural networks for fast and accurate force fields},
author={Batatia, Ilyes and Kovacs, David Peter and Simm, Gregor and Ortner, Christoph and Cs{\'a}nyi, G{\'a}bor},
journal={Advances in Neural Information Processing Systems},
volume={35},
pages={11423--11436},
year={2022}
}
This model is released under the ASL License.
This work was supported by computational resources from: