mace-torch
¶
graph-pes
supports the conversion of arbitrary mace-torch
models to GraphPESModel
objects via the MACEWrapper
class.
We also provide convenience functions to access the recently trained MACE-MP
and MACE-OFF
“foundation” models, as well as the GO-MACE-23
and Egret-1
series of models.
You can use all of these models in the same way as any other GraphPESModel
, either via the Python API:
from graph_pes.interfaces import mace_mp
model = mace_mp("medium-0b3")
model.predict_energy(graph)
or within a graph-pes-train
configuration file:
model:
+mace_off:
model: small
If you use any mace-torch
models in your work, please visit the mace-torch repository and cite the following:
@inproceedings{Batatia2022mace,
title={{MACE}: Higher Order Equivariant Message Passing Neural Networks for Fast and Accurate Force Fields},
author={Ilyes Batatia and David Peter Kovacs and Gregor N. C. Simm and Christoph Ortner and Gabor Csanyi},
booktitle={Advances in Neural Information Processing Systems},
editor={Alice H. Oh and Alekh Agarwal and Danielle Belgrave and Kyunghyun Cho},
year={2022},
url={https://openreview.net/forum?id=YPpSngE-ZU}
}
@misc{Batatia2022Design,
title = {The Design Space of E(3)-Equivariant Atom-Centered Interatomic Potentials},
author = {Batatia, Ilyes and Batzner, Simon and Kov{\'a}cs, D{\'a}vid P{\'e}ter and Musaelian, Albert and Simm, Gregor N. C. and Drautz, Ralf and Ortner, Christoph and Kozinsky, Boris and Cs{\'a}nyi, G{\'a}bor},
year = {2022},
number = {arXiv:2205.06643},
eprint = {2205.06643},
eprinttype = {arxiv},
doi = {10.48550/arXiv.2205.06643},
archiveprefix = {arXiv}
}
Installation¶
To install graph-pes
with support for MACE models, you need to install
the mace-torch package. We recommend doing this in a new environment:
conda create -n graph-pes-mace python=3.10
conda activate graph-pes-mace
pip install mace-torch graph-pes
Interface¶
- graph_pes.interfaces.mace_mp(model='small', precision=None)[source]¶
Donwload a MACE-MP model and convert it for use with
graph-pes
.Internally, we use the foundation_models functionality from the mace-torch package.
Please cite the following if you use this model:
MACE-MP by Ilyes Batatia, Philipp Benner, Yuan Chiang, Alin M. Elena, Dávid P. Kovács, Janosh Riebesell, et al., 2023, arXiv:2401.00096
MACE-Universal by Yuan Chiang, 2023, Hugging Face, Revision e5ebd9b, DOI: 10.57967/hf/1202, URL: https://huggingface.co/cyrusyc/mace-universal
Matbench Discovery by Janosh Riebesell, Rhys EA Goodall, Philipp Benner, Yuan Chiang, Alpha A Lee, Anubhav Jain, Kristin A Persson, 2023, arXiv:2308.14920
As of 10th April 2025, the following models are available:
["small", "medium", "large", "medium-mpa-0", "small-0b", "medium-0b", "small-0b2", "medium-0b2", "medium-0b3", "large-0b2", "medium-omat-0"]
- Parameters:
model (str) – The size of the MACE-MP model to download.
precision (Literal['float32', 'float64'] | None) – The precision of the model. If
None
, the default precision of torch will be used (you can set this when usinggraph-pes-train
viageneral/torch/dtype
)
- Return type:
- graph_pes.interfaces.mace_off(model, precision=None)[source]¶
Download a MACE-OFF model and convert it for use with
graph-pes
.If you use this model, please cite the relevant paper by Kovacs et.al., arXiv:2312.15211
- Parameters:
model (Literal['small', 'medium', 'large']) – The size of the MACE-OFF model to download.
precision (Literal['float32', 'float64'] | None) – The precision of the model.
- Return type:
- graph_pes.interfaces.go_mace_23(precision=None)[source]¶
Download the GO-MACE-23 model and convert it for use with
graph-pes
.Note
This model is only for use on structures containing Carbon, Hydrogen and Oxygen. Attempting to use on structures with other elements will raise an error.
If you use this model, please cite the following:
@article{El-Machachi-24, title = {Accelerated {{First-Principles Exploration}} of {{Structure}} and {{Reactivity}} in {{Graphene Oxide}}}, author = {{El-Machachi}, Zakariya and Frantzov, Damyan and Nijamudheen, A. and Zarrouk, Tigany and Caro, Miguel A. and Deringer, Volker L.}, year = {2024}, journal = {Angewandte Chemie International Edition}, volume = {63}, number = {52}, pages = {e202410088}, doi = {10.1002/anie.202410088}, }
- Return type:
- graph_pes.interfaces.egret(model='egret-1')[source]¶
Download an Egret model and convert it for use with
graph-pes
.Use the
egret-1
model via the Python API:from graph_pes.interfaces._mace import egret model = egret("egret-1")
or fine-tune it on your own data using the graph-pes-train command:
model: +egret: {model: "egret-1e"} data: ... # etc.
If you use this model, please cite the following:
@misc{Mann-25-04, title = { Egret-1: {{Pretrained Neural Network Potentials For Efficient}} and {{Accurate Bioorganic Simulation}} }, author = { Mann, Elias L. and Wagen, Corin C. and Vandezande, Jonathon E. and Wagen, Arien M. and Schneider, Spencer C. }, year = {2025}, number = {arXiv:2504.20955}, doi = {10.48550/arXiv.2504.20955}, }
As of 1st May 2025, the following models are available:
["egret-1", "egret-1t", "egret-1e"]
:param model: The model to download.- Return type:
- class graph_pes.interfaces._mace.MACEWrapper(model)[source]¶
Converts any MACE model from the mace-torch package into a
GraphPESModel
.You can use this to drive MD using LAMMPS, fine-tune MACE models, or any functionality that
graph-pes
provides.- Parameters:
model (torch.nn.Module) – The MACE model to wrap.
Examples
>>> mace_torch_model = ... # create your MACE model any-which way >>> from graph_pes.interfaces._mace import MACEWrapper >>> graph_pes_model = MACEWrapper(mace_torch_model) # convert to graph-pes >>> graph_pes_model.predict_energy(graph) torch.Tensor([123.456]) >>> from graph_pes.utils.calculator import GraphPESCalculator >>> calculator = GraphPESCalculator(graph_pes_model) >>> calculator.calculate(ase_atoms)