mace-torch

graph-pes supports the conversion of arbitrary mace-torch models to GraphPESModel objects via the MACEWrapper class.

We also provide two convenience functions to the recently trained MACE-MP and MACE-OFF “foundation” models, as well as the GO-MACE-23 model.

If you use any mace-torch models in your work, please visit the mace-torch repository and cite the following:

@inproceedings{Batatia2022mace,
    title={{MACE}: Higher Order Equivariant Message Passing Neural Networks for Fast and Accurate Force Fields},
    author={Ilyes Batatia and David Peter Kovacs and Gregor N. C. Simm and Christoph Ortner and Gabor Csanyi},
    booktitle={Advances in Neural Information Processing Systems},
    editor={Alice H. Oh and Alekh Agarwal and Danielle Belgrave and Kyunghyun Cho},
    year={2022},
    url={https://openreview.net/forum?id=YPpSngE-ZU}
}

@misc{Batatia2022Design,
    title = {The Design Space of E(3)-Equivariant Atom-Centered Interatomic Potentials},
    author = {Batatia, Ilyes and Batzner, Simon and Kov{\'a}cs, D{\'a}vid P{\'e}ter and Musaelian, Albert and Simm, Gregor N. C. and Drautz, Ralf and Ortner, Christoph and Kozinsky, Boris and Cs{\'a}nyi, G{\'a}bor},
    year = {2022},
    number = {arXiv:2205.06643},
    eprint = {2205.06643},
    eprinttype = {arxiv},
    doi = {10.48550/arXiv.2205.06643},
    archiveprefix = {arXiv}
}

Installation

To install graph-pes with support for MACE models, you need to install the mace-torch package. We recommend doing this in a new environment:

conda create -n graph-pes-mace python=3.10
conda activate graph-pes-mace
pip install mace-torch graph-pes

Interface

graph_pes.interfaces.mace_mp(model, precision=None)[source]

Donwload a MACE-MP model and convert it for use with graph-pes.

Internally, we use the foundation_models functionality from the mace-torch package.

Please cite the following if you use this model:

  • MACE-MP by Ilyes Batatia, Philipp Benner, Yuan Chiang, Alin M. Elena, Dávid P. Kovács, Janosh Riebesell, et al., 2023, arXiv:2401.00096

  • MACE-Universal by Yuan Chiang, 2023, Hugging Face, Revision e5ebd9b, DOI: 10.57967/hf/1202, URL: https://huggingface.co/cyrusyc/mace-universal

  • Matbench Discovery by Janosh Riebesell, Rhys EA Goodall, Philipp Benner, Yuan Chiang, Alpha A Lee, Anubhav Jain, Kristin A Persson, 2023, arXiv:2308.14920

Parameters:
  • model (Literal['small', 'medium', 'large']) – The size of the MACE-MP model to download.

  • precision (Literal['float32', 'float64'] | None) – The precision of the model. If None, the default precision of torch will be used (you can set this when using graph-pes-train via general/torch/dtype)

Return type:

MACEWrapper

graph_pes.interfaces.mace_off(model, precision=None)[source]

Download a MACE-OFF model and convert it for use with graph-pes.

If you use this model, please cite the relevant paper by Kovacs et.al., arXiv:2312.15211

Parameters:
  • model (Literal['small', 'medium', 'large']) – The size of the MACE-OFF model to download.

  • precision (Literal['float32', 'float64'] | None) – The precision of the model.

Return type:

MACEWrapper

graph_pes.interfaces.go_mace_23(precision=None)[source]

Download the GO-MACE-23 model and convert it for use with graph-pes.

Note

This model is only for use on structures containing Carbon, Hydrogen and Oxygen. Attempting to use on structures with other elements will raise an error.

If you use this model, please cite the following:

@article{El-Machachi-24,
    title = {Accelerated {{First-Principles Exploration}} of {{Structure}} and {{Reactivity}} in {{Graphene Oxide}}},
    author = {{El-Machachi}, Zakariya and Frantzov, Damyan and Nijamudheen, A. and Zarrouk, Tigany and Caro, Miguel A. and Deringer, Volker L.},
    year = {2024},
    journal = {Angewandte Chemie International Edition},
    volume = {63},
    number = {52},
    pages = {e202410088},
    doi = {10.1002/anie.202410088},
}
Return type:

MACEWrapper

class graph_pes.interfaces._mace.MACEWrapper(model)[source]

Converts any MACE model from the mace-torch package into a GraphPESModel.

You can use this to drive MD using LAMMPS, fine-tune MACE models, or any functionality that graph-pes provides.

Parameters:

model (torch.nn.Module) – The MACE model to wrap.

Examples

>>> mace_torch_model = ...  # create your MACE model any-which way
>>> from graph_pes.interfaces._mace import MACEWrapper
>>> graph_pes_model = MACEWrapper(mace_torch_model)  # convert to graph-pes
>>> graph_pes_model.predict_energy(graph)
torch.Tensor([123.456])
>>> from graph_pes.utils.calculator import GraphPESCalculator
>>> calculator = GraphPESCalculator(graph_pes_model)
>>> calculator.calculate(ase_atoms)