orb-models
¶
graph-pes
supports the conversion of arbitrary orb-models
models to GraphPESModel
objects via the OrbWrapper
class.
Use the orb_model()
function to load a pre-trained orb-models
model and convert it into a GraphPESModel
. You can then use this model in the same way as any other GraphPESModel
, for instance by fine-tuning it or using it to run MD via
torch-sim,
ASE or LAMMPS:
from graph_pes.interfaces import orb_model
from graph_pes import GraphPESModel
model = orb_model()
assert isinstance(model, GraphPESModel)
# do stuff ...
You can also reference the orb_model()
function in your training configs for graph-pes-train:
model:
+orb_model:
name: orb-v3-direct-20-omat
If you use any orb-models
models in your work, please visit the orb-models repository and cite the following:
@misc{rhodes2025orbv3atomisticsimulationscale,
title={Orb-v3: atomistic simulation at scale},
author={
Benjamin Rhodes and Sander Vandenhaute and Vaidotas Šimkus
and James Gin and Jonathan Godwin and Tim Duignan and Mark Neumann
},
year={2025},
eprint={2504.06231},
archivePrefix={arXiv},
primaryClass={cond-mat.mtrl-sci},
url={https://arxiv.org/abs/2504.06231},
}
@misc{neumann2024orbfastscalableneural,
title={Orb: A Fast, Scalable Neural Network Potential},
author={
Mark Neumann and James Gin and Benjamin Rhodes
and Steven Bennett and Zhiyi Li and Hitarth Choubisa
and Arthur Hussey and Jonathan Godwin
},
year={2024},
eprint={2410.22570},
archivePrefix={arXiv},
primaryClass={cond-mat.mtrl-sci},
url={https://arxiv.org/abs/2410.22570},
}
Installation¶
To install graph-pes
with support for orb-models
models, you need to install
the orb-models package alongside graph-pes
. We recommend doing this in a new environment:
conda create -n graph-pes-orb python=3.10
conda activate graph-pes-orb
pip install graph-pes orb-models
Interface¶
- graph_pes.interfaces.orb_model(name='orb-v3-direct-20-omat')[source]¶
Load a pre-trained Orb model, and convert it into a
GraphPESModel
.See the orb-models repository for more information on the available models. As of 2025-04-11, the following are available:
"orb-v3-conservative-20-omat"
"orb-v3-conservative-inf-omat"
"orb-v3-direct-20-omat"
"orb-v3-direct-inf-omat"
"orb-v3-conservative-20-mpa"
"orb-v3-conservative-inf-mpa"
"orb-v3-direct-20-mpa"
"orb-v3-direct-inf-mpa"
"orb-v2"
"orb-d3-v2"
"orb-d3-sm-v2"
"orb-d3-xs-v2"
"orb-mptraj-only-v2"
- Parameters:
name (str) – The name of the model to load.
- Return type:
- class graph_pes.interfaces._orb.OrbWrapper(orb)[source]¶
A wrapper around an
orb-models
model that converts it into aGraphPESModel
.- Parameters:
orb (torch.nn.Module) – The
orb-models
model to wrap.
- property orb_model: DirectForcefieldRegressor | ConservativeForcefieldRegressor¶
Access the underlying
orb-models
model.One use case of this is to to use
graph-pes
‘s fine-tuning functionality to adapt an existingorb-models
model to a new dataset. You can then re-extract the underlyingorb-models
model using this property and use it in otherorb-models
workflows.