Optimizers¶
- class graph_pes.training.opt.Optimizer(name, **kwargs)[source]¶
A factory class for delayed instantiation of
torch.optim.Optimizer
objects.The generated optimizer splits the parameters of the model into two groups:
“non-decayable” parameters, which are all parameters returned by the
non_decayable_parameters()
method of the model.“normal” parameters, corresponding to the remaining model parameters.
Unsurprisingly, any specified weight decay is applied only to the normal model parameters.
As an example, per-element energy offsets parameters of
LearnableOffset
models represent the arbitrary zero points of energies for different elements: it doesn’t make sense to push these towards zero during training.Note
We use delayed instantiation of optimizers when configuring our training runs to allow for arbitrary changes to the model and its parameters during the
pre_fit_all_components
method.- Parameters:
name (str | type[torch.optim.Optimizer]) – The name of the
torch.optim.Optimizer
class to use, e.g."Adam"
or"SGD"
. Alternatively, provide the type of any subclass oftorch.optim.Optimizer
.**kwargs – Additional keyword arguments to pass to the specified optimizer’s constructor.
Examples
Pass a named optimiser:
>>> from graph_pes.training.opt import Optimizer >>> optimizer_factory = Optimizer("AdamW", lr=1e-3) >>> optimizer_instance = optimizer_factory(model)
Or pass the optimiser class directly:
>>> from torch.optim import SGD >>> optimizer_factory = Optimizer(SGD, lr=1e-3) >>> optimizer_instance = optimizer_factory(model)
Psuedo-code excerpt from
graph-pes-train
logic:>>> from graph_pes.training.opt import Optimizer >>> from graph_pes.models import LennardJones >>> ... >>> optimizer_factory = Optimizer("AdamW", lr=1e-3) >>> model = LennardJones() >>> model.pre_fit(train_loader) >>> optimizer_instance = optimizer_factory(model)
Schedulers¶
- class graph_pes.training.opt.LRScheduler(name, **kwargs)[source]¶
A factory class for delayed instantiation of
torch.optim.lr_scheduler.LRScheduler
objects.- Parameters:
name (str | type[torch.optim.lr_scheduler.LRScheduler]) – The name of the
torch.optim.lr_scheduler.LRScheduler
class to use, e.g."ReduceLROnPlateau"
. Alternatively, provide any subclass oftorch.optim.lr_scheduler.LRScheduler
.**kwargs – Additional keyword arguments to pass to the specified scheduler’s constructor.
Examples
>>> from graph_pes.training.opt import LRScheduler >>> ... >>> scheduler_factory = LRScheduler( ... "LambdaLR", lr_lambda=lambda epoch: 0.95 ** epoch ... ) >>> scheduler_instance = scheduler_factory(optimizer)