topobench.optimizer package#

Init file for optimizer module.

class TBOptimizer(optimizer_id, parameters, scheduler=None)#

Bases: AbstractOptimizer

Optimizer class that manage both optimizer and scheduler, fully compatible with torch.optim classes.

Parameters:
optimizer_idstr

Name of the torch optimizer class to be used.

parametersdict

Parameters to be passed to the optimizer.

schedulerdict, optional

Scheduler id and parameters to be used. Default is None.

__init__(optimizer_id, parameters, scheduler=None)#
configure_optimizer(model_parameters)#

Configure the optimizer and scheduler.

Act as a wrapper to provide the LightningTrainer module the required config dict when it calls TBModel’s configure_optimizers() method.

Parameters:
model_parametersdict

The model parameters.

Returns:
dict

The optimizer and scheduler configuration.

Submodules#