topobench.optimizer.optimizer module#

Optimizer class responsible of managing both optimizer and scheduler.

class AbstractOptimizer#

Bases: ABC

Abstract class for the optimizer manager class.

abstract configure_optimizer(model_parameters)#

Configure the optimizer and scheduler.

Act as a wrapper.

Parameters:
model_parametersdict

The model parameters.

class Any(*args, **kwargs)#

Bases: object

Special type indicating an unconstrained type.

  • Any is compatible with every type.

  • Any assumed to have all methods.

  • All values assumed to be instances of Any.

Note that all the above statements are true from the point of view of static type checkers. At runtime, Any should not be used with instance checks.

class TBOptimizer(optimizer_id, parameters, scheduler=None)#

Bases: AbstractOptimizer

Optimizer class that manage both optimizer and scheduler, fully compatible with torch.optim classes.

Parameters:
optimizer_idstr

Name of the torch optimizer class to be used.

parametersdict

Parameters to be passed to the optimizer.

schedulerdict, optional

Scheduler id and parameters to be used. Default is None.

__init__(optimizer_id, parameters, scheduler=None)#
configure_optimizer(model_parameters)#

Configure the optimizer and scheduler.

Act as a wrapper to provide the LightningTrainer module the required config dict when it calls TBModel’s configure_optimizers() method.

Parameters:
model_parametersdict

The model parameters.

Returns:
dict

The optimizer and scheduler configuration.