Param Schedulers

class classy_vision.optim.param_scheduler.ClassyParamScheduler(update_interval: classy_vision.optim.param_scheduler.classy_vision_param_scheduler.UpdateInterval = <UpdateInterval.EPOCH: 'epoch'>)

Base class for Classy parameter schedulers.

update_interval

Specifies how often to update each parameter (before each epoch or each batch)

WHERE_EPSILON = 1e-06
__call__(where: float)

Get the param for a given point at training.

For Classy Vision we update params (such as learning rate) based on the percent progress of training completed. This allows a scheduler to be agnostic to the exact specifications of a particular run (e.g. 120 epochs vs 90 epochs).

Parameters

where – A float in [0;1) that represents how far training has progressed

__init__(update_interval: classy_vision.optim.param_scheduler.classy_vision_param_scheduler.UpdateInterval = <UpdateInterval.EPOCH: 'epoch'>)

Constructor for ClassyParamScheduler

Parameters

update_interval – Specifies the frequency of the param updates

classmethod from_config(config: Dict[str, Any]) → classy_vision.optim.param_scheduler.classy_vision_param_scheduler.ClassyParamScheduler

Instantiates a ClassyParamScheduler from a configuration.

Parameters

config – A configuration for the ClassyParamScheduler.

Returns

A ClassyParamScheduler instance.

class classy_vision.optim.param_scheduler.CompositeParamScheduler(schedulers: Sequence[classy_vision.optim.param_scheduler.classy_vision_param_scheduler.ClassyParamScheduler], lengths: Sequence[float], update_interval: classy_vision.optim.param_scheduler.classy_vision_param_scheduler.UpdateInterval, interval_scaling: Sequence[classy_vision.optim.param_scheduler.composite_scheduler.IntervalScaling])

Composite parameter scheduler composed of intermediate schedulers. Takes a list of schedulers and a list of lengths corresponding to percentage of training each scheduler should run for. Schedulers are run in order. All values in lengths should sum to 1.0.

Each scheduler also has a corresponding interval scale. If interval scale is ‘fixed’, the intermediate scheduler will be run without any rescaling of the time. If interval scale is ‘rescaled’, intermediate scheduler is run such that each scheduler will start and end at the same values as it would if it were the only scheduler. Default is ‘rescaled’ for all schedulers.

Example

update_interval = "step"
schedulers = [
  {"name": "constant", "value": 0.42},
  {"name": "cosine_decay", "start_lr": 0.42, "end_lr": 0.0001}
]
interval_scaling = ['rescaled', 'rescaled'],
lengths =  [0.3, 0.7]

The parameter value will be 0.42 for the first [0%, 30%) of steps, and then will cosine decay from 0.42 to 0.0001 for [30%, 100%) of training.

__call__(where: float)

Get the param for a given point at training.

For Classy Vision we update params (such as learning rate) based on the percent progress of training completed. This allows a scheduler to be agnostic to the exact specifications of a particular run (e.g. 120 epochs vs 90 epochs).

Parameters

where – A float in [0;1) that represents how far training has progressed

__init__(schedulers: Sequence[classy_vision.optim.param_scheduler.classy_vision_param_scheduler.ClassyParamScheduler], lengths: Sequence[float], update_interval: classy_vision.optim.param_scheduler.classy_vision_param_scheduler.UpdateInterval, interval_scaling: Sequence[classy_vision.optim.param_scheduler.composite_scheduler.IntervalScaling])

Constructor for ClassyParamScheduler

Parameters

update_interval – Specifies the frequency of the param updates

classmethod from_config(config: Dict[str, Any]) → classy_vision.optim.param_scheduler.composite_scheduler.CompositeParamScheduler

Instantiates a CompositeParamScheduler from a configuration.

Parameters

config – A configuration for a CompositeParamScheduler. See __init__() for parameters expected in the config.

Returns

A CompositeParamScheduler instance.

class classy_vision.optim.param_scheduler.ConstantParamScheduler(value: float)

Returns a constant value for a optimizer param.

__call__(where: float)

Get the param for a given point at training.

For Classy Vision we update params (such as learning rate) based on the percent progress of training completed. This allows a scheduler to be agnostic to the exact specifications of a particular run (e.g. 120 epochs vs 90 epochs).

Parameters

where – A float in [0;1) that represents how far training has progressed

__init__(value: float)

Constructor for ClassyParamScheduler

Parameters

update_interval – Specifies the frequency of the param updates

classmethod from_config(config: Dict[str, Any]) → classy_vision.optim.param_scheduler.constant_scheduler.ConstantParamScheduler

Instantiates a ConstantParamScheduler from a configuration.

Parameters

config – A configuration for a ConstantParamScheduler. See __init__() for parameters expected in the config.

Returns

A ConstantParamScheduler instance.

class classy_vision.optim.param_scheduler.CosineParamScheduler(start_lr: float, end_lr: float)

Changes the param value after every epoch based on a cosine schedule. Can be used for either cosine decay or cosine warmup schedules based on start and end values.

Example

start_lr: 0.1
end_lr: 0.0001
__call__(where: float)

Get the param for a given point at training.

For Classy Vision we update params (such as learning rate) based on the percent progress of training completed. This allows a scheduler to be agnostic to the exact specifications of a particular run (e.g. 120 epochs vs 90 epochs).

Parameters

where – A float in [0;1) that represents how far training has progressed

__init__(start_lr: float, end_lr: float)

Constructor for ClassyParamScheduler

Parameters

update_interval – Specifies the frequency of the param updates

classmethod from_config(config: Dict[str, Any]) → classy_vision.optim.param_scheduler.cosine_scheduler.CosineParamScheduler

Instantiates a CosineParamScheduler from a configuration.

Parameters

config – A configuration for a CosineParamScheduler. See __init__() for parameters expected in the config.

Returns

A CosineParamScheduler instance.

class classy_vision.optim.param_scheduler.LinearParamScheduler(start_lr: float, end_lr: float)

Linearly interpolates parameter between start_lr and end_lr. Can be used for either warmup or decay based on start and end values.

Example

start_lr: 0.0001
end_lr: 0.01

Corresponds to a linear increasing schedule with values in [0.0001, 0.01)

__call__(where: float)

Get the param for a given point at training.

For Classy Vision we update params (such as learning rate) based on the percent progress of training completed. This allows a scheduler to be agnostic to the exact specifications of a particular run (e.g. 120 epochs vs 90 epochs).

Parameters

where – A float in [0;1) that represents how far training has progressed

__init__(start_lr: float, end_lr: float)

Constructor for ClassyParamScheduler

Parameters

update_interval – Specifies the frequency of the param updates

classmethod from_config(config: Dict[str, Any]) → classy_vision.optim.param_scheduler.linear_scheduler.LinearParamScheduler

Instantiates a LinearParamScheduler from a configuration.

Parameters

config – A configuration for a LinearParamScheduler. See __init__() for parameters expected in the config.

Returns

A LinearParamScheduler instance.

class classy_vision.optim.param_scheduler.MultiStepParamScheduler(values, num_epochs: int, update_interval: classy_vision.optim.param_scheduler.classy_vision_param_scheduler.UpdateInterval, milestones: Optional[List[int]] = None)

Takes a predefined schedule for a param value, and a list of epochs which stand for the upper boundary (excluded) of each range.

Example

values: [0.1, 0.01, 0.001, 0.0001]
milestones = [30, 60, 80]
num_epochs = 120

Then the param value will be 0.1 for epochs 0-29, 0.01 for epochs 30-59, 0.001 for epochs 60-79, 0.0001 for epochs after epoch 80. Note that the length of values must be equal to the length of milestones plus one.

__call__(where: float)

Get the param for a given point at training.

For Classy Vision we update params (such as learning rate) based on the percent progress of training completed. This allows a scheduler to be agnostic to the exact specifications of a particular run (e.g. 120 epochs vs 90 epochs).

Parameters

where – A float in [0;1) that represents how far training has progressed

__init__(values, num_epochs: int, update_interval: classy_vision.optim.param_scheduler.classy_vision_param_scheduler.UpdateInterval, milestones: Optional[List[int]] = None)

Constructor for ClassyParamScheduler

Parameters

update_interval – Specifies the frequency of the param updates

classmethod from_config(config: Dict[str, Any]) → classy_vision.optim.param_scheduler.multi_step_scheduler.MultiStepParamScheduler

Instantiates a MultiStepParamScheduler from a configuration.

Parameters

config – A configuration for a MultiStepParamScheduler. See __init__() for parameters expected in the config.

Returns

A MultiStepParamScheduler instance.

class classy_vision.optim.param_scheduler.PolynomialDecayParamScheduler(base_lr, power)

Decays the param value after every epoch according to a polynomial function with a fixed power.

Example

base_lr: 0.1
power: 0.9

Then the param value will be 0.1 for epoch 0, 0.099 for epoch 1, and so on.

__call__(where: float)

Get the param for a given point at training.

For Classy Vision we update params (such as learning rate) based on the percent progress of training completed. This allows a scheduler to be agnostic to the exact specifications of a particular run (e.g. 120 epochs vs 90 epochs).

Parameters

where – A float in [0;1) that represents how far training has progressed

__init__(base_lr, power)

Constructor for ClassyParamScheduler

Parameters

update_interval – Specifies the frequency of the param updates

classmethod from_config(config: Dict[str, Any]) → classy_vision.optim.param_scheduler.polynomial_decay_scheduler.PolynomialDecayParamScheduler

Instantiates a PolynomialDecayParamScheduler from a configuration.

Parameters

config – A configuration for a PolynomialDecayParamScheduler. See __init__() for parameters expected in the config.

Returns

A PolynomialDecayParamScheduler instance.

class classy_vision.optim.param_scheduler.StepParamScheduler(num_epochs: Union[int, float], values: List[float])

Takes a fixed schedule for a param value. If the length of the fixed schedule is less than the number of epochs, then the epochs are divided evenly among the param schedule.

Example

values: [0.1, 0.01, 0.001, 0.0001]
num_epochs = 120

Then the param value will be 0.1 for epochs 0-29, 0.01 for epochs 30-59, 0.001 for epoch 60-89, 0.0001 for epochs 90-119.

__call__(where: float)

Get the param for a given point at training.

For Classy Vision we update params (such as learning rate) based on the percent progress of training completed. This allows a scheduler to be agnostic to the exact specifications of a particular run (e.g. 120 epochs vs 90 epochs).

Parameters

where – A float in [0;1) that represents how far training has progressed

__init__(num_epochs: Union[int, float], values: List[float])

Constructor for ClassyParamScheduler

Parameters

update_interval – Specifies the frequency of the param updates

classmethod from_config(config: Dict[str, Any]) → classy_vision.optim.param_scheduler.step_scheduler.StepParamScheduler

Instantiates a StepParamScheduler from a configuration.

Parameters

config – A configuration for a StepParamScheduler. See __init__() for parameters expected in the config.

Returns

A StepParamScheduler instance.

class classy_vision.optim.param_scheduler.UpdateInterval

Enum for specifying update frequency for scheduler.

EPOCH

Update param before each epoch

Type

str

STEP

Update param before each optimizer step

Type

str

EPOCH = 'epoch'
STEP = 'step'
class classy_vision.optim.param_scheduler.IntervalScaling

An enumeration.

FIXED = 2
RESCALED = 1
class classy_vision.optim.param_scheduler.StepWithFixedGammaParamScheduler(base_lr, num_decays, gamma, num_epochs)

Decays the param value by gamma at equal number of steps so as to have the specified total number of decays.

Example

base_lr: 0.1
gamma: 0.1
num_decays: 3
num_epochs: 120

Then the param value will be 0.1 for epochs 0-29, 0.01 for epochs 30-59, 0.001 for epoch 60-89, 0.0001 for epochs 90-119.

__call__(where: float) → float

Get the param for a given point at training.

For Classy Vision we update params (such as learning rate) based on the percent progress of training completed. This allows a scheduler to be agnostic to the exact specifications of a particular run (e.g. 120 epochs vs 90 epochs).

Parameters

where – A float in [0;1) that represents how far training has progressed

__init__(base_lr, num_decays, gamma, num_epochs)

Constructor for ClassyParamScheduler

Parameters

update_interval – Specifies the frequency of the param updates

classmethod from_config(config: Dict[str, Any]) → classy_vision.optim.param_scheduler.step_with_fixed_gamma_scheduler.StepWithFixedGammaParamScheduler

Instantiates a StepWithFixedGammaParamScheduler from a configuration.

Parameters

config – A configuration for a StepWithFixedGammaParamScheduler. See __init__() for parameters expected in the config.

Returns

A StepWithFixedGammaParamScheduler instance.

classy_vision.optim.param_scheduler.build_param_scheduler(config: Dict[str, Any]) → classy_vision.optim.param_scheduler.classy_vision_param_scheduler.ClassyParamScheduler

Builds a ClassyParamScheduler from a config.

This assumes a ‘name’ key in the config which is used to determine what param scheduler class to instantiate. For instance, a config {“name”: “my_scheduler”, “foo”: “bar”} will find a class that was registered as “my_scheduler” (see register_param_scheduler()) and call .from_config on it.

classy_vision.optim.param_scheduler.register_param_scheduler(name)

Registers a ClassyParamScheduler subclass.

This decorator allows Classy Vision to instantiate a subclass of ClassyParamScheduler from a configuration file, even if the class itself is not part of the Classy Vision framework. To use it, apply this decorator to a ClassyParamScheduler subclass, like this:

@register_param_scheduler('my_scheduler')
class MyParamScheduler(ClassyParamScheduler):
    ...

To instantiate a param scheduler from a configuration file, see build_param_scheduler().