Hooks

class classy_vision.hooks.CheckpointHook(checkpoint_folder: str, input_args: Optional[Any] = None, phase_types: Optional[Collection[str]] = None, checkpoint_period: int = 1)

Hook to checkpoint a model’s task.

Saves the checkpoints in checkpoint_folder.

__init__(checkpoint_folder: str, input_args: Optional[Any] = None, phase_types: Optional[Collection[str]] = None, checkpoint_period: int = 1) None

The constructor method of CheckpointHook.

Parameters
  • checkpoint_folder – Folder to store checkpoints in

  • input_args – Any arguments to save about the runtime setup. For example, it is useful to store the config that was used to instantiate the model.

  • phase_types – If phase_types is specified, only checkpoint on those phase types. Each item in phase_types must be either “train” or “test”. If not specified, it is set to checkpoint after “train” phases.

  • checkpoint_period – Checkpoint at the end of every x phases (default 1)

on_end(*args, **kwargs) None

Derived classes can set their hook functions to this.

This is useful if they want those hook functions to not do anything.

on_phase_end(task) None

Checkpoint the task every checkpoint_period phases.

We do not necessarily checkpoint the task at the end of every phase.

on_phase_start(*args, **kwargs) None

Derived classes can set their hook functions to this.

This is useful if they want those hook functions to not do anything.

on_start(task) None

Called at the start of training.

on_step(*args, **kwargs) None

Derived classes can set their hook functions to this.

This is useful if they want those hook functions to not do anything.

class classy_vision.hooks.ClassyHook

Base class for hooks.

Hooks allow to inject behavior at different places of the training loop, which are listed below in the chronological order.

on_start -> on_phase_start ->

on_step -> on_phase_end -> on_end

Deriving classes should call super().__init__() and store any state in self.state. Any state added to this property should be serializable. E.g. -

class MyHook(ClassyHook):
    def __init__(self, a, b):
        super().__init__()
        self.state.a = [1,2,3]
        self.state.b = "my_hook"
        # the following line is not allowed
        # self.state.my_lambda = lambda x: x^2
__init__()
get_classy_state() Dict[str, Any]

Get the state of the ClassyHook.

The returned state is used for checkpointing.

Returns

A state dictionary containing the state of the hook.

classmethod name() str

Returns the name of the class.

abstract on_end(task) None

Called at the end of training.

abstract on_phase_end(task) None

Called at the end of each phase (epoch).

abstract on_phase_start(task) None

Called at the start of each phase.

abstract on_start(task) None

Called at the start of training.

abstract on_step(task) None

Called each time after parameters have been updated by the optimizer.

set_classy_state(state_dict: Dict[str, Any]) None

Set the state of the ClassyHook.

Parameters

state_dict – The state dictionary. Must be the output of a call to get_classy_state().

This is used to load the state of the hook from a checkpoint.

class classy_vision.hooks.ClassyHookFunctions(value)

Enumeration of all the hook functions in the ClassyHook class.

class classy_vision.hooks.ExponentialMovingAverageModelHook(decay: float, consider_bn_buffers: bool = True, device: str = 'gpu')

Hook which keeps a track of the exponential moving average (EMA) of the model’s parameters and applies the EMA params to the model during the test phases.

Saving the state in cpu will save gpu memory, but will make training slower since the model parameters will need to be moved to cpu before the averaging.

Note

This hooks stores two additional copies of the model’s parameters, which will increase memory usage significantly.

__init__(decay: float, consider_bn_buffers: bool = True, device: str = 'gpu') None

The constructor method of ExponentialMovingAverageModelHook.

Parameters
  • decay – EMA decay factor, should be in [0, 1]. A decay of 0 corresponds to always using the latest value (no EMA) and a decay of 1 corresponds to not updating weights after initialization.

  • consider_bn_buffers – Whether to apply EMA to batch norm buffers

  • device – Device to store the model state.

get_model_state_iterator(model: torch.nn.modules.module.Module) Iterable[Tuple[str, Any]]

Get an iterator over the model state to apply EMA to.

on_end(*args, **kwargs) None

Derived classes can set their hook functions to this.

This is useful if they want those hook functions to not do anything.

on_phase_end(task) None

Called at the end of each phase (epoch).

on_phase_start(task) None

Called at the start of each phase.

on_start(task) None

Called at the start of training.

on_step(task) None

Called each time after parameters have been updated by the optimizer.

set_model_state(task, use_ema: bool) None

Depending on use_ema, set the appropriate state for the model.

class classy_vision.hooks.LossLrMeterLoggingHook(log_freq: Optional[int] = None)

Logs the loss, optimizer LR, and meters. Logs at the end of a phase.

__init__(log_freq: Optional[int] = None) None

The constructor method of LossLrMeterLoggingHook.

Parameters

log_freq – if specified, also logs every log_freq batches.

on_end(*args, **kwargs) None

Derived classes can set their hook functions to this.

This is useful if they want those hook functions to not do anything.

on_phase_end(task) None

Log the loss, optimizer LR, and meters for the phase.

on_phase_start(*args, **kwargs) None

Derived classes can set their hook functions to this.

This is useful if they want those hook functions to not do anything.

on_start(task) None

Called at the start of training.

on_step(task) None

Log the LR every log_freq batches, if log_freq is not None.

class classy_vision.hooks.ModelComplexityHook

Logs the number of paramaters and forward pass FLOPs and activations of the model.

__init__() None
on_end(*args, **kwargs) None

Derived classes can set their hook functions to this.

This is useful if they want those hook functions to not do anything.

on_phase_end(*args, **kwargs) None

Derived classes can set their hook functions to this.

This is useful if they want those hook functions to not do anything.

on_phase_start(*args, **kwargs) None

Derived classes can set their hook functions to this.

This is useful if they want those hook functions to not do anything.

on_start(task) None

Measure number of parameters, FLOPs and activations.

on_step(*args, **kwargs) None

Derived classes can set their hook functions to this.

This is useful if they want those hook functions to not do anything.

class classy_vision.hooks.ModelTensorboardHook(tb_writer)

Shows the model graph in TensorBoard <https ://www.tensorflow.org/tensorboard>_.

__init__(tb_writer) None

The constructor method of ModelTensorboardHook.

Parameters
  • tb_writer`Tensorboard SummaryWriter <https://tensorboardx.

  • readthedocs.io/en/latest/tensorboard.html#tensorboardX.

  • None (SummaryWriter>`_ instance or) –

classmethod from_config(config: [Dict[str, Any]]) classy_vision.hooks.model_tensorboard_hook.ModelTensorboardHook

The config is expected to include the key “summary_writer” with arguments which correspond to those listed at <https://tensorboardx. readthedocs.io/en/latest/tensorboard.html#tensorboardX.SummaryWriter>:

on_end(*args, **kwargs) None

Derived classes can set their hook functions to this.

This is useful if they want those hook functions to not do anything.

on_phase_end(*args, **kwargs) None

Derived classes can set their hook functions to this.

This is useful if they want those hook functions to not do anything.

on_phase_start(*args, **kwargs) None

Derived classes can set their hook functions to this.

This is useful if they want those hook functions to not do anything.

on_start(task) None

Plot the model on Tensorboard.

on_step(*args, **kwargs) None

Derived classes can set their hook functions to this.

This is useful if they want those hook functions to not do anything.

class classy_vision.hooks.OutputCSVHook(folder, id_key='id', delimiter='\t')
__init__(folder, id_key='id', delimiter='\t') None
on_end(task) None

Called at the end of training.

on_phase_end(task) None

Called at the end of each phase (epoch).

on_phase_start(*args, **kwargs) None

Derived classes can set their hook functions to this.

This is useful if they want those hook functions to not do anything.

on_start(task) None

Called at the start of training.

on_step(task) None

Saves the output of the model to a CSV file.

This hook assumes the dataset provides an “id” key. It also expects the task to provide an output of shape (B, C) where B is the batch size and C is the number of classes. Targets can be either one-hot encoded or single numbers.

class classy_vision.hooks.PreciseBatchNormHook(num_samples: int, cache_samples: bool = False)

Hook to compute precise batch normalization statistics.

Batch norm stats are calculated and updated during training, when the weights are also changing, which makes the calculations imprecise. This hook recomputes the batch norm stats at the end of a train phase to make them more precise. See fvcore’s documentation for more information.

__init__(num_samples: int, cache_samples: bool = False) None

The constructor method of PreciseBatchNormHook.

Caches the required number of samples on the CPU during train phases

Parameters
  • num_samples – Number of samples to calculate the batch norm stats per replica

  • cache_samples – If True, we cache samples at training stage. This avoids re-creating data loaders, but consumes more memory. If False, we re-create data loader at the end of phase, which might be slow for large dataset, but saves memory.

on_end(*args, **kwargs) None

Derived classes can set their hook functions to this.

This is useful if they want those hook functions to not do anything.

on_phase_end(task) None

Called at the end of each phase (epoch).

on_phase_start(task) None

Called at the start of each phase.

on_start(task) None

Called at the start of training.

on_step(task) None

Called each time after parameters have been updated by the optimizer.

class classy_vision.hooks.ProfilerHook
Hook to profile a model and to show model runtime information, such as

the time breakdown in milliseconds of forward/backward pass.

on_end(*args, **kwargs) None

Derived classes can set their hook functions to this.

This is useful if they want those hook functions to not do anything.

on_phase_end(*args, **kwargs) None

Derived classes can set their hook functions to this.

This is useful if they want those hook functions to not do anything.

on_phase_start(*args, **kwargs) None

Derived classes can set their hook functions to this.

This is useful if they want those hook functions to not do anything.

on_start(task) None

Profile the forward pass.

on_step(*args, **kwargs) None

Derived classes can set their hook functions to this.

This is useful if they want those hook functions to not do anything.

class classy_vision.hooks.ProgressBarHook

Displays a progress bar to show progress in processing batches.

__init__() None

The constructor method of ProgressBarHook.

on_end(*args, **kwargs) None

Derived classes can set their hook functions to this.

This is useful if they want those hook functions to not do anything.

on_phase_end(task) None

Clear the progress bar at the end of the phase.

on_phase_start(task) None

Create and display a progress bar with 0 progress.

on_start(*args, **kwargs) None

Derived classes can set their hook functions to this.

This is useful if they want those hook functions to not do anything.

on_step(task) None

Update the progress bar with the batch size.

class classy_vision.hooks.TensorboardPlotHook(tb_writer, log_period: int = 10)

Hook for writing the losses, learning rates and meters to tensorboard <https ://www.tensorflow.org/tensorboard>_.

Global steps are counted in terms of the number of samples processed.

__init__(tb_writer, log_period: int = 10) None

The constructor method of TensorboardPlotHook.

Parameters
  • tb_writer`Tensorboard SummaryWriter <https://tensorboardx.

  • readthedocs.io/en/latest/tensorboard.html#tensorboardX.

  • None (SummaryWriter>`_ instance or) –

classmethod from_config(config: Dict[str, Any]) classy_vision.hooks.tensorboard_plot_hook.TensorboardPlotHook

The config is expected to include the key “summary_writer” with arguments which correspond to those listed at <https://tensorboardx. readthedocs.io/en/latest/tensorboard.html#tensorboardX.SummaryWriter>:

on_end(*args, **kwargs) None

Derived classes can set their hook functions to this.

This is useful if they want those hook functions to not do anything.

on_phase_end(task) None

Add the losses and learning rates to tensorboard.

on_phase_start(task) None

Initialize losses and learning_rates.

on_start(task) None

Called at the start of training.

on_step(task) None

Store the observed learning rates.

class classy_vision.hooks.TorchscriptHook(torchscript_folder: str, use_trace: bool = True, trace_strict: bool = True, device: str = 'cpu')

Hook to convert a task model into torch script.

Saves the torch scripts in torchscript_folder.

__init__(torchscript_folder: str, use_trace: bool = True, trace_strict: bool = True, device: str = 'cpu') None

The constructor method of TorchscriptHook.

Parameters
  • torchscript_folder – Folder to store torch scripts in.

  • use_trace – set to true for tracing and false for scripting,

  • trace_strict – run the tracer in a strict mode or not (default: True). Only turn this off when you want the tracer to record your mutable container types (currently list/dict) and you are sure that the container you are using in your problem is a constant structure and does not get used as control flow (if, for) conditions.

  • device – move to device before saving.

on_end(task) None

Save model into torchscript by the end of training/testing.

on_phase_end(*args, **kwargs) None

Derived classes can set their hook functions to this.

This is useful if they want those hook functions to not do anything.

on_phase_start(*args, **kwargs) None

Derived classes can set their hook functions to this.

This is useful if they want those hook functions to not do anything.

on_start(task) None

Called at the start of training.

on_step(*args, **kwargs) None

Derived classes can set their hook functions to this.

This is useful if they want those hook functions to not do anything.

class classy_vision.hooks.VisdomHook(server: str, port: str, env: str = 'main', title_suffix: str = '')

Plots metrics on to Visdom.

Visdom is a flexible tool for creating, organizing, and sharing visualizations

of live, rich data. It supports Python.

__init__(server: str, port: str, env: str = 'main', title_suffix: str = '') None
Parameters
  • server – host name of the visdom server

  • port – port of visdom server, such as 8097

  • env – environment of visdom

  • title_suffix – suffix that will be appended to the title

on_end(*args, **kwargs) None

Derived classes can set their hook functions to this.

This is useful if they want those hook functions to not do anything.

on_phase_end(task) None

Plot the metrics on visdom.

on_phase_start(*args, **kwargs) None

Derived classes can set their hook functions to this.

This is useful if they want those hook functions to not do anything.

on_start(*args, **kwargs) None

Derived classes can set their hook functions to this.

This is useful if they want those hook functions to not do anything.

on_step(*args, **kwargs) None

Derived classes can set their hook functions to this.

This is useful if they want those hook functions to not do anything.

classy_vision.hooks.build_hook(hook_config: Dict[str, Any])

Builds a ClassyHook from a config.

This assumes a ‘name’ key in the config which is used to determine what hook class to instantiate. For instance, a config {“name”: “my_hook”, “foo”: “bar”} will find a class that was registered as “my_hook” (see register_hook()) and call .from_config on it.

classy_vision.hooks.register_hook(name, bypass_checks=False)

Registers a ClassyHook subclass.

This decorator allows Classy Vision to instantiate a subclass of ClassyHook from a configuration file, even if the class itself is not part of the base Classy Vision framework. To use it, apply this decorator to a ClassyHook subclass, like this:

@register_hook('custom_hook')
class CustomHook(ClassyHook):
   ...

To instantiate a hook from a configuration file, see build_hook().