Losses

class classy_vision.losses.BarronLoss(alpha, size_average, c)

This implements the Barron loss.

__init__(alpha, size_average, c)

Constructor for ClassyLoss.

forward(prediction, target)

Compute the loss for the provided sample.

Refer to torch.nn.Module for more details.

classmethod from_config(config: Dict[str, Any]) → classy_vision.losses.barron_loss.BarronLoss

Instantiates a BarronLoss from a configuration.

Parameters

config – A configuration for a BarronLoss. See __init__() for parameters expected in the config.

Returns

A BarronLoss instance.

class classy_vision.losses.ClassyLoss

Base class to calculate the loss during training.

This implementation of torch.nn.Module allows building the loss object from a configuration file.

__init__()

Constructor for ClassyLoss.

forward(output, target)

Compute the loss for the provided sample.

Refer to torch.nn.Module for more details.

classmethod from_config(config: Dict[str, Any]) → classy_vision.losses.classy_loss.ClassyLoss

Instantiates a ClassyLoss from a configuration.

Parameters

config – A configuration for a ClassyLoss.

Returns

A ClassyLoss instance.

get_classy_state() → Dict[str, Any]

Get the state of the ClassyLoss.

The returned state is used for checkpointing. Note that most losses are stateless and do not need to save any state.

Returns

A state dictionary containing the state of the loss.

get_optimizer_params(bn_weight_decay=False)

Gets optimizer params.

The default implementation is very simple. Most losses have no learned parameters, so this is rarely needed.

has_learned_parameters()bool

Does this loss have learned parameters?

set_classy_state(state: Dict[str, Any])None

Set the state of the ClassyLoss.

Parameters

state_dict – The state dictionary. Must be the output of a call to get_classy_state().

This is used to load the state of the loss from a checkpoint. Note that most losses are stateless and do not need to load any state.

class classy_vision.losses.LabelSmoothingCrossEntropyLoss(ignore_index, reduction, smoothing_param)
__init__(ignore_index, reduction, smoothing_param)

Intializer for the label smoothed cross entropy loss. This decreases gap between output scores and encourages generalization. Targets provided to forward can be one-hot vectors (NxC) or class indices(Nx1)

Config params: ‘weight’: weight of sample (not yet implemented), ‘ignore_index’: sample should be ignored for loss (optional), ‘smoothing_param’: value to be added to each target entry

compute_valid_targets(target, classes)

This function takes one-hot or index target vectors and computes valid one-hot target vectors, based on ignore index value

forward(output, target)

Compute the loss for the provided sample.

Refer to torch.nn.Module for more details.

classmethod from_config(config: Dict[str, Any]) → classy_vision.losses.label_smoothing_loss.LabelSmoothingCrossEntropyLoss

Instantiates a LabelSmoothingCrossEntropyLoss from a configuration.

Parameters

config – A configuration for a LabelSmoothingCrossEntropyLoss. See __init__() for parameters expected in the config.

Returns

A LabelSmoothingCrossEntropyLoss instance.

smooth_targets(valid_targets, classes)

This function takes valid (No ignore values present) one-hot target vectors and computes smoothed target vectors (normalized) according to the loss’s smoothing parameter

class classy_vision.losses.MultiOutputSumLoss(loss)

Applies the provided loss to the list of outputs (or single output) and sums up the losses.

__init__(loss)None

Constructor for ClassyLoss.

forward(output, target)

Compute the loss for the provided sample.

Refer to torch.nn.Module for more details.

classmethod from_config(config: Dict[str, Any]) → classy_vision.losses.multi_output_sum_loss.MultiOutputSumLoss

Instantiates a MultiOutputSumLoss from a configuration.

Parameters

config – A configuration for a MultiOutpuSumLoss. See __init__() for parameters expected in the config.

Returns

A MultiOutputSumLoss instance.

class classy_vision.losses.SoftTargetCrossEntropyLoss(ignore_index, reduction, normalize_targets)
__init__(ignore_index, reduction, normalize_targets)

Intializer for the soft target cross-entropy loss loss. This allows the targets for the cross entropy loss to be multilabel

Config params: ‘weight’: weight of sample (not yet implemented), ‘ignore_index’: sample should be ignored for loss (optional), ‘reduction’: specifies reduction to apply to the output (optional),

forward(output, target)

for N examples and C classes - output: N x C these are raw outputs (without softmax/sigmoid) - target: N x C or N corresponding targets

Target elements set to ignore_index contribute 0 loss.

Samples where all entries are ignore_index do not contribute to the loss reduction.

classmethod from_config(config: Dict[str, Any]) → classy_vision.losses.soft_target_cross_entropy_loss.SoftTargetCrossEntropyLoss

Instantiates a SoftTargetCrossEntropyLoss from a configuration.

Parameters

config – A configuration for a SoftTargetCrossEntropyLoss. See __init__() for parameters expected in the config.

Returns

A SoftTargetCrossEntropyLoss instance.

class classy_vision.losses.SumArbitraryLoss(losses: List[classy_vision.losses.classy_loss.ClassyLoss], weights: Optional[torch.Tensor] = None)

Sums a collection of (weighted) torch.nn losses.

NOTE: this applies all the losses to the same output and does not support taking a list of outputs as input.

__init__(losses: List[classy_vision.losses.classy_loss.ClassyLoss], weights: Optional[torch.Tensor] = None)None

Constructor for ClassyLoss.

forward(prediction, target)

Compute the loss for the provided sample.

Refer to torch.nn.Module for more details.

classmethod from_config(config: Dict[str, Any]) → classy_vision.losses.sum_arbitrary_loss.SumArbitraryLoss

Instantiates a SumArbitraryLoss from a configuration.

Parameters

config – A configuration for a SumArbitraryLoss. See __init__() for parameters expected in the config.

Returns

A SumArbitraryLoss instance.

classy_vision.losses.build_loss(config)

Builds a ClassyLoss from a config.

This assumes a ‘name’ key in the config which is used to determine what model class to instantiate. For instance, a config {“name”: “my_loss”, “foo”: “bar”} will find a class that was registered as “my_loss” (see register_loss()) and call .from_config on it.

In addition to losses registered with register_loss(), we also support instantiating losses available in the torch.nn.modules.loss module. Any keys in the config will get expanded to parameters of the loss constructor. For instance, the following call will instantiate a torch.nn.modules.CrossEntropyLoss:

build_loss({"name": "CrossEntropyLoss", "reduction": "sum"})
classy_vision.losses.register_loss(name)

Registers a ClassyLoss subclass.

This decorator allows Classy Vision to instantiate a subclass of ClassyLoss from a configuration file, even if the class itself is not part of the Classy Vision framework. To use it, apply this decorator to a ClassyLoss subclass, like this:

@register_loss("my_loss")
class MyLoss(ClassyLoss):
     ...

To instantiate a loss from a configuration file, see build_loss().