Losses¶
- class classy_vision.losses.BarronLoss(alpha, size_average, c)¶
This implements the Barron loss.
- __init__(alpha, size_average, c)¶
Constructor for ClassyLoss.
- forward(prediction, target)¶
Compute the loss for the provided sample.
Refer to
torch.nn.Module
for more details.
- classmethod from_config(config: Dict[str, Any]) classy_vision.losses.barron_loss.BarronLoss ¶
Instantiates a BarronLoss from a configuration.
- Parameters
config – A configuration for a BarronLoss. See
__init__()
for parameters expected in the config.- Returns
A BarronLoss instance.
- class classy_vision.losses.ClassyLoss¶
Base class to calculate the loss during training.
This implementation of
torch.nn.Module
allows building the loss object from a configuration file.- __init__()¶
Constructor for ClassyLoss.
- forward(output, target)¶
Compute the loss for the provided sample.
Refer to
torch.nn.Module
for more details.
- classmethod from_config(config: Dict[str, Any]) classy_vision.losses.classy_loss.ClassyLoss ¶
Instantiates a ClassyLoss from a configuration.
- Parameters
config – A configuration for a ClassyLoss.
- Returns
A ClassyLoss instance.
- get_classy_state() Dict[str, Any] ¶
Get the state of the ClassyLoss.
The returned state is used for checkpointing. Note that most losses are stateless and do not need to save any state.
- Returns
A state dictionary containing the state of the loss.
- set_classy_state(state: Dict[str, Any]) None ¶
Set the state of the ClassyLoss.
- Parameters
state_dict – The state dictionary. Must be the output of a call to
get_classy_state()
.
This is used to load the state of the loss from a checkpoint. Note that most losses are stateless and do not need to load any state.
- class classy_vision.losses.LabelSmoothingCrossEntropyLoss(ignore_index=- 100, reduction='mean', smoothing_param=None)¶
- __init__(ignore_index=- 100, reduction='mean', smoothing_param=None)¶
Intializer for the label smoothed cross entropy loss. This decreases gap between output scores and encourages generalization. Targets provided to forward can be one-hot vectors (NxC) or class indices (Nx1).
This normalizes the targets to a sum of 1 based on the total count of positive targets for a given sample before applying label smoothing.
- Parameters
ignore_index – sample should be ignored for loss if the class is this value
reduction – specifies reduction to apply to the output
smoothing_param – value to be added to each target entry
- compute_valid_targets(target, classes)¶
This function takes one-hot or index target vectors and computes valid one-hot target vectors, based on ignore index value
- forward(output, target)¶
Compute the loss for the provided sample.
Refer to
torch.nn.Module
for more details.
- classmethod from_config(config: Dict[str, Any]) classy_vision.losses.label_smoothing_loss.LabelSmoothingCrossEntropyLoss ¶
Instantiates a LabelSmoothingCrossEntropyLoss from a configuration.
- Parameters
config – A configuration for a LabelSmoothingCrossEntropyLoss. See
__init__()
for parameters expected in the config.- Returns
A LabelSmoothingCrossEntropyLoss instance.
- smooth_targets(valid_targets, classes)¶
This function takes valid (No ignore values present) one-hot target vectors and computes smoothed target vectors (normalized) according to the loss’s smoothing parameter
- class classy_vision.losses.MultiOutputSumLoss(loss)¶
Applies the provided loss to the list of outputs (or single output) and sums up the losses.
- forward(output, target)¶
Compute the loss for the provided sample.
Refer to
torch.nn.Module
for more details.
- classmethod from_config(config: Dict[str, Any]) classy_vision.losses.multi_output_sum_loss.MultiOutputSumLoss ¶
Instantiates a MultiOutputSumLoss from a configuration.
- Parameters
config – A configuration for a MultiOutpuSumLoss. See
__init__()
for parameters expected in the config.- Returns
A MultiOutputSumLoss instance.
- class classy_vision.losses.SoftTargetCrossEntropyLoss(ignore_index=- 100, reduction='mean', normalize_targets=True)¶
- __init__(ignore_index=- 100, reduction='mean', normalize_targets=True)¶
Intializer for the soft target cross-entropy loss loss. This allows the targets for the cross entropy loss to be multilabel
- Parameters
ignore_index – sample should be ignored for loss if the class is this value
reduction – specifies reduction to apply to the output
normalize_targets – whether the targets should be normalized to a sum of 1 based on the total count of positive targets for a given sample
- forward(output, target)¶
for N examples and C classes - output: N x C these are raw outputs (without softmax/sigmoid) - target: N x C or N corresponding targets
Target elements set to ignore_index contribute 0 loss.
Samples where all entries are ignore_index do not contribute to the loss reduction.
- classmethod from_config(config: Dict[str, Any]) classy_vision.losses.soft_target_cross_entropy_loss.SoftTargetCrossEntropyLoss ¶
Instantiates a SoftTargetCrossEntropyLoss from a configuration.
- Parameters
config – A configuration for a SoftTargetCrossEntropyLoss. See
__init__()
for parameters expected in the config.- Returns
A SoftTargetCrossEntropyLoss instance.
- class classy_vision.losses.SumArbitraryLoss(losses: List[classy_vision.losses.classy_loss.ClassyLoss], weights: Optional[torch.Tensor] = None)¶
Sums a collection of (weighted) torch.nn losses.
NOTE: this applies all the losses to the same output and does not support taking a list of outputs as input.
- __init__(losses: List[classy_vision.losses.classy_loss.ClassyLoss], weights: Optional[torch.Tensor] = None) None ¶
Constructor for ClassyLoss.
- forward(prediction, target)¶
Compute the loss for the provided sample.
Refer to
torch.nn.Module
for more details.
- classmethod from_config(config: Dict[str, Any]) classy_vision.losses.sum_arbitrary_loss.SumArbitraryLoss ¶
Instantiates a SumArbitraryLoss from a configuration.
- Parameters
config – A configuration for a SumArbitraryLoss. See
__init__()
for parameters expected in the config.- Returns
A SumArbitraryLoss instance.
- classy_vision.losses.build_loss(config)¶
Builds a ClassyLoss from a config.
This assumes a ‘name’ key in the config which is used to determine what model class to instantiate. For instance, a config {“name”: “my_loss”, “foo”: “bar”} will find a class that was registered as “my_loss” (see
register_loss()
) and call .from_config on it.In addition to losses registered with
register_loss()
, we also support instantiating losses available in the torch.nn.modules.loss module. Any keys in the config will get expanded to parameters of the loss constructor. For instance, the following call will instantiate a torch.nn.modules.CrossEntropyLoss:build_loss({"name": "CrossEntropyLoss", "reduction": "sum"})
- classy_vision.losses.register_loss(name, bypass_checks=False)¶
Registers a ClassyLoss subclass.
This decorator allows Classy Vision to instantiate a subclass of ClassyLoss from a configuration file, even if the class itself is not part of the Classy Vision framework. To use it, apply this decorator to a ClassyLoss subclass, like this:
@register_loss("my_loss") class MyLoss(ClassyLoss): ...
To instantiate a loss from a configuration file, see
build_loss()
.