Shortcuts

SeesawLoss

class mmpretrain.models.losses.SeesawLoss(use_sigmoid=False, p=0.8, q=2.0, num_classes=1000, eps=0.01, reduction='mean', loss_weight=1.0)[source]

Implementation of seesaw loss.

Refers to Seesaw Loss for Long-Tailed Instance Segmentation (CVPR 2021)

Parameters:
  • use_sigmoid (bool) – Whether the prediction uses sigmoid of softmax. Only False is supported. Defaults to False.

  • p (float) – The p in the mitigation factor. Defaults to 0.8.

  • q (float) – The q in the compenstation factor. Defaults to 2.0.

  • num_classes (int) – The number of classes. Defaults to 1000 for the ImageNet dataset.

  • eps (float) – The minimal value of divisor to smooth the computation of compensation factor, default to 1e-2.

  • reduction (str) – The method that reduces the loss to a scalar. Options are “none”, “mean” and “sum”. Defaults to “mean”.

  • loss_weight (float) – The weight of the loss. Defaults to 1.0

forward(cls_score, labels, weight=None, avg_factor=None, reduction_override=None)[source]

Forward function.

Parameters:
  • cls_score (torch.Tensor) – The prediction with shape (N, C).

  • labels (torch.Tensor) – The learning label of the prediction.

  • weight (torch.Tensor, optional) – Sample-wise loss weight.

  • avg_factor (int, optional) – Average factor that is used to average the loss. Defaults to None.

  • reduction (str, optional) – The method used to reduce the loss. Options are “none”, “mean” and “sum”.

Returns:

The calculated loss

Return type:

torch.Tensor

Read the Docs v: latest
Versions
latest
stable
mmcls-1.x
mmcls-0.x
dev
Downloads
epub
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.