Shortcuts

Mixup

class mmpretrain.models.utils.batch_augments.Mixup(alpha)[源代码]

Mixup batch augmentation.

Mixup is a method to reduces the memorization of corrupt labels and increases the robustness to adversarial examples. It’s proposed in mixup: Beyond Empirical Risk Minimization

参数:

alpha (float) – Parameters for Beta distribution to generate the mixing ratio. It should be a positive number. More details are in the note.

备注

The \(\alpha\) (alpha) determines a random distribution \(Beta(\alpha, \alpha)\). For each batch of data, we sample a mixing ratio (marked as \(\lambda\), lam) from the random distribution.

__call__(batch_inputs, batch_score)[源代码]

Mix the batch inputs and batch data samples.

mix(batch_inputs, batch_scores)[源代码]

Mix the batch inputs and batch one-hot format ground truth.

参数:
  • batch_inputs (Tensor) – A batch of images tensor in the shape of (N, C, H, W).

  • batch_scores (Tensor) – A batch of one-hot format labels in the shape of (N, num_classes).

返回:

The mixed inputs and labels.

返回类型:

Tuple[Tensor, Tensor)

Read the Docs v: latest
Versions
latest
stable
mmcls-1.x
mmcls-0.x
dev
Downloads
epub
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.