- class mmpretrain.models.utils.batch_augments.Mixup(alpha)¶
Mixup batch augmentation.
Mixup is a method to reduces the memorization of corrupt labels and increases the robustness to adversarial examples. It’s proposed in mixup: Beyond Empirical Risk Minimization
alpha (float) – Parameters for Beta distribution to generate the mixing ratio. It should be a positive number. More details are in the note.
The \(\alpha\) (
alpha) determines a random distribution \(Beta(\alpha, \alpha)\). For each batch of data, we sample a mixing ratio (marked as \(\lambda\),
lam) from the random distribution.
- __call__(batch_inputs, batch_score)¶
Mix the batch inputs and batch data samples.
- mix(batch_inputs, batch_scores)¶
Mix the batch inputs and batch one-hot format ground truth.
batch_inputs (Tensor) – A batch of images tensor in the shape of
(N, C, H, W).
batch_scores (Tensor) – A batch of one-hot format labels in the shape of
The mixed inputs and labels.
- Return type: