Shortcuts

DenseNet

class mmpretrain.models.backbones.DenseNet(arch='121', in_channels=3, bn_size=4, drop_rate=0, compression_factor=0.5, memory_efficient=False, norm_cfg={'type': 'BN'}, act_cfg={'type': 'ReLU'}, out_indices=-1, frozen_stages=0, init_cfg=None)[source]

DenseNet.

A PyTorch implementation of : Densely Connected Convolutional Networks

Modified from the official repo and pytorch.

Parameters:
  • arch (str | dict) –

    The model’s architecture. If string, it should be one of architecture in DenseNet.arch_settings. And if dict, it should include the following two keys:

    • growth_rate (int): Each layer of DenseBlock produce k feature maps. Here refers k as the growth rate of the network.

    • depths (list[int]): Number of repeated layers in each DenseBlock.

    • init_channels (int): The output channels of stem layers.

    Defaults to ‘121’.

  • in_channels (int) – Number of input image channels. Defaults to 3.

  • bn_size (int) – Refers to channel expansion parameter of 1x1 convolution layer. Defaults to 4.

  • drop_rate (float) – Drop rate of Dropout Layer. Defaults to 0.

  • compression_factor (float) – The reduction rate of transition layers. Defaults to 0.5.

  • memory_efficient (bool) – If True, uses checkpointing. Much more memory efficient, but slower. Defaults to False. See “paper”.

  • norm_cfg (dict) – The config dict for norm layers. Defaults to dict(type='BN').

  • act_cfg (dict) – The config dict for activation after each convolution. Defaults to dict(type='ReLU').

  • out_indices (Sequence | int) – Output from which stages. Defaults to -1, means the last stage.

  • frozen_stages (int) – Stages to be frozen (all param fixed). Defaults to 0, which means not freezing any parameters.

  • init_cfg (dict, optional) – Initialization config dict.

Read the Docs v: latest
Versions
latest
stable
mmcls-1.x
mmcls-0.x
dev
Downloads
epub
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.