Network architectures

Blocks

Convolution

class Convolution(dimensions, in_channels, out_channels, strides=1, kernel_size=3, act='PRELU', norm='INSTANCE', dropout=None, dilation=1, bias=True, conv_only=False, is_transposed=False)[source]

Constructs a convolution with optional dropout, normalization, and activation layers.

Initializes internal Module state, shared by both nn.Module and ScriptModule.

ResidualUnit

class ResidualUnit(dimensions, in_channels, out_channels, strides=1, kernel_size=3, subunits=2, act='PRELU', norm='INSTANCE', dropout=None, dilation=1, bias=True, last_conv_only=False)[source]

Initializes internal Module state, shared by both nn.Module and ScriptModule.

forward(x)[source]

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

Layers

Factories

Defines factories for creating layers in generic, extensible, and dimensionally independent ways. A separate factory object is created for each type of layer, and factory functions keyed to names are added to these objects. Whenever a layer is requested the factory name and any necessary arguments are passed to the factory object. The return value is typically a type but can be any callable producing a layer object.

The factory objects contain functions keyed to names converted to upper case, these names can be referred to as members of the factory so that they can function as constant identifiers. eg. instance normalisation is named Norm.INSTANCE.

For example, to get a transpose convolution layer the name is needed and then a dimension argument is provided which is passed to the factory function:

dimension = 3
name = Conv.CONVTRANS
conv = Conv[name, dimension]

This allows the dimension value to be set in the constructor, for example so that the dimensionality of a network is parameterizable. Not all factories require arguments after the name, the caller must be aware which are required.

Defining new factories involves creating the object then associating it with factory functions:

fact = LayerFactory()

@fact.factory_function('test')
def make_something(x, y):
    # do something with x and y to choose which layer type to return
    return SomeLayerType
...

# request object from factory TEST with 1 and 2 as values for x and y
layer = fact[fact.TEST, 1, 2]

Typically the caller of a factory would know what arguments to pass (ie. the dimensionality of the requested type) but can be parameterized with the factory name and the arguments to pass to the created type at instantiation time:

def use_factory(fact_args):
    fact_name, type_args = split_args
    layer_type = fact[fact_name, 1, 2]
    return layer_type(**type_args)
...

kw_args = {'arg0':0, 'arg1':True}
layer = use_factory( (fact.TEST, kwargs) )

LayerFactory

class LayerFactory[source]

Factory object for creating layers, this uses given factory functions to actually produce the types or constructing callables. These functions are referred to by name and can be added at any time.

SkipConnection

class SkipConnection(submodule, cat_dim=1)[source]

Concats the forward pass input with the result from the given submodule.

Initializes internal Module state, shared by both nn.Module and ScriptModule.

forward(x)[source]

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

Flatten

class Flatten[source]

Flattens the given input in the forward pass to be [B,-1] in shape.

Initializes internal Module state, shared by both nn.Module and ScriptModule.

forward(x)[source]

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

GaussianFilter

class GaussianFilter(spatial_dims, sigma, truncated=4.0, device=None)[source]
Parameters
  • spatial_dims (int) – number of spatial dimensions of the input image. must have shape (Batch, channels, H[, W, …]).

  • sigma (float) – std.

  • truncated (float) – spreads how many stds.

  • device (torch.device) – device on which the tensor will be allocated.

__call__(x)[source]
Parameters

x (tensor) – in shape [Batch, chns, H, W, D].

Nets

Densenet3D

class DenseNet(spatial_dims, in_channels, out_channels, init_features=64, growth_rate=32, block_config=(6, 12, 24, 16), bn_size=4, dropout_prob=0)[source]

Densenet based on: “Densely Connected Convolutional Networks” https://arxiv.org/pdf/1608.06993.pdf Adapted from PyTorch Hub 2D version: https://github.com/pytorch/vision/blob/master/torchvision/models/densenet.py

Parameters
  • spatial_dims (Int) – number of spatial dimensions of the input image.

  • in_channels (Int) – number of the input channel.

  • out_channels (Int) – number of the output classes.

  • init_features (Int) –

  • growth_rate (Int) – how many filters to add each layer (k in paper).

  • block_config (tuple) – how many layers in each pooling block.

  • bn_size (Int) – (i.e. bn_size * k features in the bottleneck layer)

  • dropout_prob (Float) – dropout rate after each dense layer.

Initializes internal Module state, shared by both nn.Module and ScriptModule.

forward(x)[source]

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

densenet.densenet121()
densenet.densenet169()
densenet.densenet201()
densenet.densenet264()

Highresnet

class ConvNormActi(spatial_dims, in_channels, out_channels, kernel_size, norm_type=None, acti_type=None, dropout_prob=None)[source]

Initializes internal Module state, shared by both nn.Module and ScriptModule.

forward(x)[source]

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

class HighResBlock(spatial_dims, in_channels, out_channels, kernels=(3, 3), dilation=1, norm_type='instance', acti_type='relu', channel_matching='pad')[source]
Parameters
  • kernels (list of int) – each integer k in kernels corresponds to a convolution layer with kernel size k.

  • channel_matching ('pad'|'project') – handling residual branch and conv branch channel mismatches with either zero padding (‘pad’) or a trainable conv with kernel size 1 (‘project’).

forward(x)[source]

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

class HighResNet(spatial_dims=3, in_channels=1, out_channels=1, norm_type='batch', acti_type='relu', dropout_prob=None, layer_params=({'name': 'conv_0', 'n_features': 16, 'kernel_size': 3}, {'name': 'res_1', 'n_features': 16, 'kernels': (3, 3), 'repeat': 3}, {'name': 'res_2', 'n_features': 32, 'kernels': (3, 3), 'repeat': 3}, {'name': 'res_3', 'n_features': 64, 'kernels': (3, 3), 'repeat': 3}, {'name': 'conv_1', 'n_features': 80, 'kernel_size': 1}, {'name': 'conv_2', 'kernel_size': 1}))[source]

Reimplementation of highres3dnet based on Li et al., “On the compactness, efficiency, and representation of 3D convolutional networks: Brain parcellation as a pretext task”, IPMI ‘17

Adapted from: https://github.com/NifTK/NiftyNet/blob/v0.6.0/niftynet/network/highres3dnet.py https://github.com/fepegar/highresnet

Parameters
  • spatial_dims (int) – number of spatial dimensions of the input image.

  • in_channels (int) – number of input channels.

  • out_channels (int) – number of output channels.

  • norm_type ('batch'|'instance') – feature normalisation with batchnorm or instancenorm.

  • acti_type ('relu'|'prelu'|'relu6') – non-linear activation using ReLU or PReLU.

  • dropout_prob (float) – probability of the feature map to be zeroed (only applies to the penultimate conv layer).

  • layer_params (a list of dictionaries) – specifying key parameters of each layer/block.

Initializes internal Module state, shared by both nn.Module and ScriptModule.

forward(x)[source]

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

Unet

class UNet(dimensions, in_channels, out_channels, channels, strides, kernel_size=3, up_kernel_size=3, num_res_units=0, act='PRELU', norm='INSTANCE', dropout=0)[source]

Initializes internal Module state, shared by both nn.Module and ScriptModule.

forward(x)[source]

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

Unet

alias of monai.networks.nets.unet.UNet

unet

alias of monai.networks.nets.unet.UNet