Layers

Custom torch layers for neural architectures.

class ecgan.utils.layers.MinibatchDiscrimination(in_features, out_features, kernel_dims=16, calc_mean=False)[source]

Bases: torch.nn.modules.module.Module

Minibatch discrimination layer based on https://gist.github.com/t-ae/732f78671643de97bbe2c46519972491.

forward(x)[source]

Forward pass of the Minibatch Discriminator.

Return type

Tensor

class ecgan.utils.layers.MinibatchDiscriminationSimple[source]

Bases: torch.nn.modules.module.Module

From Karras et al. 2018.

static forward(x)[source]

Forward pass of the Minibatch Discriminator.

Return type

Tensor

ecgan.utils.layers.initialize_weights(network, init_config)[source]

Initialize weights of a Torch architecture.

Currently supported are:

  • 'normal': Sampling from a normal distribution. Parameters: mean, std

  • 'uniform': Sampling from a uniform distribution. Parameters: upper_bound,

    lower_bound

  • 'he': He initialization . He, K. et al. (2015)

  • 'glorot': Glorot, X. & Bengio, Y. (2010)

Biases and BatchNorm are not initialized with this function as different strategies are applicable for these tensors/layers. Therefore the standard initialization of PyTorch when creating the layers is taken in these cases.

Return type

None

ecgan.utils.layers.initialize_batchnorm(module, **kwargs)[source]

Explicitly initialize batchnorm layers with a normal distribution.

ecgan.utils.layers.is_normalization_layer(module)[source]

Check if a module is a input normalization layer.