renate.models.layers.cn module#

class renate.models.layers.cn.ContinualNorm(num_features, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True, device=None, dtype=None, num_groups=32)[source]#

Bases: _BatchNorm

Continual Normalization as a replacement for Batch Normalization.

Pham, Quang, Chenghao Liu, and Steven Hoi. “Continual normalization: Rethinking batch normalization for online continual learning.” International Conference on Learning Representations (2022).

It combines Group Normalization with respect to a user-defined num_groups parameter, the number of groups in Group Normalization, followed by Batch Normalization.

Parameters:
  • num_features (int) – The number of input features in the channel dimension.

  • eps (float) – A value added to the denominator for numerical stability.

  • momentum (float) – the value used for the running_mean and running_var computation. Can be set to None for cumulative moving average.

  • affine (bool) – Whether learnable affine parameters are going to be used in Batch Normalization.

  • track_running_stats (bool) – Whether running stats are tracked in Batch Normalization.

  • device – What device to store the parameters.

  • dtype – The data type of the learnable parameters.

  • num_groups (int) – The number of groups in the Group Normalization.

forward(input)[source]#

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

extra_repr()[source]#

Set the extra representation of the module

To print customized extra information, you should re-implement this method in your own modules. Both single-line and multi-line strings are acceptable.

num_features: int#
eps: float#
momentum: float#
affine: bool#
track_running_stats: bool#