Normalization Techniques for Training Very Deep Neural Networks
How can we efficiently train very deep neural network architectures? What are the best in-layer normalization options? Read on and find out.
normalization batch-normalization layer-normalization group-normalization regularization neural-networks article tutorial

An in-depth analysis of different normalization methods for deep neural networks. A must if you are looking to get in the AI field or if you are preparing for an AI interview.

Topics covered:

  • Batch normalization
  • Synchronized Batch Normalization
  • Layer normalization
  • Instance Normalization
  • Weight normalization
  • Group normalization
  • Weight Standardization
  • SPADE

Don't forget to tag @black0017 in your comment, otherwise they may not be notified.

Authors original post
Deep Learning Researcher on Computer Vision and Medical Imaging
Share this project
Similar projects
Why Batch Norm Causes Exploding Gradients
Our beloved Batch Norm can actually cause exploding gradients, at least at initialization time.
EvoNorms: Evolving Normalization-Activation Layers
We use evolution to design new layers called EvoNorms, which outperform BatchNorm-ReLU on many tasks.
Rethinking Batch Normalization in Transformers
We found that NLP batch statistics exhibit large variance throughout training, which leads to poor BN performance.
EvoNorm layers in TensorFlow 2
Presents implementations of EvoNormB0 and EvoNormS0 layers as proposed in Evolving Normalization-Activation Layers by Liu et al.
Top collections