latest | popular

Filter by
How to make your network smaller and faster with the use of fastai library.
model-compression fastai knowledge-distillation sparsifying
EvoNorms: Evolving Normalization-Activation Layers
We use evolution to design new layers called EvoNorms, which outperform BatchNorm-ReLU on many tasks.
automl normalization activations activation-layers
Rethinking Batch Normalization in Transformers
We found that NLP batch statistics exhibit large variance throughout training, which leads to poor BN performance.
normalization batch-normalization power-normalization transformers
How Batch Normalization Works
The article covers how Batch Normalization works and how it "normalizes" different input distributions to make gradient propagation easier.
batch-normalization batchnorm convolutional-neural-networks article
Batch Normalization in Keras - Ablation Study
Use BatchNormalization with Keras and observe the effect it has with the change in batch size, learning rates and add dropout.
batch-normalization keras ablation-studies code
Why Batch Norm Causes Exploding Gradients
Our beloved Batch Norm can actually cause exploding gradients, at least at initialization time.
normalization batch-normalization exploding-gradients weights-initialization
EvoNorm layers in TensorFlow 2
Presents implementations of EvoNormB0 and EvoNormS0 layers as proposed in Evolving Normalization-Activation Layers by Liu et al.
normalization batch-normalization automl batch-norm-relu
projects 1 - 7 of 7
Topic experts
Share a project
Share something you or the community has made with ML.